Sample records for decentralized workflow management

  1. A three-level atomicity model for decentralized workflow management systems

    NASA Astrophysics Data System (ADS)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  2. GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction

    NASA Astrophysics Data System (ADS)

    Chrysanthis, Panos K.

    1996-12-01

    Computer Science Department, University of Pittsburgh, Pittsburgh, PA 15260, USA This special issue focuses on current efforts to represent and support workflows that integrate information systems and human resources within a business or manufacturing enterprise. Workflows may also be viewed as an emerging computational paradigm for effective structuring of cooperative applications involving human users and access to diverse data types not necessarily maintained by traditional database management systems. A workflow is an automated organizational process (also called business process) which consists of a set of activities or tasks that need to be executed in a particular controlled order over a combination of heterogeneous database systems and legacy systems. Within workflows, tasks are performed cooperatively by either human or computational agents in accordance with their roles in the organizational hierarchy. The challenge in facilitating the implementation of workflows lies in developing efficient workflow management systems. A workflow management system (also called workflow server, workflow engine or workflow enactment system) provides the necessary interfaces for coordination and communication among human and computational agents to execute the tasks involved in a workflow and controls the execution orderings of tasks as well as the flow of data that these tasks manipulate. That is, the workflow management system is responsible for correctly and reliably supporting the specification, execution, and monitoring of workflows. The six papers selected (out of the twenty-seven submitted for this special issue of Distributed Systems Engineering) address different aspects of these three functional components of a workflow management system. In the first paper, `Correctness issues in workflow management', Kamath and Ramamritham discuss the important issue of correctness in workflow management that constitutes a prerequisite for the use of workflows in the automation of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity. The fourth and fifth papers are experience papers describing a workflow management system and a large scale workflow application, respectively. Schill and Mittasch, in `Workflow management systems on top of OSF DCE and OMG CORBA', describe a decentralized workflow management system and discuss its implementation using two standardized middleware platforms, namely, OSF DCE and OMG CORBA. The system supports a new approach to workflow management, introducing several new concepts such as data type management for integrating various types of data and quality of service for various services provided by servers. A problem common to both database applications and workflows is the handling of missing and incomplete information. This is particularly pervasive in an `electronic market' with a huge number of retail outlets producing and exchanging volumes of data, the application discussed in `Information flow in the DAMA project beyond database managers: information flow managers'. Motivated by the need for a method that allows a task to proceed in a timely manner if not all data produced by other tasks are available by its deadline, Russell et al propose an architectural framework and a language that can be used to detect, approximate and, later on, to adjust missing data if necessary. The final paper, `The evolution towards flexible workflow systems' by Nutt, is complementary to the other papers and is a survey of issues and of work related to both workflow and computer supported collaborative work (CSCW) areas. In particular, the paper provides a model and a categorization of the dimensions which workflow management and CSCW systems share. Besides summarizing the recent advancements towards efficient workflow management, the papers in this special issue suggest areas open to investigation and it is our hope that they will also provide the stimulus for further research and development in the area of workflow management systems.

  3. Decentralizing the Team Station: Simulation before Reality as a Best-Practice Approach.

    PubMed

    Charko, Jackie; Geertsen, Alice; O'Brien, Patrick; Rouse, Wendy; Shahid, Ammarah; Hardenne, Denise

    2016-01-01

    The purpose of this article is to share the logistical planning requirements and simulation experience of one Canadian hospital as it prepared its staff for the change from a centralized inpatient unit model to the decentralized design planned for its new community hospital. With the commitment and support of senior leadership, project management resources and clinical leads worked collaboratively to design a decentralized prototype in the form of a pod-style environment in the hospital's current setting. Critical success factors included engaging the right stakeholders, providing an opportunity to test new workflows and technology, creating a strong communication plan and building on lessons learned as subsequent pod prototypes are launched.

  4. Economic analysis of centralized vs. decentralized electronic data capture in multi-center clinical studies.

    PubMed

    Walden, Anita; Nahm, Meredith; Barnett, M Edwina; Conde, Jose G; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E; Eisenstein, Eric L

    2011-01-01

    New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs.

  5. Economic Analysis of Centralized vs. Decentralized Electronic Data Capture in Multi-Center Clinical Studies

    PubMed Central

    Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.

    2012-01-01

    Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692

  6. Back to basics: does decentralization improve health system performance? Evidence from Ceara in north-east Brazil.

    PubMed Central

    Atkinson, Sarah; Haran, Dave

    2004-01-01

    OBJECTIVE: To examine whether decentralization has improved health system performance in the State of Ceara, north-east Brazil. METHODS: Ceara is strongly committed to decentralization. A survey across 45 local (municipio) health systems collected data on performance and formal organization, including decentralization, informal management and local political culture. The indicators for informal management and local political culture were based on prior ethnographic research. Data were analysed using analysis of variance, Duncan's post-hoc test and multiple regression. FINDINGS: Decentralization was associated with improved performance, but only for 5 of our 22 performance indicators. Moreover, in the multiple regression, decentralization explained the variance in only one performance indicator; indicators for informal management and political culture appeared to be more important influences. However, some indicators for informal management were themselves associated with decentralization but not any of the political culture indicators. CONCLUSION: Good management practices in the study led to decentralized local health systems rather than vice versa. Any apparent association between decentralization and performance seems to be an artefact of the informal management, and the wider political culture in which a local health system is embedded strongly influences the performance of local health systems. PMID:15640917

  7. Responsibility Center Management: Lessons from 25 Years of Decentralized Management.

    ERIC Educational Resources Information Center

    Strauss, Jon C.; Curry, John R.

    Decentralization of authority is a natural act in universities, but decentralization of responsibility is not. A problem faced by universities is the decoupling of academic authority from financial responsibility. The solution proposed in this book for the coupling is Responsibility Center Management (RCM), also called Revenue Responsibility…

  8. Expansion of a residency program through provision of second-shift decentralized services.

    PubMed

    Host, Brian D; Anderson, Michael J; Lucas, Paul D

    2014-12-15

    The rationale for and logistics of the expansion of a postgraduate year 1 (PGY1) residency program in a community hospital are described. Baptist Health Lexington, a nonprofit community hospital in Lexington, Kentucky, sought to expand the PGY1 program by having residents perform second-shift decentralized pharmacist functions. Program expansion was predicated on aligning resident staffing functions with current hospitalwide initiatives involving medication reconciliation and patient education. The focus was to integrate residents into the workflow while allowing them more time to practice as pharmacists and contribute to departmental objectives. The staffing function would increase residents' overall knowledge of departmental operations and foster their sense of independence and ownership. The decentralized functions would include initiation of clinical pharmacokinetic consultations, admission medication reconciliation, discharge teaching for patients with heart failure, and order-entry support from decentralized locations. The program grew from three to five residents and established a staffing rotation for second-shift decentralized coverage. The increased time spent staffing did not detract from the time allotted to previously established learning experiences and enhanced overall continuity of the staffing experience. The change also emphasized to the residents the importance of integration of distributive and clinical functions within the department. Pharmacist participation in admission and discharge medication reconciliation activities has also increased patient satisfaction, evidenced by follow-up surveys conducted by the hospital. A PGY1 residency program was expanded through the provision of second-shift decentralized clinical services, which helped provide residents with increased patient exposure and enhanced staffing experience. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. Kwf-Grid workflow management system for Earth science applications

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  10. Decentralized Budgeting in Education: Model Variations and Practitioner Perspectives.

    ERIC Educational Resources Information Center

    Hall, George; Metsinger, Jackie; McGinnis, Patricia

    In educational settings, decentralized budgeting refers to various fiscal practices that disperse budgeting responsibility away from central administration to the line education units. This distributed decision-making is common to several financial management models. Among the many financial management models that employ decentralized budgeting…

  11. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  12. Taking advantage of HTML5 browsers to realize the concepts of session state and workflow sharing in web-tool applications

    NASA Astrophysics Data System (ADS)

    Suftin, I.; Read, J. S.; Walker, J.

    2013-12-01

    Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file was generated. A user may then view results produced during that session or go back and alter input parameters, creating new results and producing new, unique sessions which they can then again share. This technique not only provides independence for the user to manage their session as they like, but also allows much greater freedom for the application provider to scale out without having to worry about carrying over user information or maintaining it in a central location.

  13. Providing leadership to a decentralized total quality process.

    PubMed

    Diederich, J J; Eisenberg, M

    1993-01-01

    Integrating total quality management into the culture of an organization and the daily work of employees requires a decentralized leadership structure that encourages all employees to become involved. This article, based upon the experience of the University of Michigan Hospitals Professional Services Divisional Lead Team, outlines a process for decentralizing the total quality management process.

  14. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    NASA Astrophysics Data System (ADS)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  15. Strategies of organization and service for the critical-care laboratory.

    PubMed

    Fleisher, M; Schwartz, M K

    1990-08-01

    Critical-care medicine requires rapidity of treatment decisions and clinical management. To meet the objectives of critical-care medicine, the critical-care laboratory must consider four major aspects of laboratory organization in addition to analytical responsibilities: specimen collection and delivery, training of technologists, selection of reliable instrumentation, and efficient data dissemination. One must also consider the advantages and disadvantages of centralization vs decentralization, the influence of such a laboratory on patient care and personnel needs, and the space required for optimal operation. Centralization may lead to workflow interruption and increased turnaround time (TAT); decentralization requires redundancy of instrumentation and staff but may shorten TAT. Minimal TAT is the hallmark of efficient laboratory service. We surveyed 55 laboratories in 33 hospitals and found that virtually all hospitals with 200 or more beds had a critical-care laboratory operating as a satellite of the main laboratory. We present data on actual TAT, although these were available in only eight of the 15 routine laboratories that provided emergency service and in eight of the 40 critical-care laboratories. In meeting the challenges of an increasing workload, a reduced clinical laboratory work force, and the need to reduce TAT, changes in traditional laboratory practice are mandatory. An increased reliance on whole-blood analysis, for example, should eliminate delays associated with sample preparation, reduce the potential hazards associated with centrifugation, and eliminate excess specimen handling.

  16. Agile parallel bioinformatics workflow management using Pwrake.

    PubMed

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.

  17. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774

  18. Engaging Social Capital for Decentralized Urban Stormwater Management

    EPA Science Inventory

    Decentralized approaches to urban stormwater management, whereby installations of green infrastructure (e.g., rain gardens, bioswales, and constructed wetlands) are dispersed throughout a management area, are cost-effective solutions with co-benefits beyond water abatement. Inste...

  19. Game-Based Virtual Worlds as Decentralized Virtual Activity Systems

    NASA Astrophysics Data System (ADS)

    Scacchi, Walt

    There is widespread interest in the development and use of decentralized systems and virtual world environments as possible new places for engaging in collaborative work activities. Similarly, there is widespread interest in stimulating new technological innovations that enable people to come together through social networking, file/media sharing, and networked multi-player computer game play. A decentralized virtual activity system (DVAS) is a networked computer supported work/play system whose elements and social activities can be both virtual and decentralized (Scacchi et al. 2008b). Massively multi-player online games (MMOGs) such as World of Warcraft and online virtual worlds such as Second Life are each popular examples of a DVAS. Furthermore, these systems are beginning to be used for research, deve-lopment, and education activities in different science, technology, and engineering domains (Bainbridge 2007, Bohannon et al. 2009; Rieber 2005; Scacchi and Adams 2007; Shaffer 2006), which are also of interest here. This chapter explores two case studies of DVASs developed at the University of California at Irvine that employ game-based virtual worlds to support collaborative work/play activities in different settings. The settings include those that model and simulate practical or imaginative physical worlds in different domains of science, technology, or engineering through alternative virtual worlds where players/workers engage in different kinds of quests or quest-like workflows (Jakobsson 2006).

  20. Financial management systems under decentralization and their effect on malaria control in Uganda.

    PubMed

    Kivumbi, George W; Nangendo, Florence; Ndyabahika, Boniface Rutagira

    2004-01-01

    A descriptive case study with multiple sites and a single level of analysis was carried out in four purposefully selected administrative districts of Uganda to investigate the effect of financial management systems under decentralization on malaria control. Data were primarily collected from 36 interviews with district managers, staff at health units and local leaders. A review of records and documents related to decentralization at the central and district level was also used to generate data for the study. We found that a long, tedious, and bureaucratic process combined with lack of knowledge in working with new financial systems by several actors characterized financial flow under decentralization. This affected the timely use of financial resources for malaria control in that there were funds in the system that could not be accessed for use. We were also told that sometimes these funds were returned to the central government because of non-use due to difficulties in accessing them and/or stringent conditions not to divert them to other uses. Our data showed that a cocktail of bureaucratic control systems, corruption and incompetence make the financial management system under decentralization counter-productive for malaria control. The main conclusion is that good governance through appropriate and efficient financial management systems is very important for effective malaria control under decentralization.

  1. Decentralization's impact on the health workforce: Perspectives of managers, workers and national leaders

    PubMed Central

    Kolehmainen-Aitken, Riitta-Liisa

    2004-01-01

    Designers and implementers of decentralization and other reform measures have focused much attention on financial and structural reform measures, but ignored their human resource implications. Concern is mounting about the impact that the reallocation of roles and responsibilities has had on the health workforce and its management, but the experiences and lessons of different countries have not been widely shared. This paper examines evidence from published literature on decentralization's impact on the demand side of the human resource equation, as well as the factors that have contributed to the impact. The elements that make such an impact analysis exceptionally complex are identified. They include the mode of decentralization that a country is implementing, the level of responsibility for the salary budget and pay determination, and the civil service status of transferred health workers. The main body of the paper is devoted to examining decentralization's impact on human resource issues from three different perspectives: that of local health managers, health workers themselves, and national health leaders. These three groups have different concerns in the human resource realm, and consequently, have been differently affected by decentralization processes. The paper concludes with recommendations regarding three key concerns that national authorities and international agencies should give prompt attention to. They are (1) defining the essential human resource policy, planning and management skills for national human resource managers who work in decentralized countries, and developing training programs to equip them with such skills; (2) supporting research that focuses on improving the knowledge base of how different modes of decentralization impact on staffing equity; and (3) identifying factors that most critically influence health worker motivation and performance under decentralization, and documenting the most cost-effective best practices to improve them. Notable experiences from South Africa, Ghana, Indonesia and Mexico are shared in an annex. PMID:15144558

  2. Engaging Social Capital for Decentralized Urban Stormwater Management (Paper in Non-EPA Proceedings)

    EPA Science Inventory

    Decentralized approaches to urban stormwater management, whereby installations of green infrastructure (e.g., rain gardens, bioswales, constructed wetlands) are dispersed throughout a management area, are cost-effective solutions with co-benefits beyond just water abatement. Inst...

  3. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  4. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  5. [Analysis of the healthcare service decentralization process in Côte d'Ivoire].

    PubMed

    Soura, B D; Coulibaly, S S

    2014-01-01

    The decentralization of healthcare services is becoming increasingly important in strategies of public sector management. This concept is analyzed from various points of view, including legal, economic, political, and sociological. Several typologies have been proposed in the literature to analyze this decentralization process, which can take different forms ranging from simple deconcentration to more elaborate devolution. In some instances, decentralization can be analyzed by the degree of autonomy given to local authorities. This article applies these typologies to analyze the healthcare system decentralization process in Cote d'Ivoire. Special attention is paid to the new forms of community healthcare organizations. These decentralized structures enjoy a kind of autonomy, with characteristics closer to those of devolution. The model might serve as an example for population involvement in defining and managing healthcare problems in Cote d'Ivoire. We end with proposals for the improvement of the process.

  6. Progress on big data publication and documentation for machine-to-machine discovery, access, and processing

    NASA Astrophysics Data System (ADS)

    Walker, J. I.; Blodgett, D. L.; Suftin, I.; Kunicki, T.

    2013-12-01

    High-resolution data for use in environmental modeling is increasingly becoming available at broad spatial and temporal scales. Downscaled climate projections, remotely sensed landscape parameters, and land-use/land-cover projections are examples of datasets that may exceed an individual investigation's data management and analysis capacity. To allow projects on limited budgets to work with many of these data sets, the burden of working with them must be reduced. The approach being pursued at the U.S. Geological Survey Center for Integrated Data Analytics uses standard self-describing web services that allow machine to machine data access and manipulation. These techniques have been implemented and deployed in production level server-based Web Processing Services that can be accessed from a web application or scripted workflow. Data publication techniques that allow machine-interpretation of large collections of data have also been implemented for numerous datasets at U.S. Geological Survey data centers as well as partner agencies and academic institutions. Discovery of data services is accomplished using a method in which a machine-generated metadata record holds content--derived from the data's source web service--that is intended for human interpretation as well as machine interpretation. A distributed search application has been developed that demonstrates the utility of a decentralized search of data-owner metadata catalogs from multiple agencies. The integrated but decentralized system of metadata, data, and server-based processing capabilities will be presented. The design, utility, and value of these solutions will be illustrated with applied science examples and success stories. Datasets such as the EPA's Integrated Climate and Land Use Scenarios, USGS/NASA MODIS derived land cover attributes, and downscaled climate projections from several sources are examples of data this system includes. These and other datasets, have been published as standard, self-describing, web services that provide the ability to inspect and subset the data. This presentation will demonstrate this file-to-web service concept and how it can be used from script-based workflows or web applications.

  7. Bioinformatics workflows and web services in systems biology made easy for experimentalists.

    PubMed

    Jimenez, Rafael C; Corpas, Manuel

    2013-01-01

    Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.

  8. EVALUATION OF ECONOMIC INCENTIVES FOR DECENTRALIZED STORMWATER RUNOFF MANAGEMENT

    EPA Science Inventory

    Impervious surfaces in urban and suburban areas can lead to excess stormwater runoff throughout a watershed, typically resulting in widespread hydrologic and ecological alteration of receiving streams. Decentralized stormwater management may improve stream ecosystems by reducing ...

  9. Strategies of Educational Decentralization: Key Questions and Core Issues.

    ERIC Educational Resources Information Center

    Hanson, E. Mark

    1998-01-01

    Explains key issues and forces that shape organization and management strategies of educational decentralization, using examples from Colombia, Venezuela, Argentina, Nicaragua, and Spain. Core decentralization issues include national and regional goals, planning, political stress, resource distribution, infrastructure development, and job…

  10. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  11. Responsibility Center Management: A Financial Paradigm and Alternative to Decentralized Budgeting.

    ERIC Educational Resources Information Center

    Hensley, Phyllis A.; Bava, D. J.; Brennan, Denise C.

    This study examined the implementation of Responsibility Center Management (RCM) systems in two institutions of higher education: the Graduate School of Business at Institution and the Center of Collaborative Education and Professional Studies at Institution B. RCM is a management and budgeting process for universities that decentralizes authority…

  12. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    NASA Astrophysics Data System (ADS)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  13. PARTICIPATORY STORM WATER MANAGEMENT AND SUSTAINABILITY – WHAT ARE THE CONNECTIONS?

    EPA Science Inventory

    Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. For areas which are already developed, decentralization of stormwater management involves private property and poss...

  14. Leadership in Decentralized Schools.

    ERIC Educational Resources Information Center

    Madsen, Jean

    1997-01-01

    Summarizes a study that examined principals' leadership in three private schools and its implications for decentralized public schools. With the increase of charter and privatized managed schools, principals will need to redefine their leadership styles. Private schools, as decentralized entities, offer useful perspectives on developing school…

  15. Decentralized Decision Making Toward Educational Goals.

    ERIC Educational Resources Information Center

    Monahan, William W.; Johnson, Homer M.

    This monograph provides guidelines to help those school districts considering a more decentralized form of management. The authors discuss the levels at which different types of decisions should be made, describe the changing nature of the educational environment, identify different centralization-decentralization models, and suggest a flexible…

  16. Empowerment or Impediment? School Governance in the School-Based Management Era in Hong Kong

    ERIC Educational Resources Information Center

    Kwan, Paula; Li, Benjamin Yuet-man

    2015-01-01

    Following the international trend in education towards democracy and decentralization, the Hong Kong government introduced a school-based management (SBM) system about two decades ago. It is widely recognized in the literature that decentralization, empowering school level management and marginalizing the influence of the intermediate level of…

  17. Decentralized water resources management in Mozambique: Challenges of implementation at the river basin level

    NASA Astrophysics Data System (ADS)

    Inguane, Ronaldo; Gallego-Ayala, Jordi; Juízo, Dinis

    In the context of integrated water resources management implementation, the decentralization of water resources management (DWRM) at the river basin level is a crucial aspect for its success. However, decentralization requires the creation of new institutions on the ground, to stimulate an environment enabling stakeholder participation and integration into the water management decision-making process. In 1991, Mozambique began restructuring its water sector toward operational decentralized water resources management. Within this context of decentralization, new legal and institutional frameworks have been created, e.g., Regional Water Administrations (RWAs) and River Basin Committees. This paper identifies and analyzes the key institutional challenges and opportunities of DWRM implementation in Mozambique. The paper uses a critical social science research methodology for in-depth analysis of the roots of the constraining factors for the implementation of DWRM. The results obtained suggest that RWAs should be designed considering the specific geographic and infrastructural conditions of their jurisdictional areas and that priorities should be selected in their institutional capacity building strategies that match local realities. Furthermore, the results also indicate that RWAs have enjoyed limited support from basin stakeholders, mainly in basins with less hydraulic infrastructure, in securing water availability for their users and minimizing the effect of climate variability.

  18. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  19. Decentralized approaches to wastewater treatment and management: applicability in developing countries.

    PubMed

    Massoud, May A; Tarhini, Akram; Nasr, Joumana A

    2009-01-01

    Providing reliable and affordable wastewater treatment in rural areas is a challenge in many parts of the world, particularly in developing countries. The problems and limitations of the centralized approaches for wastewater treatment are progressively surfacing. Centralized wastewater collection and treatment systems are costly to build and operate, especially in areas with low population densities and dispersed households. Developing countries lack both the funding to construct centralized facilities and the technical expertise to manage and operate them. Alternatively, the decentralized approach for wastewater treatment which employs a combination of onsite and/or cluster systems is gaining more attention. Such an approach allows for flexibility in management, and simple as well as complex technologies are available. The decentralized system is not only a long-term solution for small communities but is more reliable and cost effective. This paper presents a review of the various decentralized approaches to wastewater treatment and management. A discussion as to their applicability in developing countries, primarily in rural areas, and challenges faced is emphasized all through the paper. While there are many impediments and challenges towards wastewater management in developing countries, these can be overcome by suitable planning and policy implementation. Understanding the receiving environment is crucial for technology selection and should be accomplished by conducting a comprehensive site evaluation process. Centralized management of the decentralized wastewater treatment systems is essential to ensure they are inspected and maintained regularly. Management strategies should be site specific accounting for social, cultural, environmental and economic conditions in the target area.

  20. An Auto-management Thesis Program WebMIS Based on Workflow

    NASA Astrophysics Data System (ADS)

    Chang, Li; Jie, Shi; Weibo, Zhong

    An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.

  1. Evolutionary Concepts for Decentralized Air Traffic Flow Management

    NASA Technical Reports Server (NTRS)

    Adams, Milton; Kolitz, Stephan; Milner, Joseph; Odoni, Amedeo

    1997-01-01

    Alternative concepts for modifying the policies and procedures under which the air traffic flow management system operates are described, and an approach to the evaluation of those concepts is discussed. Here, air traffic flow management includes all activities related to the management of the flow of aircraft and related system resources from 'block to block.' The alternative concepts represent stages in the evolution from the current system, in which air traffic management decision making is largely centralized within the FAA, to a more decentralized approach wherein the airlines and other airspace users collaborate in air traffic management decision making with the FAA. The emphasis in the discussion is on a viable medium-term partially decentralized scenario representing a phase of this evolution that is consistent with the decision-making approaches embodied in proposed Free Flight concepts for air traffic management. System-level metrics for analyzing and evaluating the various alternatives are defined, and a simulation testbed developed to generate values for those metrics is described. The fundamental issue of modeling airline behavior in decentralized environments is also raised, and an example of such a model, which deals with the preservation of flight bank integrity in hub airports, is presented.

  2. Refueling Strategies for a Team of Cooperating AUVs

    DTIC Science & Technology

    2011-01-01

    manager, and thus the constraint a centrally managed underwater network imposes on the mission. Task management utilizing Robust Decentralized Task ...the computational complexity. A bid based approach to task management has also been studied as a possible means of decentralization of group task ...currently performing another task . In [18], ground robots perform distributed task allocation using the ASyMTRy-D algorithm, which is based on CNP

  3. Decentralization of storm runoff via engagement of social and cultural capitals - implications for the management of flood risk at the municipal scale

    EPA Science Inventory

    This research tests a novel method that focuses limited community resources on a decentralized approach to storm water management. A reverse auction was used to relieve legal constraints on management implementation on private land. Residents voluntarily bid on rain gardens and r...

  4. Contractual or Responsive Accountability? Neo-Centralist 'Self-Management' or Systemic Subsidiarity? Tasmanian Parents' and Other Stakeholders' Policy Preferences.

    ERIC Educational Resources Information Center

    Macpherson, R. J. S.

    When state governments in Australia decentralized many administrative responsibilities to schools in the late 1980s and early 1990s, it was assumed that they would develop fresh management, development, and governance capacities. In general, such decentralization attempted to replace bureaucracies with corporate management, limit school evaluation…

  5. Decentralized and Tactical Air Traffic Flow Management

    NASA Technical Reports Server (NTRS)

    Odoni, Amedeo R.; Bertsimas, Dimitris

    1997-01-01

    This project dealt with the following topics: 1. Review and description of the existing air traffic flow management system (ATFM) and identification of aspects with potential for improvement. 2. Identification and review of existing models and simulations dealing with all system segments (enroute, terminal area, ground) 3. Formulation of concepts for overall decentralization of the ATFM system, ranging from moderate decentralization to full decentralization 4. Specification of the modifications to the ATFM system required to accommodate each of the alternative concepts. 5. Identification of issues that need to be addressed with regard to: determination of the way the ATFM system would be operating; types of flow management strategies that would be used; and estimation of the effectiveness of ATFM with regard to reducing delay and re-routing costs. 6. Concept evaluation through identification of criteria and methodologies for accommodating the interests of stakeholders and of approaches to optimization of operational procedures for all segments of the ATFM system.

  6. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    PubMed

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  7. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    PubMed Central

    2010-01-01

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200

  8. REVERSE AUCTION RESULTS FOR IMPLEMENTATION OF DECENTRALIZED RETROFIT BEST MANAGEMENT PRACTICES IN A SMALL URBAN WATERSHED (CINCINNATI OH)

    EPA Science Inventory

    Although urban stormwater is typically conveyed to centralized infrastructure, there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens ...

  9. Decentralization of Education in Indonesia--A Study on Education Development Gaps in the Provincial Areas

    ERIC Educational Resources Information Center

    Winardi

    2017-01-01

    Decentralization is acknowledged as the handover of government from central government to local government, including giving broader authority to local governments to manage education. This study aims to discovering education development gap between regions in Indonesia as a result of decentralization. This research method uses descriptive…

  10. Reverse auction results for implementation of decentralized retrofit best management practices in a small urban watershed (Cincinnati OH)Participatory storm water management and sustainability – what are the connections?

    EPA Science Inventory

    Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens and r...

  11. Effectiveness of a decentralized stormwater management program in the reduction of runoff volume

    EPA Science Inventory

    A decentralized, retrofit approach to storm water management was implemented in a small suburban drainage on the basis of a voluntary reverse auction. This effort led to the installation of 83 rain gardens and 176 rain barrels on approximately 20 percent of 350 residential proper...

  12. Human resources for health and decentralization policy in the Brazilian health system

    PubMed Central

    2011-01-01

    Background The Brazilian health reform process, following the establishment of the Unified Health System (SUS), has had a strong emphasis on decentralization, with a special focus on financing, management and inter-managerial agreements. Brazil is a federal country and the Ministry of Health (MoH), through the Secretary of Labour Management and Health Education, is responsible for establishing national policy guidelines for health labour management, and also for implementing strategies for the decentralization of management of labour and education in the federal states. This paper assesses whether the process of decentralizing human resources for health (HRH) management and organization to the level of the state and municipal health departments has involved investments in technical, political and financial resources at the national level. Methods The research methods used comprise a survey of HRH managers of states and major municipalities (including capitals) and focus groups with these HRH managers - all by geographic region. The results were obtained by combining survey and focus group data, and also through triangulation with the results of previous research. Results The results of this evaluation showed the evolution policy, previously restricted to the field of 'personnel administration', now expanded to a conceptual model for health labour management and education-- identifying progress, setbacks, critical issues and challenges for the consolidation of the decentralized model for HRH management. The results showed that 76.3% of the health departments have an HRH unit. It was observed that 63.2% have an HRH information system. However, in most health departments, the HRH unit uses only the payroll and administrative records as data sources. Concerning education in health, 67.6% of the HRH managers mentioned existing cooperation with educational and teaching institutions for training and/or specialization of health workers. Among them, specialization courses account for 61.4% and short courses for 56.1%. Conclusions Due to decentralization, the HRH area has been restructured and policies beyond traditional administrative activities have been developed. However, twenty years on from the establishment of the SUS, there remains a low level of institutionalization in the HRH area, despite recent efforts of the MoH. PMID:21586156

  13. Human resources for health and decentralization policy in the Brazilian health system.

    PubMed

    Pierantoni, Celia Regina; Garcia, Ana Claudia P

    2011-05-17

    The Brazilian health reform process, following the establishment of the Unified Health System (SUS), has had a strong emphasis on decentralization, with a special focus on financing, management and inter-managerial agreements. Brazil is a federal country and the Ministry of Health (MoH), through the Secretary of Labour Management and Health Education, is responsible for establishing national policy guidelines for health labour management, and also for implementing strategies for the decentralization of management of labour and education in the federal states. This paper assesses whether the process of decentralizing human resources for health (HRH) management and organization to the level of the state and municipal health departments has involved investments in technical, political and financial resources at the national level. The research methods used comprise a survey of HRH managers of states and major municipalities (including capitals) and focus groups with these HRH managers - all by geographic region. The results were obtained by combining survey and focus group data, and also through triangulation with the results of previous research. The results of this evaluation showed the evolution policy, previously restricted to the field of 'personnel administration', now expanded to a conceptual model for health labour management and education-- identifying progress, setbacks, critical issues and challenges for the consolidation of the decentralized model for HRH management. The results showed that 76.3% of the health departments have an HRH unit. It was observed that 63.2% have an HRH information system. However, in most health departments, the HRH unit uses only the payroll and administrative records as data sources. Concerning education in health, 67.6% of the HRH managers mentioned existing cooperation with educational and teaching institutions for training and/or specialization of health workers. Among them, specialization courses account for 61.4% and short courses for 56.1%. Due to decentralization, the HRH area has been restructured and policies beyond traditional administrative activities have been developed. However, twenty years on from the establishment of the SUS, there remains a low level of institutionalization in the HRH area, despite recent efforts of the MoH.

  14. High-volume workflow management in the ITN/FBI system

    NASA Astrophysics Data System (ADS)

    Paulson, Thomas L.

    1997-02-01

    The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.

  15. Symmetric Link Key Management for Secure Neighbor Discovery in a Decentralized Wireless Sensor Network

    DTIC Science & Technology

    2017-09-01

    and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT...KEY MANAGEMENT FOR SECURE NEIGHBOR DISCOVERY IN A DECENTRALIZED WIRELESS SENSOR NETWORK by Kelvin T. Chew September 2017 Thesis Advisor...DATE September 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE SYMMETRIC LINK KEY MANAGEMENT FOR SECURE NEIGHBOR

  16. Decentralization and human resource management in the health sector: a case study (1996-1998) from Nampula province, Mozambique.

    PubMed

    Saide, M A; Stewart, D E

    2001-01-01

    Despite political, cultural and geographical diversity, health care reforms implemented in many developing countries share a number of common features regarding management and structural issues. Decentralization of decision-making from the central authority to local and provincial levels is generally regarded in the literature to be an important way of achieving a more equitable distribution of health care and better management practices, aligned with local priorities and needs. However, in the absence of clear guidelines, continuous monitoring and an adequate supply of financial and human resources, decentralization processes are more likely to have a low impact on the process of health care reform and can, to a certain extent, provoke inequalities between regions in the same country. This qualitative study in Nampula province, Mozambique, was conducted to assess the impact of decentralization, through an analysis of the viewpoints of provincial health managers regarding their perceptions of the process, particularly with regard to the management of basic and elementary nurses. Secondary data from Nampula provincial reports and documents from the Mozambican Health Ministry were also reviewed and comparisons made with the experiences of other developing countries.

  17. 32 CFR Appendix B to Part 324 - System of Records Notice

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... organizationally decentralized system, describe each level of organization or element that maintains a portion of... manager should be indicated. For geographically separated or organizationally decentralized activities...

  18. 32 CFR Appendix B to Part 324 - System of Records Notice

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... organizationally decentralized system, describe each level of organization or element that maintains a portion of... manager should be indicated. For geographically separated or organizationally decentralized activities...

  19. 32 CFR Appendix B to Part 324 - System of Records Notice

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... organizationally decentralized system, describe each level of organization or element that maintains a portion of... manager should be indicated. For geographically separated or organizationally decentralized activities...

  20. 32 CFR Appendix B to Part 324 - System of Records Notice

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... organizationally decentralized system, describe each level of organization or element that maintains a portion of... manager should be indicated. For geographically separated or organizationally decentralized activities...

  1. Context-aware workflow management of mobile health applications.

    PubMed

    Salden, Alfons; Poortinga, Remco

    2006-01-01

    We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.

  2. Concept of an innovative water management system with decentralized water reclamation and cascading material-cycle for agricultural areas.

    PubMed

    Fujiwara, T

    2012-01-01

    Unlike in urban areas where intensive water reclamation systems are available, development of decentralized technologies and systems is required for water use to be sustainable in agricultural areas. To overcome various water quality issues in those areas, a research project entitled 'Development of an innovative water management system with decentralized water reclamation and cascading material-cycle for agricultural areas under the consideration of climate change' was launched in 2009. This paper introduces the concept of this research and provides detailed information on each of its research areas: (1) development of a diffuse agricultural pollution control technology using catch crops; (2) development of a decentralized differentiable treatment system for livestock and human excreta; and (3) development of a cascading material-cycle system for water pollution control and value-added production. The author also emphasizes that the innovative water management system for agricultural areas should incorporate a strategy for the voluntary collection of bio-resources.

  3. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.

    PubMed

    Mickelson, Robin S; Unertl, Kim M; Holden, Richard J

    2016-10-12

    Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.

  4. Collective and decentralized management model in public hospitals: perspective of the nursing team.

    PubMed

    Bernardes, Andrea; Cecilio, Luiz Carlos de Oliveira; Evora, Yolanda Dora Martinez; Gabriel, Carmen Silvia; Carvalho, Mariana Bernardes de

    2011-01-01

    This research aims to present the implementation of the collective and decentralized management model in functional units of a public hospital in the city of Ribeirão Preto, state of São Paulo, according to the view of the nursing staff and the health technical assistant. This historical and organizational case study used qualitative thematic content analysis proposed by Bardin for data analysis. The institution started the decentralization of its administrative structure in 1999, through collective management, which permitted several internal improvements, with positive repercussion for the care delivered to users. The top-down implementation of the process seems to have jeopardized workers adherence, although collective management has intensified communication and the sharing of power and decision. The study shows that there is still much work to be done to concretize this innovative management proposal, despite the advances regarding the quality of care.

  5. What supervisors want to know about decentralization.

    PubMed

    Boissoneau, R; Belton, P

    1991-06-01

    Many organizations in various industries have tended to move away from strict centralization, yet some centralization is still vital to top management. With 19 of the 22 executives interviewed favoring or implementing some form of decentralization, it is probable that traditionally centralized organizations will follow the trend and begin to decentralize their organizational structures. The incentives and advantages of decentralization are too attractive to ignore. Decentralization provides responsibility, clear objectives, accountability for results, and more efficient and effective decision making. However, one must remember that decentralization can be overextended and that centralization is still viable in certain functions. Finding the correct balance between control and autonomy is a key to decentralization. Too much control and too much autonomy are the primary reasons for decentralization failures. In today's changing, competitive environment, structures must be continuously redefined, with the goal of finding an optimal balance between centralization and decentralization. Organizations are cautioned not to seek out and install a single philosopher-king to impose unified direction, but to unify leadership goals, participation, style, and control to develop improved methods of making all responsible leaders of one mind about the organization's needs and goals.

  6. Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization

    PubMed Central

    Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley

    2015-01-01

    Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173

  7. Seasonal and situational impacts on the effectiveness of a decentralized stormwater management program in the reduction of runoff volume (Cincinnati OH; USA)

    EPA Science Inventory

    A decentralized, retrofit approach to storm water management was implemented in a small suburban drainage on the basis of a voluntary reverse auction. This campaign led to the installation of 83 rain gardens and 176 rain barrels on approximately 20 percent of 350 residential prop...

  8. DServO: A Peer-to-Peer-based Approach to Biomedical Ontology Repositories.

    PubMed

    Mambone, Zakaria; Savadogo, Mahamadi; Some, Borlli Michel Jonas; Diallo, Gayo

    2015-01-01

    We present in this poster an extension of the ServO ontology server system, which adopts a decentralized Peer-To-Peer approach for managing multiple heterogeneous knowledge organization systems. It relies on the use of the JXTA protocol coupled with information retrieval techniques to provide a decentralized infrastructure for managing multiples instances of Ontology Repositories.

  9. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure

    PubMed Central

    2016-01-01

    Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331

  10. A Model of Workflow Composition for Emergency Management

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  11. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  12. Papers by the Decentralized Wastewater Management MOU Partnership

    EPA Pesticide Factsheets

    Four position papers for state, local, and tribal government officials and interested stakeholders. These papers include information on the uses and benefits of decentralized wastewater treatment and examples of its effective use.

  13. 32 CFR Appendix F to Part 505 - Example of a System of Records Notice

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) System Location: Specify the address of the primary system and any decentralized elements, including... title and duty address of the system manager. For decentralized systems, show the locations, the...

  14. 32 CFR Appendix F to Part 505 - Example of a System of Records Notice

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) System Location: Specify the address of the primary system and any decentralized elements, including... title and duty address of the system manager. For decentralized systems, show the locations, the...

  15. Electronic data capture and DICOM data management in multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Page, Charles-E.; Deserno, Thomas M.

    2016-03-01

    Providing eligibility, efficacy and security evaluation by quantitative and qualitative disease findings, medical imaging has become increasingly important in clinical trials. Here, subject's data is today captured in electronic case reports forms (eCRFs), which are offered by electronic data capture (EDC) systems. However, integration of subject's medical image data into eCRFs is insufficiently supported. Neither integration of subject's digital imaging and communications in medicine (DICOM) data, nor communication with picture archiving and communication systems (PACS), is possible. This aggravates the workflow of the study personnel, in special regarding studies with distributed data capture in multiple sites. Hence, in this work, a system architecture is presented, which connects an EDC system, a PACS and a DICOM viewer via the web access to DICOM objects (WADO) protocol. The architecture is implemented using the open source tools OpenClinica, DCM4CHEE and Weasis. The eCRF forms the primary endpoint for the study personnel, where subject's image data is stored and retrieved. Background communication with the PACS is completely hidden for the users. Data privacy and consistency is ensured by automatic de-identification and re-labelling of DICOM data with context information (e.g. study and subject identifiers), respectively. The system is exemplarily demonstrated in a clinical trial, where computer tomography (CT) data is de-centrally captured from the subjects and centrally read by a chief radiologists to decide on inclusion of the subjects in the trial. Errors, latency and costs in the EDC workflow are reduced, while, a research database is implicitly built up in the background.

  16. Formula Funding and Decentralized Management of Schools--Has It Improved Resource Allocation in Schools in Sri Lanka?

    ERIC Educational Resources Information Center

    Arunatilake, Nisha; Jayawardena, Priyanka

    2010-01-01

    Using the experience of the Educational Quality Inputs (EQI) Scheme in Sri Lanka the paper examines the distributional aspects of formula-based funding and efficiency of decentralized management of education funds in a developing country setting. The study finds that the EQI fund distribution is largely pro-poor. However, results show that to…

  17. DAISY-DAMP: A distributed AI system for the dynamic allocation and management of power

    NASA Technical Reports Server (NTRS)

    Hall, Steven B.; Ohler, Peter C.

    1988-01-01

    One of the critical parameters that must be addressed when designing a loosely coupled Distributed AI SYstem (DAISY) has to do with the degree to which authority is centralized or decentralized. The decision to implement the Dynamic Allocation and Management of Power (DAMP) system as a network of cooperating agents mandated this study. The DAISY-DAMP problem is described; the component agents of the system are characterized; and the communication protocols system elucidated. The motivations and advantages in designing the system with authority decentralized is discussed. Progress in the area of Speech Act theory is proposed as playing a role in constructing decentralized systems.

  18. Design and implementation of workflow engine for service-oriented architecture

    NASA Astrophysics Data System (ADS)

    Peng, Shuqing; Duan, Huining; Chen, Deyun

    2009-04-01

    As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.

  19. Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.

  20. Workflow technology: the new frontier. How to overcome the barriers and join the future.

    PubMed

    Shefter, Susan M

    2006-01-01

    Hospitals are catching up to the business world in the introduction of technology systems that support professional practice and workflow. The field of case management is highly complex and interrelates with diverse groups in diverse locations. The last few years have seen the introduction of Workflow Technology Tools, which can improve the quality and efficiency of discharge planning by the case manager. Despite the availability of these wonderful new programs, many case managers are hesitant to adopt the new technology and workflow. For a myriad of reasons, a computer-based workflow system can seem like a brick wall. This article discusses, from a practitioner's point of view, how professionals can gain confidence and skill to get around the brick wall and join the future.

  1. Decentralizing conservation and diversifying livelihoods within Kanchenjunga Conservation Area, Nepal.

    PubMed

    Parker, Pete; Thapa, Brijesh; Jacob, Aerin

    2015-12-01

    To alleviate poverty and enhance conservation in resource dependent communities, managers must identify existing livelihood strategies and the associated factors that impede household access to livelihood assets. Researchers increasingly advocate reallocating management power from exclusionary central institutions to a decentralized system of management based on local and inclusive participation. However, it is yet to be shown if decentralizing conservation leads to diversified livelihoods within a protected area. The purpose of this study was to identify and assess factors affecting household livelihood diversification within Nepal's Kanchenjunga Conservation Area Project, the first protected area in Asia to decentralize conservation. We randomly surveyed 25% of Kanchenjunga households to assess household socioeconomic and demographic characteristics and access to livelihood assets. We used a cluster analysis with the ten most common income generating activities (both on- and off-farm) to group the strategies households use to diversify livelihoods, and a multinomial logistic regression to identify predictors of livelihood diversification. We found four distinct groups of household livelihood strategies with a range of diversification that directly corresponded to household income. The predictors of livelihood diversification were more related to pre-existing socioeconomic and demographic factors (e.g., more landholdings and livestock, fewer dependents, receiving remittances) than activities sponsored by decentralizing conservation (e.g., microcredit, training, education, interaction with project staff). Taken together, our findings indicate that without direct policies to target marginalized groups, decentralized conservation in Kanchenjunga will continue to exclude marginalized groups, limiting a household's ability to diversify their livelihood and perpetuating their dependence on natural resources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Common Workflow Service: Standards Based Solution for Managing Operational Processes

    NASA Astrophysics Data System (ADS)

    Tinio, A. W.; Hollins, G. A.

    2017-06-01

    The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.

  3. Decentralized Real-Time Scheduling

    DTIC Science & Technology

    1990-08-01

    must provide several alternative resource management policies, including FIFO and deadline queueing for shared resources that are not available. 5...When demand exceeds the supply of shared resources (even within a single switch), some calls cannot be completed. In that case, a call’s priority...associated chiefly with the need to manage resources in a timely and decentralized fashion. The Alpha programming model permits the convenient expression of

  4. Improving investigational drug service operations through development of an innovative computer system.

    PubMed

    Sweet, Burgunda V; Tamer, Helen R; Siden, Rivka; McCreadie, Scott R; McGregory, Michael E; Benner, Todd; Tankanow, Roberta M

    2008-05-15

    The development of a computerized system for protocol management, dispensing, inventory accountability, and billing by the investigational drug service (IDS) of a university health system is described. After an unsuccessful search for a commercial system that would accommodate the variation among investigational protocols and meet regulatory requirements, the IDS worked with the health-system pharmacy's information technology staff and informatics pharmacists to develop its own system. The informatics pharmacists observed work-flow and information capture in the IDS and identified opportunities for improved efficiency with an automated system. An iterative build-test-design process was used to provide the flexibility needed for individual protocols. The intent was to design a system that would support most IDS processes, using components that would allow automated backup and redundancies. A browser-based system was chosen to allow remote access. Servers, bar-code scanners, and printers were integrated into the final system design. Initial implementation involved 10 investigational protocols chosen on the basis of dispensing volume and complexity of study design. Other protocols were added over a two-year period; all studies whose drugs were dispensed from the IDS were added, followed by those for which the drugs were dispensed from decentralized pharmacy areas. The IDS briefly used temporary staff to free pharmacist and technician time for system implementation. Decentralized pharmacy areas that rarely dispense investigational drugs continue to use manual processes, with subsequent data transcription into the system. Through the university's technology transfer division, the system was licensed by an external company for sale to other IDSs. The WebIDS system has improved daily operations, enhanced safety and efficiency, and helped meet regulatory requirements for investigational drugs.

  5. An architecture model for multiple disease management information systems.

    PubMed

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  6. Design and implementation of a secure workflow system based on PKI/PMI

    NASA Astrophysics Data System (ADS)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  7. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  8. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  9. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  10. Ecology Based Decentralized Agent Management System

    NASA Technical Reports Server (NTRS)

    Peysakhov, Maxim D.; Cicirello, Vincent A.; Regli, William C.

    2004-01-01

    The problem of maintaining a desired number of mobile agents on a network is not trivial, especially if we want a completely decentralized solution. Decentralized control makes a system more r e bust and less susceptible to partial failures. The problem is exacerbated on wireless ad hoc networks where host mobility can result in significant changes in the network size and topology. In this paper we propose an ecology-inspired approach to the management of the number of agents. The approach associates agents with living organisms and tasks with food. Agents procreate or die based on the abundance of uncompleted tasks (food). We performed a series of experiments investigating properties of such systems and analyzed their stability under various conditions. We concluded that the ecology based metaphor can be successfully applied to the management of agent populations on wireless ad hoc networks.

  11. Modelling and analysis of workflow for lean supply chains

    NASA Astrophysics Data System (ADS)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  12. Development of a user customizable imaging informatics-based intelligent workflow engine system to enhance rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent

    2014-03-01

    Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

  13. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  14. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  15. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  16. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  17. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  18. Case mix management education in a Canadian hospital.

    PubMed

    Moffat, M; Prociw, M

    1992-01-01

    The Sunnybrook Health Science Centre's matrix organization model includes a traditional departmental structure, a strategic program-based structure and a case management-based structure--the Clinical Unit structure. The Clinical Unit structure allows the centre to give responsibility for the management of case mix and volume to decentralized Clinical Unit teams, each of which manages its own budget. To train physicians and nurses in their respective roles of Medical Unit directors and Nursing Unit directors, Sunnybrook designed unique short courses on financial management and budgeting, and case-costing and case mix management. This paper discusses how these courses were organized, details their contents and explains how they fit into Sunnybrook's program of decentralized management.

  19. Content and Workflow Management for Library Websites: Case Studies

    ERIC Educational Resources Information Center

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  20. Scientific Workflow Management in Proteomics

    PubMed Central

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  1. Flexible Workflow Software enables the Management of an Increased Volume and Heterogeneity of Sensors, and evolves with the Expansion of Complex Ocean Observatory Infrastructures.

    NASA Astrophysics Data System (ADS)

    Tomlin, M. C.; Jenkyns, R.

    2015-12-01

    Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.

  2. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  3. Strategic Alignment: Recruiting Students in a Highly Decentralized Environment

    ERIC Educational Resources Information Center

    Levin, Richard

    2016-01-01

    All enrollment managers face some level of challenge related to decentralized decision making and operations. Policies and practices can vary considerably by academic area, creating administrative complexity, restricting the scope and speed of institutional initiatives, and limiting potential efficiencies. Central attempts to standardize or…

  4. [Significant changes in the health system decentralization process in Brazil].

    PubMed

    Viana, Ana Luiza d'Avila; Heimann, Luiza S; de Lima, Luciana Dias; de Oliveira, Roberta Gondim; Rodrigues, Sergio da Hora

    2002-01-01

    This article discusses the trends and limits of the Brazilian health system decentralization process, identifying the three elements that constitute the strategic induction performed by the national system administrator in accordance with the guidelines contained in the Operational Norms of the Unified National Health System: systemic rationality, intergovernmental and service provider financing, and health care model. The effects of the Federal regulations are analyzed based on the results of the evaluation study focused on the implementation of the full management scheme at the Municipal level. The decentralization strategy induced by Basic Operational Norm 96 has succeeded in improving institutional conditions, management autonomy, and supply, as measured by the Federal resources transferred, installed capacity, production, and coverage of outpatient and hospital services, with the Municipalities authorized to conduct fully autonomous management, without altering the existing patterns of inequity in the distribution of funds to poorer Municipalities.

  5. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  6. Implementation of a decentralized community-based treatment program to improve the management of Buruli ulcer in the Ouinhi district of Benin, West Africa

    PubMed Central

    Amoussouhoui, Arnaud Setondji; Wadagni, Anita Carolle; Johnson, Roch Christian; Aoulou, Paulin; Agbo, Inès Elvire; Houezo, Jean-Gabin; Boyer, Micah; Nichter, Mark

    2018-01-01

    Background Mycobacterium ulcerans infection, commonly known as Buruli ulcer (BU), is a debilitating neglected tropical disease. Its management remains complex and has three main components: antibiotic treatment combining rifampicin and streptomycin for 56 days, wound dressings and skin grafts for large ulcerations, and physical therapy to prevent functional limitations after care. In Benin, BU patient care is being integrated into the government health system. In this paper, we report on an innovative pilot program designed to introduce BU decentralization in Ouinhi district, one of Benin’s most endemic districts previously served by centralized hospital-based care. Methodology/Principal findings We conducted intervention-oriented research implemented in four steps: baseline study, training of health district clinical staff, outreach education, outcome and impact assessments. Study results demonstrated that early BU lesions (71% of all detected cases) could be treated in the community following outreach education, and that most of the afflicted were willing to accept decentralized treatment. Ninety-three percent were successfully treated with antibiotics alone. The impact evaluation found that community confidence in decentralized BU care was greatly enhanced by clinic staff who came to be seen as having expertise in the care of most chronic wounds. Conclusions/Significance This study documents a successful BU outreach and decentralized care program reaching early BU cases not previously treated by a proactive centralized BU program. The pilot program further demonstrates the added value of integrated wound management for NTD control. PMID:29529087

  7. Implementation of a decentralized community-based treatment program to improve the management of Buruli ulcer in the Ouinhi district of Benin, West Africa.

    PubMed

    Amoussouhoui, Arnaud Setondji; Sopoh, Ghislain Emmanuel; Wadagni, Anita Carolle; Johnson, Roch Christian; Aoulou, Paulin; Agbo, Inès Elvire; Houezo, Jean-Gabin; Boyer, Micah; Nichter, Mark

    2018-03-01

    Mycobacterium ulcerans infection, commonly known as Buruli ulcer (BU), is a debilitating neglected tropical disease. Its management remains complex and has three main components: antibiotic treatment combining rifampicin and streptomycin for 56 days, wound dressings and skin grafts for large ulcerations, and physical therapy to prevent functional limitations after care. In Benin, BU patient care is being integrated into the government health system. In this paper, we report on an innovative pilot program designed to introduce BU decentralization in Ouinhi district, one of Benin's most endemic districts previously served by centralized hospital-based care. We conducted intervention-oriented research implemented in four steps: baseline study, training of health district clinical staff, outreach education, outcome and impact assessments. Study results demonstrated that early BU lesions (71% of all detected cases) could be treated in the community following outreach education, and that most of the afflicted were willing to accept decentralized treatment. Ninety-three percent were successfully treated with antibiotics alone. The impact evaluation found that community confidence in decentralized BU care was greatly enhanced by clinic staff who came to be seen as having expertise in the care of most chronic wounds. This study documents a successful BU outreach and decentralized care program reaching early BU cases not previously treated by a proactive centralized BU program. The pilot program further demonstrates the added value of integrated wound management for NTD control.

  8. Nitrogen Control Through Decentralized Wastewater Treatment: Process Performance and Alternative Management Strategies

    EPA Science Inventory

    Decentralized or onsite wastewater treatment (OWT) systems have long been implicated in being a major source of N inputs to surface and ground waters and numerous regulatory bodies have promulgated strict total N (TN) effluent standards in N-sensitive areas. These standards, howe...

  9. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    PubMed

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  10. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    PubMed Central

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  11. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  12. Centralization and Decentralization of Schools' Physical Facilities Management in Nigeria

    ERIC Educational Resources Information Center

    Ikoya, Peter O.

    2008-01-01

    Purpose: This research aims to examine the difference in the availability, adequacy and functionality of physical facilities in centralized and decentralized schools districts, with a view to making appropriate recommendations to stakeholders on the reform programmes in the Nigerian education sector. Design/methodology/approach: Principals,…

  13. Leadership and the Decentralized Control of Schools

    ERIC Educational Resources Information Center

    Steinberg, Matthew P.

    2013-01-01

    This review examines the literature related to leadership and the decentralized control of schools. It first considers the distinctive goals of public and private agencies, the specific constraints that shape the autonomy of leaders in different sectors, and the ways in which new models of public management are infusing public agencies with…

  14. Lessons from implementing a combined workflow-informatics system for diabetes management.

    PubMed

    Zai, Adrian H; Grant, Richard W; Estey, Greg; Lester, William T; Andrews, Carl T; Yee, Ronnie; Mort, Elizabeth; Chueh, Henry C

    2008-01-01

    Shortcomings surrounding the care of patients with diabetes have been attributed largely to a fragmented, disorganized, and duplicative health care system that focuses more on acute conditions and complications than on managing chronic disease. To address these shortcomings, we developed a diabetes registry population management application to change the way our staff manages patients with diabetes. Use of this new application has helped us coordinate the responsibilities for intervening and monitoring patients in the registry among different users. Our experiences using this combined workflow-informatics intervention system suggest that integrating a chronic disease registry into clinical workflow for the treatment of chronic conditions creates a useful and efficient tool for managing disease.

  15. Investigation on the governance model and effect of medical schools merged with comprehensive universities in China.

    PubMed

    Bai, Ge; Luo, Li

    2013-08-01

    This investigation analyzes the management of medical schools merged with comprehensive universities through internet search and research review to reveal management model and effect of the merger. The conclusion is safely reached that governance models are divided into two different patterns: centralized management and decentralized management. Eight universities, representing the two models, were selected and evaluated comprehensively. Among them, the universities that carried out decentralized management have greater development after the merger based on a quality comparison concerning freshmen, faculty, teaching, and research between the two patterns. © 2013 Wiley Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  16. The social control of energy: A case for the promise of decentralized solar technologies

    NASA Astrophysics Data System (ADS)

    Gilmer, R. W.

    1980-05-01

    Decentralized solar technology and centralized electric utilities were contrasted in the ways they assign property rights in capital and energy output; in the assignment of operational control; and in the means of monitoring, policing, and enforcing property rights. An analogy was drawn between the decision of an energy consumer to use decentralized solar and the decision of a firm to vertically integrate, that is, to extend the boundary of a the firm to vertically integrate, that is, to extend the boundary of the firm by making inputs or further processing output. Decentralized solar energy production offers the small energy consumer the chance to cut ties to outside suppliers--to vertically integrate energy production into the home or business. The development of this analogy provides insight into important noneconomic aspects of solar energy, and it points clearly to the lighter burdens of social management offered by decentralized solar technology.

  17. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  18. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  19. Six hospitals describe decentralization, cost containment, and downsizing.

    PubMed

    Lineweaver, L A; Battle, C E; Schilling, R M; Nall, C M

    1999-01-01

    Decentralization, cost containment, and downsizing continue in full force as healthcare organizations continue to adapt to constant economic change. Hospitals are forced to take a second and third look at how health care is managed in order to survive. Six Northwest Florida hospitals were surveyed in an effort to explore current changes within the healthcare delivery system. This article provides both managers and staff with an overview of recent healthcare changes in an area of the country with implications for staff development.

  20. Historical and Cultural Perspectives on Centralization/Decentralization in Continuing Education.

    ERIC Educational Resources Information Center

    Edelson, Paul J.

    1995-01-01

    Views centralization/decentralization from four perspectives: historical, as an outgrowth of professionalism, in the culture of higher education, and management theory. Suggests that some form of centralized control will always be necessary if continuing education is to function in a larger organization, but smaller units may be the wave of the…

  1. Human Resource Support for School Principals in Two, Urban School Districts: An Exploratory Study

    ERIC Educational Resources Information Center

    Lochmiller, Chad R.

    2010-01-01

    School districts are increasingly focused on instructional practice in classrooms. Many urban school districts have shifted decision-making responsibility to school principals in order to improve instruction. This reform strategy has been referred to as decentralization or school-based management. Decentralization has a significant influence on…

  2. Supporting Collaborative Model and Data Service Development and Deployment with DevOps

    NASA Astrophysics Data System (ADS)

    David, O.

    2016-12-01

    Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.

  3. The Invasive Species Forecasting System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Most, Neal; Gill, Roger; Ma, Peter

    2011-01-01

    The Invasive Species Forecasting System (ISFS) provides computational support for the generic work processes found in many regional-scale ecosystem modeling applications. Decision support tools built using ISFS allow a user to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of management concern, such as a national park, monument, forest, or refuge. This type of decision product helps resource managers plan invasive species protection, monitoring, and control strategies for the lands they manage. Until now, scientists and resource managers have lacked the data-assembly and computing capabilities to produce these maps quickly and cost efficiently. ISFS focuses on regional-scale habitat suitability modeling for invasive terrestrial plants. ISFS s component architecture emphasizes simplicity and adaptability. Its core services can be easily adapted to produce model-based decision support tools tailored to particular parks, monuments, forests, refuges, and related management units. ISFS can be used to build standalone run-time tools that require no connection to the Internet, as well as fully Internet-based decision support applications. ISFS provides the core data structures, operating system interfaces, network interfaces, and inter-component constraints comprising the canonical workflow for habitat suitability modeling. The predictors, analysis methods, and geographic extents involved in any particular model run are elements of the user space and arbitrarily configurable by the user. ISFS provides small, lightweight, readily hardened core components of general utility. These components can be adapted to unanticipated uses, are tailorable, and require at most a loosely coupled, nonproprietary connection to the Web. Users can invoke capabilities from a command line; programmers can integrate ISFS's core components into more complex systems and services. Taken together, these features enable a degree of decentralization and distributed ownership that have helped other types of scientific information services succeed in recent years.

  4. HMIS and decision-making in Zambia: re-thinking information solutions for district health management in decentralized health systems.

    PubMed

    Mutemwa, Richard I

    2006-01-01

    At the onset of health system decentralization as a primary health care strategy, which constituted a key feature of health sector reforms across the developing world, efficient and effective health management information systems (HMIS) were widely acknowledged and adopted as a critical element of district health management strengthening programmes. The focal concern was about the performance and long-term sustainability of decentralized district health systems. The underlying logic was that effective and efficient HMIS would provide district health managers with the information required to make effective strategic decisions that are the vehicle for district performance and sustainability in these decentralized health systems. However, this argument is rooted in normative management and decision theory without significant unequivocal empirical corroboration. Indeed, extensive empirical evidence continues to indicate that managers' decision-making behaviour and the existence of other forms of information outside the HMIS, within the organizational environment, suggest a far more tenuous relationship between the presence of organizational management information systems (such as HMIS) and effective strategic decision-making. This qualitative comparative case-study conducted in two districts of Zambia focused on investigating the presence and behaviour of five formally identified, different information forms, including that from HMIS, in the strategic decision-making process. The aim was to determine the validity of current arguments for HMIS, and establish implications for current HMIS policies. Evidence from the eight strategic decision-making processes traced in the study confirmed the existence of different forms of information in the organizational environment, including that provided by the conventional HMIS. These information forms attach themselves to various organizational management processes and key aspects of organizational routine. The study results point to the need for a radical re-think of district health management information solutions in ways that account for the existence of other information forms outside the formal HMIS in the district health system.

  5. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  6. Decentralized School Governance Policy: A Comparative Study of General Public Schools and Community-Managed Schools in Nepal

    ERIC Educational Resources Information Center

    Khanal, Mukunda Mani

    2016-01-01

    The literature reviewed for this study revealed that the movement toward decentralizing responsibility of school governance to communities has become a global policy in the contemporary world. With the aim of enhancing greater community participation and retaining students in public schools, the Government of Nepal introduced two different…

  7. Decentralizing Education in Transition Societies: Case Studies from Central and Eastern Europe. WBI Learning Resources Series.

    ERIC Educational Resources Information Center

    Fiszbein, Ariel, Ed.

    This book is about education system reform in Central and Eastern Europe, with emphasis on decentralization and management. In the past, local authorities served as implementation arms of the central ministry, while finance and decision-making were controlled by the central government, leaving local communities with little influence. New education…

  8. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  9. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    PubMed

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  10. Ergonomic design for dental offices.

    PubMed

    Ahearn, David J; Sanders, Martha J; Turcotte, Claudia

    2010-01-01

    The increasing complexity of the dental office environment influences productivity and workflow for dental clinicians. Advances in technology, and with it the range of products needed to provide services, have led to sprawl in operatory setups and the potential for awkward postures for dental clinicians during the delivery of oral health services. Although ergonomics often addresses the prevention of musculoskeletal disorders for specific populations of workers, concepts of workflow and productivity are integral to improved practice in work environments. This article provides suggestions for improving workflow and productivity for dental clinicians. The article applies ergonomic principles to dental practice issues such as equipment and supply management, office design, and workflow management. Implications for improved ergonomic processes and future research are explored.

  11. Project management training : final report.

    DOT National Transportation Integrated Search

    2011-01-01

    In 2005 the Indiana Department of Transportation (INDOT) went through a complete reorganization of its operations going from centralized to decentralized (District) management. This reorganization gave Districts autonomy to manage construction projec...

  12. Project management training : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    In 2005, the Indiana Department of Transportation (INDOT) went through a complete reorganization of its operations going from centralized to decentralized (District) management. This reorganization gave Districts autonomy to manage construction proje...

  13. The recent process of decentralization and democratic management of education in Brazil

    NASA Astrophysics Data System (ADS)

    Santos Filho, José Camilo Dos

    1993-09-01

    Brazilian society is beginning a new historical period in which the principle of decentralization is beginning to predominate over centralization, which held sway during the last 25 years. In contrast to recent Brazilian history, there is now a search for political, democratic and participatory decentralization more consonant with grass-roots aspirations. The first section of this article presents a brief analysis of some decentralization policies implemented by the military regime of 1964, and discusses relevant facts related to the resistance of civil society to state authoritarianism, and to the struggle for the democratization and organization of civil society up to the end of the 1970s. The second section analyzes some new experiences of democratic public school administration initiated in the 1970s and 1980s. The final section discusses the move toward decentralization and democratization of public school administration in the new Federal and State Constitutions, and in the draft of the new Law of National Education.

  14. Detecting distant homologies on protozoans metabolic pathways using scientific workflows.

    PubMed

    da Cruz, Sérgio Manuel Serra; Batista, Vanessa; Silva, Edno; Tosta, Frederico; Vilela, Clarissa; Cuadrat, Rafael; Tschoeke, Diogo; Dávila, Alberto M R; Campos, Maria Luiza Machado; Mattoso, Marta

    2010-01-01

    Bioinformatics experiments are typically composed of programs in pipelines manipulating an enormous quantity of data. An interesting approach for managing those experiments is through workflow management systems (WfMS). In this work we discuss WfMS features to support genome homology workflows and present some relevant issues for typical genomic experiments. Our evaluation used Kepler WfMS to manage a real genomic pipeline, named OrthoSearch, originally defined as a Perl script. We show a case study detecting distant homologies on trypanomatids metabolic pathways. Our results reinforce the benefits of WfMS over script languages and point out challenges to WfMS in distributed environments.

  15. The path dependence of district manager decision-space in Ghana

    PubMed Central

    Kwamie, Aku; van Dijk, Han; Ansah, Evelyn K; Agyepong, Irene Akua

    2016-01-01

    The district health system in Ghana today is characterized by high resource-uncertainty and narrow decision-space. This article builds a theory-driven historical case study to describe the influence of path-dependent administrative, fiscal and political decentralization processes on development of the district health system and district manager decision-space. Methods included a non-exhaustive literature review of democratic governance in Ghana, and key informant interviews with high-level health system officials integral to the development of the district health system. Through our analysis we identified four periods of district health system progression: (1) development of the district health system (1970–85); (2) Strengthening District Health Systems Initiative (1986–93); (3) health sector reform planning and creation of the Ghana Health Service (1994–96) and (4) health sector reform implementation (1997–2007). It was observed that district manager decision-space steadily widened during periods (1) and (2), due to increases in managerial profile, and concerted efforts at managerial capacity strengthening. Periods (3) and (4) saw initial augmentation of district health system financing, further widening managerial decision-space. However, the latter half of period 4 witnessed district manager decision-space contraction. Formalization of Ghana Health Service structures influenced by self-reinforcing tendencies towards centralized decision-making, national and donor shifts in health sector financing, and changes in key policy actors all worked to the detriment of the district health system, reversing early gains from bottom-up development of the district health system. Policy feedback mechanisms have been influenced by historical and contemporary sequencing of local government and health sector decentralization. An initial act of administrative decentralization, followed by incomplete political and fiscal decentralization has ensured that the balance of power has remained at national level, with strong vertical accountabilities and dependence of the district on national level. This study demonstrates that the rhetoric of decentralization does not always mirror actual implementation, nor always result in empowered local actors. PMID:26318537

  16. SynTrack: DNA Assembly Workflow Management (SynTrack) v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MENG, XIANWEI; SIMIRENKO, LISA

    2016-12-01

    SynTrack is a dynamic, workflow-driven data management system that tracks the DNA build process: Management of the hierarchical relationships of the DNA fragments; Monitoring of process tasks for the assembly of multiple DNA fragments into final constructs; Creations of vendor order forms with selectable building blocks. Organizing plate layouts barcodes for vendor/pcr/fusion/chewback/bioassay/glycerol/master plate maps (default/condensed); Creating or updating Pre-Assembly/Assembly process workflows with selected building blocks; Generating Echo pooling instructions based on plate maps; Tracking of building block orders, received and final assembled for delivering; Bulk updating of colony or PCR amplification information, fusion PCR and chewback results; Updating with QA/QCmore » outcome with .csv & .xlsx template files; Re-work assembly workflow enabled before and after sequencing validation; and Tracking of plate/well data changes and status updates and reporting of master plate status with QC outcomes.« less

  17. Improving diabetes population management efficiency with an informatics solution.

    PubMed

    Zai, Adrian; Grant, Richard; Andrews, Carl; Yee, Ronnie; Chueh, Henry

    2007-10-11

    Despite intensive resource use for diabetes management in the U.S., our care continues to fall short of evidence-based goals, partly due to system inefficiencies. Diabetes registries are increasingly being utilized as a critical tool for population level disease management by providing real-time data. Since the successful adoption of a diabetes registry depends on how well it integrates with disease management workflows, we optimized our current diabetes management workflow and designed our registry application around it.

  18. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    NASA Astrophysics Data System (ADS)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  19. Health sector decentralization and local decision-making: Decision space, institutional capacities and accountability in Pakistan.

    PubMed

    Bossert, Thomas John; Mitchell, Andrew David

    2011-01-01

    Health sector decentralization has been widely adopted to improve delivery of health services. While many argue that institutional capacities and mechanisms of accountability required to transform decentralized decision-making into improvements in local health systems are lacking, few empirical studies exist which measure or relate together these concepts. Based on research instruments administered to a sample of 91 health sector decision-makers in 17 districts of Pakistan, this study analyzes relationships between three dimensions of decentralization: decentralized authority (referred to as "decision space"), institutional capacities, and accountability to local officials. Composite quantitative indicators of these three dimensions were constructed within four broad health functions (strategic and operational planning, budgeting, human resources management, and service organization/delivery) and on an overall/cross-function basis. Three main findings emerged. First, district-level respondents report varying degrees of each dimension despite being under a single decentralization regime and facing similar rules across provinces. Second, within dimensions of decentralization-particularly decision space and capacities-synergies exist between levels reported by respondents in one function and those reported in other functions (statistically significant coefficients of correlation ranging from ρ=0.22 to ρ=0.43). Third, synergies exist across dimensions of decentralization, particularly in terms of an overall indicator of institutional capacities (significantly correlated with both overall decision space (ρ=0.39) and accountability (ρ=0.23)). This study demonstrates that decentralization is a varied experience-with some district-level officials making greater use of decision space than others and that those who do so also tend to have more capacity to make decisions and are held more accountable to elected local officials for such choices. These findings suggest that Pakistan's decentralization policy should focus on synergies among dimensions of decentralization to encouraging more use of de jure decision space, work toward more uniform institutional capacity, and encourage greater accountability to local elected officials. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Flexible Early Warning Systems with Workflows and Decision Tables

    NASA Astrophysics Data System (ADS)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  1. The institutionalization of River Basin Management as politics of scale - Insights from Mongolia

    NASA Astrophysics Data System (ADS)

    Houdret, Annabelle; Dombrowsky, Ines; Horlemann, Lena

    2014-11-01

    River Basin Management (RBM) as an approach to sustainable water use has become the dominant model of water governance. Its introduction, however, entails a fundamental realignment and rescaling of water-sector institutions along hydrological boundaries. Creating such a new governance scale is inherently political, and is being described as politics of scale. This paper analyzes how the politics of scale play out in the institutionalization of RBM in Mongolia. It furthermore scrutinizes the role of the broader political decentralization process in the introduction of RBM, an issue that has so far received little attention. Finally, it assesses whether the river basin is an adequate water management scale in Mongolia. This article finds that institutionalizing RBM in Mongolia is indeed a highly political negotiation process that does not only concern the choice of the governance scale, but also its detailed institutional design. It furthermore reveals that Mongolia's incomplete political decentralization process has for a long time negatively impacted the decentralization of water-related tasks and the implementation of RBM. However, the 2011 Budget Law and the 2012 Water Law provide for a fiscal strengthening of local governments and clearer sharing of responsibilities among the various different institutions involved in water management. Nevertheless, only if the 2012 Water Law is complemented by adequate by-laws - and if the newly created river basin institutions are adequately equipped - can RBM be effectively put into practice. This article confirms the usefulness of a politics-of-scale approach to understand scalar practices and changes in water management. However, the article also argues for a broadening of the analytical perspective to take the interdependencies between changes in water governance and other political processes, such as decentralization, into account.

  2. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  3. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    PubMed Central

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  4. Commentary on "Finance, Management, and Costs of Public and Private Schools in Indonesia" and "Do Local Contributions Affect the Efficiency of Public Primary Schools?"

    ERIC Educational Resources Information Center

    Berger, Mark C.

    1996-01-01

    Studies on Indonesia and the Philippines in this special issue examine how local financial control affects costs of providing primary schooling. In both countries, schools with greater financial decentralization operated more efficiently. These results have important implications for U.S. schools, where decentralization reforms in Kentucky and…

  5. A Two-Stage Probabilistic Approach to Manage Personal Worklist in Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Han, Rui; Liu, Yingbo; Wen, Lijie; Wang, Jianmin

    The application of workflow scheduling in managing individual actor's personal worklist is one area that can bring great improvement to business process. However, current deterministic work cannot adapt to the dynamics and uncertainties in the management of personal worklist. For such an issue, this paper proposes a two-stage probabilistic approach which aims at assisting actors to flexibly manage their personal worklists. To be specific, the approach analyzes every activity instance's continuous probability of satisfying deadline at the first stage. Based on this stochastic analysis result, at the second stage, an innovative scheduling strategy is proposed to minimize the overall deadline violation cost for an actor's personal worklist. Simultaneously, the strategy recommends the actor a feasible worklist of activity instances which meet the required bottom line of successful execution. The effectiveness of our approach is evaluated in a real-world workflow management system and with large scale simulation experiments.

  6. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  7. Overcoming Barriers to Technology Adoption in Small Manufacturing Enterprises (SMEs)

    DTIC Science & Technology

    2003-06-01

    automates quote-generation, order - processing workflow management, perform- ance analysis, and accounting functions. Ultimately, it will enable Magdic...that Magdic imple- ment an MES instead. The MES, in addition to solving the problem of document manage- ment, would automate quote-generation, order ... processing , workflow management, perform- ance analysis, and accounting functions. To help Magdic personnel learn about the MES, TIDE personnel provided

  8. Development of a novel imaging informatics-based system with an intelligent workflow engine (IWEIS) to support imaging-based clinical trials

    PubMed Central

    Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J

    2015-01-01

    Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169

  9. Attitudes and perceptions of stakeholders on decentralization of health services in Uganda: the case of Lira and Apac districts.

    PubMed

    Anokbonggo, W W; Ogwal-Okeng, J W; Ross-Degnan, D; Aupont, O

    2004-02-01

    In Uganda, the decentralization of administrative functions, management, and responsibility for health care to districts, which began in 1994, resulted in fundamental changes in health care delivery. Since the introduction of the policy in Uganda, little information has been available on stakeholders' perceptions about the benefits of the policy and how decentralization affected health care delivery. To identify the perceptions and beliefs of key stakeholders on the impact and process of decentralization and on the operations of health services in two districts in Uganda, and to report their suggestions to improve future implementation of similar policies. We used qualitative research methods that included focus group discussions with 90 stakeholders from both study districts. The sample population comprised of 12 health workers from the two hospitals, 11 district health administrators, and 67 Local Council Leaders. Perceptions and concerns of stakeholders on the impact of decentralization on district health services. There was a general consensus that decentralization empowered local administrative and political decision-making. Among stakeholders, the policy was perceived to have created a sense of ownership and responsibility. Major problems that were said to be associated with decentralization included political harassment of civil servants, increased nepotism, inadequate financial resources, and mismanagement of resources. This study elicited perceptions about critical factors upon which successful implementation of the decentralization policy depended. These included: appreciation of the role of all stakeholders by district politicians; adequate availability and efficient utilization of resources; reasonably developed infrastructure prior to the policy change; appropriate sensitisation and training of those implementing policies; and the good will and active involvement of the local community. In the absence of these factors, implementation of decentralization of services to districts may not immediately make economic and administrative sense.

  10. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    NASA Technical Reports Server (NTRS)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  11. Public Managers Should Be Proactive

    ERIC Educational Resources Information Center

    Carlson, Thomas S.

    1976-01-01

    Future public managers should be proactive by creating management processes before problems arise. Planning prevents reactive or crisis managing. Future managers should also be prepared to meet dilemmas and paradoxes such as centralization versus decentralization of decision-making and work processes, politics versus administration dichotomy, and…

  12. Application of decentralized cooperative problem solving in dynamic flexible scheduling

    NASA Astrophysics Data System (ADS)

    Guan, Zai-Lin; Lei, Ming; Wu, Bo; Wu, Ya; Yang, Shuzi

    1995-08-01

    The object of this study is to discuss an intelligent solution to the problem of task-allocation in shop floor scheduling. For this purpose, the technique of distributed artificial intelligence (DAI) is applied. Intelligent agents (IAs) are used to realize decentralized cooperation, and negotiation is realized by using message passing based on the contract net model. Multiple agents, such as manager agents, workcell agents, and workstation agents, make game-like decisions based on multiple criteria evaluations. This procedure of decentralized cooperative problem solving makes local scheduling possible. And by integrating such multiple local schedules, dynamic flexible scheduling for the whole shop floor production can be realized.

  13. Defining Usability Heuristics for Adoption and Efficiency of an Electronic Workflow Document Management System

    ERIC Educational Resources Information Center

    Fuentes, Steven

    2017-01-01

    Usability heuristics have been established for different uses and applications as general guidelines for user interfaces. These can affect the implementation of industry solutions and play a significant role regarding cost reduction and process efficiency. The area of electronic workflow document management (EWDM) solutions, also known as…

  14. Towards an intelligent hospital environment: OR of the future.

    PubMed

    Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh

    2005-01-01

    Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.

  15. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  16. Responsibility Center Management: An Assessment of RCM at IUPUI.

    ERIC Educational Resources Information Center

    Robbins, David L.; Rooney, Patrick Michael

    1995-01-01

    Indiana University-Purdue University at Indianapolis is the first public institution to implement Responsibility Center Management (RCM), a comprehensive decentralized, incentive-base financial management system. RCM has strengthened academic planning, budget management, general accountability, and multiyear fiscal planning. Organizational…

  17. Identification and induction of human, social, and cultural capitals through an experimental approach to stormwater management

    EPA Science Inventory

    Decentralized stormwater management is based on the dispersal of stormwater management practices (SWMP) throughout a watershed to manage stormwater runoff volume and potentially restore natural hydrologic processes. This approach to stormwater management is increasingly popular b...

  18. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  19. Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-Based Systems

    ERIC Educational Resources Information Center

    An, Ho

    2012-01-01

    In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…

  20. Changes in the cardiac rehabilitation workflow process needed for the implementation of a self-management system.

    PubMed

    Wiggers, Anne-Marieke; Vosbergen, Sandra; Kraaijenhagen, Roderik; Jaspers, Monique; Peek, Niels

    2013-01-01

    E-health interventions are of a growing importance for self-management of chronic conditions. This study aimed to describe the process adaptions that are needed in cardiac rehabilitation (CR) to implement a self-management system, called MyCARDSS. We created a generic workflow model based on interviews and observations at three CR clinics. Subsequently, a workflow model of the ideal situation after implementation of MyCARDSS was created. We found that the implementation will increase the complexity of existing working procedures because 1) not all patients will use MyCARDSS, 2) there is a transfer of tasks and responsibilities from professionals to patients, and 3) information in MyCARDSS needs to be synchronized with the EPR system for professionals.

  1. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  2. Agent-based Modeling to Simulate the Diffusion of Water-Efficient Innovations and the Emergence of Urban Water Sustainability

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Giacomoni, M.; Shafiee, M. E.; Berglund, E.

    2014-12-01

    The sustainability of water resources is threatened by urbanization, as increasing demands deplete water availability, and changes to the landscape alter runoff and the flow regime of receiving water bodies. Utility managers typically manage urban water resources through the use of centralized solutions, such as large reservoirs, which may be limited in their ability balance the needs of urbanization and ecological systems. Decentralized technologies, on the other hand, may improve the health of the water resources system and deliver urban water services. For example, low impact development technologies, such as rainwater harvesting, and water-efficient technologies, such as low-flow faucets and toilets, may be adopted by households to retain rainwater and reduce demands, offsetting the need for new centralized infrastructure. Decentralized technologies may create new complexities in infrastructure and water management, as decentralization depends on community behavior and participation beyond traditional water resources planning. Messages about water shortages and water quality from peers and the water utility managers can influence the adoption of new technologies. As a result, feedbacks between consumers and water resources emerge, creating a complex system. This research develops a framework to simulate the diffusion of water-efficient innovations and the sustainability of urban water resources, by coupling models of households in a community, hydrologic models of a water resources system, and a cellular automata model of land use change. Agent-based models are developed to simulate the land use and water demand decisions of individual households, and behavioral rules are encoded to simulate communication with other agents and adoption of decentralized technologies, using a model of the diffusion of innovation. The framework is applied for an illustrative case study to simulate water resources sustainability over a long-term planning horizon.

  3. Different approaches for centralized and decentralized water system management in multiple decision makers' problems

    NASA Astrophysics Data System (ADS)

    Anghileri, D.; Giuliani, M.; Castelletti, A.

    2012-04-01

    There is a general agreement that one of the most challenging issues related to water system management is the presence of many and often conflicting interests as well as the presence of several and independent decision makers. The traditional approach to multi-objective water systems management is a centralized management, in which an ideal central regulator coordinates the operation of the whole system, exploiting all the available information and balancing all the operating objectives. Although this approach allows to obtain Pareto-optimal solutions representing the maximum achievable benefit, it is based on assumptions which strongly limits its application in real world contexts: 1) top-down management, 2) existence of a central regulation institution, 3) complete information exchange within the system, 4) perfect economic efficiency. A bottom-up decentralized approach seems therefore to be more suitable for real case applications since different reservoir operators may maintain their independence. In this work we tested the consequences of a change in the water management approach moving from a centralized toward a decentralized one. In particular we compared three different cases: the centralized management approach, the independent management approach where each reservoir operator takes the daily release decision maximizing (or minimizing) his operating objective independently from each other, and an intermediate approach, leading to the Nash equilibrium of the associated game, where different reservoir operators try to model the behaviours of the other operators. The three approaches are demonstrated using a test case-study composed of two reservoirs regulated for the minimization of flooding in different locations. The operating policies are computed by solving one single multi-objective optimal control problem, in the centralized management approach; multiple single-objective optimization problems, i.e. one for each operator, in the independent case; using techniques related to game theory for the description of the interaction between the two operators, in the last approach. Computational results shows that the Pareto-optimal control policies obtained in the centralized approach dominate the control policies of both the two cases of decentralized management and that the so called price of anarchy increases moving toward the independent management approach. However, the Nash equilibrium solution seems to be the most promising alternative because it represents a good compromise in maximizing management efficiency without limiting the behaviours of the reservoir operators.

  4. Combination of decentralized waste drying and SSF techniques for household biowaste minimization and ethanol production.

    PubMed

    Sotiropoulos, A; Vourka, I; Erotokritou, A; Novakovic, J; Panaretou, V; Vakalis, S; Thanos, T; Moustakas, K; Malamis, D

    2016-06-01

    The results of the demonstration of an innovative household biowaste management and treatment scheme established in two Greek Municipalities for the production of lignocellulosic ethanol using dehydrated household biowaste as a substrate, are presented within this research. This is the first time that biowaste drying was tested at a decentralized level for the production of ethanol using the Simultaneous Saccharification and Fermentation (SSF) process, at a pilot scale in Greece. The decentralized biowaste drying method proved that the household biowaste mass and volume reduction may reach 80% through the dehydration process used. The chemical characteristics related to lignocellulosic ethanol production have proved to differ substantially between seasons thus; special attention should be given to the process applied for ethanol production mainly regarding the enzyme quality and quantity used during the pretreatment stage. The maximum ethanol production achieved was 29.12g/L, approximately 60% of the maximum theoretical yield based on the substrate's sugar content. The use of the decentralized waste drying as an alternative approach for household biowaste minimization and the production of second generation ethanol is considered to be a promising approach for efficient biowaste management and treatment in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. China's Quest for Management of Academic Medicine.

    ERIC Educational Resources Information Center

    Butler, William T.; Ruma, Steven J.

    1990-01-01

    China's national medical universities are strategically managed by the public health ministry, with increasing decentralization in management and organizational operational planning and control. The universities do not have enough trained personnel to absorb the additional responsibilities, and need training in adapting Western-style management to…

  6. Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less

  7. The path dependence of district manager decision-space in Ghana.

    PubMed

    Kwamie, Aku; van Dijk, Han; Ansah, Evelyn K; Agyepong, Irene Akua

    2016-04-01

    The district health system in Ghana today is characterized by high resource-uncertainty and narrow decision-space. This article builds a theory-driven historical case study to describe the influence of path-dependent administrative, fiscal and political decentralization processes on development of the district health system and district manager decision-space. Methods included a non-exhaustive literature review of democratic governance in Ghana, and key informant interviews with high-level health system officials integral to the development of the district health system. Through our analysis we identified four periods of district health system progression: (1) development of the district health system (1970-85); (2) Strengthening District Health Systems Initiative (1986-93); (3) health sector reform planning and creation of the Ghana Health Service (1994-96) and (4) health sector reform implementation (1997-2007). It was observed that district manager decision-space steadily widened during periods (1) and (2), due to increases in managerial profile, and concerted efforts at managerial capacity strengthening. Periods (3) and (4) saw initial augmentation of district health system financing, further widening managerial decision-space. However, the latter half of period 4 witnessed district manager decision-space contraction. Formalization of Ghana Health Service structures influenced by self-reinforcing tendencies towards centralized decision-making, national and donor shifts in health sector financing, and changes in key policy actors all worked to the detriment of the district health system, reversing early gains from bottom-up development of the district health system. Policy feedback mechanisms have been influenced by historical and contemporary sequencing of local government and health sector decentralization. An initial act of administrative decentralization, followed by incomplete political and fiscal decentralization has ensured that the balance of power has remained at national level, with strong vertical accountabilities and dependence of the district on national level. This study demonstrates that the rhetoric of decentralization does not always mirror actual implementation, nor always result in empowered local actors. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  8. A QoS Management Technique of Urgent Information Provision in ITS Services Using DSRC for Autonomous Base Stations

    NASA Astrophysics Data System (ADS)

    Shimura, Akitoshi; Aizono, Takeiki; Hiraiwa, Masashi; Sugano, Shigeki

    A QoS management technique based on an autonomous decentralized mobility system, which is an autonomous decentralized system enhanced to provide mobile stations with information about urgent roadway situations, is proposed in this paper. This technique enables urgent messages to be flexibly and quickly transmitted to mobile stations by multiple decentralized base stations using dedicated short range communication. It also supports the easy addition of additional base stations. Each station autonomously creates information-delivery communities based on the urgency of the messages it receives through the roadside network and the distances between the senders and receivers. Each station dynamically determines the urgency of messages according to the message content and the speed of the mobile stations. Evaluation of this technique applied to the Smart Gateway system, which provides driving-assistance services to mobile stations through dedicated short-range communication, demonstrated its effectiveness and that it is suitable for actual systems.

  9. FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data

    PubMed Central

    2015-01-01

    Background Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. Results In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. Conclusions FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data. PMID:26696462

  10. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  11. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    PubMed

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  12. Planning Management Training Programs for Organizational Development

    ERIC Educational Resources Information Center

    Alpander, Guvenc G.

    1974-01-01

    To investigate means of converting management development programs into a successful organizational development process, managers' attitudes toward centralization and decentralization of functions and decisions, the importance of performed functions, their personal effectiveness, their managerial style, and what they prefer for executive…

  13. Generation of Conflict Resolution Maneuvers for Air Traffic Management

    DOT National Transportation Integrated Search

    1997-01-01

    We explore the use of distributed on-line motion planning algorithms for multiple mobile agents, in Air Traffic Management Systems (ATMS). The work is motivated by current trends in ATMS to move towards decentralized air traffic management, in which ...

  14. Understanding the dispensary workflow at the Birmingham Free Clinic: a proposed framework for an informatics intervention.

    PubMed

    Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P

    2016-02-19

    The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC dispensary, we believe it will be generalizable to pharmacies in other low-resource settings, both domestically and internationally.

  15. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    PubMed

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  16. Distributed architecture and distributed processing mode in urban sewage treatment

    NASA Astrophysics Data System (ADS)

    Zhou, Ruipeng; Yang, Yuanming

    2017-05-01

    Decentralized rural sewage treatment facility over the broad area, a larger operation and management difficult, based on the analysis of rural sewage treatment model based on the response to these challenges, we describe the principle, structure and function in networking technology and network communications technology as the core of distributed remote monitoring system, through the application of case analysis to explore remote monitoring system features in a decentralized rural sewage treatment facilities in the daily operation and management. Practice shows that the remote monitoring system to provide technical support for the long-term operation and effective supervision of the facilities, and reduced operating, maintenance and supervision costs for development.

  17. Decentralized asset management for collaborative sensing

    NASA Astrophysics Data System (ADS)

    Malhotra, Raj P.; Pribilski, Michael J.; Toole, Patrick A.; Agate, Craig

    2017-05-01

    There has been increased impetus to leverage Small Unmanned Aerial Systems (SUAS) for collaborative sensing applications in which many platforms work together to provide critical situation awareness in dynamic environments. Such applications require critical sensor observations to be made at the right place and time to facilitate the detection, tracking, and classification of ground-based objects. This further requires rapid response to real-world events and the balancing of multiple, competing mission objectives. In this context, human operators become overwhelmed with management of many platforms. Further, current automated planning paradigms tend to be centralized and don't scale up well to many collaborating platforms. We introduce a decentralized approach based upon information-theory and distributed fusion which enable us to scale up to large numbers of collaborating Small Unmanned Aerial Systems (SUAS) platforms. This is exercised against a military application involving the autonomous detection, tracking, and classification of critical mobile targets. We further show that, based upon monte-carlo simulation results, our decentralized approach out-performs more static management strategies employed by human operators and achieves similar results to a centralized approach while being scalable and robust to degradation of communication. Finally, we describe the limitations of our approach and future directions for our research.

  18. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  19. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  20. The economics of central billing offices.

    PubMed

    Woodcock, E; Nguyen, L

    2000-01-01

    The anticipation of economies of scale in physician billing has led many medical practices to consolidate their billing operations. This article analyzes these economies of scale, comparing performance indicators from centralized and decentralized operations. While consolidation provides compliance, control and information, diseconomies of scale can exist in the centralized receivables management process. The authors conclude that physician practices should consider a hybrid approach to billing, thus reaping the benefits of both centralization and decentralization.

  1. Intelligent Decentralized Control In Large Distributed Computer Systems

    DTIC Science & Technology

    1988-04-01

    decentralized. The goal is to find a way for the agents to coordinate their actions to maximize some index of system performance. (Our main...shown in Figure 4.13. The controller observes the environ- ment through sensors, and then may issue a command (i.e., take action ) to affect the...the Hypothesis Generator and the Belief Manager, and finally actions are issued by the Action Generator, the Experiment Generator, or the Reflex

  2. Essays on remote monitoring as an emerging tool for centralized management of decentralized wastewater systems

    NASA Astrophysics Data System (ADS)

    Solomon, Clement

    According to the United States Environmental Protections Agency (USEPA), nearly one in four households in the United States depends on an individual septic system (commonly referred as an onsite system or a decentralized wastewater system) to treat and disperse wastewater. More than half of these systems are over 30 years old, and surveys indicate at least 10 to 20% might not be functioning properly. The USEPA concluded in its 1997 report to Congress that adequately managed decentralized wastewater systems (DWS) are a cost-effective and long-term option for meeting public health and water quality goals, particularly in less densely populated areas. The major challenge however is the absence of a guiding national regulatory framework based on consistent performance-based standards and lack of proper management of DWS. These inconsistencies pose a significant threat to our water resources, local economies, and public health. This dissertation addresses key policy and regulatory strategies needed in response to the new realities confronting decentralized wastewater management. The two core objectives of this research are to demonstrate the centralized management of DWS paradigm and to present a scientific methodology to develop performance-based standards (a regulatory shift from prescriptive methods) using remote monitoring. The underlying remote monitoring architecture for centralized DWS management and the value of science-based policy making are presented. Traditionally, prescriptive standards using conventional grab sampling data are the norm by which most standards are set. Three case studies that support the potential of remote monitoring as a tool for standards development and system management are presented. The results revealed a vital role for remote monitoring in the development of standardized protocols, policies and procedures that are greatly lacking in this field. This centralized management and remote monitoring paradigm fits well and complements current USEPA policy (13 elements of management); meets the growing need for qualitative data (objective and numerical); has better time efficiencies as real-time events are sampled and translated into machine-readable signals in a short period of time; allows cost saving rapid response to system recovery and operation; produces labor and economic efficiencies through targeted responses; and, improves the quality and operational costs of any management program. This project was funded by the USEPA grant # C-82878001 as part of the National Onsite Demonstration Project (NODP), West Virginia University.

  3. Project management plan : Dallas Integrated Corridor Management (ICM) demonstration project.

    DOT National Transportation Integrated Search

    2010-12-01

    The Dallas Integrated Corridor Management System Demonstration Project is a multi-agency, de-centralized operation which will utilize a set of regional systems to integrate the operations of the corridor. The purpose of the Dallas ICM System is to im...

  4. Implementing Knowledge Management as a Strategic Initiative

    DTIC Science & Technology

    2003-12-01

    Quality Management (TQM); Development Metrics Standards; Philosophy Hierarchical, Centralized or Decentralized; Sociolinguistics ...disciplines of operations research, logic, psychology, philosophy, sociolinguistics , management science, management information science, organizational...needs of customers for America and its Allies.” (CECOM AC Strategic Plan, 2001) Given the mission and vision statements, an organization needs to

  5. Schedule-Aware Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  6. Automated lattice data generation

    NASA Astrophysics Data System (ADS)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  7. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  8. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  9. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    PubMed Central

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  10. RESTORING SUBURBAN WATERSHEDS USING A MULTIDISCIPLINARY APPROACH TO STORMWATER MANAGEMENT

    EPA Science Inventory

    In mixed-use, suburban watersheds, stormwater runoff from impervious surfaces on both public and private property impairs stream ecosystems. Decentralized stormwater management, which distributes stormwater infiltration and retention devices throughout watersheds, is more effect...

  11. Hierarchical Management Information Systems: A Decentralized Approach for University Administration

    ERIC Educational Resources Information Center

    Wager, J. James

    1977-01-01

    A Hierarchical Management Information System (HMIS) provides decision-making as well as operational information to all groups of the institution in a timely and predictable manner. Its operational aspects and benefits are discussed. (Author/LBH)

  12. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  13. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  14. A patient workflow management system built on guidelines.

    PubMed Central

    Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.

    1997-01-01

    To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606

  15. Rethinking Clinical Workflow.

    PubMed

    Schlesinger, Joseph J; Burdick, Kendall; Baum, Sarah; Bellomy, Melissa; Mueller, Dorothee; MacDonald, Alistair; Chern, Alex; Chrouser, Kristin; Burger, Christie

    2018-03-01

    The concept of clinical workflow borrows from management and leadership principles outside of medicine. The only way to rethink clinical workflow is to understand the neuroscience principles that underlie attention and vigilance. With any implementation to improve practice, there are human factors that can promote or impede progress. Modulating the environment and working as a team to take care of patients is paramount. Clinicians must continually rethink clinical workflow, evaluate progress, and understand that other industries have something to offer. Then, novel approaches can be implemented to take the best care of patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  17. Workflow Management for Complex HEP Analyses

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  18. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less

  19. From chart tracking to workflow management.

    PubMed Central

    Srinivasan, P.; Vignes, G.; Venable, C.; Hazelwood, A.; Cade, T.

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system. PMID:7950051

  20. [OR management - Checklists for OR-design for OR-managers - results of a workshop].

    PubMed

    Bock, Matthias; Steinmeyer-Bauer, Klaus; Schüpfer, Guido

    2014-10-01

    The construction of an operating room (OR) suite represents an important intermediate- and long term investment. The planning process starts with the quantitative estimation of the procedures to be carried out which defines the operative capacity for the life time of the facility. This permits the calculation of the number of ORs and the definition of the resources for the recovery room, the intermediate care and intensive care unit.The projectors should integrate the new facility into workflow, workload and logistics of the entire hospital. The simulation flow of patients and accompanying persons and of the routes of the personnel is helpful for this purpose. Separating structures for outpatients from those for inpatients and avoiding de-centralized rooms helps designing an efficient and safe OR suite.The design of the single ORs should be flexible to permit changes or technical innovations during their use period. Mobile equipment is preferable to permanently installed devices. We consider an expanse of at least 45 m(2) for any location adequate for general ORs. The space requirements are elevated for hybrid ORs and rooms dedicated for robotic surgery.The design of the suite should separate the flow of personnel, patients and logistics. Surgical instruments and their logistics should be standardized. Dedicated locations for a simultaneous preparation of the instrumentation tables permit parallel processing. Thus an adequate capacity of preparation rooms and storage rooms is necessary. Dressing rooms, rest rooms, showers and lounges are important for the working conditions and should be planned in an adequate size and number. © Georg Thieme Verlag Stuttgart · New York.

  1. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing

    PubMed Central

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932

  2. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    PubMed

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  3. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    DOE PAGES

    Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...

    2015-01-01

    This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less

  4. Digitization workflows for flat sheets and packets of plants, algae, and fungi1

    PubMed Central

    Nelson, Gil; Sweeney, Patrick; Wallace, Lisa E.; Rabeler, Richard K.; Allard, Dorothy; Brown, Herrick; Carter, J. Richard; Denslow, Michael W.; Ellwood, Elizabeth R.; Germain-Aubrey, Charlotte C.; Gilbert, Ed; Gillespie, Emily; Goertzen, Leslie R.; Legler, Ben; Marchant, D. Blaine; Marsico, Travis D.; Morris, Ashley B.; Murrell, Zack; Nazaire, Mare; Neefus, Chris; Oberreiter, Shanna; Paul, Deborah; Ruhfel, Brad R.; Sasek, Thomas; Shaw, Joey; Soltis, Pamela S.; Watson, Kimberly; Weeks, Andrea; Mast, Austin R.

    2015-01-01

    Effective workflows are essential components in the digitization of biodiversity specimen collections. To date, no comprehensive, community-vetted workflows have been published for digitizing flat sheets and packets of plants, algae, and fungi, even though latest estimates suggest that only 33% of herbarium specimens have been digitally transcribed, 54% of herbaria use a specimen database, and 24% are imaging specimens. In 2012, iDigBio, the U.S. National Science Foundation’s (NSF) coordinating center and national resource for the digitization of public, nonfederal U.S. collections, launched several working groups to address this deficiency. Here, we report the development of 14 workflow modules with 7–36 tasks each. These workflows represent the combined work of approximately 35 curators, directors, and collections managers representing more than 30 herbaria, including 15 NSF-supported plant-related Thematic Collections Networks and collaboratives. The workflows are provided for download as Portable Document Format (PDF) and Microsoft Word files. Customization of these workflows for specific institutional implementation is encouraged. PMID:26421256

  5. Real-Time System for Water Modeling and Management

    NASA Astrophysics Data System (ADS)

    Lee, J.; Zhao, T.; David, C. H.; Minsker, B.

    2012-12-01

    Working closely with the Texas Commission on Environmental Quality (TCEQ) and the University of Texas at Austin (UT-Austin), we are developing a real-time system for water modeling and management using advanced cyberinfrastructure, data integration and geospatial visualization, and numerical modeling. The state of Texas suffered a severe drought in 2011 that cost the state $7.62 billion in agricultural losses (crops and livestock). Devastating situations such as this could potentially be avoided with better water modeling and management strategies that incorporate state of the art simulation and digital data integration. The goal of the project is to prototype a near-real-time decision support system for river modeling and management in Texas that can serve as a national and international model to promote more sustainable and resilient water systems. The system uses National Weather Service current and predicted precipitation data as input to the Noah-MP Land Surface model, which forecasts runoff, soil moisture, evapotranspiration, and water table levels given land surface features. These results are then used by a river model called RAPID, along with an error model currently under development at UT-Austin, to forecast stream flows in the rivers. Model forecasts are visualized as a Web application for TCEQ decision makers, who issue water diversion (withdrawal) permits and any needed drought restrictions; permit holders; and reservoir operation managers. Users will be able to adjust model parameters to predict the impacts of alternative curtailment scenarios or weather forecasts. A real-time optimization system under development will help TCEQ to identify optimal curtailment strategies to minimize impacts on permit holders and protect health and safety. To develop the system we have implemented RAPID as a remotely-executed modeling service using the Cyberintegrator workflow system with input data downloaded from the North American Land Data Assimilation System. The Cyberintegrator workflow system provides RESTful web services for users to provide inputs, execute workflows, and retrieve outputs. Along with REST endpoints, PAW (Publishable Active Workflows) provides the web user interface toolkit for us to develop web applications with scientific workflows. The prototype web application is built on top of workflows with PAW, so that users will have a user-friendly web environment to provide input parameters, execute the model, and visualize/retrieve the results using geospatial mapping tools. In future work the optimization model will be developed and integrated into the workflow.; Real-Time System for Water Modeling and Management

  6. DECENTRALIZED STORMWATER MANAGEMENT: RETROFITTING HOMES, RESTORING WATERSHEDS

    EPA Science Inventory

    Stormwater runoff from impervious surfaces in urban and suburban areas has led to human safety risks and widespread stream ecosystem impairment. While centralized stormwater management can minimize large fluctuations in stream flows and flooding risk to urban areas, this approac...

  7. Management Development at Hewlett-Packard.

    ERIC Educational Resources Information Center

    Nilsson, William P.

    This presentation describes the principles and policies underlying the successful management development program at Hewlett-Packard Company, a manufacturer of electronic instruments and components. The company is organized into relatively independent product divisions with decentralized decision-making responsibilities, flexible working hours, and…

  8. Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.

  9. Applying Content Management to Automated Provenance Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.

    2008-04-10

    Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less

  10. Management of Audio-Visual Media Services. Part II. Practical Management Methods.

    ERIC Educational Resources Information Center

    Price, Robert V.

    1978-01-01

    This paper furnishes a framework that allows the local audiovisual administrator to develop a management system necessary for the instructional support of teaching through modern media and educational technology. The structure of this framework rests on organizational patterns which are explained in four categories: complete decentralization,…

  11. Evolution of Management Thought in the Medieval Times.

    ERIC Educational Resources Information Center

    Sharma, C. L.

    The medieval times witnessed progress toward the growth of larger and more complex organizations and the application of increasingly sophisticated management techniques. Feudalism contributed the concept of decentralization. The concepts evolved by the Catholic Church can scarcely be improved on and are very much pertinent to the management of…

  12. From decentralization to commonization of HIV healthcare resources: keys to reduction in health disparity and equitable distribution of health services in Nigeria.

    PubMed

    Oleribe, Obinna Ositadimma; Oladipo, Olabisi Abiodun; Ezieme, Iheaka Paul; Crossey, Mary Margaret Elizabeth; Taylor-Robinson, Simon David

    2016-01-01

    Access to quality care is essential for improved health outcomes. Decentralization improves access to healthcare services at lower levels of care, but it does not dismantle structural, funding and programming restrictions to access, resulting in inequity and inequality in population health. Unlike decentralization, Commonization Model of care reduces health inequalities and inequity, dismantles structural, funding and other program related obstacles to population health. Excellence and Friends Management Care Center (EFMC) using Commonization Model (CM), fully integrated HIV services into core health services in 121 supported facilities. This initiative improved access to care, treatment, support services, reduced stigmatization/discrimination, and improved uptake of HTC. We call on governments to adequately finance CM for health systems restructuring towards better health outcomes.

  13. Optimizing decentralized production-distribution planning problem in a multi-period supply chain network under uncertainty

    NASA Astrophysics Data System (ADS)

    Nourifar, Raheleh; Mahdavi, Iraj; Mahdavi-Amiri, Nezam; Paydar, Mohammad Mahdi

    2017-09-01

    Decentralized supply chain management is found to be significantly relevant in today's competitive markets. Production and distribution planning is posed as an important optimization problem in supply chain networks. Here, we propose a multi-period decentralized supply chain network model with uncertainty. The imprecision related to uncertain parameters like demand and price of the final product is appropriated with stochastic and fuzzy numbers. We provide mathematical formulation of the problem as a bi-level mixed integer linear programming model. Due to problem's convolution, a structure to solve is developed that incorporates a novel heuristic algorithm based on Kth-best algorithm, fuzzy approach and chance constraint approach. Ultimately, a numerical example is constructed and worked through to demonstrate applicability of the optimization model. A sensitivity analysis is also made.

  14. Producing an Infrared Multiwavelength Galactic Plane Atlas Using Montage, Pegasus, and Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Rynge, M.; Juve, G.; Kinney, J.; Good, J.; Berriman, B.; Merrihew, A.; Deelman, E.

    2014-05-01

    In this paper, we describe how to leverage cloud resources to generate large-scale mosaics of the galactic plane in multiple wavelengths. Our goal is to generate a 16-wavelength infrared Atlas of the Galactic Plane at a common spatial sampling of 1 arcsec, processed so that they appear to have been measured with a single instrument. This will be achieved by using the Montage image mosaic engine process observations from the 2MASS, GLIMPSE, MIPSGAL, MSX and WISE datasets, over a wavelength range of 1 μm to 24 μm, and by using the Pegasus Workflow Management System for managing the workload. When complete, the Atlas will be made available to the community as a data product. We are generating images that cover ±180° in Galactic longitude and ±20° in Galactic latitude, to the extent permitted by the spatial coverage of each dataset. Each image will be 5°x5° in size (including an overlap of 1° with neighboring tiles), resulting in an atlas of 1,001 images. The final size will be about 50 TBs. This paper will focus on the computational challenges, solutions, and lessons learned in producing the Atlas. To manage the computation we are using the Pegasus Workflow Management System, a mature, highly fault-tolerant system now in release 4.2.2 that has found wide applicability across many science disciplines. A scientific workflow describes the dependencies between the tasks and in most cases the workflow is described as a directed acyclic graph, where the nodes are tasks and the edges denote the task dependencies. A defining property for a scientific workflow is that it manages data flow between tasks. Applied to the galactic plane project, each 5 by 5 mosaic is a Pegasus workflow. Pegasus is used to fetch the source images, execute the image mosaicking steps of Montage, and store the final outputs in a storage system. As these workflows are very I/O intensive, care has to be taken when choosing what infrastructure to execute the workflow on. In our setup, we choose to use dynamically provisioned compute clusters running on the Amazon Elastic Compute Cloud (EC2). All our instances are using the same base image, which is configured to come up as a master node by default. The master node is a central instance from where the workflow can be managed. Additional worker instances are provisioned and configured to accept work assignments from the master node. The system allows for adding/removing workers in an ad hoc fashion, and could be run in large configurations. To-date we have performed 245,000 CPU hours of computing and generated 7,029 images and totaling 30 TB. With the current set up our runtime would be 340,000 CPU hours for the whole project. Using spot m2.4xlarge instances, the cost would be approximately $5,950. Using faster AWS instances, such as cc2.8xlarge could potentially decrease the total CPU hours and further reduce the compute costs. The paper will explore these tradeoffs.

  15. How changing quality management influenced PGME accreditation: a focus on decentralization and quality improvement.

    PubMed

    Akdemir, Nesibe; Lombarts, Kiki M J M H; Paternotte, Emma; Schreuder, Bas; Scheele, Fedde

    2017-06-02

    Evaluating the quality of postgraduate medical education (PGME) programs through accreditation is common practice worldwide. Accreditation is shaped by educational quality and quality management. An appropriate accreditation design is important, as it may drive improvements in training. Moreover, accreditors determine whether a PGME program passes the assessment, which may have major consequences, such as starting, continuing or discontinuing PGME. However, there is limited evidence for the benefits of different choices in accreditation design. Therefore, this study aims to explain how changing views on educational quality and quality management have impacted the design of the PGME accreditation system in the Netherlands. To determine the historical development of the Dutch PGME accreditation system, we conducted a document analysis of accreditation documents spanning the past 50 years and a vision document outlining the future system. A template analysis technique was used to identify the main elements of the system. Four themes in the Dutch PGME accreditation system were identified: (1) objectives of accreditation, (2) PGME quality domains, (3) quality management approaches and (4) actors' responsibilities. Major shifts have taken place regarding decentralization, residency performance and physician practice outcomes, and quality improvement. Decentralization of the responsibilities of the accreditor was absent in 1966, but this has been slowly changing since 1999. In the future system, there will be nearly a maximum degree of decentralization. A focus on outcomes and quality improvement has been introduced in the current system. The number of formal documents striving for quality assurance has increased enormously over the past 50 years, which has led to increased bureaucracy. The future system needs to decrease the number of standards to focus on measurable outcomes and to strive for quality improvement. The challenge for accreditors is to find the right balance between trusting and controlling medical professionals. Their choices will be reflected in the accreditation design. The four themes could enhance international comparisons and encourage better choices in the design of accreditation systems.

  16. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.

  17. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  18. Identification and Management of Information Problems by Emergency Department Staff

    PubMed Central

    Murphy, Alison R.; Reddy, Madhu C.

    2014-01-01

    Patient-care teams frequently encounter information problems during their daily activities. These information problems include wrong, outdated, conflicting, incomplete, or missing information. Information problems can negatively impact the patient-care workflow, lead to misunderstandings about patient information, and potentially lead to medical errors. Existing research focuses on understanding the cause of these information problems and the impact that they can have on the hospital’s workflow. However, there is limited research on how patient-care teams currently identify and manage information problems that they encounter during their work. Through qualitative observations and interviews in an emergency department (ED), we identified the types of information problems encountered by ED staff, and examined how they identified and managed the information problems. We also discuss the impact that these information problems can have on the patient-care teams, including the cascading effects of information problems on workflow and the ambiguous accountability for fixing information problems within collaborative teams. PMID:25954457

  19. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  20. USING MARKET INCENTIVES TO PROMOTE DECENTRALIZED STORMWATER MANAGEMENT

    EPA Science Inventory

    Stormwater runoff from impervious surfaces in urban and suburban areas has led to human safety risks and widespread stream ecosystem impairment. While centralized stormwater management can minimize large fluctuations in stream flows and flooding risk to urban areas, this approac...

  1. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  2. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  3. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  4. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  5. Experimental evaluation of a flexible I/O architecture for accelerating workflow engines in ultrascale environments

    DOE PAGES

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...

    2016-10-06

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  6. School-Based Management in Hong Kong: Centralizing or Decentralizing

    ERIC Educational Resources Information Center

    Pang, I-Wah

    2008-01-01

    This paper examined the debate on a reform of school-based management in Hong Kong, which was to set up the Incorporated Management Committee (IMC) to manage the subsidized school. The nature of the debate during legislation and the characteristics of the reform were examined. The advantages, disadvantages and the implications of the reform were…

  7. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  8. Lessons from Implementing a Combined Workflow–Informatics System for Diabetes Management

    PubMed Central

    Zai, Adrian H.; Grant, Richard W.; Estey, Greg; Lester, William T.; Andrews, Carl T.; Yee, Ronnie; Mort, Elizabeth; Chueh, Henry C.

    2008-01-01

    Shortcomings surrounding the care of patients with diabetes have been attributed largely to a fragmented, disorganized, and duplicative health care system that focuses more on acute conditions and complications than on managing chronic disease. To address these shortcomings, we developed a diabetes registry population management application to change the way our staff manages patients with diabetes. Use of this new application has helped us coordinate the responsibilities for intervening and monitoring patients in the registry among different users. Our experiences using this combined workflow-informatics intervention system suggest that integrating a chronic disease registry into clinical workflow for the treatment of chronic conditions creates a useful and efficient tool for managing disease. PMID:18436907

  9. A pattern-based analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Mulyar, Nataliya; van der Aalst, Wil M P; Peleg, Mor

    2007-01-01

    Languages used to specify computer-interpretable guidelines (CIGs) differ in their approaches to addressing particular modeling challenges. The main goals of this article are: (1) to examine the expressive power of CIG modeling languages, and (2) to define the differences, from the control-flow perspective, between process languages in workflow management systems and modeling languages used to design clinical guidelines. The pattern-based analysis was applied to guideline modeling languages Asbru, EON, GLIF, and PROforma. We focused on control-flow and left other perspectives out of consideration. We evaluated the selected CIG modeling languages and identified their degree of support of 43 control-flow patterns. We used a set of explicitly defined evaluation criteria to determine whether each pattern is supported directly, indirectly, or not at all. PROforma offers direct support for 22 of 43 patterns, Asbru 20, GLIF 17, and EON 11. All four directly support basic control-flow patterns, cancellation patterns, and some advance branching and synchronization patterns. None support multiple instances patterns. They offer varying levels of support for synchronizing merge patterns and state-based patterns. Some support a few scenarios not covered by the 43 control-flow patterns. CIG modeling languages are remarkably close to traditional workflow languages from the control-flow perspective, but cover many fewer workflow patterns. CIG languages offer some flexibility that supports modeling of complex decisions and provide ways for modeling some decisions not covered by workflow management systems. Workflow management systems may be suitable for clinical guideline applications.

  10. STORMWATER, PARTICIPATORY ENVIRONMENTAL MANAGEMENT, AND SUSTAINABILITY – WHAT ARE THE CONNECTIONS?

    EPA Science Inventory

    Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens and r...

  11. Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.

  12. Streamling the Change Management with Business Rules

    NASA Technical Reports Server (NTRS)

    Savela, Christopher

    2015-01-01

    Will discuss how their organization is trying to streamline workflows and the change management process with business rules. In looking for ways to make things more efficient and save money one way is to reduce the work the workflow task approvers have to do when reviewing affected items. Will share the technical details of the business rules, how to implement them, how to speed up the development process by using the API to demonstrate the rules in action.

  13. Reconceptualizing the Self-Managing School

    ERIC Educational Resources Information Center

    Caldwell, Brian J.

    2008-01-01

    Contrary to the claims of its critics, the introduction of self-managing schools under the ERA and its counterpart in other countries did not lead to the privatization of public education. Self-managing schools have been one manifestation of a general trend to decentralization in public education in many countries since the late 1960s. The…

  14. Losing Voice? Educational Management Organizations and Charter Schools' Educational Programs

    ERIC Educational Resources Information Center

    Bulkley, Katrina

    2005-01-01

    Charter schools are one form of decentralizing public education by shifting power into the hands of school stakeholders by providing them with more "voice" in day-to-day decisions. However, the increasing involvement of educational management organizations (EMOs) as managers of charter schools raises new questions about the influence of school…

  15. Overview of devolution of health services in the Philippines.

    PubMed

    Grundy, J; Healy, V; Gorgolon, L; Sandig, E

    2003-01-01

    In 1991 the Philippines Government introduced a major devolution of national government services, which included the first wave of health sector reform, through the introduction of the Local Government Code of 1991. The Code devolved basic services for agriculture extension, forest management, health services, barangay (township) roads and social welfare to Local Government Units. In 1992, the Philippines Government devolved the management and delivery of health services from the National Department of Health to locally elected provincial, city and municipal governments. The aim of this review is to (i) Provide a background to the introduction of devolution to the health system in the Philippines and to (ii) describe the impact of devolution on the structure and functioning of the health system in defined locations. International literature was reviewed on the subjects of decentralization. Rapid appraisals of health management systems were conducted in both provinces. Additional data were accessed from the rural health information system and previous consultant reports. Subsequent to the introduction of devolution, quality and coverage of health services declined in some locations, particularly in rural and remote areas. It was found that in 1992-1997, system effects included a breakdown in management systems between levels of government, declining utilization particularly in the hospital sector, poor staff morale, a decline in maintenance of infrastructure and under financing of operational costs of services. The aim of decentralization is to widen decision-making space of middle level managers, enhance resource allocations from central to peripheral areas and to improve the efficiency and effectiveness of health services management. The findings of the historical review of devolution in the Philippines reveals some consistencies with the international literature, which describe some negative effects of decentralization, and provide a rationale for the Philippines in undertaking a second wave of reform in order to 'make devolution work'.

  16. COSMOS: Python library for massively parallel workflows

    PubMed Central

    Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.

    2014-01-01

    Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428

  17. COSMOS: Python library for massively parallel workflows.

    PubMed

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  18. Outcome assessment of decentralization of antiretroviral therapy provision in a rural district of Malawi using an integrated primary care model.

    PubMed

    Chan, Adrienne K; Mateyu, Gabriel; Jahn, Andreas; Schouten, Erik; Arora, Paul; Mlotha, William; Kambanji, Marion; van Lettow, Monique

    2010-06-01

    To assess the effect of decentralization (DC) of antiretroviral therapy (ART) provision in a rural district of Malawi using an integrated primary care model. Between October 2004 and December 2008, 8093 patients (63% women) were registered for ART. Of these, 3440 (43%) were decentralized to health centres for follow-up ART care. We applied multivariate regression analysis that adjusted for sex, age, clinical stage at initiation, type of regimen, presence of side effects because of ART, and duration of treatment and follow-up at site of analysis. Patients managed at health centres had lower mortality [adjusted OR 0.19 (95% C.I. 0.15-0.25)] and lower loss to follow-up (defaulted from treatment) [adjusted OR 0.48 (95% C.I. 0.40-0.58)]. During the first 10 months of follow-up, those decentralized to health centres were approximately 60% less likely to default than those not decentralized; and after 10 months of follow-up, 40% less likely to default. DC was significantly associated with a reduced risk of death from 0 to 25 months of follow-up. The lower mortality may be explained by the selection of stable patients for DC, and the mentorship and supportive supervision of lower cadre health workers to identify and refer complicated cases. Decentralization of follow-up ART care to rural health facilities, using an integrated primary care model, appears a safe and effective way to rapidly scale-up ART and improves both geographical equity in access to HIV-related services and adherence to ART.

  19. ERM Ideas and Innovations

    ERIC Educational Resources Information Center

    Schmidt, Kari

    2012-01-01

    In this column, the author discusses how the management of e-books has introduced, at many libraries and in varying degrees, the challenges of maintaining effective technical services workflows. Four different e-book workflows are identified and explored, and the author takes a closer look at how particular variables for each are affected, such as…

  20. Responsibility-Centered Management: A 10-Year Nursing Assessment.

    ERIC Educational Resources Information Center

    McBride, Angela Barron; Neiman, Sandra; Johnson, James

    2000-01-01

    Describes the implementation of responsibility-centered management, a decentralized model giving deans responsibility for expanding and using resources, at Indiana University's nursing school. Discusses how it led to creation of an information-rich environment, strategic decision making, and a performance-based reward structure. (SK)

  1. [Regional health systems management: a case study in Rio Grande do Sul, Brazil].

    PubMed

    Lima, Juliano de Carvalho; Rivera, Francisco Javier Uribe

    2006-10-01

    This article analyzes the management system in a health district in the State of Rio Grande do Sul, Brazil, through qualitative analysis, using a case study as the methodology and macro-organization theory as the analytical framework. For the current management system in the 6th Health Region, a clear mission statement and wide acceptance by health workers are facilitating factors for the current organizational practices within the health system. Nevertheless, the way health coordinators are currently prioritizing their time has diverted necessary resources from critical problems towards more remedial issues. The 6th Health Region has encouraged social control (or public oversight) in order to improve accountability. However, there is room for improvement in quality assurance management, since there were no well-defined goals, objectives, or accountability. Decentralized consultancy provided to the municipalities and the funding model itself have both promoted decentralization and autonomy, although the strategy requires better regional integration and greater commitment in managerial practices.

  2. Septic Systems Case Studies

    EPA Pesticide Factsheets

    A collection of septic systems case studies to help community planners, elected officials, health department staff, state officials, and interested citizens explore alternatives for managing their decentralized wastewater treatment systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  4. Tools for automated acoustic monitoring within the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  5. Decentralization in Indonesia: lessons from cost recovery rate of district hospitals.

    PubMed

    Maharani, Asri; Femina, Devi; Tampubolon, Gindo

    2015-07-01

    In 1991, Indonesia began a process of decentralization in the health sector which had implications for the country's public hospitals. The public hospitals were given greater authority to manage their own personnel, finance and procurement, with which they were allowed to operate commercial sections in addition to offering public services. These public services are subsidized by the government, although patients still pay certain proportion of fees. The main objectives of health sector decentralization are to increase the ability of public hospitals to cover their costs and to reduce government subsidies. This study investigates the consequences of decentralization on cost recovery rate of public hospitals at district level. We examine five service units (inpatient, outpatient, operating room, laboratory and radiology) in three public hospitals. We find that after 20 years of decentralization, district hospitals still depend on government subsidies, demonstrated by the fact that the cost recovery rate of most service units is less than one. The commercial sections fail to play their role as revenue generator as they are still subsidized by the government. We also find that the bulk of costs are made up of staff salaries and incentives in all units except radiology. As this study constitutes exploratory research, further investigation is needed to find out the reasons behind these results. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  6. A big data approach for climate change indicators processing in the CLIP-C project

    NASA Astrophysics Data System (ADS)

    D'Anca, Alessandro; Conte, Laura; Palazzo, Cosimo; Fiore, Sandro; Aloisio, Giovanni

    2016-04-01

    Defining and implementing processing chains with multiple (e.g. tens or hundreds of) data analytics operators can be a real challenge in many practical scientific use cases such as climate change indicators. This is usually done via scripts (e.g. bash) on the client side and requires climate scientists to take care of, implement and replicate workflow-like control logic aspects (which may be error-prone too) in their scripts, along with the expected application-level part. Moreover, the big amount of data and the strong I/O demand pose additional challenges related to the performance. In this regard, production-level tools for climate data analysis are mostly sequential and there is a lack of big data analytics solutions implementing fine-grain data parallelism or adopting stronger parallel I/O strategies, data locality, workflow optimization, etc. High-level solutions leveraging on workflow-enabled big data analytics frameworks for eScience could help scientists in defining and implementing the workflows related to their experiments by exploiting a more declarative, efficient and powerful approach. This talk will start introducing the main needs and challenges regarding big data analytics workflow management for eScience and will then provide some insights about the implementation of some real use cases related to some climate change indicators on large datasets produced in the context of the CLIP-C project - a EU FP7 project aiming at providing access to climate information of direct relevance to a wide variety of users, from scientists to policy makers and private sector decision makers. All the proposed use cases have been implemented exploiting the Ophidia big data analytics framework. The software stack includes an internal workflow management system, which coordinates, orchestrates, and optimises the execution of multiple scientific data analytics and visualization tasks. Real-time workflow monitoring execution is also supported through a graphical user interface. In order to address the challenges of the use cases, the implemented data analytics workflows include parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, and import/export of datasets in NetCDF format. The use cases have been implemented on a HPC cluster of 8-nodes (16-cores/node) of the Athena Cluster available at the CMCC Supercomputing Centre. Benchmark results will be also presented during the talk.

  7. School-Site Management.

    ERIC Educational Resources Information Center

    English, Fenwick W.

    1989-01-01

    School-site management embodies the concept that decisions should be made at the lowest possible level in organizations and intends that no decision be made without the input of those affected by them. The concept also suggests the empowerment of individual units of the system, particularly, the teachers. Centralization versus decentralization is…

  8. A Preliminary Inquiry into School-Based Management.

    ERIC Educational Resources Information Center

    Brown, Daniel J.

    General interest in decentralized decision-making in education is increasing in both Canada and the United States. This paper attempts a preliminary study of school-based management, which shifts some budgetary decision-making authority from the central office to individual schools. Although many academic specialties have explored decentralization…

  9. Implementing School-Based Management in Indonesia. RTI Research Report Series. Occasional Paper

    ERIC Educational Resources Information Center

    Heyward, Mark; Cannon, Robert A.; Sarjono

    2011-01-01

    Indonesia, the world's fourth most populous nation, has been decentralizing its education sector for the past decade. In this context, school-based management is essential for improving the quality of education. A mixed-method, multisite assessment of a project that aimed to improve the management and governance of basic education in Indonesia…

  10. Focusing on the big picture: urban vegetation and eco ...

    EPA Pesticide Factsheets

    Trees and vegetation can be key components of urban green infrastructure and green spaces such as parks and residential yards. Large trees, characterized by broad canopies, and high leaf and stem volumes, can intercept a substantial amount of stormwater while promoting evapotranspiration and reducing stormwater runoff and pollutant loads. Urban vegetation cover, height, and volume are likely to be affected not only by local climatic characteristics, but also by complex socio-economic dynamics resulting from management practices and resident’s preferences. We examine the benefits provided by private greenspace and present preliminary findings related to the climatic and socio-economic drivers correlated with structural complexity of residential urban vegetation. We use laser (LiDAR) and multispectral remotely-sensed data collected throughout 1400+ neighborhoods and 1.2+ million residential yards across 8 US cities to carry out this analysis. We discuss principles and opportunities to enhance stormwater management using residential greenspace, as well as the larger implications for decentralized stormwater management at city-wide scale. We discuss principles and opportunities to enhance stormwater management using residential greenspace, as well as the larger implications for decentralized stormwater management at city-wide scale.

  11. Framing the difficulties resulting from implementing a Participatory Management Model in a public hospital.

    PubMed

    Bernardes, Andrea; Cummings, Greta; Évora, Yolanda Dora Martinez; Gabriel, Carmen Silvia

    2012-01-01

    This study aims to address difficulties reported by the nursing team during the process of changing the management model in a public hospital in Brazil. This qualitative study used thematic content analysis as proposed by Bardin, and data were analyzed using the theoretical framework of Bolman and Deal. The vertical implementation of Participatory Management contradicted its underlying philosophy and thereby negatively influenced employee acceptance of the change. The decentralized structure of the Participatory Management Model was implemented but shared decision-making was only partially utilized. Despite facilitation of the communication process within the unit, more significant difficulties arose from lack of communication inter-unit. Values and principals need to be shared by teams, however, that will happens only if managers restructure accountabilities changing job descriptions of all team members. Innovative management models that depart from the premise of decentralized decision-making and increased communication encourage accountability, increased motivation and satisfaction, and contribute to improving the quality of care. The contribution of the study is that it describes the complexity of implementing an innovative management model, examines dissent and intentionally acknowledges the difficulties faced by employees in the organization.

  12. A microseismic workflow for managing induced seismicity risk as CO 2 storage projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzel, E.; Morency, C.; Pyle, M.

    2015-10-27

    It is well established that fluid injection has the potential to induce earthquakes—from microseismicity to large, damaging events—by altering state-of-stress conditions in the subsurface. While induced seismicity has not been a major operational issue for carbon storage projects to date, a seismicity hazard exists and must be carefully addressed. Two essential components of effective seismic risk management are (1) sensitive microseismic monitoring and (2) robust data interpretation tools. This report describes a novel workflow, based on advanced processing algorithms applied to microseismic data, to help improve management of seismic risk. This workflow has three main goals: (1) to improve themore » resolution and reliability of passive seismic monitoring, (2) to extract additional, valuable information from continuous waveform data that is often ignored in standard processing, and (3) to minimize the turn-around time between data collection, interpretation, and decision-making. These three objectives can allow for a better-informed and rapid response to changing subsurface conditions.« less

  13. Implementing CORAL: An Electronic Resource Management System

    ERIC Educational Resources Information Center

    Whitfield, Sharon

    2011-01-01

    A 2010 electronic resource management survey conducted by Maria Collins of North Carolina State University and Jill E. Grogg of University of Alabama Libraries found that the top six electronic resources management priorities included workflow management, communications management, license management, statistics management, administrative…

  14. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  15. Improving data collection, documentation, and workflow in a dementia screening study.

    PubMed

    Read, Kevin B; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I; Galvin, James E; Surkis, Alisa

    2017-04-01

    A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies' data collection, entry, and processing workflows. The librarians' role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library's broader user community.

  16. Scalable and cost-effective NGS genotyping in the cloud.

    PubMed

    Souilmi, Yassine; Lancaster, Alex K; Jung, Jae-Yoon; Rizzo, Ettore; Hawkins, Jared B; Powles, Ryan; Amzazi, Saaïd; Ghazal, Hassan; Tonellato, Peter J; Wall, Dennis P

    2015-10-15

    While next-generation sequencing (NGS) costs have plummeted in recent years, cost and complexity of computation remain substantial barriers to the use of NGS in routine clinical care. The clinical potential of NGS will not be realized until robust and routine whole genome sequencing data can be accurately rendered to medically actionable reports within a time window of hours and at scales of economy in the 10's of dollars. We take a step towards addressing this challenge, by using COSMOS, a cloud-enabled workflow management system, to develop GenomeKey, an NGS whole genome analysis workflow. COSMOS implements complex workflows making optimal use of high-performance compute clusters. Here we show that the Amazon Web Service (AWS) implementation of GenomeKey via COSMOS provides a fast, scalable, and cost-effective analysis of both public benchmarking and large-scale heterogeneous clinical NGS datasets. Our systematic benchmarking reveals important new insights and considerations to produce clinical turn-around of whole genome analysis optimization and workflow management including strategic batching of individual genomes and efficient cluster resource configuration.

  17. Decentralization and health system performance - a focused review of dimensions, difficulties, and derivatives in India.

    PubMed

    Panda, Bhuputra; Thakur, Harshad P

    2016-10-31

    One of the principal goals of any health care system is to improve health through the provision of clinical and public health services. Decentralization as a reform measure aims to improve inputs, management processes and health outcomes, and has political, administrative and financial connotations. It is argued that the robustness of a health system in achieving desirable outcomes is contingent upon the width and depth of 'decision space' at the local level. Studies have used different approaches to examine one or more facets of decentralization and its effect on health system functioning; however, lack of consensus on an acceptable framework is a critical gap in determining its quantum and quality. Theorists have resorted to concepts of 'trust', 'convenience' and 'mutual benefits' to explain, define and measure components of governance in health. In the emerging 'continuum of health services' model, the challenge lies in identifying variables of performance (fiscal allocation, autonomy at local level, perception of key stakeholders, service delivery outputs, etc.) through the prism of decentralization in the first place, and in establishing directed relationships among them. This focused review paper conducted extensive web-based literature search, using PubMed and Google Scholar search engines. After screening of key words and study objectives, we retrieved 180 articles for next round of screening. One hundred and four full articles (three working papers and 101 published papers) were reviewed in totality. We attempted to summarize existing literature on decentralization and health systems performance, explain key concepts and essential variables, and develop a framework for further scientific scrutiny. Themes are presented in three separate segments of dimensions, difficulties and derivatives. Evaluation of local decision making and its effect on health system performance has been studied in a compartmentalized manner. There is sparse evidence about innovations attributable to decentralization. We observed that in India, there is very scant evaluative study on the subject. We didn't come across a single study examining the perception and experiences of local decision makers about the opportunities and challenges they faced. The existing body of evidences may be inadequate to feed into sound policy making. The principles of management hinge on measurement of inputs, processes and outputs. In the conceptual framework we propose three levels of functions (health systems functions, management functions and measurement functions) being intricately related to inputs, processes and outputs. Each level of function encompasses essential elements derived from the synthesis of information gathered through literature review and non-participant observation. We observed that it is difficult to quantify characteristics of governance at institutional, system and individual levels except through proxy means. There is an urgent need to sensitize governments and academia about how best more objective evaluation of 'shared governance' can be undertaken to benefit policy making. The future direction of enquiry should focus on context-specific evidence of its effect on the entire spectrum of health system, with special emphasis on efficiency, community participation, human resource management and quality of services.

  18. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  19. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  20. Computers and Management Structure: Some Empirical Findings Re-examined

    ERIC Educational Resources Information Center

    Robey, Daniel

    1977-01-01

    Studies that relate computerization to either centralization or decentralization of organizational decision making are reviewed. Four issues are addressed that relate to conceptual or methodological problems. (Author/MLF)

  1. A Customized Approach to Talent Management at the University of Pennsylvania

    ERIC Educational Resources Information Center

    Edwards, Beverly

    2008-01-01

    The University of Pennsylvania places great emphasis on talent management, specifically on attracting and retaining top-notch people. One way it accomplishes this is by offering several avenues by which its employees can further their careers. Penn's large, decentralized structure provides significant opportunities for career growth; however,…

  2. An Evolving Simulation/Gaming Process to Facilitate Adaptive Watershed Management in Northern Mountainous Thailand

    ERIC Educational Resources Information Center

    Barnaud, Cecile; Promburom, Tanya; Trebuil, Guy; Bousquet, Francois

    2007-01-01

    The decentralization of natural resource management provides an opportunity for communities to increase their participation in related decision making. Research should propose adapted methodologies enabling the numerous stakeholders of these complex socioecological settings to define their problems and identify agreed-on solutions. This article…

  3. How Much Is Enough? Minimal Responses of Water Quality and Stream Biota to Partial Retrofit Stormwater Management in a Suburban Neighborhood

    EPA Science Inventory

    Decentralized stormwater management approaches (e.g., biofiltration swales, pervious pavement, green roofs, rain gardens) that capture, detain, infiltrate, and filter runoff are now commonly used to minimize the impacts of stormwater runoff from impervious surfaces on aquatic eco...

  4. Energy Management System Successful in Indiana Elementary School.

    ERIC Educational Resources Information Center

    School Business Affairs, 1984

    1984-01-01

    The new Oregon-Davis Elementary School in rural Indiana embodies state-of-the-art energy management. Its environmental systems include thorough insulation, dual heating and cooling equipment for flexible loads, and decentralized computer controls. A heat recovery unit and variable-air-volume discharge ducts also contribute to conservation. (MCG)

  5. Computerized management information systems and organizational structures

    NASA Technical Reports Server (NTRS)

    Zannetos, Z. S.; Sertel, M. R.

    1970-01-01

    The computerized management of information systems and organizational structures is discussed. The subjects presented are: (1) critical factors favoring centralization and decentralization of organizations, (2) classification of organizations by relative structure, (3) attempts to measure change in organization structure, and (4) impact of information technology developments on organizational structure changes.

  6. School-Based Management and Arts Education: Lessons from Chicago

    ERIC Educational Resources Information Center

    Fitzpatrick, Kate R.

    2012-01-01

    School-based management, or local school control, is an organizational school reform effort aimed at decentralizing school decision-making that has become prevalent in districts throughout the United States. Using the groundbreaking Chicago system of local school control as an exemplar, this article outlines the implications of such reform efforts…

  7. The Rise of Networks: How Decentralized Management Is Improving Schools

    ERIC Educational Resources Information Center

    Kelleher, Maureen

    2014-01-01

    School districts across the country are shifting away from their traditional management paradigm--a central office that directs its schools through uniform mandates and policies--toward a new vision where district leaders support autonomous schools while holding them accountable for student performance. The advent of new governance mechanisms…

  8. Towards Exascale Seismic Imaging and Inversion

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Lefebvre, M. P.; Smith, J. A.; Lei, W.; Ruan, Y.

    2015-12-01

    Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns tied to obtaining optimum performance. Several issues are currently being investigated by the HPC community. These include energy consumption, fault resilience, scalability of the current parallel paradigms, workflow management, I/O performance and feature extraction with large datasets. In this presentation, we focus on the last three issues. In the context of seismic imaging and inversion, in particular for simulations based on adjoint methods, workflows are well defined.They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts comprising it. The usual approach is to speedup the purely computational parts based on code optimization in order to reach higher FLOPS and better memory management. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from severe I/O bottlenecks. Such limitations occur both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). Parallel I/O libraries, namely HDF5 and ADIOS, are used to drastically reduce the cost of disk access. Parallel visualization tools, such as VisIt, are able to take advantage of ADIOS metadata to extract features and display massive datasets. Because large parts of the workflow are embarrassingly parallel, we are investigating the possibility of automating the imaging process with the integration of scientific workflow management tools, specifically Pegasus.

  9. Design and Applications of a GeoSemantic Framework for Integration of Data and Model Resources in Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.

    2016-12-01

    Hydrologists today have to integrate resources such as data and models, which originate and reside in multiple autonomous and heterogeneous repositories over the Web. Several resource management systems have emerged within geoscience communities for sharing long-tail data, which are collected by individual or small research groups, and long-tail models, which are developed by scientists or small modeling communities. While these systems have increased the availability of resources within geoscience domains, deficiencies remain due to the heterogeneity in the methods, which are used to describe, encode, and publish information about resources over the Web. This heterogeneity limits our ability to access the right information in the right context so that it can be efficiently retrieved and understood without the Hydrologist's mediation. A primary challenge of the Web today is the lack of the semantic interoperability among the massive number of resources, which already exist and are continually being generated at rapid rates. To address this challenge, we have developed a decentralized GeoSemantic (GS) framework, which provides three sets of micro-web services to support (i) semantic annotation of resources, (ii) semantic alignment between the metadata of two resources, and (iii) semantic mediation among Standard Names. Here we present the design of the framework and demonstrate its application for semantic integration between data and models used in the IML-CZO. First we show how the IML-CZO data are annotated using the Semantic Annotation Services. Then we illustrate how the Resource Alignment Services and Knowledge Integration Services are used to create a semantic workflow among TopoFlow model, which is a spatially-distributed hydrologic model and the annotated data. Results of this work are (i) a demonstration of how the GS framework advances the integration of heterogeneous data and models of water-related disciplines by seamless handling of their semantic heterogeneity, (ii) an introduction of new paradigm for reusing existing and new standards as well as tools and models without the need of their implementation in the Cyberinfrastructures of water-related disciplines, and (iii) an investigation of a methodology by which distributed models can be coupled in a workflow using the GS services.

  10. Three alternative structural configurations for phlebotomy: a comparison of effectiveness.

    PubMed

    Mannion, Heidi; Nadder, Teresa

    2007-01-01

    This study was designed to compare the effectiveness of three alternative structural configurations for inpatient phlebotomy. It was hypothesized that decentralized was less effective when compared to centralized inpatient phlebotomy. A non-experimental prospective survey design was conducted at the institution level. Laboratory managers completed an organizational survey and collected data on inpatient blood specimens during a 30-day data collection period. A random sample (n=31) of hospitals with onsite laboratories in the United States was selected from a database purchased from the Joint Commission on Accreditations of Healthcare Organizations (JCAHO). Effectiveness of the blood collection process was measured by the percentage of specimens rejected during the data collection period. Analysis of variance showed a statistically significant difference in the percentage of specimens rejected for centralized, hybrid, and decentralized phlebotomy configurations [F (2, 28) = 4.27, p = .02] with an effect size of .23. Post-hoc comparison using Tukey's HSD indicated that mean percentage of specimens rejected for centralized phlebotomy (M = .045, SD = 0.36) was significantly different from the decentralized configuration (M = 1.42, SD = 0.92, p = .03). found to be more effective when compared to the decentralized configuration.

  11. Matching services with local preferences: managing primary education services in a rural district of India.

    PubMed

    Subrahmanian, R

    1999-02-01

    India's poorest households have particularly little access to education. Urgent reforms are therefore needed to improve the universal availability of quality basic services and universal access to those services. At least 32 million children in India are estimated to not be enrolled and attending school. These children must be brought into schools in order to meet the goal of Universal Elementary Education (UEE). Widespread support exists for the decentralization of public services due to the equity and efficiency benefits associated with it. In particular, decentralization is seen to facilitate the matching of services with local preferences, increasing the chances of meeting policy goals. This approach is explored in the context of research conducted in a village of Raichur district, where poor households' preferences with regard to school timing are analyzed. Sections consider the equity and efficiency merits of decentralization, the agenda for improving education service delivery in India, users' relationship to the education system in Raichur district, how preferences are revealed, whose preferences are important in the conflict between local and policy perspectives, preference heterogeneity in the village context, and whether aspects of education services can be selectively decentralized.

  12. Development of an Excel-based laboratory information management system for improving workflow efficiencies in early ADME screening.

    PubMed

    Lu, Xinyan

    2016-01-01

    There is a clear requirement for enhancing laboratory information management during early absorption, distribution, metabolism and excretion (ADME) screening. The application of a commercial laboratory information management system (LIMS) is limited by complexity, insufficient flexibility, high costs and extended timelines. An improved custom in-house LIMS for ADME screening was developed using Excel. All Excel templates were generated through macros and formulae, and information flow was streamlined as much as possible. This system has been successfully applied in task generation, process control and data management, with a reduction in both labor time and human error rates. An Excel-based LIMS can provide a simple, flexible and cost/time-saving solution for improving workflow efficiencies in early ADME screening.

  13. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  14. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-03-01

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  15. Interplay between Clinical Guidelines and Organizational Workflow Systems. Experience from the MobiGuide Project.

    PubMed

    Shabo, Amnon; Peleg, Mor; Parimbelli, Enea; Quaglini, Silvana; Napolitano, Carlo

    2016-12-07

    Implementing a decision-support system within a healthcare organization requires integration of clinical domain knowledge with resource constraints. Computer-interpretable guidelines (CIG) are excellent instruments for addressing clinical aspects while business process management (BPM) languages and Workflow (Wf) engines manage the logistic organizational constraints. Our objective is the orchestration of all the relevant factors needed for a successful execution of patient's care pathways, especially when spanning the continuum of care, from acute to community or home care. We considered three strategies for integrating CIGs with organizational workflows: extending the CIG or BPM languages and their engines, or creating an interplay between them. We used the interplay approach to implement a set of use cases arising from a CIG implementation in the domain of Atrial Fibrillation. To provide a more scalable and standards-based solution, we explored the use of Cross-Enterprise Document Workflow Integration Profile. We describe our proof-of-concept implementation of five use cases. We utilized the Personal Health Record of the MobiGuide project to implement a loosely-coupled approach between the Activiti BPM engine and the Picard CIG engine. Changes in the PHR were detected by polling. IHE profiles were used to develop workflow documents that orchestrate cross-enterprise execution of cardioversion. Interplay between CIG and BPM engines can support orchestration of care flows within organizational settings.

  16. A federated information management system for the Deep Space Network. M.S. Thesis - Univ. of Southern California

    NASA Technical Reports Server (NTRS)

    Dobinson, E.

    1982-01-01

    General requirements for an information management system for the deep space network (DSN) are examined. A concise review of available database management system technology is presented. It is recommended that a federation of logically decentralized databases be implemented for the Network Information Management System of the DSN. Overall characteristics of the federation are specified, as well as reasons for adopting this approach.

  17. Management system, organizational climate and performance relationships

    NASA Technical Reports Server (NTRS)

    Davis, B. D.

    1979-01-01

    Seven aerospace firms were investigated to determine if a relationship existed among management systems, organizational climate, and organization performance. Positive relationships were found between each of these variables, but a statistically significant relationship existed only between the management system and organizational climate. The direction and amount of communication and the degree of decentralized decision-making, elements of the management system, also had a statistically significant realtionship with organization performance.

  18. [Mechanisms for allocating financial resources after decentralization in the state of Jalisco].

    PubMed

    Pérez-Núñez, Ricardo; Arredondo-López, Armando; Pelcastre, Blanca

    2006-01-01

    To analyze, from the decision maker's perspective, the financial resource allocation process of the health services of the state of Jalisco (SSJ, per its abbreviation in spanish), within the context of decentralization. Through a qualitative approximation using semi-structured individual interviews of key personnel in managerial positions as the method for compiling information, the experience of the SSJ in financial resource allocation was documented. From September to November 2003, the perception of managers and administrators regarding their level of autonomy in decision-making was explored as well as the process they follow for the allocation of financial resources, in order to identify the criteria they use and their justifications. From the point of view of decision-makers, autonomy of the SSJ has increased considerably since decentralization was implemented, although the degree of decision-making freedom remains limited due mainly to high adminstrative costs associated with salaries. In this sense, the implications attributable to labor situations that are still centralized are evident. Some innovative systems for financial resource allocation have been established in the SSJ for the sanitary regions and hospitals based upon administrative-managerial and productivity incentives. Adjustments were also made for degree of marginalization and population lag, under the equity criterion. General work conditions and decision-making autonomy of the sanitary regions constitute outstanding aspects pending decentralization. Although decentralization has granted more autonomy to the SSJ, the level of decision-making freedom for allocating financial resources has been held within the highest hierarchical levels.

  19. Decentralized control of human visceral leishmaniasis in endemic urban areas of Brazil: a literature review.

    PubMed

    Menon, Sonia S; Rossi, Rodolfo; Nshimyumukiza, Leon; Zinszer, Kate

    2016-01-01

    Human migration and concomitant HIV infections are likely to bring about major changes in the epidemiology of some parasitic infections in Brazil. Human visceral leishmaniasis (HVL) control is particularly fraught with intricacies. It is against a backdrop of decentralized health care that the complex HVL control initiatives are brought to bear. This comprehensive review aims to explore the obstacles facing decentralized HVL control in urban endemic areas in Brazil. A literature search was carried out in December 2015 by means of three databases: MEDLINE, Google Scholar, and Web of Science. Although there have been many strides that have been made in elucidating the eco-epidemiology of Leishmania infantum, which forms the underpinnings of the national control program, transmission risk factors for HVL are still insufficiently elucidated in urban settings. Decentralized HVL epidemiological surveillance and control for animal reservoirs and vectors may compromise sustainability. In addition, it may hamper timely human HVL case management. With the burgeoning of the HIV-HVL co-infection, the potential human transmission may be underestimated. HVL is a disease with focal transmission at a critical juncture, which warrants that the bottlenecks facing the control program within contexts of decentralized healthcare systems be taken into account. In addition, HIV-driven HVL epidemics may substantially increase the transmission potential of the human reservoir. Calculating the basic reproductive number to fine-tune interventions will have to take into consideration the specific socio-economic development context.

  20. Using Semantic Components to Represent Dynamics of an Interdisciplinary Healthcare Team in a Multi-Agent Decision Support System.

    PubMed

    Wilk, Szymon; Kezadri-Hamiaz, Mounira; Rosu, Daniela; Kuziemsky, Craig; Michalowski, Wojtek; Amyot, Daniel; Carrier, Marc

    2016-02-01

    In healthcare organizations, clinical workflows are executed by interdisciplinary healthcare teams (IHTs) that operate in ways that are difficult to manage. Responding to a need to support such teams, we designed and developed the MET4 multi-agent system that allows IHTs to manage patients according to presentation-specific clinical workflows. In this paper, we describe a significant extension of the MET4 system that allows for supporting rich team dynamics (understood as team formation, management and task-practitioner allocation), including selection and maintenance of the most responsible physician and more complex rules of selecting practitioners for the workflow tasks. In order to develop this extension, we introduced three semantic components: (1) a revised ontology describing concepts and relations pertinent to IHTs, workflows, and managed patients, (2) a set of behavioral rules describing the team dynamics, and (3) an instance base that stores facts corresponding to instances of concepts from the ontology and to relations between these instances. The semantic components are represented in first-order logic and they can be automatically processed using theorem proving and model finding techniques. We employ these techniques to find models that correspond to specific decisions controlling the dynamics of IHT. In the paper, we present the design of extended MET4 with a special focus on the new semantic components. We then describe its proof-of-concept implementation using the WADE multi-agent platform and the Z3 solver (theorem prover/model finder). We illustrate the main ideas discussed in the paper with a clinical scenario of an IHT managing a patient with chronic kidney disease.

  1. Decentralization strategies and provider incentives in healthcare: evidence from the english national health service.

    PubMed

    Mannion, Russell; Goddard, Maria; Kuhn, Michael; Bate, Angela

    2005-01-01

    This article examines the incentive effects of delegating operational and financial decision making from central government to local healthcare providers. It addresses the economic consequences of a contemporary policy initiative in the English National Health Service (NHS)-earned autonomy. This policy entails awarding operational autonomy to 'front-line' organisations that are assessed to be meeting national performance targets. In doing so, it introduces new types of incentives into the healthcare system, changes the nature of established agency relationships and represents a novel approach to performance management. Theoretical elements of a principal-agent model are used to examine the impact of decentralization in the context of the results of an empirical study that elicited the perceptions of senior hospital managers regarding the incentive effects of earned autonomy. A multi-method approach was adopted. In order to capture the breadth of policy impact, we conducted a national postal questionnaire survey of all Chief Executives in acute-care hospital Trusts in England (n = 173). To provide added depth and richness to our understanding of the impact and incentive effects of earned autonomy at an organisational level, we interviewed senior managers in a purposeful sample of eight acute-care hospital Trusts. This theoretical framework and our empirical work suggest that some aspects of the earned autonomy as currently implemented in the NHS serve to weaken the potential incentive effect of decentralization. In particular, the nature of the freedoms is such that many senior managers do not view autonomy as a particularly valuable prize. This suggests that incentives associated with the policy will be insufficiently powerful to motivate providers to deliver better performance. We also found that principal commitment may be a problem in the NHS. Some hospital managers reported that they already enjoyed a large degree of autonomy, regardless of their current performance ratings. We also found evidence that the objectives of providers may differ from those of both the central government and local purchasers. There is, therefore, a risk that granting greater autonomy will allow providers to pursue their own objectives which, whilst not self-serving, may still jeopardize the achievement of strategic goals. It is apparent that the design and implementation features of decentralizing policies such as earned autonomy require careful attention if an optimal balance is to be struck between central oversight and local autonomy in the delivery of healthcare.

  2. A network approach to decentralized coordination of energy production-consumption grids.

    PubMed

    Omodei, Elisa; Arenas, Alex

    2018-01-01

    Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems.

  3. An ontological knowledge framework for adaptive medical workflow.

    PubMed

    Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir

    2008-10-01

    As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.

  4. Task Management in the New ATLAS Production System

    NASA Astrophysics Data System (ADS)

    De, K.; Golubkov, D.; Klimentov, A.; Potekhin, M.; Vaniachine, A.; Atlas Collaboration

    2014-06-01

    This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.

  5. Optimizing insulin pump therapy: the potential advantages of using a structured diabetes management program.

    PubMed

    Lange, Karin; Ziegler, Ralph; Neu, Andreas; Reinehr, Thomas; Daab, Iris; Walz, Marion; Maraun, Michael; Schnell, Oliver; Kulzer, Bernhard; Reichel, Andreas; Heinemann, Lutz; Parkin, Christopher G; Haak, Thomas

    2015-03-01

    Use of continuous subcutaneous insulin infusion (CSII) therapy improves glycemic control, reduces hypoglycemia and increases treatment satisfaction in individuals with diabetes. As a number of patient- and clinician-related factors can hinder the effectiveness and optimal usage of CSII therapy, new approaches are needed to address these obstacles. Ceriello and colleagues recently proposed a model of care that incorporates the collaborative use of structured SMBG into a formal approach to personalized diabetes management within all diabetes populations. We adapted this model for use in CSII-treated patients in order to enable the implementation of a workflow structure that enhances patient-physician communication and supports patients' diabetes self-management skills. We recognize that time constraints and current reimbursement policies pose significant challenges to healthcare providers integrating the Personalised Diabetes Management (PDM) process into clinical practice. We believe, however, that the time invested in modifying practice workflow and learning to apply the various steps of the PDM process will be offset by improved workflow and more effective patient consultations. This article describes how to implement PDM into clinical practice as a systematic, standardized process that can optimize CSII therapy.

  6. Pegasus Workflow Management System: Helping Applications From Earth and Space

    NASA Astrophysics Data System (ADS)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved. Astrophysics: The Laser Interferometer Gravitational-Wave Observatory (LIGO) uses Pegasus WMS to search for binary inspiral gravitational waves. A month of LIGO data requires many thousands of jobs, running for days on hundreds of CPUs on the LIGO Data Grid (LDG) and Open Science Grid (OSG). Ocean Temperature Forecast: Researchers at the Jet Propulsion Laboratory are exploring Pegasus WMS to run ocean forecast ensembles of the California coastal region. These models produce a number of daily forecasts for water temperature, salinity, and other measures. Helioseismology: The Solar Dynamics Observatory (SDO) is NASA's most important solar physics mission of this coming decade. Pegasus WMS is being used to analyze the data from SDO, which will be predominantly used to learn about solar magnetic activity and to probe the internal structure and dynamics of the Sun with helioseismology. Bacterial RNA studies: SIPHT is an application in bacterial genomics, which predicts sRNA (small non-coding RNAs)-encoding genes in bacteria. This project currently provides a web-based interface using Pegasus WMS at the backend to facilitate large-scale execution of the workflows on varied resources and provide better notifications of task/workflow completion.

  7. Influence of forest management systems on natural resource use and provision of ecosystem services in Tanzania.

    PubMed

    Strauch, Ayron M; Rurai, Masegeri T; Almedom, Astier M

    2016-09-15

    Social, religious and economic facets of rural livelihoods in Sub-Saharan Africa are heavily dependent on natural resources, but improper resource management, drought, and social instability frequently lead to their unsustainable exploitation. In rural Tanzania, natural resources are often governed locally by informal systems of traditional resource management (TRM), defined as cultural practices developed within the context of social and religious institutions over hundreds of years. However, following independence from colonial rule, centralized governments began to exercise jurisdictional control over natural resources. Following decades of mismanagement that resulted in lost ecosystem services, communities demanded change. To improve resource protection and participation in management among stakeholders, the Tanzanian government began to decentralize management programs in the early 2000s. We investigated these two differing management approaches (traditional and decentralized government) in Sonjo communities, to examine local perceptions of resource governance, management influences on forest use, and their consequences for forest and water resources. While 97% of households understood the regulations governing traditionally-managed forests, this was true for only 39% of households for government-managed forests, leading to differences in forest use. Traditional management practices resulted in improved forest condition and surface water quality. This research provides an essential case study demonstrating the importance of TRM in shaping decision frameworks for natural resource planning and management. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. An ontology-based framework for bioinformatics workflows.

    PubMed

    Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer

    2007-01-01

    The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.

  9. Towards Decentralized and Goal-Oriented Models of Institutional Resource Allocation: The Spanish Case

    ERIC Educational Resources Information Center

    Lopez, Maria Jose Gonzalez

    2006-01-01

    The search for more flexibility in financial management of public universities demands adjustments in budgeting strategies. International studies on this topic recommend wider financial autonomy for management units, the use of budgeting models based on performance, the implementation of formula systems for the determination of financial needs of…

  10. Department Level Information Resource Management: A Theoretical Argument for a Decentralized Approach.

    ERIC Educational Resources Information Center

    Beath, Cynthia Mathis; Straub, Detmar W.

    1991-01-01

    Explores where the responsibility for information resources management (IRM) can lie, identifying entities which might carry IRM tasks: (1) individuals; (2) departments; (3) institutions; and (4) markets. It is argued that the IRM function should be located at the department level, and that associated departmental costs may be overshadowed by the…

  11. Developing Guidelines for IRM: A Grassroots Process in a Decentralized Environment.

    ERIC Educational Resources Information Center

    Balkan, Lore; Sheldon, Philip

    1990-01-01

    The offices of Information Resource Management and Institutional Research at Virginia Tech developed a set of guidelines for information management. This article describes the historical evolution, the forces that motivated the development of the guidelines, and the consensus-building activities that led to the acceptance of the guidelines.…

  12. Toolkit Approach to Integrating Library Resources into the Learning Management System

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2008-01-01

    As use of learning management systems (LMS) increases, it is essential that librarians are there. Ohio State University Libraries took a toolkit approach to integrate library content in the LMS to facilitate creative and flexible interactions between librarians, students and faculty in Ohio State University's large and decentralized academic…

  13. Financial Decentralization in Malaysian Schools: Strategies for Effective Implementation

    ERIC Educational Resources Information Center

    Radzi, Norfariza Mohd; Ghani, Muhammad Faizal A.; Siraj, Saedah; Afshari, Mojgan

    2013-01-01

    This article presents findings on the essential strategies required at the school site and the relevant people responsible for the effective implementation of school-based financial management in Malaysia. Many lessons have been learned since more than a decade of the school-based financial management reform in Malaysia through the establishment…

  14. Democracy, Decentralization and School-Based Management in Spain.

    ERIC Educational Resources Information Center

    Hanson, E. Mark; Ulrich, Carolyn

    This paper presents findings of a study that described and analyzed the first 5 years (1985-90) of the Spanish experience in school-based management (SBM). The Spanish experience is instructive because the country, formerly comprised of independent territories, made a swift and peaceful transition to democracy. As a means of reinforcing the…

  15. Polarity management: the key challenge for integrated health systems.

    PubMed

    Burns, L R

    1999-01-01

    Integrated health systems are confronted with numerous dilemmas that must be managed. Many of these dilemmas are an inherent part of the system's structure, given that multiple competing hospitals, medical groups, and (sometimes) health plans are often under one organizational roof. This article presents an analysis of these dilemmas--referred to in the management literature as polarities--as they are found in six integrated health systems in Illinois. The nine polarities that must be managed include (1) hospital systems that want to be organizations of physicians; (2) system expansion by growing the physician component; (3) system centralization and physician decentralization; (4) centripetal and centrifugal forces involving physicians; (5) system objectives and physician interests; (6) system centralization and hospital decentralization; (7) primary care physicians and specialists; (8) physician autonomy via collectivization; and (9) vertical and virtual integration. The article identifies some of the solutions to the polarities that have been enacted by systems. In general, executives and physicians in integrated health systems must attend to the processes of integration as much as or more than the structures of integration.

  16. Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation

    PubMed Central

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533

  17. Jflow: a workflow management system for web applications.

    PubMed

    Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe

    2016-02-01

    Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center are working on a multi-year Collaborative Research and Development Agreement.With the knowledge developed in the first year on how to provision and manage a federation of virtual machines through Cloud management systems. In this second year, we expanded the work on provisioning and federation, increasing both scale and diversity of solutions, and we started to build on-demand services on the established fabric, introducing the paradigm of Platform as a Service to assist with the execution of scientific workflows. We have enabled scientific workflows ofmore » stakeholders to run on multiple cloud resources at the scale of 1,000 concurrent machines. The demonstrations have been in the areas of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federation of Cloud Resources, and (c) On-demand Services for ScientificWorkflows.« less

  20. IceProd 2 Usage Experience

    NASA Astrophysics Data System (ADS)

    Delventhal, D.; Schultz, D.; Diaz Velez, J. C.

    2017-10-01

    IceProd is a data processing and management framework developed by the IceCube Neutrino Observatory for processing of Monte Carlo simulations, detector data, and data driven analysis. It runs as a separate layer on top of grid and batch systems. This is accomplished by a set of daemons which process job workflow, maintaining configuration and status information on the job before, during, and after processing. IceProd can also manage complex workflow DAGs across distributed computing grids in order to optimize usage of resources. IceProd has recently been rewritten to increase its scaling capabilities, handle user analysis workflows together with simulation production, and facilitate the integration with 3rd party scheduling tools. IceProd 2, the second generation of IceProd, has been running in production for several months now. We share our experience setting up the system and things we’ve learned along the way.

  1. Decentralized Energy Management System for Networked Microgrids in Grid-connected and Islanded Modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhaoyu; Chen, Bokan; Wang, Jianhui

    This paper proposes a decentralized energy management system (EMS) for the coordinated operation of networked Microgirds (MGs) in a distribution system. In the grid-connected mode, the distribution network operator (DNO) and each MG are considered as distinct entities with individual objectives to minimize their own operation costs. It is assumed that both dispatchable and renewable energy source (RES)-based distributed generators (DGs) exist in the distribution network and the networked MGs. In order to coordinate the operation of all entities, we apply a decentralized bi-level algorithm to solve the problem with the first level to conduct negotiations among all entities andmore » the second level to update the non-converging penalties. In the islanded mode, the objective of each MG is to maintain a reliable power supply to its customers. In order to take into account the uncertainties of DG outputs and load consumption, we formulate the problems as two-stage stochastic programs. The first stage is to determine base generation setpoints based on the forecasts and the second stage is to adjust the generation outputs based on the realized scenarios. Case studies of a distribution system with networked MGs demonstrate the effectiveness of the proposed methodology in both grid-connected and islanded modes.« less

  2. The Study on Flood Reduction and Securing Instreamflow by applying Decentralized Rainwater Retention Facilities for Chunggyechun in Seoul of Korea

    NASA Astrophysics Data System (ADS)

    Park, J. H.; Jun, S. M.; Park, C. G.

    2014-12-01

    Recently abnormal climate phenomena and urbanization recently causes the changes of the hydrological environment. To restore the hydrological cycle in urban area some fundamental solutions such as decentralized rainwater management system and Low Impact Development (LID) techniques may be choosed. In this study, SWMM 5 was used to analyze the effects of decentralized stormwater retention for preventing the urban flood and securing the instreamflow. The Chunggyechun stream watershed(21.29㎢) which is located in Seoul city(Korea) and fully developed as urban area was selected as the study watershed, and the runoff characteristics of urban stream with various methods of LID techniques (Permeable pavement, small rainwater storage tank, large rainwater storage tank) were analyzed. By the simulation results, the permeability of pavement materials and detention storage at the surface soil layer make high effect to the flood discharge, and the initial rainfall retention at the rainwater storage tank effected to reduce the flood peak. The peak discharge was decreased as 22% for the design precipitation. Moreover the instreamflow was increased as 55% by using adequate LID techniques These kind of data could be used as the basis data for designing urban flood prevention facilities, urban regeneration planning in the view of the integrated watershed management.

  3. Creating a comprehensive customer service program to help convey critical and acute results of radiology studies.

    PubMed

    Towbin, Alexander J; Hall, Seth; Moskovitz, Jay; Johnson, Neil D; Donnelly, Lane F

    2011-01-01

    Communication of acute or critical results between the radiology department and referring clinicians has been a deficiency of many radiology departments. The failure to perform or document these communications can lead to poor patient care, patient safety issues, medical-legal issues, and complaints from referring clinicians. To mitigate these factors, a communication and documentation tool was created and incorporated into our departmental customer service program. This article will describe the implementation of a comprehensive customer service program in a hospital-based radiology department. A comprehensive customer service program was created in the radiology department. Customer service representatives were hired to answer the telephone calls to the radiology reading rooms and to help convey radiology results. The radiologists, referring clinicians, and customer service representatives were then linked via a novel workflow management system. This workflow management system provided tools to help facilitate the communication needs of each group. The number of studies with results conveyed was recorded from the implementation of the workflow management system. Between the implementation of the workflow management system on August 1, 2005, and June 1, 2009, 116,844 radiology results were conveyed to the referring clinicians and documented in the system. This accounts for more than 14% of the 828,516 radiology cases performed in this time frame. We have been successful in creating a comprehensive customer service program to convey and document communication of radiology results. This program has been widely used by the ordering clinicians as well as radiologists since its inception.

  4. Multi-core processing and scheduling performance in CMS

    NASA Astrophysics Data System (ADS)

    Hernández, J. M.; Evans, D.; Foulkes, S.

    2012-12-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  5. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  6. Ctrl "C"-Ctrl "V"; Using Gaming Peripherals to Improve Library Workflows and Enhance Staff Efficiency

    ERIC Educational Resources Information Center

    Litsey, Ryan; Harris, Rea; London, Jessie

    2018-01-01

    Library workflows are an area where repetitive stress can potentially reduce staff efficiency. Day to day activities that require a repetitive motion can bring about physical and psychological fatigue. For library managers, it is important to seek ways in which this type of repetitive stress can be alleviated while having the added benefit of…

  7. Improving data collection, documentation, and workflow in a dementia screening study

    PubMed Central

    Read, Kevin B.; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I.; Galvin, James E.; Surkis, Alisa

    2017-01-01

    Background A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies’ data collection, entry, and processing workflows. Case Presentation The librarians’ role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. Conclusions NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library’s broader user community. PMID:28377680

  8. Managing the CMS Data and Monte Carlo Processing during LHC Run 2

    NASA Astrophysics Data System (ADS)

    Wissing, C.; CMS Collaboration

    2017-10-01

    In order to cope with the challenges expected during the LHC Run 2 CMS put in a number of enhancements into the main software packages and the tools used for centrally managed processing. In the presentation we will highlight these improvements that allow CMS to deal with the increased trigger output rate, the increased pileup and the evolution in computing technology. The overall system aims at high flexibility, improved operational flexibility and largely automated procedures. The tight coupling of workflow classes to types of sites has been drastically relaxed. Reliable and high-performing networking between most of the computing sites and the successful deployment of a data-federation allow the execution of workflows using remote data access. That required the development of a largely automatized system to assign workflows and to handle necessary pre-staging of data. Another step towards flexibility has been the introduction of one large global HTCondor Pool for all types of processing workflows and analysis jobs. Besides classical Grid resources also some opportunistic resources as well as Cloud resources have been integrated into that Pool, which gives reach to more than 200k CPU cores.

  9. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory

    PubMed Central

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L.; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M.; Wilter da Silva, Alan; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S.; Stuart, David I.; Henrick, Kim; Esnouf, Robert M.

    2011-01-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service. PMID:21460443

  10. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory.

    PubMed

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M; da Silva, Alan Wilter; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S; Stuart, David I; Henrick, Kim; Esnouf, Robert M

    2011-04-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service.

  11. Adaptive Workflows for Diabetes Management: Self-Management Assistant and Remote Treatment for Diabetes.

    PubMed

    Contreras, Iván; Kiefer, Stephan; Vehi, Josep

    2017-01-01

    Diabetes self-management is a crucial element for all people with diabetes and those at risk for developing the disease. Diabetic patients should be empowered to increase their self-management skills in order to prevent or delay the complications of diabetes. This work presents the proposal and first development stages of a smartphone application focused on the empowerment of the patients with diabetes. The concept of this interventional tool is based on the personalization of the user experience from an adaptive and dynamic perspective. The segmentation of the population and the dynamical treatment of user profiles among the different experience levels is the main challenge of the implementation. The self-management assistant and remote treatment for diabetes aims to develop a platform to integrate a series of innovative models and tools rigorously tested and supported by the research literature in diabetes together the use of a proved engine to manage workflows for healthcare.

  12. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  13. Support for Taverna workflows in the VPH-Share cloud platform.

    PubMed

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  15. PIMS-Universal Payload Information Management

    NASA Technical Reports Server (NTRS)

    Elmore, Ralph; McNair, Ann R. (Technical Monitor)

    2002-01-01

    As the overall manager and integrator of International Space Station (ISS) science payloads and experiments, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center had a critical need to provide an information management system for exchange and management of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to provide collaborative access to remote experimenters and International Partners. The Payload Information Management System (PIMS) is a ground based electronic document configuration management and workflow system that was built to service that need. Functionally, PIMS provides the following document management related capabilities: 1. File access control, storage and retrieval from a central repository vault. 2. Collect supplemental data about files in the vault. 3. File exchange with a PMS GUI client, or any FTP connection. 4. Files placement into an FTP accessible dropbox for pickup by interfacing facilities, included files transmitted for spacecraft uplink. 5. Transmission of email messages to users notifying them of new version availability. 6. Polling of intermediate facility dropboxes for files that will automatically be processed by PIMS. 7. Provide an API that allows other POIC applications to access PIMS information. Functionally, PIMS provides the following Change Request processing capabilities: 1. Ability to create, view, manipulate, and query information about Operations Change Requests (OCRs). 2. Provides an adaptable workflow approval of OCRs with routing through developers, facility leads, POIC leads, reviewers, and implementers. Email messages can be sent to users either involving them in the workflow process or simply notifying them of OCR approval progress. All PIMS document management and OCR workflow controls are coordinated through and routed to individual user's "to do" list tasks. A user is given a task when it is their turn to perform some action relating to the approval of the Document or OCR. The user's available actions are restricted to only functions available for the assigned task. Certain actions, such as review or action implementation by non-PIMS users, can also be coordinated through automated emails.

  16. A WATERSHED APPROACH TO DRINKING WATER QUALITY

    EPA Science Inventory

    The purpose of this presentation is to describe emerging technologies and strategies managing watersheds with the goal of protecting drinking water sources. Included are discussions on decentralized wastewater treatment, whole organism biomonitor detection systems, treatment of...

  17. Decentralized Systems: Developing Partnerships to Broaden Opportunities Using the CWSRF

    EPA Pesticide Factsheets

    Many states maximized the effect of their CWSRF ARRA grant by partnering with other state agencies, local governments, and nonprofit organizations to manage many projects to repair and replace failing onsite treatment systems.

  18. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed within the same environment, this setup demonstrates the flexibility of DIaaS to run applications across e-Infrastructures. In summary, DIaaS delivers specialized software to execute data-intensive applications in a scalable, efficient, and robust manner reducing the engineering time and computational cost.

  19. [Application of information management system about medical equipment].

    PubMed

    Hang, Jianjin; Zhang, Chaoqun; Wu, Xiang-Yang

    2011-05-01

    Based on the practice of workflow, information management system about medical equipment was developed and its functions such as gathering, browsing, inquiring and counting were introduced. With dynamic and complete case management of medical equipment, the system improved the management of medical equipment.

  20. VisTrails SAHM: visualization and workflow management for species habitat modeling

    USGS Publications Warehouse

    Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew A.; Talbert, Marian; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.

    2013-01-01

    The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.

  1. Scientific Workflows + Provenance = Better (Meta-)Data Management

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.

  2. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  3. Development of the workflow kine systems for support on KAIZEN.

    PubMed

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  4. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. A case study on the impacts of computerized provider order entry (CPOE) system on hospital clinical workflow.

    PubMed

    Mominah, Maher; Yunus, Faisel; Househ, Mowafa S

    2013-01-01

    Computerized provider order entry (CPOE) is a health informatics system that helps health care providers create and manage orders for medications and other health care services. Through the automation of the ordering process, CPOE has improved the overall efficiency of hospital processes and workflow. In Saudi Arabia, CPOE has been used for years, with only a few studies evaluating the impacts of CPOE on clinical workflow. In this paper, we discuss the experience of a local hospital with the use of CPOE and its impacts on clinical workflow. Results show that there are many issues related to the implementation and use of CPOE within Saudi Arabia that must be addressed, including design, training, medication errors, alert fatigue, and system dep Recommendations for improving CPOE use within Saudi Arabia are also discussed.

  6. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  7. Responsibility, Authority, and Accountability in School-Based and Non-School-Based Management: Principals' Coping Strategies

    ERIC Educational Resources Information Center

    Grinshtain, Yael; Gibton, Dan

    2018-01-01

    Purpose: The purpose of this paper is to understand how primary school principals in Israel cope with the gaps between authority and responsibility in their work, deriving from partially implemented decentralization processes, and how this relates to school-based management (SBM) and accountability principles. Design/methodology/approach: Using…

  8. Decentralization and Participatory Decision-Making: Implementing School-Based Management in the Abbott Districts.

    ERIC Educational Resources Information Center

    Walker, Elaine M.

    2000-01-01

    This study examined issues faced during implementation of school-based management (SBM) in New Jersey's special needs or Abbott districts, using a literature review, surveys of K-12 schools, and focus groups with central office administrators. The study examined forms of SBM, team operations, local autonomy versus state power, skills required to…

  9. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    PubMed

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  10. A two-level discount model for coordinating a decentralized supply chain considering stochastic price-sensitive demand

    NASA Astrophysics Data System (ADS)

    Heydari, Jafar; Norouzinasab, Yousef

    2015-12-01

    In this paper, a discount model is proposed to coordinate pricing and ordering decisions in a two-echelon supply chain (SC). Demand is stochastic and price sensitive while lead times are fixed. Decentralized decision making where downstream decides on selling price and order size is investigated. Then, joint pricing and ordering decisions are extracted where both members act as a single entity aim to maximize whole SC profit. Finally, a coordination mechanism based on quantity discount is proposed to coordinate both pricing and ordering decisions simultaneously. The proposed two-level discount policy can be characterized from two aspects: (1) marketing viewpoint: a retail price discount to increase the demand, and (2) operations management viewpoint: a wholesale price discount to induce the retailer to adjust its order quantity and selling price jointly. Results of numerical experiments demonstrate that the proposed policy is suitable to coordinate SC and improve the profitability of SC as well as all SC members in comparison with decentralized decision making.

  11. Federalism and decentralization: impact on international and Brazilian health policies.

    PubMed

    Leite, Valéria Rodrigues; de Vasconcelos, Cipriano Maia; Lima, Kenio Costa

    2011-01-01

    This article discusses the implications of decentralization in the light of international and Brazilian federalism, and its effects on public health policy. In a comparative analysis among countries, the authors find there is no single model; rather, each country has a unique structure of institutions and norms that have important implications for the operation of its health system. Brazil shares some similarities with other countries that have adopted a decentralized system and is assuming features ever closer to U.S. federalism, with a complex web of relationships. The degree of inequality among Brazilian municipalities and states, along with the budgetary imbalances caused by the minimal levels of resource utilization, undermines Brazil's constitutional principles and, consequently, its federalism. To ensure the constitutional mandate in Brazil, it is essential, as in other countries, to create a stable source of funds and increase the volume and efficiency of spending. Also important are investing in the training of managers, improving information systems, strengthening the principles of autonomy and interdependence, and defining patterns of cooperation within the federation.

  12. Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application

    NASA Astrophysics Data System (ADS)

    Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei

    2016-04-01

    In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.

  13. 76 FR 54190 - Proposed Privacy Act System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... decentralized, with each mission area and agency managing its respective FOIA programs. At the end of each year...-know basis. Role-based access controls are used, and FX is only accessible via the Internet using USDA...

  14. [Ongoing, flexible distance learning through the Internet: course on decentralized management of human resources in health care].

    PubMed

    Struchiner, Miriam; Roschke, Maria Alice; Ricciardi, Regina Maria Vieira

    2002-03-01

    This paper describes the Course on Decentralized Management of Human Resources in Health Care, which is an Internet-based distance learning program to train and provide continuing education for health care professionals. The program is an initiative of the Pan American Health Organization, and it was organized in response to the growing need for self-reliant professionals who can constantly upgrade their knowledge without having to leave their place of work. The proposed model promotes an educational process that brings together theory and practice in realistic and relevant contexts and that maximizes the participation of students, both individually and in groups. The program has been evaluated in pilot studies in Brazil, Chile, and Peru. Following these assessments, the course has been adapted to facilitate its implementation and to adjust its contents to fit each country's circumstances.

  15. A network approach to decentralized coordination of energy production-consumption grids

    PubMed Central

    Arenas, Alex

    2018-01-01

    Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems. PMID:29364962

  16. A new approach to implementing decentralized wastewater treatment concepts.

    PubMed

    van Afferden, Manfred; Cardona, Jaime A; Lee, Mi-Yong; Subah, Ali; Müller, Roland A

    2015-01-01

    Planners and decision-makers in the wastewater sector are often confronted with the problem of identifying adequate development strategies and most suitable finance schemes for decentralized wastewater infrastructure. This paper research has focused on providing an approach in support of such decision-making. It is based on basic principles that stand for an integrated perspective towards sustainable wastewater management. We operationalize these principles by means of a geographic information system (GIS)-based approach 'Assessment of Local Lowest-Cost Wastewater Solutions'--ALLOWS. The main product of ALLOWS is the identification of cost-effective local wastewater management solutions for any given demographic and physical context. By using universally available input data the tool allows decision-makers to compare different wastewater solutions for any given wastewater situation. This paper introduces the ALLOWS-GIS tool. Its application and functionality are illustrated by assessing different wastewater solutions for two neighboring communities in rural Jordan.

  17. A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination

    DTIC Science & Technology

    2016-01-01

    Object Oriented Data Technology (OODT) big data toolkit developed by NASA and the Work-flow INstance Generation and Selection (WINGS) scientific work...to several challenge big data problems and demonstrated the utility of OODT-WINGS in addressing them. Specific demonstrated analyses address i...source software, Apache, Object Oriented Data Technology, OODT, semantic work-flows, WINGS, big data , work- flow management 16. SECURITY CLASSIFICATION OF

  18. Voice, Collaboration and School Culture: Creating a Community for School Improvement. Evaluation of the Pioneer SCBM Schools, Hawaii's School/Community-Based Management Initiative. Executive Summary.

    ERIC Educational Resources Information Center

    Izu, Jo Ann; And Others

    Site-based management is designed to bring decision making to the school level and involve all stakeholders in a process that will result ultimately in improved student outcomes. Enacted into law in June 1989, Hawaii's School/Community-Based Management Initiative (SCBM) is part of a national trend toward decentralizing decision making and…

  19. Multi-core processing and scheduling performance in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J. M.; Evans, D.; Foulkes, S.

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less

  20. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    NASA Astrophysics Data System (ADS)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  1. Cost comparison of centralized and decentralized wastewater management systems using optimization model.

    PubMed

    Jung, Youngmee Tiffany; Narayanan, N C; Cheng, Yu-Ling

    2018-05-01

    There is a growing interest in decentralized wastewater management (DWWM) as a potential alternative to centralized wastewater management (CWWM) in developing countries. However, the comparative cost of CWWM and DWWM is not well understood. In this study, the cost of cluster-type DWWM is simulated and compared to the cost of CWWM in Alibag, India. A three-step model is built to simulate a broad range of potential DWWM configurations with varying number and layout of cluster subsystems. The considered DWWM scheme consists of cluster subsystems, that each uses simplified sewer and DEWATS (Decentralized Wastewater Treatment Systems). We consider CWWM that uses conventional sewer and an activated sludge plant. The results show that the cost of DWWM can vary significantly with the number and layout of the comprising cluster subsystems. The cost of DWWM increased nonlinearly with increasing number of comprising clusters, mainly due to the loss in the economies of scale for DEWATS. For configurations with the same number of comprising cluster subsystems, the cost of DWWM varied by ±5% around the mean, depending on the layout of the cluster subsystems. In comparison to CWWM, DWWM was of lower cost than CWWM when configured with fewer than 16 clusters in Alibag, with significantly less operation and maintenance requirement, but with higher capital and land requirement for construction. The study demonstrates that cluster-type DWWM using simplified sewer and DEWATS may be a cost-competitive alternative to CWWM, when carefully configured to lower the cost. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. POTENTIAL AQUATIC COMMUNITY IMPROVEMENT THROUGH A MULTIDISCIPLINARY STORMWATER MANAGEMENT EXPERIMENT

    EPA Science Inventory

    Small-scale urban stream restoration efforts (e.g., riparian planting and in-stream habitat structures) often fail to improve ecological structure and function due the continuous hydrologic and chemical disturbances posed by impervious surfaces upstream. Decentralized stormwater...

  3. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    PubMed

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  5. Task Delegation Based Access Control Models for Workflow Systems

    NASA Astrophysics Data System (ADS)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  6. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    PubMed

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  7. An access control model with high security for distributed workflow and real-time application

    NASA Astrophysics Data System (ADS)

    Han, Ruo-Fei; Wang, Hou-Xiang

    2007-11-01

    The traditional mandatory access control policy (MAC) is regarded as a policy with strict regulation and poor flexibility. The security policy of MAC is so compelling that few information systems would adopt it at the cost of facility, except some particular cases with high security requirement as military or government application. However, with the increasing requirement for flexibility, even some access control systems in military application have switched to role-based access control (RBAC) which is well known as flexible. Though RBAC can meet the demands for flexibility but it is weak in dynamic authorization and consequently can not fit well in the workflow management systems. The task-role-based access control (T-RBAC) is then introduced to solve the problem. It combines both the advantages of RBAC and task-based access control (TBAC) which uses task to manage permissions dynamically. To satisfy the requirement of system which is distributed, well defined with workflow process and critically for time accuracy, this paper will analyze the spirit of MAC, introduce it into the improved T&RBAC model which is based on T-RBAC. At last, a conceptual task-role-based access control model with high security for distributed workflow and real-time application (A_T&RBAC) is built, and its performance is simply analyzed.

  8. Site-Based Management in Education: How To Make It Work in Your School.

    ERIC Educational Resources Information Center

    Candoli, I. Carl

    This handbook explains site-based management (SBM) in schools to help those who wish to decentralize their operation so that the campus-level staff have flexibility in making decisions that affect the students at their location. The text is divided into four major sections, each devoted to a particular area of concern under the SBM concept.…

  9. How Schools and Students Respond to School Improvement Programs: The Case of Brazil's PDE

    ERIC Educational Resources Information Center

    Carnoy, Martin; Gove, Amber K.; Loeb, Susanna; Marshall, Jeffrey H.; Socias, Miguel

    2008-01-01

    This study uses rich empirical data from Brazil to assess how a government program (PDE) that decentralizes school management decisions changes what goes on in schools and how these changes affect student outcomes. It appears that the PDE resulted in some improvements in management and learning materials, but little change in other areas including…

  10. Reinventing School-Based Management: A School Board Guide to School-Based Improvement.

    ERIC Educational Resources Information Center

    Drury, Darrel W.

    This report critiques the movement to decentralize decision making in public education. It provides an indepth examination of school-based management (SBM) with the aim of revealing why this type of reform seems to have had so little payoff for students. It addresses several key questions: What are the objectives of SBM, and are these objectives…

  11. Staggering successes amid controversy in California water management

    NASA Astrophysics Data System (ADS)

    Lund, J. R.

    2012-12-01

    Water in California has always been important and controversial, and it probably always will be. California has a large, growing economy and population in a semi-arid climate. But California's aridity, hydrologic variability, and water controversies have not precluded considerable economic successes. The successes of California's water system have stemmed from the decentralization of water management with historically punctuated periods of more centralized strategic decision-making. Decentralized management has allowed California's water users to efficiently explore incremental solutions to water problems, ranging from early local development of water systems (such as Hetch Hetchy, Owens Valley, and numerous local irrigation projects) to more contemporary efforts at water conservation, water markets, wastewater reuse, and conjunctive use of surface and groundwater. In the cacophony of local and stakeholder interests, strategic decisions have been more difficult, and consequently occur less frequently. California state water projects and Sacramento Valley flood control are examples where decades of effort, crises, floods and droughts were needed to mobilize local interests to agree to major strategic decisions. Currently, the state is faced with making strategic environmental and water management decisions regarding its deteriorating Sacramento-San Joaquin Delta. Not surprisingly, human uncertainties and physical and fiscal non-stationarities dominate this process.

  12. Cyberinfrastructure for End-to-End Environmental Explorations

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Kumar, S.; Song, C.; Zhao, L.; Govindaraju, R.; Niyogi, D.

    2007-12-01

    The design and implementation of a cyberinfrastructure for End-to-End Environmental Exploration (C4E4) is presented. The C4E4 framework addresses the need for an integrated data/computation platform for studying broad environmental impacts by combining heterogeneous data resources with state-of-the-art modeling and visualization tools. With Purdue being a TeraGrid Resource Provider, C4E4 builds on top of the Purdue TeraGrid data management system and Grid resources, and integrates them through a service-oriented workflow system. It allows researchers to construct environmental workflows for data discovery, access, transformation, modeling, and visualization. Using the C4E4 framework, we have implemented an end-to-end SWAT simulation and analysis workflow that connects our TeraGrid data and computation resources. It enables researchers to conduct comprehensive studies on the impact of land management practices in the St. Joseph watershed using data from various sources in hydrologic, atmospheric, agricultural, and other related disciplines.

  13. gProcess and ESIP Platforms for Satellite Imagery Processing over the Grid

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Gorgan, Dorian; Rodila, Denisa; Pop, Florin; Neagu, Gabriel; Petcu, Dana

    2010-05-01

    The Environment oriented Satellite Data Processing Platform (ESIP) is developed through the SEE-GRID-SCI (SEE-GRID eInfrastructure for regional eScience) co-funded by the European Commission through FP7 [1]. The gProcess Platform [2] is a set of tools and services supporting the development and the execution over the Grid of the workflow based processing, and particularly the satelite imagery processing. The ESIP [3], [4] is build on top of the gProcess platform by adding a set of satellite image processing software modules and meteorological algorithms. The satellite images can reveal and supply important information on earth surface parameters, climate data, pollution level, weather conditions that can be used in different research areas. Generally, the processing algorithms of the satellite images can be decomposed in a set of modules that forms a graph representation of the processing workflow. Two types of workflows can be defined in the gProcess platform: abstract workflow (PDG - Process Description Graph), in which the user defines conceptually the algorithm, and instantiated workflow (iPDG - instantiated PDG), which is the mapping of the PDG pattern on particular satellite image and meteorological data [5]. The gProcess platform allows the definition of complex workflows by combining data resources, operators, services and sub-graphs. The gProcess platform is developed for the gLite middleware that is available in EGEE and SEE-GRID infrastructures [6]. gProcess exposes the specific functionality through web services [7]. The Editor Web Service retrieves information on available resources that are used to develop complex workflows (available operators, sub-graphs, services, supported resources, etc.). The Manager Web Service deals with resources management (uploading new resources such as workflows, operators, services, data, etc.) and in addition retrieves information on workflows. The Executor Web Service manages the execution of the instantiated workflows on the Grid infrastructure. In addition, this web service monitors the execution and generates statistical data that are important to evaluate performances and to optimize execution. The Viewer Web Service allows access to input and output data. To prove and to validate the utility of the gProcess and ESIP platforms there were developed the GreenView and GreenLand applications. The GreenView related functionality includes the refinement of some meteorological data such as temperature, and the calibration of the satellite images based on field measurements. The GreenLand application performs the classification of the satellite images by using a set of vegetation indices. The gProcess and ESIP platforms are used as well in GiSHEO project [8] to support the processing of Earth Observation data over the Grid in eGLE (GiSHEO eLearning Environment). Experiments of performance assessment were conducted and they have revealed that the workflow-based execution could improve the execution time of a satellite image processing algorithm [9]. It is not a reliable solution to execute all the workflow nodes on different machines. The execution of some nodes can be more time consuming and they will be performed in a longer time than other nodes. The total execution time will be affected because some nodes will slow down the execution. It is important to correctly balance the workflow nodes. Based on some optimization strategy the workflow nodes can be grouped horizontally, vertically or in a hybrid approach. In this way, those operators will be executed on one machine and also the data transfer between workflow nodes will be lower. The dynamic nature of the Grid infrastructure makes it more exposed to the occurrence of failures. These failures can occur at worker node, services availability, storage element, etc. Currently gProcess has support for some basic error prevention and error management solutions. In future, some more advanced error prevention and management solutions will be integrated in the gProcess platform. References [1] SEE-GRID-SCI Project, http://www.see-grid-sci.eu/ [2] Bacu V., Stefanut T., Rodila D., Gorgan D., Process Description Graph Composition by gProcess Platform. HiPerGRID - 3rd International Workshop on High Performance Grid Middleware, 28 May, Bucharest. Proceedings of CSCS-17 Conference, Vol.2., ISSN 2066-4451, pp. 423-430, (2009). [3] ESIP Platform, http://wiki.egee-see.org/index.php/JRA1_Commonalities [4] Gorgan D., Bacu V., Rodila D., Pop Fl., Petcu D., Experiments on ESIP - Environment oriented Satellite Data Processing Platform. SEE-GRID-SCI User Forum, 9-10 Dec 2009, Bogazici University, Istanbul, Turkey, ISBN: 978-975-403-510-0, pp. 157-166 (2009). [5] Radu, A., Bacu, V., Gorgan, D., Diagrammatic Description of Satellite Image Processing Workflow. Workshop on Grid Computing Applications Development (GridCAD) at the SYNASC Symposium, 28 September 2007, Timisoara, IEEE Computer Press, ISBN 0-7695-3078-8, 2007, pp. 341-348 (2007). [6] Gorgan D., Bacu V., Stefanut T., Rodila D., Mihon D., Grid based Satellite Image Processing Platform for Earth Observation Applications Development. IDAACS'2009 - IEEE Fifth International Workshop on "Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications", 21-23 September, Cosenza, Italy, IEEE Published in Computer Press, 247-252 (2009). [7] Rodila D., Bacu V., Gorgan D., Integration of Satellite Image Operators as Workflows in the gProcess Application. Proceedings of ICCP2009 - IEEE 5th International Conference on Intelligent Computer Communication and Processing, 27-29 Aug, 2009 Cluj-Napoca. ISBN: 978-1-4244-5007-7, pp. 355-358 (2009). [8] GiSHEO consortium, Project site, http://gisheo.info.uvt.ro [9] Bacu V., Gorgan D., Graph Based Evaluation of Satellite Imagery Processing over Grid. ISPDC 2008 - 7th International Symposium on Parallel and Distributed Computing, July 1-5, 2008, Krakow, Poland. IEEE Computer Society 2008, ISBN: 978-0-7695-3472-5, pp. 147-154.

  14. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  15. Democratizing Authority in the Built Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Michael P; Kolb, John; Chen, Kaifei

    Operating systems and applications in the built environment have relied upon central authorization and management mechanisms which restrict their scalability, especially with respect to administrative overhead. We propose a new set of primitives encompassing syndication, security, and service execution that unifies the management of applications and services across the built environment, while enabling participants to individually delegate privilege across multiple administrative domains with no loss of security or manageability. We show how to leverage a decentralized authorization syndication platform to extend the design of building operating systems beyond the single administrative domain of a building. The authorization system leveraged ismore » based on blockchain smart contracts to permit decentralized and democratized delegation of authorization without central trust. Upon this, a publish/subscribe syndication tier and a containerized service execution environment are constructed. Combined, these mechanisms solve problems of delegation, federation, device protection and service execution that arise throughout the built environment. We leverage a high-fidelity city-scale emulation to verify the scalability of the authorization tier, and briefly describe a prototypical democratized operating system for the built environment using this foundation.« less

  16. Adopting a corporate perspective on databases. Improving support for research and decision making.

    PubMed

    Meistrell, M; Schlehuber, C

    1996-03-01

    The Veterans Health Administration (VHA) is at the forefront of designing and managing health care information systems that accommodate the needs of clinicians, researchers, and administrators at all levels. Rather than using one single-site, centralized corporate database VHA has constructed several large databases with different configurations to meet the needs of users with different perspectives. The largest VHA database is the Decentralized Hospital Computer Program (DHCP), a multisite, distributed data system that uses decoupled hospital databases. The centralization of DHCP policy has promoted data coherence, whereas the decentralization of DHCP management has permitted system development to be done with maximum relevance to the users'local practices. A more recently developed VHA data system, the Event Driven Reporting system (EDR), uses multiple, highly coupled databases to provide workload data at facility, regional, and national levels. The EDR automatically posts a subset of DHCP data to local and national VHA management. The development of the EDR illustrates how adoption of a corporate perspective can offer significant database improvements at reasonable cost and with modest impact on the legacy system.

  17. Coupled economic-coastline modeling with suckers and free riders

    NASA Astrophysics Data System (ADS)

    Williams, Zachary C.; McNamara, Dylan E.; Smith, Martin D.; Murray, A. Brad.; Gopalakrishnan, Sathya

    2013-06-01

    erosion is a natural trend along most sandy coastlines. Humans often respond to shoreline erosion with beach nourishment to maintain coastal property values. Locally extending the shoreline through nourishment alters alongshore sediment transport and changes shoreline dynamics in adjacent coastal regions. If left unmanaged, sandy coastlines can have spatially complex or simple patterns of erosion due to the relationship of large-scale morphology and the local wave climate. Using a numerical model that simulates spatially decentralized and locally optimal nourishment decisions characteristic of much of U.S. East Coast beach management, we find that human erosion intervention does not simply reflect the alongshore erosion pattern. Spatial interactions generate feedbacks in economic and physical variables that lead to widespread emergence of "free riders" and "suckers" with subsequent inequality in the alongshore distribution of property value. Along cuspate coastlines, such as those found along the U.S. Southeast Coast, these long-term property value differences span an order of magnitude. Results imply that spatially decentralized management of nourishment can lead to property values that are divorced from spatial erosion signals; this management approach is unlikely to be optimal.

  18. Aboard the "Moving School."

    ERIC Educational Resources Information Center

    Ainscow, Mel; Hopkins, David

    1992-01-01

    In many countries, education legislation embodies contradictory pressures for centralization and decentralization. In the United Kingdom, there is growing government control over policy and direction of schools; schools are also being given more responsibility for resource management. "Moving" schools within Improving the Quality of…

  19. Shared Decisions That Count.

    ERIC Educational Resources Information Center

    Schlechty, Phillip C.

    1993-01-01

    Advocates of participatory leadership, site-based management, and decentralization often assume that changing decision-making group composition will automatically improve the quality of decisions being made. Stakeholder satisfaction does not guarantee quality results. This article offers a framework for moving the decision-making discussion from…

  20. A Spike Cocktail Approach to Improve Microbial Performance Monitoring for Water Reuse

    EPA Science Inventory

    Water reuse, via either centralized treatment of traditional wastewater or decentralized treatment and on-site reuse, is becoming an increasingly important element of sustainable water management. Despite advances in waterborne pathogen detection methods, low and highly variable ...

  1. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  2. Managing the life cycle of electronic clinical documents.

    PubMed

    Payne, Thomas H; Graham, Gail

    2006-01-01

    To develop a model of the life cycle of clinical documents from inception to use in a person's medical record, including workflow requirements from clinical practice, local policy, and regulation. We propose a model for the life cycle of clinical documents as a framework for research on documentation within electronic medical record (EMR) systems. Our proposed model includes three axes: the stages of the document, the roles of those involved with the document, and the actions those involved may take on the document at each stage. The model includes the rules to describe who (in what role) can perform what actions on the document, and at what stages they can perform them. Rules are derived from needs of clinicians, and requirements of hospital bylaws and regulators. Our model encompasses current practices for paper medical records and workflow in some EMR systems. Commercial EMR systems include methods for implementing document workflow rules. Workflow rules that are part of this model mirror functionality in the Department of Veterans Affairs (VA) EMR system where the Authorization/ Subscription Utility permits document life cycle rules to be written in English-like fashion. Creating a model of the life cycle of clinical documents serves as a framework for discussion of document workflow, how rules governing workflow can be implemented in EMR systems, and future research of electronic documentation.

  3. Development and Appraisal of Multiple Accounting Record System (Mars).

    PubMed

    Yu, H C; Chen, M C

    2016-01-01

    The aim of the system is to achieve simplification of workflow, reduction of recording time, and increase the income for the study hospital. The project team decided to develop a multiple accounting record system that generates the account records based on the nursing records automatically, reduces the time and effort for nurses to review the procedure and provide another note of material consumption. Three configuration files were identified to demonstrate the relationship of treatments and reimbursement items. The workflow was simplified. The nurses averagely reduced 10 minutes of daily recording time, and the reimbursement points have been increased by 7.49%. The project streamlined the workflow and provides the institute a better way in finical management.

  4. Family planning and sexual health organizations: management lessons for health system reform.

    PubMed

    Ambegaokar, Maia; Lush, Louisiana

    2004-10-01

    Advocates of health system reform are calling for, among other things, decentralized, autonomous managerial and financial control, use of contracting and incentives, and a greater reliance on market mechanisms in the delivery of health services. The family planning and sexual health (FP&SH) sector already has experience of these. In this paper, we set forth three typical means of service provision within the FP&SH sector since the mid-1900s: independent not-for-profit providers, vertical government programmes and social marketing programmes. In each case, we present the context within which the service delivery mechanism evolved, the management techniques that characterize it and the lessons learned in FP&SH that are applicable to the wider debate about improving health sector management. We conclude that the FP&SH sector can provide both positive and negative lessons in the areas of autonomous management, use of incentives to providers and acceptors, balancing of centralization against decentralization, and employing private sector marketing and distribution techniques for delivering health services. This experience has not been adequately acknowledged in the debates about how to improve the quality and quantity of health services for the poor in developing countries. Health sector reform advocates and FP&SH advocates should collaborate within countries and regions to apply these management lessons. Copyright 2004 Oxford University Press

  5. Managing diverse occupational therapy resources in a creative, corporate model.

    PubMed

    Baptiste, S

    1993-10-01

    Two occupational therapy departments were amalgamated into a corporate whole and charged with the development of a workable, corporate structure. The departmental model which was developed served to enhance the concepts of quality of working life, employee autonomy, management team and quality circle theory. This paper provides a background from business and organizational literature, and outlines the development of the departmental model, in concert with the adoption of the client-centred model of occupational performance as a department basis for practice. This development was taking place concurrently with larger, institutional changes into a decentralized clinical programme management model. Discussion highlights the level of staff satisfaction with the changes, areas of concern during the development of the system and plans for the future growth. During this period of massive and critical change in the delivery of health care services, there has been a trend in restructuring health care institutions towards decentralized models. This paper will describe the experience of one occupational therapy department in developing an innovative departmental structure involving participatory management amalgamation. It is believed that the experience of the past occupational therapy work units with one viable option for a renewed management model. Staff skill sets can be maximized and optimal potential realized while faced with inevitable resource shrinkage and service reorganization.

  6. Improving Clinical Workflow in Ambulatory Care: Implemented Recommendations in an Innovation Prototype for the Veteran’s Health Administration

    PubMed Central

    Patterson, Emily S.; Lowry, Svetlana Z.; Ramaiah, Mala; Gibbons, Michael C.; Brick, David; Calco, Robert; Matton, Greg; Miller, Anne; Makar, Ellen; Ferrer, Jorge A.

    2015-01-01

    Introduction: Human factors workflow analyses in healthcare settings prior to technology implemented are recommended to improve workflow in ambulatory care settings. In this paper we describe how insights from a workflow analysis conducted by NIST were implemented in a software prototype developed for a Veteran’s Health Administration (VHA) VAi2 innovation project and associated lessons learned. Methods: We organize the original recommendations and associated stages and steps visualized in process maps from NIST and the VA’s lessons learned from implementing the recommendations in the VAi2 prototype according to four stages: 1) before the patient visit, 2) during the visit, 3) discharge, and 4) visit documentation. NIST recommendations to improve workflow in ambulatory care (outpatient) settings and process map representations were based on reflective statements collected during one-hour discussions with three physicians. The development of the VAi2 prototype was conducted initially independently from the NIST recommendations, but at a midpoint in the process development, all of the implementation elements were compared with the NIST recommendations and lessons learned were documented. Findings: Story-based displays and templates with default preliminary order sets were used to support scheduling, time-critical notifications, drafting medication orders, and supporting a diagnosis-based workflow. These templates enabled customization to the level of diagnostic uncertainty. Functionality was designed to support cooperative work across interdisciplinary team members, including shared documentation sessions with tracking of text modifications, medication lists, and patient education features. Displays were customized to the role and included access for consultants and site-defined educator teams. Discussion: Workflow, usability, and patient safety can be enhanced through clinician-centered design of electronic health records. The lessons learned from implementing NIST recommendations to improve workflow in ambulatory care using an EHR provide a first step in moving from a billing-centered perspective on how to maintain accurate, comprehensive, and up-to-date information about a group of patients to a clinician-centered perspective. These recommendations point the way towards a “patient visit management system,” which incorporates broader notions of supporting workload management, supporting flexible flow of patients and tasks, enabling accountable distributed work across members of the clinical team, and supporting dynamic tracking of steps in tasks that have longer time distributions. PMID:26290887

  7. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    PubMed Central

    Ho, Jonhan; Aridor, Orly; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that may possibly improve following the introduction of digital pathology systems were identified. These work processes included case management, case examination and review, and final case reporting. Furthermore, a digital slide system should integrate with the anatomic pathologic laboratory information system. Conclusions: To our knowledge, this is the first study that utilized the contextual inquiry method to document AP workflow. Findings were used to establish key requirements for the design of digital pathology systems. PMID:23243553

  8. An experimental paradigm for team decision processes

    NASA Technical Reports Server (NTRS)

    Serfaty, D.; Kleinman, D. L.

    1986-01-01

    The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.

  9. A novel test method to determine the filter material service life of decentralized systems treating runoff from traffic areas.

    PubMed

    Huber, Maximilian; Welker, Antje; Dierschke, Martina; Drewes, Jörg E; Helmreich, Brigitte

    2016-09-01

    In recent years, there has been a significant increase in the development and application of technical decentralized filter systems for the treatment of runoff from traffic areas. However, there are still many uncertainties regarding the service life and the performance of filter materials that are employed in decentralized treatment systems. These filter media are designed to prevent the transport of pollutants into the environment. A novel pilot-scale test method was developed to determine - within a few days - the service lives and long-term removal efficiencies for dissolved heavy metals in stormwater treatment systems. The proposed method consists of several steps including preloading the filter media in a pilot-scale model with copper and zinc by a load of n-1 years of the estimated service life (n). Subsequently, three representative rain events are simulated to evaluate the long-term performance by dissolved copper and zinc during the last year of application. The presented results, which verified the applicability of this method, were obtained for three filter channel systems and six filter shaft systems. The performance of the evaluated systems varied largely for both tested heavy metals and during all three simulated rain events. A validation of the pilot-scale assessment method with field measurements was also performed for two systems. Findings of this study suggest that this novel method does provide a standardized and accurate estimation of service intervals of decentralized treatment systems employing various filter materials. The method also provides regulatory authorities, designers, and operators with an objective basis for performance assessment and supports stormwater managers to make decisions for the installation of such decentralized treatment systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Life-Cycle Cost and Environmental Assessment of Decentralized Nitrogen Recovery Using Ion Exchange from Source-Separated Urine through Spatial Modeling.

    PubMed

    Kavvada, Olga; Tarpeh, William A; Horvath, Arpad; Nelson, Kara L

    2017-11-07

    Nitrogen standards for discharge of wastewater effluent into aquatic bodies are becoming more stringent, requiring some treatment plants to reduce effluent nitrogen concentrations. This study aimed to assess, from a life-cycle perspective, an innovative decentralized approach to nitrogen recovery: ion exchange of source-separated urine. We modeled an approach in which nitrogen from urine at individual buildings is sorbed onto resins, then transported by truck to regeneration and fertilizer production facilities. To provide insight into impacts from transportation, we enhanced the traditional economic and environmental assessment approach by combining spatial analysis, system-scale evaluation, and detailed last-mile logistics modeling using the city of San Francisco as an illustrative case study. The major contributor to energy intensity and greenhouse gas (GHG) emissions was the production of sulfuric acid to regenerate resins, rather than transportation. Energy and GHG emissions were not significantly sensitive to the number of regeneration facilities. Cost, however, increased with decentralization as rental costs per unit area are higher for smaller areas. The metrics assessed (unit energy, GHG emissions, and cost) were not significantly influenced by facility location in this high-density urban area. We determined that this decentralized approach has lower cost, unit energy, and GHG emissions than centralized nitrogen management via nitrification-denitrification if fertilizer production offsets are taken into account.

  11. Experimenting with Decentralization: The Politics of Change.

    ERIC Educational Resources Information Center

    Wohlstetter, Priscilla

    The relationship between the political context of school districts and their choices of decentralization policy is explored in this paper. It was expected that district politics would affect decentralization policies in two ways: the form of decentralization adopted and the degree of change. The decision to decentralize in three large urban school…

  12. Workflow management in large distributed systems

    NASA Astrophysics Data System (ADS)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  13. 78 FR 20087 - Privacy Act of 1974; Proposed New System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-03

    ... is comprised of two components--Enterprise Content Management (ECM) and the Account Management System (AMS). The heart of the system is the ECM component, which manages the workflows that were developed..., digital media, and/or CD-ROM. PAS is a customized module within USDA's Enterprise Content Management (ECM...

  14. Inventory-based landscape-scale simulation of management effectiveness and economic feasibility with BioSum

    Treesearch

    Jeremy S. Fried; Larry D. Potts; Sara M. Loreno; Glenn A. Christensen; R. Jamie Barbour

    2017-01-01

    The Forest Inventory and Analysis (FIA)-based BioSum (Bioregional Inventory Originated Simulation Under Management) is a free policy analysis framework and workflow management software solution. It addresses complex management questions concerning forest health and vulnerability for large, multimillion acre, multiowner landscapes using FIA plot data as the initial...

  15. Organizational Structure at the Crossroads.

    ERIC Educational Resources Information Center

    Person, Ruth

    1994-01-01

    Because colleges and universities have adopted new information technology idiosyncratically, formal structures to manage and govern its use have not evolved at the same pace. In creating such structures, issues to be considered include centralization vs. decentralization, attitudes toward change, institutional diversity, entrepreneurial spirit,…

  16. Implementation of retrofit BMPs in a suburban watershed via economic incentives

    EPA Science Inventory

    Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens and r...

  17. Measuring Charter School Efficiency: An Early Appraisal

    ERIC Educational Resources Information Center

    Carpenter, Dick M., II; Noller, Scott L.

    2010-01-01

    In an era of increased accountability and challenging times for public finance, charter schools built on decentralization, grassroots accountability, and market forces may provide, in the spirit of "educational laboratories," lessons for increasing student achievement more efficiently through diverse and innovative management,…

  18. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  19. Risk management frameworks: supporting the next generation of Murray-Darling Basin water sharing plans

    NASA Astrophysics Data System (ADS)

    Podger, G. M.; Cuddy, S. M.; Peeters, L.; Smith, T.; Bark, R. H.; Black, D. C.; Wallbrink, P.

    2014-09-01

    Water jurisdictions in Australia are required to prepare and implement water resource plans. In developing these plans the common goal is realising the best possible use of the water resources - maximising outcomes while minimising negative impacts. This requires managing the risks associated with assessing and balancing cultural, industrial, agricultural, social and environmental demands for water within a competitive and resource-limited environment. Recognising this, conformance to international risk management principles (ISO 31000:2009) have been embedded within the Murray-Darling Basin Plan. Yet, to date, there has been little strategic investment by water jurisdictions in bridging the gap between principle and practice. The ISO 31000 principles and the risk management framework that embodies them align well with an adaptive management paradigm within which to conduct water resource planning. They also provide an integrative framework for the development of workflows that link risk analysis with risk evaluation and mitigation (adaptation) scenarios, providing a transparent, repeatable and robust platform. This study, through a demonstration use case and a series of workflows, demonstrates to policy makers how these principles can be used to support the development of the next generation of water sharing plans in 2019. The workflows consider the uncertainty associated with climate and flow inputs, and model parameters on irrigation and hydropower production, meeting environmental flow objectives and recreational use of the water resource. The results provide insights to the risks associated with meeting a range of different objectives.

  20. Fuzzy Adaptive Decentralized Optimal Control for Strict Feedback Nonlinear Large-Scale Systems.

    PubMed

    Sun, Kangkang; Sui, Shuai; Tong, Shaocheng

    2018-04-01

    This paper considers the optimal decentralized fuzzy adaptive control design problem for a class of interconnected large-scale nonlinear systems in strict feedback form and with unknown nonlinear functions. The fuzzy logic systems are introduced to learn the unknown dynamics and cost functions, respectively, and a state estimator is developed. By applying the state estimator and the backstepping recursive design algorithm, a decentralized feedforward controller is established. By using the backstepping decentralized feedforward control scheme, the considered interconnected large-scale nonlinear system in strict feedback form is changed into an equivalent affine large-scale nonlinear system. Subsequently, an optimal decentralized fuzzy adaptive control scheme is constructed. The whole optimal decentralized fuzzy adaptive controller is composed of a decentralized feedforward control and an optimal decentralized control. It is proved that the developed optimal decentralized controller can ensure that all the variables of the control system are uniformly ultimately bounded, and the cost functions are the smallest. Two simulation examples are provided to illustrate the validity of the developed optimal decentralized fuzzy adaptive control scheme.

  1. Using R in Taverna: RShell v1.2

    PubMed Central

    Wassink, Ingo; Rauwerda, Han; Neerincx, Pieter BT; Vet, Paul E van der; Breit, Timo M; Leunissen, Jack AM; Nijholt, Anton

    2009-01-01

    Background R is the statistical language commonly used by many life scientists in (omics) data analysis. At the same time, these complex analyses benefit from a workflow approach, such as used by the open source workflow management system Taverna. However, Taverna had limited support for R, because it supported just a few data types and only a single output. Also, there was no support for graphical output and persistent sessions. Altogether this made using R in Taverna impractical. Findings We have developed an R plugin for Taverna: RShell, which provides R functionality within workflows designed in Taverna. In order to fully support the R language, our RShell plugin directly uses the R interpreter. The RShell plugin consists of a Taverna processor for R scripts and an RShell Session Manager that communicates with the R server. We made the RShell processor highly configurable allowing the user to define multiple inputs and outputs. Also, various data types are supported, such as strings, numeric data and images. To limit data transport between multiple RShell processors, the RShell plugin also supports persistent sessions. Here, we will describe the architecture of RShell and the new features that are introduced in version 1.2, i.e.: i) Support for R up to and including R version 2.9; ii) Support for persistent sessions to limit data transfer; iii) Support for vector graphics output through PDF; iv)Syntax highlighting of the R code; v) Improved usability through fewer port types. Our new RShell processor is backwards compatible with workflows that use older versions of the RShell processor. We demonstrate the value of the RShell processor by a use-case workflow that maps oligonucleotide probes designed with DNA sequence information from Vega onto the Ensembl genome assembly. Conclusion Our RShell plugin enables Taverna users to employ R scripts within their workflows in a highly configurable way. PMID:19607662

  2. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    PubMed

    de Carvalho, Elias Cesar Araujo; Batilana, Adelia Portero; Claudino, Wederson; Reis, Luiz Fernando Lima; Schmerling, Rafael A; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial sponsors should improve relevant support systems.

  3. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial sponsors should improve relevant support systems. PMID:22768105

  4. Implementation of Epic Beaker Anatomic Pathology at an Academic Medical Center.

    PubMed

    Blau, John Larry; Wilford, Joseph D; Dane, Susan K; Karandikar, Nitin J; Fuller, Emily S; Jacobsmeier, Debbie J; Jans, Melissa A; Horning, Elisabeth A; Krasowski, Matthew D; Ford, Bradley A; Becker, Kent R; Beranek, Jeanine M; Robinson, Robert A

    2017-01-01

    Beaker is a relatively new laboratory information system (LIS) offered by Epic Systems Corporation as part of its suite of health-care software and bundled with its electronic medical record, EpicCare. It is divided into two modules, Beaker anatomic pathology (Beaker AP) and Beaker Clinical Pathology. In this report, we describe our experience implementing Beaker AP version 2014 at an academic medical center with a go-live date of October 2015. This report covers preimplementation preparations and challenges beginning in September 2014, issues discovered soon after go-live in October 2015, and some post go-live optimizations using data from meetings, debriefings, and the project closure document. We share specific issues that we encountered during implementation, including difficulties with the proposed frozen section workflow, developing a shared specimen source dictionary, and implementation of the standard Beaker workflow in large institution with trainees. We share specific strategies that we used to overcome these issues for a successful Beaker AP implementation. Several areas of the laboratory-required adaptation of the default Beaker build parameters to meet the needs of the workflow in a busy academic medical center. In a few areas, our laboratory was unable to use the Beaker functionality to support our workflow, and we have continued to use paper or have altered our workflow. In spite of several difficulties that required creative solutions before go-live, the implementation has been successful based on satisfaction surveys completed by pathologists and others who use the software. However, optimization of Beaker workflows has continued to be an ongoing process after go-live to the present time. The Beaker AP LIS can be successfully implemented at an academic medical center but requires significant forethought, creative adaptation, and continued shared management of the ongoing product by institutional and departmental information technology staff as well as laboratory managers to meet the needs of the laboratory.

  5. Explaining and Influencing Chinese Arms Transfers

    DTIC Science & Technology

    1995-02-01

    this volume: Jonathan W. Pierce [] Secretary: Laura Hall [] Circulation Manager : Myma Morgan INSS publishes McNair Papers to provoke thought and inform...their affiliated factories were directed to decentralize their decision-making processes, grant more autonomy to managers , use excess capacity to...intimacy of ties between Moscow and 10 EXPLAINING AND INFLUENCING CHINESE ARMS TRANSFERS New Dehli , and concem that the United States was still in retreat

  6. Voice, Collaboration and School Culture: Creating a Community for School Improvement. Evaluation of the Pioneer SCBM Schools, Hawaii's School/Community-Based Management Initiative.

    ERIC Educational Resources Information Center

    Izu, Jo Ann; And Others

    Hawaii's School/Community-Based Management Initiative (SCBM), which was enacted into law in 1989, is part of a national trend toward decentralizing decision making and increasing school autonomy that arose during the 1980s. A voluntary program, SCBM offers schools flexibility, autonomy, and a small amount of resources in exchange for…

  7. New Water Management Institutions in Mexico’s ‘New Culture of Water’: Emerging Opportunities and Challenges for Effective Use of Climate Knowledge and Climate Science

    NASA Astrophysics Data System (ADS)

    Wilder, M.; Varady, R. G.; Pineda Pablos, N.; Browning-Aiken, A.; Diaz Caravantes, R.; Garfin, G.

    2007-05-01

    Since 1992, Mexico has developed a new set of water management institutions to usher in a ‘new culture of water’ that focuses on decentralized governance and formalized participation of local water users. Reforms to the national water legislation in April 2004 regionalized the governance of water and highlighted the importance of river basin councils as a mechanism for integrated management of major watersheds across Mexico. As a result of the dramatic national water policy reforms, water service delivery in Mexico has been decentralized to the state and municipal level, resulting in a critical new role for municipal governments charged with this important function. A network of river basin councils accompanied and sub-basin councils has been developed to undertake watershed planning. Decentralization and local participation policies embody numerous significant goals and promises, including greater efficiency, more financial accountability, fostering the beginnings of a sense of local stewardship of precious resources, and enhanced environmental sustainability. This paper examines the implications of municipalized water services and emerging river basin councils for utilization of climate knowledge and climate science. We analyze whether these changes open new windows of opportunity for meaningful use of climate science (e.g., forecasts; models). How effectively are municipal water managers and river basin councils utilizing climate knowledge and climate science, and for what purposes? Are there ways to improve the fit between the needs of water managers and river basin councils and the science that is currently available? What is the role of local participation in water policy making in urban settings and river basin councils? The study found overall that the promises and potential for effective utilization of climate science/knowledge to enhance sustainability exists, but is not yet being adequately realized. Binational efforts to develop climate science and information-sharing mechanisms across the Sonora/Arizona border and to work with local communities and stakeholders to improve the fit between science and social stakeholders’ needs should help realize the potential offered by Mexico’s emerging water management institutions and enhance sustainable policy making.

  8. Reorganization of secondary medical care in the Israeli Defense Forces Medical Corps: A cost-effect analysis.

    PubMed

    Yagil, Yael; Arnon, Ronen; Ezra, Vered; Ashkenazi, Isaac

    2006-12-01

    To increase accessibility and availability of secondary medical care, 10 secondary unit specialist clinics were established side-by-side with five existing regional specialist centers, thus achieving decentralization. The purpose was to analyze the impact of this reorganization on overall consumption of secondary medical care and expenditures. Consumption of secondary medical care was analyzed by using computerized clinic and Medical Corps databases. Functional efficiency and budgetary expenditures were evaluated in four representative unit specialist clinics. The reorganization resulted in an 8% increase in total secondary care consumption over 2.5 years. The establishment of unit specialist clinics did not achieve increased accessibility or availability for military personnel. Functional analysis of representative unit specialist clinics showed diversity in efficiency, differences in physicians' performance, and excess expenditures. The decentralizing reorganization of secondary medical care generated an increase in medical care consumption, possibly because of supply-induced demand. The uniform inefficiency of the unit specialist clinics might have been related to incorrect planning and management. The decentralization of secondary medical care within the Israeli Defense Forces has not proved to be cost-efficient.

  9. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  10. Towards seamless workflows in agile data science

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the geosciences. We use code management that allows researchers to interact with the code through tools like Jupyter Notebooks while data are held in an object store. Our aim is an architecture allowing seamless integration of code development, data management, and data processing in virtual research environments.

  11. Barriers to effective, safe communication and workflow between nurses and non-consultant hospital doctors during out-of-hours.

    PubMed

    Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh

    2017-11-01

    This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  12. Widening the adoption of workflows to include human and human-machine scientific processes

    NASA Astrophysics Data System (ADS)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  13. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  14. The Kiel data management infrastructure - arising from a generic data model

    NASA Astrophysics Data System (ADS)

    Fleischer, D.; Mehrtens, H.; Schirnick, C.; Springer, P.

    2010-12-01

    The Kiel Data Management Infrastructure (KDMI) started from a cooperation of three large-scale projects (SFB574, SFB754 and Cluster of Excellence The Future Ocean) and the Leibniz Institute of Marine Sciences (IFM-GEOMAR). The common strategy for project data management is a single person collecting and transforming data according to the requirements of the targeted data center(s). The intention of the KDMI cooperation is to avoid redundant and potentially incompatible data management efforts for scientists and data managers and to create a single sustainable infrastructure. An increased level of complexity in the conceptual planing arose from the diversity of marine disciplines and approximately 1000 scientists involved. KDMI key features focus on the data provenance which we consider to comprise the entire workflow from field sampling thru labwork to data calculation and evaluation. Managing the data of each individual project participant in this way yields the data management for the entire project and warrants the reusability of (meta)data. Accordingly scientists provide a workflow definition of their data creation procedures resulting in their target variables. The central idea in the development of the KDMI presented here is based on the object oriented programming concept which allows to have one object definition (workflow) and infinite numbers of object instances (data). Each definition is created by a graphical user interface and produces XML output stored in a database using a generic data model. On creation of a data instance the KDMI translates the definition into web forms for the scientist, the generic data model then accepts all information input following the given data provenance definition. An important aspect of the implementation phase is the possibility of a successive transition from daily measurement routines resulting in single spreadsheet files with well known points of failure and limited reuseability to a central infrastructure as a single point of truth. The data provenance approach has the following positive side effects: (1) the scientist designs the extend and timing of data and metadata prompts by workflow definitions himself while (2) consistency and completeness (mandatory information) of metadata in the resulting XML document can be checked by XML validation. (3) Storage of the entire data creation process (including raw data and processing steps) provides a multidimensional quality history accessible by all researchers in addition to the commonly applied one dimensional quality flag system. (4) The KDMI can be extended to other scientific disciplines by adding new workflows and domain specific outputs assisted by the KDMI-Team. The KDMI is a social network inspired system but instead of sharing privacy it is a sharing platform for daily scientific work, data and their provenance.

  15. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  16. Workflow Challenges of Enterprise Imaging: HIMSS-SIIM Collaborative White Paper.

    PubMed

    Towbin, Alexander J; Roth, Christopher J; Bronkalla, Mark; Cram, Dawn

    2016-10-01

    With the advent of digital cameras, there has been an explosion in the number of medical specialties using images to diagnose or document disease and guide interventions. In many specialties, these images are not added to the patient's electronic medical record and are not distributed so that other providers caring for the patient can view them. As hospitals begin to develop enterprise imaging strategies, they have found that there are multiple challenges preventing the implementation of systems to manage image capture, image upload, and image management. This HIMSS-SIIM white paper will describe the key workflow challenges related to enterprise imaging and offer suggestions for potential solutions to these challenges.

  17. Case Report: Activity Diagrams for Integrating Electronic Prescribing Tools into Clinical Workflow

    PubMed Central

    Johnson, Kevin B.; FitzHenry, Fern

    2006-01-01

    To facilitate the future implementation of an electronic prescribing system, this case study modeled prescription management processes in various primary care settings. The Vanderbilt e-prescribing design team conducted initial interviews with clinic managers, physicians and nurses, and then represented the sequences of steps carried out to complete prescriptions in activity diagrams. The diagrams covered outpatient prescribing for patients during a clinic visit and between clinic visits. Practice size, practice setting, and practice specialty type influenced the prescribing processes used. The model developed may be useful to others engaged in building or tailoring an e-prescribing system to meet the specific workflows of various clinic settings. PMID:16622168

  18. U-Form vs. M-Form: How to Understand Decision Autonomy Under Healthcare Decentralization?

    PubMed Central

    Bustamante, Arturo Vargas

    2016-01-01

    For more than three decades healthcare decentralization has been promoted in developing countries as a way of improving the financing and delivery of public healthcare. Decision autonomy under healthcare decentralization would determine the role and scope of responsibility of local authorities. Jalal Mohammed, Nicola North, and Toni Ashton analyze decision autonomy within decentralized services in Fiji. They conclude that the narrow decision space allowed to local entities might have limited the benefits of decentralization on users and providers. To discuss the costs and benefits of healthcare decentralization this paper uses the U-form and M-form typology to further illustrate the role of decision autonomy under healthcare decentralization. This paper argues that when evaluating healthcare decentralization, it is important to determine whether the benefits from decentralization are greater than its costs. The U-form and M-form framework is proposed as a useful typology to evaluate different types of institutional arrangements under healthcare decentralization. Under this model, the more decentralized organizational form (M-form) is superior if the benefits from flexibility exceed the costs of duplication and the more centralized organizational form (U-form) is superior if the savings from economies of scale outweigh the costly decision-making process from the center to the regions. Budgetary and financial autonomy and effective mechanisms to maintain local governments accountable for their spending behavior are key decision autonomy variables that could sway the cost-benefit analysis of healthcare decentralization. PMID:27694684

  19. 47 CFR 202.2 - Criteria and guidance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... operations management will remain decentralized in order to retain flexibility in the use of individual... responsible for telecommunications systems operations, and the carriers, are responsible for planning with respect to emergency operations. Guidance in this matter has been issued from a number of sources and...

  20. The Shepherd Creek experience and some lessons learned

    EPA Science Inventory

    A decentralized, retrofit approach to storm water management was implemented in a small suburban drainage on the basis of a voluntary reverse auction. This campaign led to the installation of 83 rain gardens and 176 rain barrels on approximately 20 percent of 350 residential prop...

  1. WASTE-TO-RESOURCE: NOVEL MEMBRANE SYSTEMS FOR SAFE AND SUSTAINABLE BRINE MANAGEMENT

    EPA Science Inventory

    Decentralized waste-to-reuse systems will be optimized to maximize resource and energy recovery and minimize chemicals and energy use. This research will enhance fundamental knowledge on simultaneous heat and mass transport through membranes, lower process costs, and furthe...

  2. [Plansalud: Decentralized and agreed sector plan for the capacity development in health, Peru 2010-2014].

    PubMed

    Huamán-Angulo, Lizardo; Liendo-Lucano, Lindaura; Nuñez-Vergara, Manuel

    2011-06-01

    Human resources are the backbone of health sector actions; however, they are not necessarily the area with the greatest attention, therefore, the Ministry of Health of Peru (MINSA) together with regional governments, led the Decentralized and Agreed Sector Plan for the Capacity Development in Health 2010-2014 (PLANSALUD) with the aim of strengthening the capacities of Human Resources for Health (HRH) and contribute to health care efficient development, quality, relevance, equity and multiculturalism, in the context of descentralization, the Universal Health Insurance (AUS) and health policies. To achieve this goal, they have proposed three components (technical assistance, joint training and education - health articulation) that bring together an important set of interventions, which are planned and defined according to the national, regional and local levels, thus contributing to improve the government capacity, capability management and delivery of health services. This paper presents a first approach of PLANSALUD, including aspects related to planning, management, financing, structure and functioning, as well as monitoring and evaluation measures.

  3. Optimal joint management of a coastal aquifer and a substitute resource

    NASA Astrophysics Data System (ADS)

    Moreaux, M.; Reynaud, A.

    2004-06-01

    This article characterizes the optimal joint management of a coastal aquifer and a costly water substitute. For this purpose we use a mathematical representation of the aquifer that incorporates the displacement of the interface between the seawater and the freshwater of the aquifer. We identify the spatial cost externalities created by users on each other and we show that the optimal water supply depends on the location of users. Users located in the coastal zone exclusively use the costly substitute. Those located in the more upstream area are supplied from the aquifer. At the optimum their withdrawal must take into account the cost externalities they generate on users located downstream. Last, users located in a median zone use the aquifer with a surface transportation cost. We show that the optimum can be implemented in a decentralized economy through a very simple Pigouvian tax. Finally, the optimal and decentralized extraction policies are simulated on a very simple example.

  4. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  5. A Web application for the management of clinical workflow in image‐guided and adaptive proton therapy for prostate cancer treatments

    PubMed Central

    Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-01-01

    Image‐guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X‐rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post‐treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state‐of‐the‐art Web technologies, a domain model closely matching the workflow, a database‐supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model‐View‐Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client‐side technologies, such as jQuery, jQuery Plug‐ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process. PACS number: 87 PMID:26103504

  6. Optical zone centration: a retrospective analysis of the excimer laser after three years

    NASA Astrophysics Data System (ADS)

    Vervecken, Filip; Trau, Rene; Mertens, Erik L.; Vanhorenbeeck, R.; Van Aerde, F.; Zen, J.; Haustrate, F.; Tassignon, Marie J.

    1996-12-01

    The aim of this study was to evaluate the implication of the mechanical factor 'decentration' on the visual outcome after PRK. 100 eyes of 70 patients were included. The mean decentration was 0.27 mm +/- 0.18. Decentration was less than 0.5 mm in 84 percent of the cases. The importance of the decentration was investigated by the statistical correlation of decentration from the pupilcenter and the visual outcome. We did not find any statistical significant association for decentrations less than 1 mm. Our conclusion is that decentration, if less than 1 mm, does not play an important role in the final visual outcome after PRK.

  7. Using conceptual work products of health care to design health IT.

    PubMed

    Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark

    2016-02-01

    This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  9. Vel-IO 3D: A tool for 3D velocity model construction, optimization and time-depth conversion in 3D geological modeling workflow

    NASA Astrophysics Data System (ADS)

    Maesano, Francesco E.; D'Ambrogi, Chiara

    2017-02-01

    We present Vel-IO 3D, a tool for 3D velocity model creation and time-depth conversion, as part of a workflow for 3D model building. The workflow addresses the management of large subsurface dataset, mainly seismic lines and well logs, and the construction of a 3D velocity model able to describe the variation of the velocity parameters related to strong facies and thickness variability and to high structural complexity. Although it is applicable in many geological contexts (e.g. foreland basins, large intermountain basins), it is particularly suitable in wide flat regions, where subsurface structures have no surface expression. The Vel-IO 3D tool is composed by three scripts, written in Python 2.7.11, that automate i) the 3D instantaneous velocity model building, ii) the velocity model optimization, iii) the time-depth conversion. They determine a 3D geological model that is consistent with the primary geological constraints (e.g. depth of the markers on wells). The proposed workflow and the Vel-IO 3D tool have been tested, during the EU funded Project GeoMol, by the construction of the 3D geological model of a flat region, 5700 km2 in area, located in the central part of the Po Plain. The final 3D model showed the efficiency of the workflow and Vel-IO 3D tool in the management of large amount of data both in time and depth domain. A 4 layer-cake velocity model has been applied to a several thousand (5000-13,000 m) thick succession, with 15 horizons from Triassic up to Pleistocene, complicated by a Mesozoic extensional tectonics and by buried thrusts related to Southern Alps and Northern Apennines.

  10. A tutorial of diverse genome analysis tools found in the CoGe web-platform using Plasmodium spp. as a model

    PubMed Central

    Castillo, Andreina I; Nelson, Andrew D L; Haug-Baltzell, Asher K; Lyons, Eric

    2018-01-01

    Abstract Integrated platforms for storage, management, analysis and sharing of large quantities of omics data have become fundamental to comparative genomics. CoGe (https://genomevolution.org/coge/) is an online platform designed to manage and study genomic data, enabling both data- and hypothesis-driven comparative genomics. CoGe’s tools and resources can be used to organize and analyse both publicly available and private genomic data from any species. Here, we demonstrate the capabilities of CoGe through three example workflows using 17 Plasmodium genomes as a model. Plasmodium genomes present unique challenges for comparative genomics due to their rapidly evolving and highly variable genomic AT/GC content. These example workflows are intended to serve as templates to help guide researchers who would like to use CoGe to examine diverse aspects of genome evolution. In the first workflow, trends in genome composition and amino acid usage are explored. In the second, changes in genome structure and the distribution of synonymous (Ks) and non-synonymous (Kn) substitution values are evaluated across species with different levels of evolutionary relatedness. In the third workflow, microsyntenic analyses of multigene families’ genomic organization are conducted using two Plasmodium-specific gene families—serine repeat antigen, and cytoadherence-linked asexual gene—as models. In general, these example workflows show how to achieve quick, reproducible and shareable results using the CoGe platform. We were able to replicate previously published results, as well as leverage CoGe’s tools and resources to gain additional insight into various aspects of Plasmodium genome evolution. Our results highlight the usefulness of the CoGe platform, particularly in understanding complex features of genome evolution. Database URL: https://genomevolution.org/coge/

  11. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  12. Work Flow Analysis Report Consisting of Work Management - Preventive Maintenance - Materials and Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENNINGS, T.L.

    The Work Flow analysis Report will be used to facilitate the requirements for implementing the Work Control module of Passport. The report consists of workflow integration processes for Work Management, Preventative Maintenance, Materials and Equipment

  13. Hbim Methodology as a Bridge Between Italy and Argentina

    NASA Astrophysics Data System (ADS)

    Moreira, A.; Quattrini, R.; Maggiolo, G.; Mammoli, R.

    2018-05-01

    The availability of efficient HBIM workflows could represent a very important change towards a more efficient management of the historical real estate. The present work shows how to obtain accurate and reliable information of heritage buildings through reality capture and 3D modelling to support restoration purposes or knowledge-based applications. Two cases studies metaphorically joint Italy with Argentina. The research article explains the workflows applied at the Palazzo Ferretti at Ancona and the Manzana Histórica de la Universidad National del Litoral, providing a constructive comparison and blending technological and theoretical approaches. In a bottom-up process, the assessment of two cases study validates a workflow allowing the achievement of a useful and proper data enrichment of each HBIM model. Another key aspect is the Level of Development (LOD) evaluation of both models: different ranges and scales are defined in America (100-500) and in Italy (A-G), nevertheless is possible to obtain standard shared procedures, enabling facilitation of HBIM development and diffusion in operating workflows.

  14. A decentralized and onsite wastewater management course: bringing together global concerns and practical pedagogy.

    PubMed

    Gaulke, L S; Borgford-Parnell, J L; Stensel, H D

    2008-01-01

    This paper reports on the design, implementation, and results of a course focused on decentralized and onsite wastewater treatment in global contexts. Problem-based learning was the primary pedagogical method, with which students tackled real-world problems and designed systems to meet the needs of diverse populations. Both learning and course evaluations demonstrated that the course was successful in fulfilling learning objectives, increasing student design skills, and raising awareness of global applications. Based on this experience a list of recommendations was created for co-developing and team-teaching multidisciplinary design courses. These recommendations include ideas for aligning student and teacher goals, overcoming barriers to effective group-work, and imbedding continuous course assessments. Copyright IWA Publishing 2008.

  15. Balancing Officer Community Manpower through Decentralization: Granular Programming Revisited (1REV)

    DTIC Science & Technology

    2017-08-01

    supply-demand imbalances Economic theory identifies costs and benefits associated with decentralization. On the benefits side, decentralized decision...patterns rather than costs . Granular programming as a decentralized, market-based initiative The costs and benefits of decentralized (instead of...paygrade-specific rates were based on average MPN costs by paygrade. The benefits of this approach to granular programming are that it is conceptually

  16. The impact of automation on organizational changes in a community hospital clinical microbiology laboratory.

    PubMed

    Camporese, Alessandro

    2004-06-01

    The diagnosis of infectious diseases and the role of the microbiology laboratory are currently undergoing a process of change. The need for overall efficiency in providing results is now given the same importance as accuracy. This means that laboratories must be able to produce quality results in less time with the capacity to interpret the results clinically. To improve the clinical impact of microbiology results, the new challenge facing the microbiologist has become one of process management instead of pure analysis. A proper project management process designed to improve workflow, reduce analytical time, and provide the same high quality results without losing valuable time treating the patient, has become essential. Our objective was to study the impact of introducing automation and computerization into the microbiology laboratory, and the reorganization of the laboratory workflow, i.e. scheduling personnel to work shifts covering both the entire day and the entire week. In our laboratory, the introduction of automation and computerization, as well as the reorganization of personnel, thus the workflow itself, has resulted in an improvement in response time and greater efficiency in diagnostic procedures.

  17. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  18. Implementation of a 'lean' cytopathology service: towards routine same-day reporting.

    PubMed

    Hewer, Ekkehard; Hammer, Caroline; Fricke-Vetsch, Daniela; Baumann, Cinzia; Perren, Aurel; Schmitt, Anja M

    2018-05-01

    To systematically assess the effects of a Lean management intervention in an academic cytopathology service. We monitored outcomes including specimen turnaround times during stepwise implementation of a lean cytopathology workflow for gynaecological and non-gynaecological cytology. The intervention resulted in a major reduction of turnaround times for both gynaecological (3rd quartile 4.1 vs 2.3 working days) and non-gynaecological cytology (3rd quartile 1.9 vs. 1.2 working days). Introduction of fully electronic reporting had additional effect over continuous staining of slides alone. The rate of non-gynaecological specimens reported the same day increased from 4.5% to 56.5% of specimens received before noon. Lean management principles provide a useful framework for organization of a cytopathology workflow. Stepwise implementation beginning with a simplified gynaecological cytology workflow allowed involved staff to monitor the effects of individual changes and allowed for a smooth transition. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Accrual Budgeting: Experiences of Other Nations and Implications for the United States

    DTIC Science & Technology

    2000-02-01

    Used as a Tool in Addressing Performance Management Challenges Proponents described accrual budgeting as a useful, if not critical , tool in...accountable in these more decentralized systems. As a result, some proponents view accrual budgeting as critical to establishing more performance-focused...the federal government and thus plays a critical role in the decision-making process. Policymakers, managers, and the American people rely on it to

  20. Do Community-Managed Schools Work? An Evaluation of El Salvador's EDUCO Program. Working Paper Series on Impact Evaluation of Education Reforms. Paper No. 8.

    ERIC Educational Resources Information Center

    Jimenez, Emmanuel; Sawada, Yasuyuki

    This paper measures the effects on student outcomes of decentralizing educational responsibility to communities and schools. In El Salvador, community-managed schools emerged during the 1980s when public schools could not be extended to rural areas because of the country's civil war. In 1991, El Salvador's Ministry of Education decided to draw on…

  1. U-Form vs. M-Form: How to Understand Decision Autonomy Under Healthcare Decentralization? Comment on "Decentralisation of Health Services in Fiji: A Decision Space Analysis".

    PubMed

    Bustamante, Arturo Vargas

    2016-06-07

    For more than three decades healthcare decentralization has been promoted in developing countries as a way of improving the financing and delivery of public healthcare. Decision autonomy under healthcare decentralization would determine the role and scope of responsibility of local authorities. Jalal Mohammed, Nicola North, and Toni Ashton analyze decision autonomy within decentralized services in Fiji. They conclude that the narrow decision space allowed to local entities might have limited the benefits of decentralization on users and providers. To discuss the costs and benefits of healthcare decentralization this paper uses the U-form and M-form typology to further illustrate the role of decision autonomy under healthcare decentralization. This paper argues that when evaluating healthcare decentralization, it is important to determine whether the benefits from decentralization are greater than its costs. The U-form and M-form framework is proposed as a useful typology to evaluate different types of institutional arrangements under healthcare decentralization. Under this model, the more decentralized organizational form (M-form) is superior if the benefits from flexibility exceed the costs of duplication and the more centralized organizational form (U-form) is superior if the savings from economies of scale outweigh the costly decision-making process from the center to the regions. Budgetary and financial autonomy and effective mechanisms to maintain local governments accountable for their spending behavior are key decision autonomy variables that could sway the cost-benefit analysis of healthcare decentralization. © 2016 by Kerman University of Medical Sciences.

  2. The impact of missing sensor information on surgical workflow management.

    PubMed

    Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas

    2013-09-01

    Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.

  3. The development and implementation of MOSAIQ Integration Platform (MIP) based on the radiotherapy workflow

    NASA Astrophysics Data System (ADS)

    Yang, Xin; He, Zhen-yu; Jiang, Xiao-bo; Lin, Mao-sheng; Zhong, Ning-shan; Hu, Jiang; Qi, Zhen-yu; Bao, Yong; Li, Qiao-qiao; Li, Bao-yue; Hu, Lian-ying; Lin, Cheng-guang; Gao, Yuan-hong; Liu, Hui; Huang, Xiao-yan; Deng, Xiao-wu; Xia, Yun-fei; Liu, Meng-zhong; Sun, Ying

    2017-03-01

    To meet the special demands in China and the particular needs for the radiotherapy department, a MOSAIQ Integration Platform CHN (MIP) based on the workflow of radiation therapy (RT) has been developed, as a supplement system to the Elekta MOSAIQ. The MIP adopts C/S (client-server) structure mode, and its database is based on the Treatment Planning System (TPS) and MOSAIQ SQL Server 2008, running on the hospital local network. Five network servers, as a core hardware, supply data storage and network service based on the cloud services. The core software, using C# programming language, is developed based on Microsoft Visual Studio Platform. The MIP server could offer network service, including entry, query, statistics and print information for about 200 workstations at the same time. The MIP was implemented in the past one and a half years, and some practical patient-oriented functions were developed. And now the MIP is almost covering the whole workflow of radiation therapy. There are 15 function modules, such as: Notice, Appointment, Billing, Document Management (application/execution), System Management, and so on. By June of 2016, recorded data in the MIP are as following: 13546 patients, 13533 plan application, 15475 RT records, 14656 RT summaries, 567048 billing records and 506612 workload records, etc. The MIP based on the RT workflow has been successfully developed and clinically implemented with real-time performance, data security, stable operation. And it is demonstrated to be user-friendly and is proven to significantly improve the efficiency of the department. It is a key to facilitate the information sharing and department management. More functions can be added or modified for further enhancement its potentials in research and clinical practice.

  4. Electronic health records and patient safety: co-occurrence of early EHR implementation with patient safety practices in primary care settings.

    PubMed

    Tanner, C; Gans, D; White, J; Nath, R; Pohl, J

    2015-01-01

    The role of electronic health records (EHR) in enhancing patient safety, while substantiated in many studies, is still debated. This paper examines early EHR adopters in primary care to understand the extent to which EHR implementation is associated with the workflows, policies and practices that promote patient safety, as compared to practices with paper records. Early adoption is defined as those who were using EHR prior to implementation of the Meaningful Use program. We utilized the Physician Practice Patient Safety Assessment (PPPSA) to compare primary care practices with fully implemented EHR to those utilizing paper records. The PPPSA measures the extent of adoption of patient safety practices in the domains: medication management, handoffs and transition, personnel qualifications and competencies, practice management and culture, and patient communication. Data from 209 primary care practices responding between 2006-2010 were included in the analysis: 117 practices used paper medical records and 92 used an EHR. Results showed that, within all domains, EHR settings showed significantly higher rates of having workflows, policies and practices that promote patient safety than paper record settings. While these results were expected in the area of medication management, EHR use was also associated with adoption of patient safety practices in areas in which the researchers had no a priori expectations of association. Sociotechnical models of EHR use point to complex interactions between technology and other aspects of the environment related to human resources, workflow, policy, culture, among others. This study identifies that among primary care practices in the national PPPSA database, having an EHR was strongly empirically associated with the workflow, policy, communication and cultural practices recommended for safe patient care in ambulatory settings.

  5. Decentralization and primary health care: some negative implications in developing countries.

    PubMed

    Collins, C; Green, A

    1994-01-01

    Decentralization is a highly popular concept, being a key element of Primary Health Care policies. There are, however, certain negative implications of decentralization that must be taken into account. These are analyzed in this article with particular reference to developing countries. The authors criticize the tendency for decentralization to be associated with state limitations, and discuss the dilemma of relating decentralization, which is the enhancement of the different, to equity, which is the promotion of equivalence. Those situations in which decentralization can strengthen political domination are described. The authors conclude by setting out a checklist of warning questions and issues to be taken into account to ensure that decentralization genuinely facilitates the Primary Health Care orientation of health policy.

  6. Evaluating Benefits of LID Practices at Multiple Spatial Scales Using SUSTAIN

    EPA Science Inventory

    Low impact development (LID) is a storm water management approach that essentially mimics the way nature works: infiltrate, filter, store, evaporate, and detain runoff close to its source. LID practices are distributed in nature, and they work on decentralized micro-scales and m...

  7. The Micropolitics of School District Decentralization

    ERIC Educational Resources Information Center

    Bjork, Lars G.; Blase, Joseph

    2009-01-01

    This case study of school district educational reform in the United States adds to the knowledge base of macropolitics of federal, state and local governing bodies and private sector agencies in formulating educational policies: It also contributes to our understanding the microplitics of policy implementation. Middle managers' political…

  8. Defense Management in the 1980s: The Role of the Service Secretaries,

    DTIC Science & Technology

    1980-10-01

    to be mananged fully from the top. Even without decentralization, a Service Secretary’s responsibilities could be staggering. He alone is responsible...in military organizations those difficulties are 24 particularly acute. In his classic study of innovation and the military. Edward Katzenbach

  9. NATIONAL RESEARCH NEEDS CONFERENCE PROCEEDINGS: RISK-BASED DECISION MAKING FOR ONSITE WASTEWATER TREATMENT

    EPA Science Inventory

    The Research Needs Conference Proceedings consist of a description of the background for the project and a series of white papers on the topics of integrated risk assessment/management for decentralized wastewater systems, design and performance of onsite soil adsorption systems,...

  10. Incorporating Brokers within Collaboration Environments

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; de Torcy, A.

    2013-12-01

    A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.

  11. Leadership characteristics and business management in modern academic surgery.

    PubMed

    Büchler, Peter; Martin, David; Knaebel, Hanns-Peter; Büchler, Markus W

    2006-04-01

    Management skills are necessary to successfully lead a surgical department in future. This article focuses on practical aspects of surgical management, leadership and training. It demonstrates how the implementation of business management concepts changes workflow management and surgical training. A systematic Medline search was performed and business management publications were analysed. Neither management nor leadership skills are inborn but acquired. Management is about planning, controlling and putting appropriate structures in place. Leadership is anticipating and coping with change and people, and adopting a visionary stance. More change requires more leadership. Changes in surgery occur with unprecedented speed because of a growing demand for surgical procedures with limited financial resources. Modern leadership and management theories have to be tailored to surgery. It is clear that not all of them are applicable but some of them are essential for surgeons. In business management, common traits of successful leaders include team orientation and communication skills. As the most important character, however, appears to be the emotional intelligence. Novel training concepts for surgeons include on-the-job training and introduction of improved workflow management systems, e.g. the central case management. The need for surgeons with advanced skills in business, finance and organisational management is evident and will require systematic and tailored training.

  12. DietPal: A Web-Based Dietary Menu-Generating and Management System

    PubMed Central

    Abdullah, Siti Norulhuda; Shahar, Suzana; Abdul-Hamid, Helmi; Khairudin, Nurkahirizan; Yusoff, Mohamed; Ghazali, Rafidah; Mohd-Yusoff, Nooraini; Shafii, Nik Shanita; Abdul-Manaf, Zaharah

    2004-01-01

    Background Attempts in current health care practice to make health care more accessible, effective, and efficient through the use of information technology could include implementation of computer-based dietary menu generation. While several of such systems already exist, their focus is mainly to assist healthy individuals calculate their calorie intake and to help monitor the selection of menus based upon a prespecified calorie value. Although these prove to be helpful in some ways, they are not suitable for monitoring, planning, and managing patients' dietary needs and requirements. This paper presents a Web-based application that simulates the process of menu suggestions according to a standard practice employed by dietitians. Objective To model the workflow of dietitians and to develop, based on this workflow, a Web-based system for dietary menu generation and management. The system is aimed to be used by dietitians or by medical professionals of health centers in rural areas where there are no designated qualified dietitians. Methods First, a user-needs study was conducted among dietitians in Malaysia. The first survey of 93 dietitians (with 52 responding) was an assessment of information needed for dietary management and evaluation of compliance towards a dietary regime. The second study consisted of ethnographic observation and semi-structured interviews with 14 dietitians in order to identify the workflow of a menu-suggestion process. We subsequently designed and developed a Web-based dietary menu generation and management system called DietPal. DietPal has the capability of automatically calculating the nutrient and calorie intake of each patient based on the dietary recall as well as generating suitable diet and menu plans according to the calorie and nutrient requirement of the patient, calculated from anthropometric measurements. The system also allows reusing stored or predefined menus for other patients with similar health and nutrient requirements. Results We modeled the workflow of menu-suggestion activity currently adhered to by dietitians in Malaysia. Based on this workflow, a Web-based system was developed. Initial post evaluation among 10 dietitians indicates that they are comfortable with the organization of the modules and information. Conclusions The system has the potential of enhancing the quality of services with the provision of standard and healthy menu plans and at the same time increasing outreach, particularly to rural areas. With its potential capability of optimizing the time spent by dietitians to plan suitable menus, more quality time could be spent delivering nutrition education to the patients. PMID:15111270

  13. DietPal: a Web-based dietary menu-generating and management system.

    PubMed

    Noah, Shahrul A; Abdullah, Siti Norulhuda; Shahar, Suzana; Abdul-Hamid, Helmi; Khairudin, Nurkahirizan; Yusoff, Mohamed; Ghazali, Rafidah; Mohd-Yusoff, Nooraini; Shafii, Nik Shanita; Abdul-Manaf, Zaharah

    2004-01-30

    Attempts in current health care practice to make health care more accessible, effective, and efficient through the use of information technology could include implementation of computer-based dietary menu generation. While several of such systems already exist, their focus is mainly to assist healthy individuals calculate their calorie intake and to help monitor the selection of menus based upon a prespecified calorie value. Although these prove to be helpful in some ways, they are not suitable for monitoring, planning, and managing patients' dietary needs and requirements. This paper presents a Web-based application that simulates the process of menu suggestions according to a standard practice employed by dietitians. To model the workflow of dietitians and to develop, based on this workflow, a Web-based system for dietary menu generation and management. The system is aimed to be used by dietitians or by medical professionals of health centers in rural areas where there are no designated qualified dietitians. First, a user-needs study was conducted among dietitians in Malaysia. The first survey of 93 dietitians (with 52 responding) was an assessment of information needed for dietary management and evaluation of compliance towards a dietary regime. The second study consisted of ethnographic observation and semi-structured interviews with 14 dietitians in order to identify the workflow of a menu-suggestion process. We subsequently designed and developed a Web-based dietary menu generation and management system called DietPal. DietPal has the capability of automatically calculating the nutrient and calorie intake of each patient based on the dietary recall as well as generating suitable diet and menu plans according to the calorie and nutrient requirement of the patient, calculated from anthropometric measurements. The system also allows reusing stored or predefined menus for other patients with similar health and nutrient requirements. We modeled the workflow of menu-suggestion activity currently adhered to by dietitians in Malaysia. Based on this workflow, a Web-based system was developed. Initial post evaluation among 10 dietitians indicates that they are comfortable with the organization of the modules and information. The system has the potential of enhancing the quality of services with the provision of standard and healthy menu plans and at the same time increasing outreach, particularly to rural areas. With its potential capability of optimizing the time spent by dietitians to plan suitable menus, more quality time could be spent delivering nutrition education to the patients.

  14. Natural Resource Dependency and Decentralized Conservation Within Kanchenjunga Conservation Area Project, Nepal

    NASA Astrophysics Data System (ADS)

    Parker, Pete; Thapa, Brijesh

    2012-02-01

    Kanchenjunga Conservation Area Project (KCAP) in Nepal is among the first protected areas in the world to institute a completely decentralized system of conservation and development. Proponents of decentralized conservation claim that it increases management efficiency, enhances the responsiveness to local needs, and promotes greater equity among local residents. This study assessed local equity by evaluating the levels of dependencies on natural resources among households and the factors affecting that dependency. Data were collected via detailed surveys among 205 randomly selected households within the KCAP. Natural resource dependency was evaluated by comparing the ratio of total household income to income derived from access to natural resources. Economic, social, and access-related variables were employed to determine potential significant predictors of dependency. Overall, households were heavily dependent on natural resources for their income, especially households at higher elevations and those with more adult members. The households that received remittances were most able to supplement their income and, therefore, drastically reduced their reliance on the access to natural resources. Socio-economic variables, such as land holdings, education, caste, and ethnicity, failed to predict dependency. Household participation in KCAP-sponsored training programs also failed to affect household dependency; however, fewer than 20% of the households had any form of direct contact with KCAP personnel within the past year. The success of the KCAP as a decentralized conservation program is contingent on project capacity-building via social mobilization, training programs, and participatory inclusion in decision making to help alleviate the dependency on natural resources.

  15. Impacts of a Large Decentralized Telepathology Network in Canada.

    PubMed

    Pare, Guy; Meyer, Julien; Trudel, Marie-Claude; Tetu, Bernard

    2016-03-01

    Telepathology is a fast growing segment of the telemedicine field. As of yet, no prior research has investigated the impacts of large decentralized telepathology projects on patients, clinicians, and healthcare systems. This study aims to fill this gap. We report a benefits evaluation study of a large decentralized telepathology project deployed in Eastern Quebec, Canada whose main objective is to provide continuous coverage of intraoperative consultations in remote hospitals without pathologists on-site. The project involves 18 hospitals, making it one of the largest telepathology networks in the world. We conducted 43 semistructured interviews with several telepathology users and hospital managers. Archival data on the impacts of the telepathology project (e.g., number of service disruptions, average time between initial diagnosis and surgery) were also extracted and analyzed. Our findings show that no service disruptions were recorded in hospitals without pathologists following the deployment of telepathology. Surgeons noted that the use of intraoperative consultations enabled by telepathology helped avoid second surgeries and improved accessibility to care services. Telepathology was also perceived by our respondents as having positive impacts on the remote hospitals' ability to retain and recruit surgeons. The observed benefits should not leave the impression that implementing telepathology is a trivial matter. Indeed, many technical, human, and organizational challenges may be encountered. Telepathology can be highly useful in regional hospitals that do not have a pathologist on-site. More research is needed to investigate the challenges and benefits associated with large decentralized telepathology networks.

  16. Natural resource dependency and decentralized conservation within Kanchenjunga Conservation Area Project, Nepal.

    PubMed

    Parker, Pete; Thapa, Brijesh

    2012-02-01

    Kanchenjunga Conservation Area Project (KCAP) in Nepal is among the first protected areas in the world to institute a completely decentralized system of conservation and development. Proponents of decentralized conservation claim that it increases management efficiency, enhances the responsiveness to local needs, and promotes greater equity among local residents. This study assessed local equity by evaluating the levels of dependencies on natural resources among households and the factors affecting that dependency. Data were collected via detailed surveys among 205 randomly selected households within the KCAP. Natural resource dependency was evaluated by comparing the ratio of total household income to income derived from access to natural resources. Economic, social, and access-related variables were employed to determine potential significant predictors of dependency. Overall, households were heavily dependent on natural resources for their income, especially households at higher elevations and those with more adult members. The households that received remittances were most able to supplement their income and, therefore, drastically reduced their reliance on the access to natural resources. Socio-economic variables, such as land holdings, education, caste, and ethnicity, failed to predict dependency. Household participation in KCAP-sponsored training programs also failed to affect household dependency; however, fewer than 20% of the households had any form of direct contact with KCAP personnel within the past year. The success of the KCAP as a decentralized conservation program is contingent on project capacity-building via social mobilization, training programs, and participatory inclusion in decision making to help alleviate the dependency on natural resources.

  17. Patient Experiences of Decentralized HIV Treatment and Care in Plateau State, North Central Nigeria: A Qualitative Study

    PubMed Central

    Kolawole, Grace O.; Gilbert, Hannah N.; Dadem, Nancin Y.; Genberg, Becky L.; Agbaji, Oche O.

    2017-01-01

    Background. Decentralization of care and treatment for HIV infection in Africa makes services available in local health facilities. Decentralization has been associated with improved retention and comparable or superior treatment outcomes, but patient experiences are not well understood. Methods. We conducted a qualitative study of patient experiences in decentralized HIV care in Plateau State, north central Nigeria. Five decentralized care sites in the Plateau State Decentralization Initiative were purposefully selected. Ninety-three patients and 16 providers at these sites participated in individual interviews and focus groups. Data collection activities were audio-recorded and transcribed. Transcripts were inductively content analyzed to derive descriptive categories representing patient experiences of decentralized care. Results. Patient participants in this study experienced the transition to decentralized care as a series of “trade-offs.” Advantages cited included saving time and money on travel to clinic visits, avoiding dangers on the road, and the “family-like atmosphere” found in some decentralized clinics. Disadvantages were loss of access to ancillary services, reduced opportunities for interaction with providers, and increased risk of disclosure. Participants preferred decentralized services overall. Conclusion. Difficulty and cost of travel remain a fundamental barrier to accessing HIV care outside urban centers, suggesting increased availability of community-based services will be enthusiastically received. PMID:28331636

  18. Patient Experiences of Decentralized HIV Treatment and Care in Plateau State, North Central Nigeria: A Qualitative Study.

    PubMed

    Kolawole, Grace O; Gilbert, Hannah N; Dadem, Nancin Y; Genberg, Becky L; Agaba, Patricia A; Okonkwo, Prosper; Agbaji, Oche O; Ware, Norma C

    2017-01-01

    Background. Decentralization of care and treatment for HIV infection in Africa makes services available in local health facilities. Decentralization has been associated with improved retention and comparable or superior treatment outcomes, but patient experiences are not well understood. Methods. We conducted a qualitative study of patient experiences in decentralized HIV care in Plateau State, north central Nigeria. Five decentralized care sites in the Plateau State Decentralization Initiative were purposefully selected. Ninety-three patients and 16 providers at these sites participated in individual interviews and focus groups. Data collection activities were audio-recorded and transcribed. Transcripts were inductively content analyzed to derive descriptive categories representing patient experiences of decentralized care. Results. Patient participants in this study experienced the transition to decentralized care as a series of "trade-offs." Advantages cited included saving time and money on travel to clinic visits, avoiding dangers on the road, and the "family-like atmosphere" found in some decentralized clinics. Disadvantages were loss of access to ancillary services, reduced opportunities for interaction with providers, and increased risk of disclosure. Participants preferred decentralized services overall. Conclusion. Difficulty and cost of travel remain a fundamental barrier to accessing HIV care outside urban centers, suggesting increased availability of community-based services will be enthusiastically received.

  19. Cost-effectiveness of two operational models at industrial wastewater treatment plants in China: a case study in Shengze town, Suzhou City.

    PubMed

    Yuan, Zengwei; Jiang, Weili; Bi, Jun

    2010-10-01

    The widespread illegal discharge of industrial wastewater in China has posed significant challenges to the effective management of industrial wastewater treatment plants (IWTPs) and caused or exacerbated critical social issues such as trans-boundary environmental pollution. This study examines two operational strategies, decentralized model and an innovative integrated model, that have been used in the industrial town of Shengze (located in Suzhou City) over the past two decades at IWTPs handling wastewater from the city's dyeing industry. Our cost-effectiveness analysis shows that, although the operational cost of IWTPs under the integrated model is higher than under the original decentralized model, the integrated model has significantly improved IWTP performance and effectively reduced illegal discharge of industrial wastewater. As a result, the number of reported incidents of unacceptable pollution in local receiving water bodies had declined from 13 in 2000-1 in 2008. Key factors contributing to the success of the innovative integrated model are strong support from municipal and provincial leaders, mandatory ownership transfer of IWTPs to a centralized management body, strong financial incentives for proper plant management, and geographically-clustered IWTPs. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  20. Applications of process improvement techniques to improve workflow in abdominal imaging.

    PubMed

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  1. Automation in an addiction treatment research clinic: computerised contingency management, ecological momentary assessment and a protocol workflow system.

    PubMed

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L

    2009-01-01

    A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.

  2. Strategic Planning for Electronic Resources Management: A Case Study at Gustavus Adolphus College

    ERIC Educational Resources Information Center

    Hulseberg, Anna; Monson, Sarah

    2009-01-01

    Electronic resources, the tools we use to manage them, and the needs and expectations of our users are constantly evolving; at the same time, the roles, responsibilities, and workflow of the library staff who manage e-resources are also in flux. Recognizing a need to be more intentional and proactive about how we manage e-resources, the…

  3. Contribution of Optical Zone Decentration and Pupil Dilation on the Change of Optical Quality After Myopic Photorefractive Keratectomy in a Cat Model

    PubMed Central

    Bühren, Jens; Yoon, Geunyoung; MacRae, Scott; Huxlin, Krystel

    2010-01-01

    PURPOSE To simulate the simultaneous contribution of optical zone decentration and pupil dilation on retinal image quality using wavefront error data from a myopic photorefractive keratectomy (PRK) cat model. METHODS Wavefront error differences were obtained from five cat eyes 19±7 weeks (range: 12 to 24 weeks) after spherical myopic PRK for −6.00 diopters (D) (three eyes) and −10.00 D (two eyes). A computer model was used to simulate decentration of a 6-mm sub-aperture relative to the measured wavefront error difference. Changes in image quality (visual Strehl ratio based on the optical transfer function [VSOTF]) were computed for simulated decentrations from 0 to 1500 μm over pupil diameters of 3.5 to 6.0 mm in 0.5-mm steps. For each eye, a bivariate regression model was applied to calculate the simultaneous contribution of pupil dilation and decentration on the pre- to postoperative change of the log VSOTF. RESULTS Pupil diameter and decentration explained up to 95% of the variance of VSOTF change (adjusted R2=0.95). Pupil diameter had a higher impact on VSOTF (median β=−0.88, P<.001) than decentration (median β= −0.45, P<.001). If decentration-induced lower order aberrations were corrected, the impact of decentration further decreased (β= −0.26) compared to the influence of pupil dilation (β= −0.95). CONCLUSIONS Both pupil dilation and decentration of the optical zone affected the change of retinal image quality (VSOTF) after myopic PRK with decentration exerting a lower impact on VSOTF change. Thus, under physiological conditions pupil dilation is likely to have more effect on VSOTF change after PRK than optical zone decentration. PMID:20229950

  4. Contribution of optical zone decentration and pupil dilation on the change of optical quality after myopic photorefractive keratectomy in a cat model.

    PubMed

    Bühren, Jens; Yoon, Geunyoung; MacRae, Scott; Huxlin, Krystel

    2010-03-01

    To simulate the simultaneous contribution of optical zone decentration and pupil dilation on retinal image quality using wavefront error data from a myopic photorefractive keratectomy (PRK) cat model. Wavefront error differences were obtained from five cat eyes 19+/-7 weeks (range: 12 to 24 weeks) after spherical myopic PRK for -6.00 diopters (D) (three eyes) and -10.00 D (two eyes). A computer model was used to simulate decentration of a 6-mm sub-aperture relative to the measured wavefront error difference. Changes in image quality (visual Strehl ratio based on the optical transfer function [VSOTF]) were computed for simulated decentrations from 0 to 1500 mum over pupil diameters of 3.5 to 6.0 mm in 0.5-mm steps. For each eye, a bivariate regression model was applied to calculate the simultaneous contribution of pupil dilation and decentration on the pre- to postoperative change of the log VSOTF. Pupil diameter and decentration explained up to 95% of the variance of VSOTF change (adjusted R(2)=0.95). Pupil diameter had a higher impact on VSOTF (median beta=-0.88, P<.001) than decentration (median beta=-0.45, P<.001). If decentration-induced lower order aberrations were corrected, the impact of decentration further decreased (beta=-0.26) compared to the influence of pupil dilation (beta=-0.95). Both pupil dilation and decentration of the optical zone affected the change of retinal image quality (VSOTF) after myopic PRK with decentration exerting a lower impact on VSOTF change. Thus, under physiological conditions pupil dilation is likely to have more effect on VSOTF change after PRK than optical zone decentration. Copyright 2010, SLACK Incorporated.

  5. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  6. SMITH: a LIMS for handling next-generation sequencing workflows

    PubMed Central

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934

  7. SMITH: a LIMS for handling next-generation sequencing workflows.

    PubMed

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis.

  8. Toward a geoinformatics framework for understanding the social and biophysical influences on urban nutrient pollution due to residential impervious service connectivity

    NASA Astrophysics Data System (ADS)

    Miles, B.; Band, L. E.

    2012-12-01

    Water sustainability has been recognized as a fundamental problem of science whose solution relies in part on high-performance computing. Stormwater management is a major concern of urban sustainability. Understanding interactions between urban landcover and stormwater nutrient pollution requires consideration of fine-scale residential stormwater management, which in turn requires high-resolution LIDAR and landcover data not provided through national spatial data infrastructure, as well as field observation at the household scale. The objectives of my research are twofold: (1) advance understanding of the relationship between residential stormwater management practices and the export of nutrient pollution from stormwater in urbanized ecosystems; and (2) improve the informatics workflows used in community ecohydrology modeling as applied to heterogeneous urbanized ecosystems. In support of these objectives, I present preliminary results from initial work to: (1) develop an ecohydrology workflow platform that automates data preparation while maintaining data provenance and model metadata to yield reproducible workflows and support model benchmarking; (2) perform field observation of existing patterns of residential rooftop impervious surface connectivity to stormwater networks; and (3) develop Regional Hydro-Ecological Simulation System (RHESSys) models for watersheds in Baltimore, MD (as part of the Baltimore Ecosystem Study (BES) NSF Long-Term Ecological Research (LTER) site) and Durham, NC (as part of the NSF Urban Long-Term Research Area (ULTRA) program); these models will be used to simulate nitrogen loading resulting from both baseline residential rooftop impervious connectivity and for disconnection scenarios (e.g. roof drainage to lawn v. engineered rain garden, upslope v. riparian). This research builds on work done as part of the NSF EarthCube Layered Architecture Concept Award where a RHESSys workflow is being implemented in an iRODS (integrated Rule-Oriented Data System) environment. Modeling the ecohydrology of urban ecosystems in a reliable and reproducible manner requires a flexible scientific workflow platform that allows rapid prototyping with large-scale spatial datasets and model refinement integrating expert knowledge with local datasets and household surveys.

  9. A data management and publication workflow for a large-scale, heterogeneous sensor network.

    PubMed

    Jones, Amber Spackman; Horsburgh, Jeffery S; Reeder, Stephanie L; Ramírez, Maurier; Caraballo, Juan

    2015-06-01

    It is common for hydrology researchers to collect data using in situ sensors at high frequencies, for extended durations, and with spatial distributions that produce data volumes requiring infrastructure for data storage, management, and sharing. The availability and utility of these data in addressing scientific questions related to water availability, water quality, and natural disasters relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into usable data products. It also depends on the ability of researchers to share and access the data in useable formats. In this paper, we describe a data management and publication workflow and software tools for research groups and sites conducting long-term monitoring using in situ sensors. Functionality includes the ability to track monitoring equipment inventory and events related to field maintenance. Linking this information to the observational data is imperative in ensuring the quality of sensor-based data products. We present these tools in the context of a case study for the innovative Urban Transitions and Aridregion Hydrosustainability (iUTAH) sensor network. The iUTAH monitoring network includes sensors at aquatic and terrestrial sites for continuous monitoring of common meteorological variables, snow accumulation and melt, soil moisture, surface water flow, and surface water quality. We present the overall workflow we have developed for effectively transferring data from field monitoring sites to ultimate end-users and describe the software tools we have deployed for storing, managing, and sharing the sensor data. These tools are all open source and available for others to use.

  10. Purdue ionomics information management system. An integrated functional genomics platform.

    PubMed

    Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S; Salt, David E

    2007-02-01

    The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics.

  11. The Internet and World-Wide-Web: Potential Benefits to Rural Schools.

    ERIC Educational Resources Information Center

    Barker, Bruce O.

    The Internet is a decentralized collection of computer networks managed by separate groups using a common set of technical standards. The Internet has tremendous potential as an educational resource by providing access to networking through worldwide electronic mail, various databases, and electronic bulletin boards; collaborative investigation…

  12. The Traditional Centralized Model of Institutional Research: Its Derivation & Evolution at One College.

    ERIC Educational Resources Information Center

    Slark, Julie

    A description is provided of Rancho Santiago College's institutional research program, which uses a traditional centralized research model, augmented with alternative, decentralized approaches. First, background information is presented on the college and the role of the research office in management, decision-making, and educational support.…

  13. On Deming and School Quality: A Conversation with Enid Brown.

    ERIC Educational Resources Information Center

    Brandt, Ron

    1992-01-01

    A Deming expert explains that his 14 principles are no recipe but must be combined with the theory of profound knowledge, which poses essential questions and recognizes the importance of human variation, intrinsic motivation, and external rewards. She also debunks grading, formal teacher evaluation, tracking, and decentralized management. (MLH)

  14. Coping with Downsizing as a Writing and Editing Group.

    ERIC Educational Resources Information Center

    Steve, Mike; Bigelow, Tom

    1993-01-01

    Maintains that writers and editors are likely candidates for downsizing within an organization. Notes that centralization-decentralization factors are valuable in addressing downsizing, as is knowledge of corporate management's point of view toward its investment in writing and editing. Offers five self-assessment scenarios to help prepare for the…

  15. Educational Reform in England and the United States: The Significance of Contextual Differences.

    ERIC Educational Resources Information Center

    Swanson, Austin D.

    1995-01-01

    Explores British and American policy analysts' disparate perceptions about utility of major educational reforms (school-based management, school choice, and national standards). The Tories got carried away with their ideological agenda and have alienated teachers and the public. American educators are debating centralization/decentralization, but…

  16. Incorporating Social and Human Capital into an Experimental Approach to Urban Water Resources Management

    EPA Science Inventory

    To test the benefits of decentralized Green Infrastructure (GI) in an urban setting, we aimed to install GI in the Shepherd Creek Watershed of Cincinnati. The primary stressor in Shepherd Creek is stormwater runoff. An assessment of the total impervious surface area in the waters...

  17. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  18. 32 CFR 185.5 - Responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... development of an MSCA data base and emergency reporting system, as described in paragraph (j) of this section... parameters of the DoD Resources Data Base (DODRDB) for MSCA, which is described in paragraph (n) of this section. Facilitate use of that data base to support decentralized management of MSCA in time of emergency...

  19. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  20. Human-Guided Management of Collaborating Unmanned Vehicles in Degraded Communication Environments

    DTIC Science & Technology

    2010-05-01

    system operated by the U.S. Air Force exemplifies the utility of unmanned systems. Predator drones have been employed extensively in intelligence...Massachusetts, USA), 2005. [50] Flint, M., T. Khobanova, and M. Curry , “Decentralized control using global optimization,” in Pro- ceedings of the AIAA

  1. Distributed Leadership and Contract Relations: Evidence from Turkish High Schools

    ERIC Educational Resources Information Center

    Özdemir, Murat; Demircioglu, Ebru

    2015-01-01

    In recent years educational organizations have begun to be administered by more sharing, participation and democratic principles. The school-based management approach accelerated during the decentralization period in education is also seen as a cause for spread of leadership throughout the school. This trend is reflected in the educational…

  2. Restructuring American Schools: The Promise and the Pitfalls. Conference Paper No. 10.

    ERIC Educational Resources Information Center

    McDonnell, Lorraine M.

    Poor educational performance and the changing nature of work and workers have prompted calls for a major restructuring of American schools. The following broad categories of restructuring options are discussed and supporting research is reviewed: (1) decentralizing authority over schooling through school-based management, more professional…

  3. Ten Years of Experience with a Performance-Based Promotional Selection and Career Development System within State Government.

    ERIC Educational Resources Information Center

    Baugher, Dan; And Others

    1994-01-01

    The New York State Division of Budget uses a decentralized system to assess promotion candidates by comparing their training, experience, and recent performance to the proposed position. Managers and candidates find the system more effective than traditional written/oral exams. (SK)

  4. Is "Closing the Gap" Enough? Ngarrindjeri Ontologies, Reconciliation and Caring for Country

    ERIC Educational Resources Information Center

    Rigney, Daryle; Hemming, Steve

    2014-01-01

    This article is concerned with Ngarrindjeri nation building in the "contact zone" with the Australian settler state by decentring the colonizer within a range of bureaucratic regimes. Ngarrindjeri engagement with natural resource and cultural heritage management will be used to illustrate the relationship between globalization, community…

  5. A Coordinated Decentralized Approach to Online Project Development

    ERIC Educational Resources Information Center

    Mykota, David

    2013-01-01

    With the growth rate of online learning outpacing traditional face-to-face instruction, universities are beginning to recognize the importance of strategic planning in its development. Making the case for online learning requires sound project management practices and an understanding of the business models on which it is predicated. The objective…

  6. Friction Point Rating: A Blueprint for Selective Decentralization in School Systems

    ERIC Educational Resources Information Center

    Ponder, A. A.; Bulcock, J. W.

    1976-01-01

    Describes a survey of elementary teachers in Newfoundland that investigated the extent to which teachers saw themselves as participants in educational decision-making compared to their desired degree of involvement. (Available from Canadian Administrator Business Manager, Department of Educational Administration, The University of Alberta,…

  7. Who Benefits from School-Based Management in Mexico?

    ERIC Educational Resources Information Center

    Reimers, Fernando; Cardenas, Sergio

    2007-01-01

    In this article the authors examine evidence pertaining to the implementation of a national programme of school-based decentralization, the Quality Schools Programme ("Programa de Escuelas de Calidad"). The main argument of this article is that high levels of inequality in the institutional capacity of different schools and in the…

  8. An Example of Decentralized Management in Education: Provincial Directory Model

    ERIC Educational Resources Information Center

    Ada, Sefer; Baysal, Z. Nurdan; Erkan, Senem Seda Sahenk

    2016-01-01

    In Turkey, two types of administrative structures existed in the fields of National Education: "the central" and "provincial" institutions. However, between 1926-1931, the Locality model was implemented. Locality can be considered as a local administration formed in the provincial organization of the Ministry of Education by…

  9. Decentralisation and Regionalisation in Educational Administration: Comparisons of Venezuela, Colombia and Spain.

    ERIC Educational Resources Information Center

    Hanson, E. Mark

    1989-01-01

    Compares administrative reforms and decentralization in the public educational systems of 3 Hispanic countries 10 years after transition to democracy. Discusses the effects of collaboration and compromise among political parties, incremental approaches to change, continuity of policies, costs and resource management, and formalization of…

  10. Rethinking Partnerships on a Decentralized Campus

    ERIC Educational Resources Information Center

    Dufault, Katie H.

    2017-01-01

    Decentralization is an effective approach for structuring campus learning and success centers. McShane & Von Glinow (2007) describe decentralization as "an organizational model where decision authority and power are dispersed among units rather than held by a single small group of administrators" (p. 237). A decentralized structure…

  11. Grid-based platform for training in Earth Observation

    NASA Astrophysics Data System (ADS)

    Petcu, Dana; Zaharie, Daniela; Panica, Silviu; Frincu, Marc; Neagul, Marian; Gorgan, Dorian; Stefanut, Teodor

    2010-05-01

    GiSHEO platform [1] providing on-demand services for training and high education in Earth Observation is developed, in the frame of an ESA funded project through its PECS programme, to respond to the needs of powerful education resources in remote sensing field. It intends to be a Grid-based platform of which potential for experimentation and extensibility are the key benefits compared with a desktop software solution. Near-real time applications requiring simultaneous multiple short-time-response data-intensive tasks, as in the case of a short time training event, are the ones that are proved to be ideal for this platform. The platform is based on Globus Toolkit 4 facilities for security and process management, and on the clusters of four academic institutions involved in the project. The authorization uses a VOMS service. The main public services are the followings: the EO processing services (represented through special WSRF-type services); the workflow service exposing a particular workflow engine; the data indexing and discovery service for accessing the data management mechanisms; the processing services, a collection allowing easy access to the processing platform. The WSRF-type services for basic satellite image processing are reusing free image processing tools, OpenCV and GDAL. New algorithms and workflows were develop to tackle with challenging problems like detecting the underground remains of old fortifications, walls or houses. More details can be found in [2]. Composed services can be specified through workflows and are easy to be deployed. The workflow engine, OSyRIS (Orchestration System using a Rule based Inference Solution), is based on DROOLS, and a new rule-based workflow language, SILK (SImple Language for worKflow), has been built. Workflow creation in SILK can be done with or without a visual designing tools. The basics of SILK are the tasks and relations (rules) between them. It is similar with the SCUFL language, but not relying on XML in order to allow the introduction of more workflow specific issues. Moreover, an event-condition-action (ECA) approach allows a greater flexibility when expressing data and task dependencies, as well as the creation of adaptive workflows which can react to changes in the configuration of the Grid or in the workflow itself. Changes inside the grid are handled by creating specific rules which allow resource selection based on various task scheduling criteria. Modifications of the workflow are usually accomplished either by inserting or retracting at runtime rules belonging to it or by modifying the executor of the task in case a better one is found. The former implies changes in its structure while the latter does not necessarily mean changes of the resource but more precisely changes of the algorithm used for solving the task. More details can be found in [3]. Another important platform component is the data indexing and storage service, GDIS, providing features for data storage, indexing data using a specialized RDBMS, finding data by various conditions, querying external services and keeping track of temporary data generated by other components. The data storage component part of GDIS is responsible for storing the data by using available storage backends such as local disk file systems (ext3), local cluster storage (GFS) or distributed file systems (HDFS). A front-end GridFTP service is capable of interacting with the storage domains on behalf of the clients and in a uniform way and also enforces the security restrictions provided by other specialized services and related with data access. The data indexing is performed by PostGIS. An advanced and flexible interface for searching the project's geographical repository is built around a custom query language (LLQL - Lisp Like Query Language) designed to provide fine grained access to the data in the repository and to query external services (e.g. for exploiting the connection with GENESI-DR catalog). More details can be found in [4]. The Workload Management System (WMS) provides two types of resource managers. The first one will be based on Condor HTC and use Condor as a job manager for task dispatching and working nodes (for development purposes) while the second one will use GT4 GRAM (for production purposes). The WMS main component, the Grid Task Dispatcher (GTD), is responsible for the interaction with other internal services as the composition engine in order to facilitate access to the processing platform. Its main responsibilities are to receive tasks from the workflow engine or directly from user interface, to use a task description language (the ClassAd meta language in case of Condor HTC) for job units, to submit and check the status of jobs inside the workload management system and to retrieve job logs for debugging purposes. More details can be found in [4]. A particular component of the platform is eGLE, the eLearning environment. It provides the functionalities necessary to create the visual appearance of the lessons through the usage of visual containers like tools, patterns and templates. The teacher uses the platform for testing the already created lessons, as well as for developing new lesson resources, such as new images and workflows describing graph-based processing. The students execute the lessons or describe and experiment with new workflows or different data. The eGLE database includes several workflow-based lesson descriptions, teaching materials and lesson resources, selected satellite and spatial data. More details can be found in [5]. A first training event of using the platform was organized in September 2009 during 11th SYNASC symposium (links to the demos, testing interface, and exercises are available on project site [1]). The eGLE component was presented at 4th GPC conference in May 2009. Moreover, the functionality of the platform will be presented as demo in April 2010 at 5th EGEE User forum. References: [1] GiSHEO consortium, Project site, http://gisheo.info.uvt.ro [2] D. Petcu, D. Zaharie, M. Neagul, S. Panica, M. Frincu, D. Gorgan, T. Stefanut, V. Bacu, Remote Sensed Image Processing on Grids for Training in Earth Observation. In Image Processing, V. Kordic (ed.), In-Tech, January 2010. [3] M. Neagul, S. Panica, D. Petcu, D. Zaharie, D. Gorgan, Web and Grid Services for Training in Earth Observation, IDAACS 2009, IEEE Computer Press, 241-246 [4] M. Frincu, S. Panica, M. Neagul, D. Petcu, Gisheo: On Demand Grid Service Based Platform for EO Data Processing. HiperGrid 2009, Politehnica Press, 415-422. [5] D. Gorgan, T. Stefanut, V. Bacu, Grid Based Training Environment for Earth Observation, GPC 2009, LNCS 5529, 98-109

  12. A Foundation for Enterprise Imaging: HIMSS-SIIM Collaborative White Paper.

    PubMed

    Roth, Christopher J; Lannum, Louis M; Persons, Kenneth R

    2016-10-01

    Care providers today routinely obtain valuable clinical multimedia with mobile devices, scope cameras, ultrasound, and many other modalities at the point of care. Image capture and storage workflows may be heterogeneous across an enterprise, and as a result, they often are not well incorporated in the electronic health record. Enterprise Imaging refers to a set of strategies, initiatives, and workflows implemented across a healthcare enterprise to consistently and optimally capture, index, manage, store, distribute, view, exchange, and analyze all clinical imaging and multimedia content to enhance the electronic health record. This paper is intended to introduce Enterprise Imaging as an important initiative to clinical and informatics leadership, and outline its key elements of governance, strategy, infrastructure, common multimedia content, acquisition workflows, enterprise image viewers, and image exchange services.

  13. Systematic Redaction for Neuroimage Data

    PubMed Central

    Matlock, Matt; Schimke, Nakeisha; Kong, Liang; Macke, Stephen; Hale, John

    2013-01-01

    In neuroscience, collaboration and data sharing are undermined by concerns over the management of protected health information (PHI) and personal identifying information (PII) in neuroimage datasets. The HIPAA Privacy Rule mandates measures for the preservation of subject privacy in neuroimaging studies. Unfortunately for the researcher, the management of information privacy is a burdensome task. Wide scale data sharing of neuroimages is challenging for three primary reasons: (i) A dearth of tools to systematically expunge PHI/PII from neuroimage data sets, (ii) a facility for tracking patient identities in redacted datasets has not been produced, and (iii) a sanitization workflow remains conspicuously absent. This article describes the XNAT Redaction Toolkit—an integrated redaction workflow which extends a popular neuroimage data management toolkit to remove PHI/PII from neuroimages. Quickshear defacing is also presented as a complementary technique for deidentifying the image data itself. Together, these tools improve subject privacy through systematic removal of PII/PHI. PMID:24179597

  14. Software Project Management and Measurement on the World-Wide-Web (WWW)

    NASA Technical Reports Server (NTRS)

    Callahan, John; Ramakrishnan, Sudhaka

    1996-01-01

    We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.

  15. [Implementation of modern operating room management -- experiences made at an university hospital].

    PubMed

    Hensel, M; Wauer, H; Bloch, A; Volk, T; Kox, W J; Spies, C

    2005-07-01

    Caused by structural changes in health care the general need for cost control is evident for all hospitals. As operating room is one of the most cost-intensive sectors in a hospital, optimisation of workflow processes in this area is of particular interest for health care providers. While modern operating room management is established in several clinics yet, others are less prepared for economic challenges. Therefore, the operating room statute of the Charité university hospital useful for other hospitals to develop an own concept is presented. In addition, experiences made with implementation of new management structures are described and results obtained over the last 5 years are reported. Whereas the total number of operation procedures increased by 15 %, the operating room utilization increased more markedly in terms of time and cases. Summarizing the results, central operating room management has been proved to be an effective tool to increase the efficiency of workflow processes in the operating room.

  16. Comparative Perspectives on Educational Decentralization: An Exercise in Contradiction?

    ERIC Educational Resources Information Center

    Weiler, Hans N.

    1990-01-01

    It is argued that policies decentralizing the governance of educational systems, although appealing in the abstract, tend to be fundamentally ambivalent and in conflict with powerful forces favoring centralization. Tensions surrounding the issue of decentralization are discussed, with emphasis on the relationship between decentralization and…

  17. Does Decentralization Improve Health System Performance and Outcomes in Low- and Middle-Income Countries? A Systematic Review of Evidence From Quantitative Studies.

    PubMed

    Dwicaksono, Adenantera; Fox, Ashley M

    2018-06-01

    Policy Points: For more than 3 decades, international development agencies have advocated health system decentralization to improve health system performance in low- and middle-income countries. We found little rigorous evidence documenting the impact of decentralization processes on health system performance or outcomes in part due to challenges in measuring such far-reaching and multifaceted system-level changes. We propose a renewed research agenda that focuses on discrete definitions of decentralization and how institutional factors and mechanisms affect health system performance and outcomes within the general context of decentralized governance structures. Despite the widespread adoption of decentralization reforms as a means to improve public service delivery in developing countries since the 1980s, empirical evidence of the role of decentralization on health system improvement is still limited and inconclusive. This study reviewed studies published from 2000 to 2016 with adequate research designs to identify evidence on whether and how decentralization processes have impacted health systems. We conducted a systematic review of peer-reviewed journal articles from the public health and social science literature. We searched for articles within 9 databases using predefined search terms reflecting decentralization and health system constructs. Inclusion criteria were original research articles, low- and middle-income country settings, quantifiable outcome measures, and study designs that use comparisons or statistical adjustments. We excluded studies in high-income country settings and/or published in a non-English language. Sixteen studies met our prespecified inclusion and exclusion criteria and were grouped based on outcomes measured: health system inputs (n = 3), performance (n = 7), and health outcomes (n = 7). Numerous studies addressing conceptual issues related to decentralization but without any attempt at empirical estimation were excluded. Overall, we found mixed results regarding the effects of decentralization on health system indicators with seemingly beneficial effects on health system performance and health outcomes. Only 10 studies were considered to have relatively low risks of bias. This study reveals the limited empirical knowledge of the impact of decentralization on health system performance. Mixed empirical findings on the role of decentralization on health system performance and outcomes highlight the complexity of decentralization processes and their systemwide effects. Thus, we propose a renewed research agenda that focuses on discrete definitions of decentralization and how institutional factors and mechanisms affect health system performance and outcomes within the general context of decentralized governance structures. © 2018 Milbank Memorial Fund.

  18. Theory and applications survey of decentralized control methods

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    A nonmathematical overview is presented of trends in the general area of decentralized control strategies which are suitable for hierarchical systems. Advances in decentralized system theory are closely related to advances in the so-called stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools pertaining to the classical stochastic control problem are outlined. Particular attention is devoted to pitfalls in the mathematical problem formulation for decentralized control. Major conclusions are that any purely deterministic approach to multilevel hierarchical dynamic systems is unlikely to lead to realistic theories or designs, that the flow of measurements and decisions in a decentralized system should not be instantaneous and error-free, and that delays in information exchange in a decentralized system lead to reasonable approaches to decentralized control. A mathematically precise notion of aggregating information is not yet available.

  19. Task–Technology Fit of Video Telehealth for Nurses in an Outpatient Clinic Setting

    PubMed Central

    Finkelstein, Stanley M.

    2014-01-01

    Abstract Background: Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task–technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task–technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. Materials and Methods: The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time–motion study. Qualitative and quantitative results were merged and analyzed within the task–technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Results: Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task–technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Conclusions: Telehealth must provide the right information to the right clinician at the right time. Evaluating task–technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology. PMID:24841219

  20. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  1. Semi-automted analysis of high-resolution aerial images to quantify docks in Upper Midwest glacial lakes

    USGS Publications Warehouse

    Beck, Marcus W.; Vondracek, Bruce C.; Hatch, Lorin K.; Vinje, Jason

    2013-01-01

    Lake resources can be negatively affected by environmental stressors originating from multiple sources and different spatial scales. Shoreline development, in particular, can negatively affect lake resources through decline in habitat quality, physical disturbance, and impacts on fisheries. The development of remote sensing techniques that efficiently characterize shoreline development in a regional context could greatly improve management approaches for protecting and restoring lake resources. The goal of this study was to develop an approach using high-resolution aerial photographs to quantify and assess docks as indicators of shoreline development. First, we describe a dock analysis workflow that can be used to quantify the spatial extent of docks using aerial images. Our approach incorporates pixel-based classifiers with object-based techniques to effectively analyze high-resolution digital imagery. Second, we apply the analysis workflow to quantify docks for 4261 lakes managed by the Minnesota Department of Natural Resources. Overall accuracy of the analysis results was 98.4% (87.7% based on ) after manual post-processing. The analysis workflow was also 74% more efficient than the time required for manual digitization of docks. These analyses have immediate relevance for resource planning in Minnesota, whereas the dock analysis workflow could be used to quantify shoreline development in other regions with comparable imagery. These data can also be used to better understand the effects of shoreline development on aquatic resources and to evaluate the effects of shoreline development relative to other stressors.

  2. [Applications of the hospital statistics management system].

    PubMed

    Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao

    2008-01-01

    The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.

  3. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  4. A knowledge-based decision support system in bioinformatics: an application to protein complex extraction

    PubMed Central

    2013-01-01

    Background We introduce a Knowledge-based Decision Support System (KDSS) in order to face the Protein Complex Extraction issue. Using a Knowledge Base (KB) coding the expertise about the proposed scenario, our KDSS is able to suggest both strategies and tools, according to the features of input dataset. Our system provides a navigable workflow for the current experiment and furthermore it offers support in the configuration and running of every processing component of that workflow. This last feature makes our system a crossover between classical DSS and Workflow Management Systems. Results We briefly present the KDSS' architecture and basic concepts used in the design of the knowledge base and the reasoning component. The system is then tested using a subset of Saccharomyces cerevisiae Protein-Protein interaction dataset. We used this subset because it has been well studied in literature by several research groups in the field of complex extraction: in this way we could easily compare the results obtained through our KDSS with theirs. Our system suggests both a preprocessing and a clustering strategy, and for each of them it proposes and eventually runs suited algorithms. Our system's final results are then composed of a workflow of tasks, that can be reused for other experiments, and the specific numerical results for that particular trial. Conclusions The proposed approach, using the KDSS' knowledge base, provides a novel workflow that gives the best results with regard to the other workflows produced by the system. This workflow and its numeric results have been compared with other approaches about PPI network analysis found in literature, offering similar results. PMID:23368995

  5. Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Tarighati, Alla; Gross, James; Jalden, Joakim

    2017-09-01

    We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.

  6. Leveraging workflow control patterns in the domain of clinical practice guidelines.

    PubMed

    Kaiser, Katharina; Marcos, Mar

    2016-02-10

    Clinical practice guidelines (CPGs) include recommendations describing appropriate care for the management of patients with a specific clinical condition. A number of representation languages have been developed to support executable CPGs, with associated authoring/editing tools. Even with tool assistance, authoring of CPG models is a labor-intensive task. We aim at facilitating the early stages of CPG modeling task. In this context, we propose to support the authoring of CPG models based on a set of suitable procedural patterns described in an implementation-independent notation that can be then semi-automatically transformed into one of the alternative executable CPG languages. We have started with the workflow control patterns which have been identified in the fields of workflow systems and business process management. We have analyzed the suitability of these patterns by means of a qualitative analysis of CPG texts. Following our analysis we have implemented a selection of workflow patterns in the Asbru and PROforma CPG languages. As implementation-independent notation for the description of patterns we have chosen BPMN 2.0. Finally, we have developed XSLT transformations to convert the BPMN 2.0 version of the patterns into the Asbru and PROforma languages. We showed that although a significant number of workflow control patterns are suitable to describe CPG procedural knowledge, not all of them are applicable in the context of CPGs due to their focus on single-patient care. Moreover, CPGs may require additional patterns not included in the set of workflow control patterns. We also showed that nearly all the CPG-suitable patterns can be conveniently implemented in the Asbru and PROforma languages. Finally, we demonstrated that individual patterns can be semi-automatically transformed from a process specification in BPMN 2.0 to executable implementations in these languages. We propose a pattern and transformation-based approach for the development of CPG models. Such an approach can form the basis of a valid framework for the authoring of CPG models. The identification of adequate patterns and the implementation of transformations to convert patterns from a process specification into different executable implementations are the first necessary steps for our approach.

  7. Basic principles of information technology organization in health care institutions.

    PubMed

    Mitchell, J A

    1997-01-01

    This paper focuses on the basic principles of information technology (IT) organization within health sciences centers. The paper considers the placement of the leader of the IT effort within the health sciences administrative structure and the organization of the IT unit. A case study of the University of Missouri-Columbia Health Sciences Center demonstrates how a role-based organizational model for IT support can be effective for determining the boundary between centralized and decentralized organizations. The conclusions are that the IT leader needs to be positioned with other institutional leaders who are making strategic decisions, and that the internal IT structure needs to be a role-based hybrid of centralized and decentralized units. The IT leader needs to understand the mission of the organization and actively use change-management techniques.

  8. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  9. ASCEM Data Brower (ASCEMDB) v0.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ROMOSAN, ALEXANDRU

    Data management tool designed for the Advanced Simulation Capability for Environmental Management (ASCEM) framework. Distinguishing features of this gateway include: (1) handling of complex geometry data, (2) advance selection mechanism, (3) state of art rendering of spatiotemporal data records, and (4) seamless integration with a distributed workflow engine.

  10. Electronic Resource Management 2.0: Using Web 2.0 Technologies as Cost-Effective Alternatives to an Electronic Resource Management System

    ERIC Educational Resources Information Center

    Murray, Adam

    2008-01-01

    Designed to assist with the management of e-resources, electronic resource management (ERM) systems are time- and fund-consuming to purchase and maintain. Questions of system compatibility, data population, and workflow design/redesign can be difficult to answer; sometimes those answers are not what we'd prefer to hear. The two primary functions…

  11. Distributed resource allocation under communication constraints

    NASA Astrophysics Data System (ADS)

    Dodin, Pierre; Nimier, Vincent

    2001-03-01

    This paper deals with a study of the multi-sensor management problem for multi-target tracking. The collaboration between many sensors observing the same target means that they are able to fuse their data during the information process. Then one must take into account this possibility to compute the optimal association sensors-target at each step of time. In order to solve this problem for real large scale system, one must both consider the information aspect and the control aspect of the problem. To unify these problems, one possibility is to use a decentralized filtering algorithm locally driven by an assignment algorithm. The decentralized filtering algorithm we use in our model is the filtering algorithm of Grime, which relaxes the usual full-connected hypothesis. By full-connected, one means that the information in a full-connected system is totally distributed everywhere at the same moment, which is unacceptable for a real large scale system. We modelize the distributed assignment decision with the help of a greedy algorithm. Each sensor performs a global optimization, in order to estimate other information sets. A consequence of the relaxation of the full- connected hypothesis is that the sensors' information set are not the same at each step of time, producing an information dis- symmetry in the system. The assignment algorithm uses a local knowledge of this dis-symmetry. By testing the reactions and the coherence of the local assignment decisions of our system, against maneuvering targets, we show that it is still possible to manage with decentralized assignment control even though the system is not full-connected.

  12. Using Economic Experiments to Test Electricity Policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiesling, Lynne

    2005-11-01

    The industry's history of central generation, coordination, and regulation breeds a natural suspicion of whether or not decentralized coordination and a more market-based, decentralized regulatory approach can work. To see how people will behave in a decentralized environment with decentralized institutions, one must test the environment and institutions experimentally, with real people.

  13. Educational Decentralization, Public Spending, and Social Justice in Nigeria

    ERIC Educational Resources Information Center

    Geo-Jaja, Macleans A.

    2006-01-01

    This study situates the process of educational decentralization in the narrower context of social justice. Its main object, however, is to analyze the implications of decentralization for strategies of equity and social justice in Nigeria. It starts from the premise that the early optimism that supported decentralization as an efficient and…

  14. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    DOE PAGES

    Kim, Hyunjoo; el-Khamra, Yaakoub; Rodero, Ivan; ...

    2011-01-01

    In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints.more » The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.« less

  15. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  16. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  17. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan Timothy; Hackenberg, Robert Errol

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneckmore » Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.« less

  18. Towards a Unified Architecture for Data-Intensive Seismology in VERCE

    NASA Astrophysics Data System (ADS)

    Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.

    2013-12-01

    Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.

  19. Overview of the state of the art of constructed wetlands for decentralized wastewater management in Brazil.

    PubMed

    Machado, A I; Beretta, M; Fragoso, R; Duarte, E

    2017-02-01

    Conventional wastewater treatment plants (WWTPs) commonly require large capital investments as well as operation and maintenance costs. Constructed wetlands (CWs) appear as a cost-effective treatment, since they can remove a broad range of contaminants by a combination of physical, chemical and biological processes with a low cost. Therefore, CWs can be successfully applied for decentralized wastewater treatment in regions with low population density and/or with large land availability as Brazil. The present work provides a review of thirty nine studies developed on CWs implemented in Brazil to remove wastewater contaminants. Brazil current sanitation data is also considered to evaluate the potential role of CWs as decentralized wastewater treatment. Performance of CWs was evaluated according to (i) type of wetland system, (ii) different support matrix (iii) vegetation species and (iv) removal efficiency of chemical oxygen demand (COD), biological oxygen demand (BOD 5 ), nitrogen (N), and phosphorus (P). The reviewed CWs in overall presented good efficiencies, whereas H-CWs achieved the highest removals for P, while the higher results for N were attained on VF-CW and for COD and BOD 5 on HF-CW. Therefore, was concluded that CWs are an interesting solution for decentralized wastewater treatment in Brazil since it has warm temperatures, extensive radiation hours and available land. Additionally, the low percentage of population with access to the sewage network in the North and Northeast regions makes these systems especially suitable. Hence, the further implementation of CW is encouraged by the authors in regions with similar characteristics as Brazil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Privatization of Public Universities: How the Budget System Affects the Decision-Making Strategy of Deans

    ERIC Educational Resources Information Center

    Volpatti, Mark Christopher

    2013-01-01

    In response to lower funding commitments, many public colleges and universities have elected to incorporate decentralized budgeting systems, one of which is Responsibility Center Management (RCM). As public institutions are becoming more dependent on tuition dollars because state appropriations are declining, deans have an increased responsibility…

Top