Sample records for simplify process management

  1. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  2. Operation Windshield and the simplification of emergency management.

    PubMed

    Andrews, Michael

    2016-01-01

    Large, complex, multi-stakeholder exercises are the culmination of years of gradual progression through a comprehensive training and exercise programme. Exercises intended to validate training, refine procedures and test processes initially tested in isolation are combined to ensure seamless response and coordination during actual crises. The challenges of integrating timely and accurate situational awareness from an array of sources, including response agencies, municipal departments, partner agencies and the public, on an ever-growing range of media platforms, increase information management complexity in emergencies. Considering that many municipal emergency operations centre roles are filled by staff whose day jobs have little to do with crisis management, there is a need to simplify emergency management and make it more intuitive. North Shore Emergency Management has accepted the challenge of making emergency management less onerous to occasional practitioners through a series of initiatives aimed to build competence and confidence by making processes easier to use as well as by introducing technical tools that can simplify processes and enhance efficiencies. These efforts culminated in the full-scale earthquake exercise, Operation Windshield, which preceded the 2015 Emergency Preparedness and Business Continuity Conference in Vancouver, British Columbia.

  3. A Contextual Model for Identity Management (IdM) Interfaces

    ERIC Educational Resources Information Center

    Fuller, Nathaniel J.

    2014-01-01

    The usability of Identity Management (IdM) systems is highly dependent upon design that simplifies the processes of identification, authentication, and authorization. Recent findings reveal two critical problems that degrade IdM usability: (1) unfeasible techniques for managing various digital identifiers, and (2) ambiguous security interfaces.…

  4. Use of simplifier scenarios for CRM training

    NASA Technical Reports Server (NTRS)

    Weatherly, D.

    1984-01-01

    Cockpit resource management (CRM) at Metro Airlines is discussed. The process by which the program of CRM training was initiated is mentioned. Management aspects of various flying scenarios are considered. The transfer of training from the classroom to the field is assessed.

  5. Managing Uncertainty in Runoff Estimation with the U.S. Environmental Protection Agency National Stormwater Calculator.

    EPA Science Inventory

    The U.S. Environmental Protection Agency National Stormwater Calculator (NSWC) simplifies the task of estimating runoff through a straightforward simulation process based on the EPA Stormwater Management Model. The NSWC accesses localized climate and soil hydrology data, and opti...

  6. Forest processes from stands to landscapes: exploring model forecast uncertainties using cross-scale model comparison

    Treesearch

    Michael J. Papaik; Andrew Fall; Brian Sturtevant; Daniel Kneeshaw; Christian Messier; Marie-Josee Fortin; Neal Simon

    2010-01-01

    Forest management practices conducted primarily at the stand scale result in simplified forests with regeneration problems and low structural and biological diversity. Landscape models have been used to help design management strategies to address these problems. However, there remains a great deal of uncertainty that the actual management practices result in the...

  7. Managing the construction bidding process : a move to simpler construction plan sets

    DOT National Transportation Integrated Search

    2001-01-31

    This project was conducted to determine whether construction plan sets could be significantly simplified to speed the process of moving projects to construction. The work steps included a literature review, a telephone survey of highway agencies in s...

  8. Simplified Processing Method for Meter Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Colotelo, Alison H. A.; Downs, Janelle L.

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  9. Total Quality Management Simplified.

    ERIC Educational Resources Information Center

    Arias, Pam

    1995-01-01

    Maintains that Total Quality Management (TQM) is one method that helps to monitor and improve the quality of child care. Lists four steps for a child-care center to design and implement its own TQM program. Suggests that quality assurance in child-care settings is an ongoing process, and that TQM programs help in providing consistent, high-quality…

  10. Simplifying and speeding the management of intra-node cache coherence

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Phillip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2012-04-17

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  11. If you take stand, how can you manage an ecosystem? The complex art of raising a forest.

    Treesearch

    Sally Duncan

    2000-01-01

    Managing whole ecosystem is a concept gaining considerable acceptance among forest managers throughout the Northwest, but it does not have a clear or simple definition. Terminology and definitions can be confusing. Forests are complex places, formed by complex processes, and the moment we try to simplify, we are likely to damage the healthy functioning of...

  12. Developing a Mobile Application "Educational Process Remote Management System" on the Android Operating System

    ERIC Educational Resources Information Center

    Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.

    2016-01-01

    Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

  13. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems

    PubMed Central

    Hou, Kun-Mean; Zhang, Zhan

    2017-01-01

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem. PMID:29120357

  14. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems.

    PubMed

    Zhou, Peng; Zuo, Decheng; Hou, Kun-Mean; Zhang, Zhan

    2017-11-09

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem.

  15. Research on the Intensive Material Management System of Biomass Power Plant

    NASA Astrophysics Data System (ADS)

    Zhang, Ruosi; Hao, Tianyi; Li, Yunxiao; Zhang, Fangqing; Ding, Sheng

    2017-05-01

    In view of the universal problem which the material management is loose, and lack of standardization and interactive real-time in the biomass power plant, a system based on the method of intensive management is proposed in this paper to control the whole process of power plant material. By analysing the whole process of power plant material management and applying the Internet of Things, the method can simplify the management process. By making use of the resources to maximize and data mining, material utilization, circulation rate and quality control management can be improved. The system has been applied in Gaotang power plant, which raised the level of materials management and economic effectiveness greatly. It has an important significance for safe, cost-effective and highly efficient operation of the plant.

  16. Costing and Data Management. Development of a Simplified System for Smaller Colleges and Universities

    ERIC Educational Resources Information Center

    Ames, Michael D.

    1976-01-01

    A participatory process by which a useful costing and data management system was developed at Chapman College is described. The system summarizes information on instructional workloads, class sizes, and the costs per student credit hour for academic programs. Costs incurred in other areas to support each program are included. (Editor/LBH)

  17. Operating tool for a distributed data and information management system

    NASA Astrophysics Data System (ADS)

    Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.

    2002-07-01

    The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.

  18. Following the Yellow Brick Road to Simplified Link Management

    ERIC Educational Resources Information Center

    Engard, Nicole C.

    2005-01-01

    Jenkins Law Library is the oldest law library in America, and has a reputation for offering great content not only to local attorneys, but also to the entire legal research community. In this article, the author, who is Web manager at Jenkins, describes the development of an automated process by which research links can be added to the database so…

  19. An Introduction to Simplified Performance Management Approaches. Report No. 98.

    ERIC Educational Resources Information Center

    Goddu, Roland

    This guide to simplified performance management approaches contains five sections. The first section, entitled "Simple Techniques for Managing an Innovation," is written from the viewpoint of a principal as manager. It describes how to manage an innovation, develop an objective, allocate resources for the innovation, keep organized…

  20. Evaluation of the user requirements processes for NASA terrestrial applications programs

    NASA Technical Reports Server (NTRS)

    1982-01-01

    To support the evolution of increasingly sound user requirements definition processes that would meet the broad range of NASA's terrestrial applications planning and management needs during the 1980's, the user requirements processes as they function in the real world at the senior and middle management levels were evaluated. Special attention was given to geologic mapping and domestic crop reporting to provide insight into problems associated with the development and management of user established conventional practices and data sources. An attempt was made to identify alternative NASA user interfaces that sustain strengths, alleviate weaknesses, maximize application to multiple problems, and simplify management cognizance. Some of the alternatives are outlined and evaluated. It is recommended that NASA have an identified organizational point of focus for consolidation and oversight of the user processes.

  1. AMP: A platform for managing and mining data in the treatment of Autism Spectrum Disorder.

    PubMed

    Linstead, Erik; Burns, Ryan; Duy Nguyen; Tyler, David

    2016-08-01

    We introduce AMP (Autism Management Platform), an integrated health care information system for capturing, analyzing, and managing data associated with the diagnosis and treatment of Autism Spectrum Disorder in children. AMP's mobile application simplifies the means by which parents, guardians, and clinicians can collect and share multimedia data with one another, facilitating communication and reducing data redundancy, while simplifying retrieval. Additionally, AMP provides an intelligent web interface and analytics platform which allow physicians and specialists to aggregate and mine patient data in real-time, as well as give relevant feedback to automatically learn data filtering preferences over time. Together AMP's mobile app, web client, and analytics engine implement a rich set of features that streamline the data collection and analysis process in the context of a secure and easy-to-use system so that data may be more effectively leveraged to guide treatment.

  2. 48 CFR 46.202-2 - Government reliance on inspection by contractor.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...

  3. 48 CFR 46.202-2 - Government reliance on inspection by contractor.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...

  4. 48 CFR 46.202-2 - Government reliance on inspection by contractor.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...

  5. Electronic Handbooks Simplify Process Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  6. Managed care for Medicare: some considerations in designing effective information provision programs.

    PubMed

    Jayanti, R K

    2001-01-01

    Consumer information-processing theory provides a useful framework for policy makers concerned with regulating information provided by managed care organizations. The assumption that consumers are rational information processors and providing more information is better is questioned in this paper. Consumer research demonstrates that when faced with an uncertain decision, consumers adopt simplifying strategies leading to sub-optimal choices. A discussion on how consumers process risk information and the effects of various informational formats on decision outcomes is provided. Categorization theory is used to propose guidelines with regard to providing effective information to consumers choosing among competing managed care plans. Public policy implications borne out of consumer information-processing theory conclude the article.

  7. A LOTUS NOTES APPLICATION FOR PREPARING, REVIEWING, AND STORING NHEERL RESEARCH PROTOCOLS

    EPA Science Inventory

    Upon becoming QA Manager of the Health Effects Research Laboratory (HERL) in 1990, Ron became aware of the need to simplify and streamline the research planning process that Laboratory Principal Investigators (Pls) faced. Appropriately, animal studies require close scrutiny, both...

  8. Simplified aerosol modeling for variational data assimilation

    NASA Astrophysics Data System (ADS)

    Huneeus, N.; Boucher, O.; Chevallier, F.

    2009-11-01

    We have developed a simplified aerosol model together with its tangent linear and adjoint versions for the ultimate aim of optimizing global aerosol and aerosol precursor emission using variational data assimilation. The model was derived from the general circulation model LMDz; it groups together the 24 aerosol species simulated in LMDz into 4 species, namely gaseous precursors, fine mode aerosols, coarse mode desert dust and coarse mode sea salt. The emissions have been kept as in the original model. Modifications, however, were introduced in the computation of aerosol optical depth and in the processes of sedimentation, dry and wet deposition and sulphur chemistry to ensure consistency with the new set of species and their composition. The simplified model successfully manages to reproduce the main features of the aerosol distribution in LMDz. The largest differences in aerosol load are observed for fine mode aerosols and gaseous precursors. Differences between the original and simplified models are mainly associated to the new deposition and sedimentation velocities consistent with the definition of species in the simplified model and the simplification of the sulphur chemistry. Furthermore, simulated aerosol optical depth remains within the variability of monthly AERONET observations for all aerosol types and all sites throughout most of the year. Largest differences are observed over sites with strong desert dust influence. In terms of the daily aerosol variability, the model is less able to reproduce the observed variability from the AERONET data with larger discrepancies in stations affected by industrial aerosols. The simplified model however, closely follows the daily simulation from LMDz. Sensitivity analyses with the tangent linear version show that the simplified sulphur chemistry is the dominant process responsible for the strong non-linearity of the model.

  9. Development of a Simplified Sustainable Facilities Guide

    DTIC Science & Technology

    2003-04-18

    Government Through Efficient Energy Management , June 3, 1999 EO 13148 Greening the Government Through Leadership in Environmental Management ...architects, engineers, and project managers . - The United States Green Building Council (USGBC) has created the " Leadership in Energy and...SIMPLIFIED SUSTAINABLE FACILITIES GUIDE THESIS Presented to the Faculty Department of Systems and Engineering Management

  10. Using stable isotopes and models to explore estuarine linkages at multiple scales

    EPA Science Inventory

    Estuarine managers need tools to respond to dynamic stressors that occur in three linked environments – coastal ocean, estuaries and watersheds. Models have been the tool of choice for examining these dynamic systems because they simplify processes and integrate over multiple sc...

  11. E-facts: business process management in clinical data repositories.

    PubMed

    Wattanasin, Nich; Peng, Zhaoping; Raine, Christine; Mitchell, Mariah; Wang, Charles; Murphy, Shawn N

    2008-11-06

    The Partners Healthcare Research Patient Data Registry (RPDR) is a centralized data repository that gathers clinical data from various hospital systems. The RPDR allows clinical investigators to obtain aggregate numbers of patients with user-defined characteristics such as diagnoses, procedures, medications, and laboratory values. They may then obtain patient identifiers and electronic medical records with prior IRB approval. Moreover, the accurate identification and efficient population of worthwhile and quantifiable facts from doctor's report into the RPDR is a significant process. As part of our ongoing e-Fact project, this work describes a new business process management technology that helps coordinate and simplify this procedure.

  12. Single-Scale Fusion: An Effective Approach to Merging Images.

    PubMed

    Ancuti, Codruta O; Ancuti, Cosmin; De Vleeschouwer, Christophe; Bovik, Alan C

    2017-01-01

    Due to its robustness and effectiveness, multi-scale fusion (MSF) based on the Laplacian pyramid decomposition has emerged as a popular technique that has shown utility in many applications. Guided by several intuitive measures (weight maps) the MSF process is versatile and straightforward to be implemented. However, the number of pyramid levels increases with the image size, which implies sophisticated data management and memory accesses, as well as additional computations. Here, we introduce a simplified formulation that reduces MSF to only a single level process. Starting from the MSF decomposition, we explain both mathematically and intuitively (visually) a way to simplify the classical MSF approach with minimal loss of information. The resulting single-scale fusion (SSF) solution is a close approximation of the MSF process that eliminates important redundant computations. It also provides insights regarding why MSF is so effective. While our simplified expression is derived in the context of high dynamic range imaging, we show its generality on several well-known fusion-based applications, such as image compositing, extended depth of field, medical imaging, and blending thermal (infrared) images with visible light. Besides visual validation, quantitative evaluations demonstrate that our SSF strategy is able to yield results that are highly competitive with traditional MSF approaches.

  13. Developing a Three Processes Framework to Analyze Hydrologic Performance of Urban Stormwater Management in a Watershed Scale

    NASA Astrophysics Data System (ADS)

    Lyu, H.; Ni, G.; Sun, T.

    2016-12-01

    Urban stormwater management contributes to recover water cycle to a nearly natural situation. It is a challenge for analyzing the hydrologic performance in a watershed scale, since the measures are various of sorts and scales and work in different processes. A three processes framework is developed to simplify the urban hydrologic process on the surface and evaluate the urban stormwater management. The three processes include source utilization, transfer regulation and terminal detention, by which the stormwater is controlled in order or discharged. Methods for analyzing performance are based on the water controlled proportions by each process, which are calculated using USEPA Stormwater Management Model. A case study form Beijing is used to illustrate how the performance varies under a set of designed events of different return periods. This framework provides a method to assess urban stormwater management as a whole system considering the interaction between measures, and to examine if there is any weak process of an urban watershed to be improved. The results help to make better solutions of urban water crisis.

  14. Simplified method for numerical modeling of fiber lasers.

    PubMed

    Shtyrina, O V; Yarutkina, I A; Fedoruk, M P

    2014-12-29

    A simplified numerical approach to modeling of dissipative dispersion-managed fiber lasers is examined. We present a new numerical iteration algorithm for finding the periodic solutions of the system of nonlinear ordinary differential equations describing the intra-cavity dynamics of the dissipative soliton characteristics in dispersion-managed fiber lasers. We demonstrate that results obtained using simplified model are in good agreement with full numerical modeling based on the corresponding partial differential equations.

  15. The Tuition/Financial Aid Equation and Its Impact on Access.

    ERIC Educational Resources Information Center

    Bowen, Raymond C.

    The complex rules and regulations of the student financial aid industry have alienated and confused both students and parents, especially those from lower income families. Unless simplified, the financial aid application process will continue to act as a deterrent to participation in the U.S. educational system. Families managing to overcome the…

  16. Painting a picture across the landscape with ModelMap

    Treesearch

    Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino

    2017-01-01

    Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...

  17. Collaborative Learning through Formative Peer Review: Pedagogy, Programs and Potential

    ERIC Educational Resources Information Center

    Sondergaard, Harald; Mulder, Raoul A.

    2012-01-01

    We examine student peer review, with an emphasis on formative practice and collaborative learning, rather than peer grading. Opportunities to engage students in such formative peer assessment are growing, as a range of online tools become available to manage and simplify the process of administering student peer review. We consider whether…

  18. Analysis of Decisions Made Using the Analytic Hierarchy Process

    DTIC Science & Technology

    2013-09-01

    country petroleum pipelines (Dey, 2003), deciding how best to manage U.S. watersheds (De Steiguer, Duberstein, and Lopes, 2003), and the U. S. Army...many benefits to its use. Primarily these fall under the heading of managing chaos. Specifically, the AHP is a tool that can be used to simplify and...originally. The commonly used scenario is this: the waiter asks if you want chicken or fish, and you reply fish. The waiter then remembers that steak is

  19. Information Architecture for Quality Management Support in Hospitals.

    PubMed

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  20. Gbm.auto: A software tool to simplify spatial modelling and Marine Protected Area planning

    PubMed Central

    Officer, Rick; Clarke, Maurice; Reid, David G.; Brophy, Deirdre

    2017-01-01

    Boosted Regression Trees. Excellent for data-poor spatial management but hard to use Marine resource managers and scientists often advocate spatial approaches to manage data-poor species. Existing spatial prediction and management techniques are either insufficiently robust, struggle with sparse input data, or make suboptimal use of multiple explanatory variables. Boosted Regression Trees feature excellent performance and are well suited to modelling the distribution of data-limited species, but are extremely complicated and time-consuming to learn and use, hindering access for a wide potential user base and therefore limiting uptake and usage. BRTs automated and simplified for accessible general use with rich feature set We have built a software suite in R which integrates pre-existing functions with new tailor-made functions to automate the processing and predictive mapping of species abundance data: by automating and greatly simplifying Boosted Regression Tree spatial modelling, the gbm.auto R package suite makes this powerful statistical modelling technique more accessible to potential users in the ecological and modelling communities. The package and its documentation allow the user to generate maps of predicted abundance, visualise the representativeness of those abundance maps and to plot the relative influence of explanatory variables and their relationship to the response variables. Databases of the processed model objects and a report explaining all the steps taken within the model are also generated. The package includes a previously unavailable Decision Support Tool which combines estimated escapement biomass (the percentage of an exploited population which must be retained each year to conserve it) with the predicted abundance maps to generate maps showing the location and size of habitat that should be protected to conserve the target stocks (candidate MPAs), based on stakeholder priorities, such as the minimisation of fishing effort displacement. Gbm.auto for management in various settings By bridging the gap between advanced statistical methods for species distribution modelling and conservation science, management and policy, these tools can allow improved spatial abundance predictions, and therefore better management, decision-making, and conservation. Although this package was built to support spatial management of a data-limited marine elasmobranch fishery, it should be equally applicable to spatial abundance modelling, area protection, and stakeholder engagement in various scenarios. PMID:29216310

  1. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  2. 75 FR 71376 - Simplified Network Application Processing System, On-Line Registration and Account Maintenance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ...-02] RIN 0694-AE98 Simplified Network Application Processing System, On-Line Registration and Account...'') electronically via BIS's Simplified Network Application Processing (SNAP-R) system. Currently, parties must... Network Applications Processing System (SNAP-R) in October 2006. The SNAP-R system provides a Web based...

  3. A cluster-randomized controlled trial to evaluate the effects of a simplified cardiovascular management program in Tibet, China and Haryana, India: study design and rationale.

    PubMed

    Ajay, Vamadevan S; Tian, Maoyi; Chen, Hao; Wu, Yangfeng; Li, Xian; Dunzhu, Danzeng; Ali, Mohammed K; Tandon, Nikhil; Krishnan, Anand; Prabhakaran, Dorairaj; Yan, Lijing L

    2014-09-06

    In resource-poor areas of China and India, the cardiovascular disease burden is high, but availability of and access to quality healthcare is limited. Establishing a management scheme that utilizes the local infrastructure and builds healthcare capacity is essential for cardiovascular disease prevention and management. The study aims to develop, implement, and evaluate the feasibility and effectiveness of a simplified, evidence-based cardiovascular management program delivered by community healthcare workers in resource-constrained areas in Tibet, China and Haryana, India. This yearlong cluster-randomized controlled trial will be conducted in 20 villages in Tibet and 20 villages in Haryana. Randomization of villages to usual care or intervention will be stratified by country. High cardiovascular disease risk individuals (aged 40 years or older, history of heart disease, stroke, diabetes, or measured systolic blood pressure of 160 mmHg or higher) will be screened at baseline. Community health workers in the intervention villages will be trained to manage and follow up high-risk patients on a monthly basis following a simplified '2+2' intervention model involving two lifestyle recommendations and the appropriate prescription of two medications. A customized electronic decision support system based on the intervention strategy will be developed to assist the community health workers with patient management. Baseline and follow-up surveys will be conducted in a standardized fashion in all villages. The primary outcome will be the net difference between-group in the proportion of high-risk patients taking antihypertensive medication pre- and post-intervention. Secondary outcomes will include the proportion of patients taking aspirin and changes in blood pressure. Process and economic evaluations will also be conducted. To our knowledge, this will be the first study to evaluate the effect of a simplified management program delivered by community health workers with the help of electronic decision support system on improving the health of high cardiovascular disease risk patients. If effective, this intervention strategy can serve as a model that can be implemented, where applicable, in rural China, India, and other resource-constrained areas. The trial was registered in the clinicaltrials.gov database on 30 December, 2011 and the registration number is NCT01503814.

  4. [Medical record management and risk management processes. State of the art and new normative guidelines about the organization and the management of the sanitary documentation in the National Health System or Hospital Trusts].

    PubMed

    Spolaore, P; Murolo, G; Sommavilla, M

    2003-01-01

    Recent health care reforms, the start of accreditation processes of health institutions, and the introduction also in the health system of risk management concepts and instruments, borrowed from the enterprise culture and the emphasis put on the protection of privacy, render evident the need and the urgency to define and to implement improvement processes of the organization and management of the medical documentation in the hospital with the aim of facilitation in fulfilment of regional and local health authorities policies about protection of the safety and improvement of quality of care. Currently the normative context that disciplines the management of medical records inside the hospital appears somewhat fragmentary, incomplete and however not able to clearly orientate health operators with the aim of a correct application of the enforced norms in the respect of the interests of the user and of local health authority. In this job we individuate the critical steps in the various phases of management process of the clinical folder and propose a new model of regulations, with the purpose to improve and to simplify the management processes and the modalities of compilation, conservation and release to entitled people of all clinical documentation.

  5. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1987-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  6. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1988-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  7. Simplified lipid guidelines: Prevention and management of cardiovascular disease in primary care.

    PubMed

    Allan, G Michael; Lindblad, Adrienne J; Comeau, Ann; Coppola, John; Hudson, Brianne; Mannarino, Marco; McMinis, Cindy; Padwal, Raj; Schelstraete, Christine; Zarnke, Kelly; Garrison, Scott; Cotton, Candra; Korownyk, Christina; McCormack, James; Nickel, Sharon; Kolber, Michael R

    2015-10-01

    To develop clinical practice guidelines for a simplified approach to primary prevention of cardiovascular disease (CVD), concentrating on CVD risk estimation and lipid management for primary care clinicians and their teams; we sought increased contribution from primary care professionals with little or no conflict of interest and focused on the highest level of evidence available. Nine health professionals (4 family physicians, 2 internal medicine specialists, 1 nurse practitioner, 1 registered nurse, and 1 pharmacist) and 1 nonvoting member (pharmacist project manager) comprised the overarching Lipid Pathway Committee (LPC). Member selection was based on profession, practice setting, and location, and members disclosed any actual or potential conflicts of interest. The guideline process was iterative through online posting, detailed evidence review, and telephone and online meetings. The LPC identified 12 priority questions to be addressed. The Evidence Review Group answered these questions. After review of the answers, key recommendations were derived through consensus of the LPC. The guidelines were drafted, refined, and distributed to a group of clinicians (family physicians, other specialists, pharmacists, nurses, and nurse practitioners) and patients for feedback, then refined again and finalized by the LPC. Recommendations are provided on screening and testing, risk assessments, interventions, follow-up, and the role of acetylsalicylic acid in primary prevention. These simplified lipid guidelines provide practical recommendations for prevention and treatment of CVD for primary care practitioners. All recommendations are intended to assist with, not dictate, decision making in conjunction with patients. Copyright© the College of Family Physicians of Canada.

  8. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal... not to exceed the simplified acquisition threshold. The short selection process described in FAR 36.602-5 is authorized for use for contracts not expected to exceed the simplified acquisition threshold...

  9. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5... for contracts not to exceed the simplified acquisition threshold. (a) In contracts not expected to exceed the simplified acquisition threshold, either or both of the short selection processes set out at...

  10. "Simplify, Simplify."

    ERIC Educational Resources Information Center

    Stump, William P.

    1983-01-01

    An integrated electronic system combines individual monitoring and control functions into one economical unit that earns a rapid payback by automatically managing and controlling energy usage, building systems, and security and maintenance tasks. (MLF)

  11. A transfer function type of simplified electrochemical model with modified boundary conditions and Padé approximation for Li-ion battery: Part 1. lithium concentration estimation

    NASA Astrophysics Data System (ADS)

    Yuan, Shifei; Jiang, Lei; Yin, Chengliang; Wu, Hongjie; Zhang, Xi

    2017-06-01

    To guarantee the safety, high efficiency and long lifetime for lithium-ion battery, an advanced battery management system requires a physics-meaningful yet computationally efficient battery model. The pseudo-two dimensional (P2D) electrochemical model can provide physical information about the lithium concentration and potential distributions across the cell dimension. However, the extensive computation burden caused by the temporal and spatial discretization limits its real-time application. In this research, we propose a new simplified electrochemical model (SEM) by modifying the boundary conditions for electrolyte diffusion equations, which significantly facilitates the analytical solving process. Then to obtain a reduced order transfer function, the Padé approximation method is adopted to simplify the derived transcendental impedance solution. The proposed model with the reduced order transfer function can be briefly computable and preserve physical meanings through the presence of parameters such as the solid/electrolyte diffusion coefficients (Ds&De) and particle radius. The simulation illustrates that the proposed simplified model maintains high accuracy for electrolyte phase concentration (Ce) predictions, saying 0.8% and 0.24% modeling error respectively, when compared to the rigorous model under 1C-rate pulse charge/discharge and urban dynamometer driving schedule (UDDS) profiles. Meanwhile, this simplified model yields significantly reduced computational burden, which benefits its real-time application.

  12. Comparison of monoplex and duplex RT-PCR assays for the detection of measles virus.

    PubMed

    Binkhamis, Khalifa; Gillis, Hayley; Lafreniere, Joseph Daniel; Hiebert, Joanne; Mendoza, Lillian; Pettipas, Janice; Severini, Alberto; Hatchette, Todd F; LeBlanc, Jason J

    2017-01-01

    Rapid and accurate detection of measles virus is important for case diagnosis and public health management. This study compared the performance of two monoplex RT-PCR reactions targeting the H and N genes to a duplex RT-PCR targeting both genes simultaneously. The duplex simplified processing without compromising assay performance characteristic. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... selection process for procurements not to exceed the simplified acquisition threshold. References to FAR 36...

  14. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  15. Understanding the Data Complexity continuum to reduce data management costs and increase data usability through partnerships with the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Mesick, S.; Weathers, K. W.

    2017-12-01

    Data complexity can be seen as a continuum from complex to simple. The term data complexity refers to data collections that are disorganized, poorly documented, and generally do not follow best data management practices. Complex data collections are challenging and expensive to manage. Simplified collections readily support automated archival processes, enhanced discovery and data access, as well as production of services that make data easier to reuse. In this session, NOAA NCEI scientific data stewards will discuss the data complexity continuum. This talk will explore data simplification concepts, methods, and tools that data managers can employ which may offer more control over data management costs and processes, while achieving policy goals for open data access and ready reuse. Topics will include guidance for data managers on best allocation of limited data management resources; models for partnering with NCEI to accomplish shared data management goals; and will demonstrate through case studies the benefits of investing in documentation, accessibility, and services to increase data value and return on investment.

  16. CELLFS: TAKING THE "DMA" OUT OF CELL PROGRAMMING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    IONKOV, LATCHESAR A.; MIRTCHOVSKI, ANDREY A.; NYRHINEN, AKI M.

    In this paper we present a new programming model for the Cell BE architecture of scalar multiprocessors. They call this programming model CellFS. CellFS aims at simplifying the task of managing I/O between the local store of the processing units and main memory. The CellFS support library provides the means for transferring data via simple file I/O operations between the PPU and the SPU.

  17. Physiotherapists' experiences of the management of anterior cruciate ligament injuries.

    PubMed

    von Aesch, Arlene V; Perry, Meredith; Sole, Gisela

    2016-05-01

    While extensive research has been reported for management of anterior cruciate ligament (ACL) injuries, variation in treatment by physiotherapists is evident. To explore physiotherapists' experiences regarding ACL injury rehabilitation and factors that influenced physiotherapists' decision making for ACL rehabilitation, and to elicit what research physiotherapists perceived would support their management of these patients. Qualitative study. Fifteen physiotherapists from six private clinics in New Zealand participated in semi-structured interviews. The interviews were recorded, transcribed verbatim and the general inductive approach was used to develop key themes. Participant's management strengths were evident by their intent and commitment to provide expert rehabilitation, using a biopsychosocial approach and evidence-informed practice. The lengthy management process (including prolonged rehabilitation and referral processes) and interprofessional disconnect concerned participants. Translational research was needed for clear directions for exercise prescription and milestones for return to sports and occupation following ACL injury. Participants provided a biopsychosocial and evidence-based approach to ACL injury management. Potential areas of improvement include simplifying the referral process and enhancing communication between physiotherapists and other health professionals. Future research should focus on clarifying areas of ACL rehabilitation uncertainty, or collating results in an accessible and usable format for clinical practice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Transportable Applications Environment (TAE) Plus: A NASA tool for building and managing graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at GSFC, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUI's), supports prototyping, allows applications to be ported easily between different platforms and encourages appropriate levels of user interface consistency between applications. The following topics are discussed: the capabilities of the TAE Plus tool; how the implementation has utilized state-of-the-art technologies within graphic workstations; and how it has been used both within and outside of NASA.

  19. Transfer of Perceptual Expertise: The Case of Simplified and Traditional Chinese Character Recognition

    ERIC Educational Resources Information Center

    Liu, Tianyin; Chuk, Tin Yim; Yeh, Su-Ling; Hsiao, Janet H.

    2016-01-01

    Expertise in Chinese character recognition is marked by reduced holistic processing (HP), which depends mainly on writing rather than reading experience. Here we show that, while simplified and traditional Chinese readers demonstrated a similar level of HP when processing characters shared between the simplified and traditional scripts, simplified…

  20. The determination of measures of software reliability

    NASA Technical Reports Server (NTRS)

    Maxwell, F. D.; Corn, B. C.

    1978-01-01

    Measurement of software reliability was carried out during the development of data base software for a multi-sensor tracking system. The failure ratio and failure rate were found to be consistent measures. Trend lines could be established from these measurements that provide good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  1. Reliability measurement during software development. [for a multisensor tracking system

    NASA Technical Reports Server (NTRS)

    Hecht, H.; Sturm, W. A.; Trattner, S.

    1977-01-01

    During the development of data base software for a multi-sensor tracking system, reliability was measured. The failure ratio and failure rate were found to be consistent measures. Trend lines were established from these measurements that provided good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  2. A Manpower Model for U.S. Navy Operational Contracting

    DTIC Science & Technology

    2012-06-01

    Accomplishment Time RFP Request For Proposal SAF/FM Air Force Financial Management SAP Simplified Acquisition Procedures SAT Simplified...conformance and seller’s release of claim (Garrett, 2007). 2. Contract Size and its Effect on Workload Simplified acquisition procedures ( SAP ) were...the SAP dollar threshold. 14 The drastic reduction in KO workload through the use of SAP is unmatched by any federal authorization that came

  3. Simplified pancreatoduodenectomy for complex blunt pancreaticoduodenal injury.

    PubMed

    Feng, Xin-Fu; Fan, Wei; Shi, Cheng-Xian; Li, Jun-Hua; Liu, Jun; Liu, Zhen-Hua

    2013-01-01

    A 34-year-old man admitted to our department with complex blunt pancreaticoduodenal injury after a car accident. The wall of the first, second, and third portions of the duodenum was extensively lacerated, and the pancreas was longitudinally transected along the superior mesenteric vein-portal vein trunk. The pancreatic head and the uncinate process were devitalized and the distal common bile duct and the proximal main pancreatic duct were completely detached from the Vater ampulla. The length of the stump of distal common bile located at the cut surface of remnant pancreas was approximately 0.6 cm. A simplified Kausch-Whipple's procedure was performed after debridement of the devitalized pancreatic head and resection of the damaged duodenum in which the stump of distal common bile duct and the pancreatic remnant were embedded into the jejunal loop. Postoperative wound abscess appeared that eventually recovered by conservative treatment. During 16 months follow-up the patient has been stable and healthy. A simplified pancreaticoduodenectomy is a safe alternative for the Whipple procedure in managing complex pancreaticoduodenal injury in a hemodynamically stable patient.

  4. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... contracts not to exceed the simplified acquisition threshold. Either of the procedures provided in FAR 36... simplified acquisition threshold. ...

  5. Service management at CERN with Service-Now

    NASA Astrophysics Data System (ADS)

    Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.

    2012-12-01

    The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.

  6. The Architecture Design of Detection and Calibration System for High-voltage Electrical Equipment

    NASA Astrophysics Data System (ADS)

    Ma, Y.; Lin, Y.; Yang, Y.; Gu, Ch; Yang, F.; Zou, L. D.

    2018-01-01

    With the construction of Material Quality Inspection Center of Shandong electric power company, Electric Power Research Institute takes on more jobs on quality analysis and laboratory calibration for high-voltage electrical equipment, and informationization construction becomes urgent. In the paper we design a consolidated system, which implements the electronic management and online automation process for material sampling, test apparatus detection and field test. In the three jobs we use QR code scanning, online Word editing and electronic signature. These techniques simplify the complex process of warehouse management and testing report transferring, and largely reduce the manual procedure. The construction of the standardized detection information platform realizes the integrated management of high-voltage electrical equipment from their networking, running to periodic detection. According to system operation evaluation, the speed of transferring report is doubled, and querying data is also easier and faster.

  7. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... to exceed the simplified acquisition threshold. The HCA may include either or both procedures in FAR...

  8. Heuristics in Managing Complex Clinical Decision Tasks in Experts’ Decision Making

    PubMed Central

    Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme

    2016-01-01

    Background Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. Objective The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. Method After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. Results We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Conclusion Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Application Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design. PMID:27275019

  9. Heuristics in Managing Complex Clinical Decision Tasks in Experts' Decision Making.

    PubMed

    Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme

    2014-09-01

    Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design.

  10. The Protected Areas Visitor Impact Management (PAVIM) framework: A simplified process for making management decisions

    USGS Publications Warehouse

    Farrell, T.A.; Marion, J.L.

    2002-01-01

    Ecotourism and protected area visitation in Central and South America have resulted in ecological impacts, which some protected areas managers have addressed by employing visitor impact management frameworks. In this paper, we propose the Protected Area Visitor Impact Management (PAVIM) framework as an alternative to carrying capacity and other frameworks such as Limits of Acceptable Change. We use a set of evaluation criteria to compare the relative positive and negative attributes of carrying capacity, other decision-making frameworks and the new framework, within the context of their actual and potential use in Central and South America. Positive attributes of PAVIM include simplicity, flexibility, cost effectiveness, timeliness, and incorporating input from stakeholders and local residents. Negative attributes include diminished objectivity and cultural sensitivity issues. Further research and application of PAVIM are recommended.

  11. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5... for contracts not to exceed the simplified acquisition threshold. At each occurrence, CO approval...-engineer contracts not expected to exceed the simplified acquisition threshold. ...

  12. An evaluation of a risk-based environmental regulation in Brazil: Limitations to risk management of hazardous installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naime, Andre, E-mail: andre.naime.ibama@gmail.com

    The environmental regulation of hazardous projects with risk-based decision-making processes can lead to a deficient management of human exposure to technological hazards. Such an approach for regulation is criticized for simplifying the complexity of decisions involving the economic, social, and environmental aspects of the installation and operation of hazardous facilities in urban areas. Results of a Brazilian case study indicate that oil and gas transmission pipelines may represent a threat to diverse communities if the relationship between such linear projects and human populations is overlooked by regulatory bodies. Results also corroborate known challenges to the implementation of EIA processes andmore » outline limitations to an effective environmental and risk management. Two preliminary topics are discussed to strengthen similar regulatory practices. Firstly, an effective integration between social impact assessment and risk assessment in EIA processes to have a more comprehensive understanding of the social fabric. Secondly, the advancement of traditional management practices for hazardous installations to pursue a strong transition from assessment and evaluation to management and control and to promote an effective interaction between land-use planning and environmental regulation.« less

  13. Simplified lipid guidelines

    PubMed Central

    Allan, G. Michael; Lindblad, Adrienne J.; Comeau, Ann; Coppola, John; Hudson, Brianne; Mannarino, Marco; McMinis, Cindy; Padwal, Raj; Schelstraete, Christine; Zarnke, Kelly; Garrison, Scott; Cotton, Candra; Korownyk, Christina; McCormack, James; Nickel, Sharon; Kolber, Michael R.

    2015-01-01

    Abstract Objective To develop clinical practice guidelines for a simplified approach to primary prevention of cardiovascular disease (CVD), concentrating on CVD risk estimation and lipid management for primary care clinicians and their teams; we sought increased contribution from primary care professionals with little or no conflict of interest and focused on the highest level of evidence available. Methods Nine health professionals (4 family physicians, 2 internal medicine specialists, 1 nurse practitioner, 1 registered nurse, and 1 pharmacist) and 1 nonvoting member (pharmacist project manager) comprised the overarching Lipid Pathway Committee (LPC). Member selection was based on profession, practice setting, and location, and members disclosed any actual or potential conflicts of interest. The guideline process was iterative through online posting, detailed evidence review, and telephone and online meetings. The LPC identified 12 priority questions to be addressed. The Evidence Review Group answered these questions. After review of the answers, key recommendations were derived through consensus of the LPC. The guidelines were drafted, refined, and distributed to a group of clinicians (family physicians, other specialists, pharmacists, nurses, and nurse practitioners) and patients for feedback, then refined again and finalized by the LPC. Recommendations Recommendations are provided on screening and testing, risk assessments, interventions, follow-up, and the role of acetylsalicylic acid in primary prevention. Conclusion These simplified lipid guidelines provide practical recommendations for prevention and treatment of CVD for primary care practitioners. All recommendations are intended to assist with, not dictate, decision making in conjunction with patients. PMID:26472792

  14. Cloud computing can simplify HIT infrastructure management.

    PubMed

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  15. A Framework for Categorizing Important Project Variables

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie S.

    2003-01-01

    While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.

  16. Managing coherence via put/get windows

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2011-01-11

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  17. Managing coherence via put/get windows

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2012-02-21

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  18. Transportable Applications Environment (TAE) Plus: A NASA tool for building and managing graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1993-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development which simplifies the process of creating and managing complex application graphical user interfaces (GUI's). TAE Plus supports the rapid prototyping of GUI's and allows applications to be ported easily between different platforms. This paper will discuss the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUI's easier for application developers. TAE Plus is being applied to many types of applications, and this paper discusses how it has been used both within and outside NASA.

  19. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  20. Indiva: a middleware for managing distributed media environment

    NASA Astrophysics Data System (ADS)

    Ooi, Wei-Tsang; Pletcher, Peter; Rowe, Lawrence A.

    2003-12-01

    This paper presents a unified set of abstractions and operations for hardware devices, software processes, and media data in a distributed audio and video environment. These abstractions, which are provided through a middleware layer called Indiva, use a file system metaphor to access resources and high-level commands to simplify the development of Internet webcast and distributed collaboration control applications. The design and implementation of Indiva are described and examples are presented to illustrate the usefulness of the abstractions.

  1. Triple Play over Satellite, Ka-Band Making the Difference

    NASA Astrophysics Data System (ADS)

    Benoit, Guillaume; Fenech, Hector; Pezzana, Stefano

    Over the last years a number of operators have been deploying satellite-based consumer internet access services to reduce the digital divide and capture the market of households not covered by ADSL, cable or wireless broadband. These operators are proposing a step change improvement in the economics of consumer service, with lower terminal costs, broadband access with monthly fees comparable to ADSL and an integrated technology simplifying the process of terminal installation, provisioning and management.

  2. Simplifying and upscaling water resources systems models that combine natural and engineered components

    NASA Astrophysics Data System (ADS)

    McIntyre, N.; Keir, G.

    2014-12-01

    Water supply systems typically encompass components of both natural systems (e.g. catchment runoff, aquifer interception) and engineered systems (e.g. process equipment, water storages and transfers). Many physical processes of varying spatial and temporal scales are contained within these hybrid systems models. The need to aggregate and simplify system components has been recognised for reasons of parsimony and comprehensibility; and the use of probabilistic methods for modelling water-related risks also prompts the need to seek computationally efficient up-scaled conceptualisations. How to manage the up-scaling errors in such hybrid systems models has not been well-explored, compared to research in the hydrological process domain. Particular challenges include the non-linearity introduced by decision thresholds and non-linear relations between water use, water quality, and discharge strategies. Using a case study of a mining region, we explore the nature of up-scaling errors in water use, water quality and discharge, and we illustrate an approach to identification of a scale-adjusted model including an error model. Ways forward for efficient modelling of such complex, hybrid systems are discussed, including interactions with human, energy and carbon systems models.

  3. 26 CFR 1.199-4 - Costs allocable to domestic production gross receipts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the simplified deduction method. Paragraph (f) of this section provides a small business... taxpayer for internal management or other business purposes; whether the method is used for other Federal... than a taxpayer that uses the small business simplified overall method of paragraph (f) of this section...

  4. 77 FR 5228 - Summer Food Service Program; 2012 Reimbursement Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-02

    ... of rates to highlight simplified cost accounting procedures. The 2012 rates are also presented... review by the Office of Management and Budget under Executive Order 12866. Definitions The terms used in... reimbursement rates are presented as a combined set of rates to highlight simplified cost accounting procedures...

  5. Experience of Data Handling with IPPM Payload

    NASA Astrophysics Data System (ADS)

    Errico, Walter; Tosi, Pietro; Ilstad, Jorgen; Jameux, David; Viviani, Riccardo; Collantoni, Daniele

    2010-08-01

    A simplified On-Board Data Handling system has been developed by CAEN AURELIA SPACE and ABSTRAQT as PUS-over-SpaceWire demonstration platform for the Onboard Payload Data Processing laboratory at ESTEC. The system is composed of three Leon2-based IPPM (Integrated Payload Processing Module) computers that play the roles of Instrument, Payload Data Handling Unit and Satellite Management Unit. Two PCs complete the test set-up simulating an external Memory Management Unit and the Ground Control Unit. Communication among units take place primarily through SpaceWire links; RMAP[2] protocol is used for configuration and housekeeping. A limited implementation of ECSS-E-70-41B Packet Utilisation Standard (PUS)[1] over CANbus and MIL-STD-1553B has been also realized. The Open Source RTEMS is running on the IPPM AT697E CPU as real-time operating system.

  6. Managing Underground Storage Tank Data Using dBASE III PLUS.

    DTIC Science & Technology

    1987-06-01

    create a more user friendly environment that will simplify the process of using the UST data. For example, using the Assistant facility a user can delete... for the novice user. Approved for public release; distribution is unlimited. t6 The contents of this report are not to be used for advertising...Al jther ed.lons are Obsolete U11class if iC( %J %. % FOREWORD The programs which this user’s manual documents were developed for the Office of the

  7. Managing coherence via put/get windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumrich, Matthias A; Chen, Dong; Coteus, Paul W

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an areamore » of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.« less

  8. Strategic Planning: A Practical Primer for the Healthcare Provider: Part I.

    PubMed

    Baum, Neil; Brockmann, Erich N; Lacho, Kenneth J

    2016-01-01

    Entrepreneurs are known for opportunity recognition--that is, "How can I start a business to make money from this opportunity?" However, once a commercial entity is formed to take advantage of an opportunity, the leadership priority shifts from entrepreneurial to strategic. A strategic perspective leverages limited resources to position a business for future success relative to rivals in a competitive environment. Often, the talents needed for one priority are not the same as those needed for the other. This article, the first part of a two-part article, intends to simplify the transition from an entrepreneurial to a strategic focus. It walks an entrepreneur through the strategic management planning process using a fictional business. The various tasks in the process (i.e., mission, vision, internal analysis, external analysis) are illustrated with examples from a typical primary physician's private practice. The examples show how the strategic management tasks are interrelated and ultimately lead to a philosophical approach to managing a business.

  9. Generalized Cartographic and Simultaneous Representation of Utility Networks for Decision-Support Systems and Crisis Management in Urban Environments

    NASA Astrophysics Data System (ADS)

    Becker, T.; König, G.

    2015-10-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting relevant information to the involved actors. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific analysis throughout the decision-making process. Meaningful cartographic presentation is needed for coordinating the activities of crisis manager in a highly dynamic situation, since operators' attention span and their spatial memories are limiting factors during the perception and interpretation process. Situational Awareness of operators in conjunction with a COP are key aspects in decision-making process and essential for making well thought-out and appropriate decisions. Considering utility networks as one of the most complex and particularly frequent required systems in urban environment, meaningful cartographic presentation of multiple utility networks with respect to disaster management do not exist. Therefore, an optimized visualization of utility infrastructure for emergency response procedures is proposed. The article will describe a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  10. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  11. Radioactive waste management treatments: A selection for the Italian scenario

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Locatelli, G.; Mancini, M.; Sardini, M.

    2012-07-01

    The increased attention for radioactive waste management is one of the most peculiar aspects of the nuclear sector considering both reactors and not power sources. The aim of this paper is to present the state-of-art of treatments for radioactive waste management all over the world in order to derive guidelines for the radioactive waste management in the Italian scenario. Starting with an overview on the international situation, it analyses the different sources, amounts, treatments, social and economic impacts looking at countries with different industrial backgrounds, energetic policies, geography and population. It lists all these treatments and selects the most reasonablemore » according to technical, economic and social criteria. In particular, a double scenario is discussed (to be considered in case of few quantities of nuclear waste): the use of regional, centralized, off site processing facilities, which accept waste from many nuclear plants, and the use of mobile systems, which can be transported among multiple nuclear sites for processing campaigns. At the end the treatments suitable for the Italian scenario are presented providing simplified work-flows and guidelines. (authors)« less

  12. How to unlock the benefits of MRP (materiel requirements planning) II and Just-in-Time.

    PubMed

    Jacobi, M A

    1994-05-01

    Manufacturing companies need to use the best and most applicable parts of MRP II and JIT to run their businesses effectively. MRP II provides the methodology to plan and control the total resources of the company and focuses on the processes that add value to their customers' products. It is the cornerstone of total quality management, as it reduces the variability and costly activities in the communication and subsequent execution of the required steps from customer order to shipment. JIT focuses on simplifying the total business operation and execution of business processes. MRP II and JIT are the foundations for successful manufacturing businesses.

  13. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  14. 29 CFR 403.4 - Simplified annual reports for smaller labor organizations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF LABOR LABOR-MANAGEMENT STANDARDS LABOR ORGANIZATION ANNUAL FINANCIAL REPORTS § 403.4 Simplified... revocation of the privileges as provided in section 208 of the Act, to file the annual financial report... the privileges as provided in section 208 of the Act, to file the annual financial report called for...

  15. 48 CFR 246.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...

  16. 48 CFR 246.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...

  17. 48 CFR 246.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...

  18. 48 CFR 46.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...

  19. 48 CFR 46.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...

  20. 48 CFR 46.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...

  1. 48 CFR 46.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...

  2. 48 CFR 246.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...

  3. 48 CFR 246.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...

  4. 48 CFR 46.404 - Government contract quality assurance for acquisitions at or below the simplified acquisition...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...

  5. Present Practice And Perceived Needs-Managing Diagnostic Images

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.

    1982-01-01

    With the advent of digital radiography and the installed base of CT, Nuclear Medicine and Ultrasound Scanners numbering in the thousands and the potential of NMR, the market potential for the electronic management of digital images is perhaps one of the most exciting, fastest growing (and most ill defined) fields in medicine today. New technology in optical data storage, electronic transmission, image reproduction, microprocessing, automation and software development provide the promise of a whole new generation of products which will simplify and enhance the diagnostic process (thereby hopefully improving diagnostic accuracy), enable implementation of archival review in a practical sense, expand the availability of diagnostic data and lower the cost/case by at least an order of magnitude.

  6. The Design of Integrated Information System for High Voltage Metering Lab

    NASA Astrophysics Data System (ADS)

    Ma, Yan; Yang, Yi; Xu, Guangke; Gu, Chao; Zou, Lida; Yang, Feng

    2018-01-01

    With the development of smart grid, intelligent and informatization management of high-voltage metering lab become increasingly urgent. In the paper we design an integrated information system, which automates the whole transactions from accepting instruments, make experiments, generating report, report signature to instrument claims. Through creating database for all the calibrated instruments, using two-dimensional code, integrating report templates in advance, establishing bookmarks and online transmission of electronical signatures, our manual procedures reduce largely. These techniques simplify the complex process of account management and report transmission. After more than a year of operation, our work efficiency improves about forty percent averagely, and its accuracy rate and data reliability are much higher as well.

  7. A simple bedside approach to therapeutic goals achievement during the management of deceased organ donors--An adapted version of the "VIP" approach.

    PubMed

    Westphal, Glauco Adrieno

    2016-02-01

    The disproportion between the supply and demand of transplant organs could be alleviated by improving the quality of clinical management of deceased potential donors. As a large number of donor losses by cardiac arrest occur due to hemodynamic instability, without instituting all essential maintenance measures, it is likely that the application of simplified potential donor maintenance protocols will help to decrease potential donor losses and increase the supply of organs for transplantation. The Ventilation, Infusion and Pumping (VIP) strategy is a mnemonic method that brings together key aspects of the restoration of oxygen delivery to tissues during hemodynamic instability: adequate mechanical Ventilation, volume Infusion and evaluation of heart Pump effectiveness. The inclusion of the additional initials, "P" and "S," refers to Pharmacological treatment and Specificities involved in the etiology of shock. The use of simplified care standards can assist in adhering to essential potential donor management measures. Therefore, using a simplified method as the adapted VIP approach can contribute to improving management standards of potential organ donors and increasing the supply of organs for transplantation. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. [Consideration about chemistry, manufacture and control (CMC) key problems in simplified registration of classical traditional Chinese medicine excellent prescriptions].

    PubMed

    Wang, Zhi-Min; Liu, Ju-Yan; Liu, Xiao-Qian; Wang, De-Qin; Yan, Li-Hua; Zhu, Jin-Jin; Gao, Hui-Min; Li, Chun; Wang, Jin-Yu; Li, Chu-Yuan; Ni, Qing-Chun; Huang, Ji-Sheng; Lin, Juan

    2017-05-01

    As an outstanding representative of traditional Chinese medicine(TCM) prescriptions accumulated from famous TCM doctors' clinical experiences in past dynasties, classical TCM excellent prescriptions (cTCMeP) are the most valuable part of TCM system. To support the research and development of cTCMeP, a series of regulations and measures were issued to encourage its simplified registration. There is still a long-way to go because many key problems and puzzles about technology, registration and administration in cTCMeP R&D process are not resolved. Based on the analysis of registration and management regulations of botanical drug products in FDA of USA and Japan, and EMA of Europe, the possible key problems and countermeasures in chemistry, manufacture and control (CMC) of simplified registration of cTCMeP were analyzed on the consideration of its actual situation. The method of "reference decoction extract by traditional prescription" (RDETP) was firstly proposed as standard to evaluate the quality and preparation uniformity between the new developing product under simplified registration and traditional original usages of cTCMeP, instead of Standard Decoction method in Japan. "Totality of the evidence" approach, mass balance and bioassay/biological assay of cTCMeP were emphatically suggested to introduce to the quality uniformity evaluation system in the raw drug material, drug substance and final product between the modern product and traditional decoction. Copyright© by the Chinese Pharmaceutical Association.

  9. A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube

    NASA Astrophysics Data System (ADS)

    Zhang, Ziqian; Yang, Huilin

    2017-12-01

    The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.

  10. Simplified power processing for ion-thruster subsystems

    NASA Technical Reports Server (NTRS)

    Wessel, F. J.; Hancock, D. J.

    1983-01-01

    A design for a greatly simplified power-processing unit (SPPU) for the 8-cm diameter mercury-ion-thruster subsystem is discussed. This SPPU design will provide a tenfold reduction in parts count, a decrease in system mass and cost, and an increase in system reliability compared to the existing power-processing unit (PPU) used in the Hughes/NASA Lewis Research Center Ion Auxiliary Propulsion Subsystem. The simplifications achieved in this design will greatly increase the attractiveness of ion propulsion in near-term and future spacecraft propulsion applications. A description of a typical ion-thruster subsystem is given. An overview of the thruster/power-processor interface requirements is given. Simplified thruster power processing is discussed.

  11. Organic thin film transistor with a simplified planar structure

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Yu, Jungsheng; Zhong, Jian; Jiang, Yadong

    2009-05-01

    Organic thin film transistor (OTFT) with a simplified planar structure is described. The gate electrode and the source/drain electrodes of OTFT are processed in one planar structure. And these three electrodes are deposited on the glass substrate by DC sputtering technology using Cr/Ni target. Then the electrode layouts of different width length ratio are made by photolithography technology at the same time. Only one step of deposition and one step of photolithography is needed while conventional process takes at least two steps of deposition and two steps of photolithography. Metal is first prepared on the other side of glass substrate and electrode is formed by photolithography. Then source/drain electrode is prepared by deposition and photolithography on the side with the insulation layer. Compared to conventional process of OTFTs, the process in this work is simplified. After three electrodes prepared, the insulation layer is made by spin coating method. The organic material of polyimide is used as the insulation layer. A small molecular material of pentacene is evaporated on the insulation layer using vacuum deposition as the active layer. The process of OTFTs needs only three steps totally. A semi-auto probe stage is used to connect the three electrodes and the probe of the test instrument. A charge carrier mobility of 0.3 cm2 /V s, is obtained from OTFTs on glass substrates with and on/off current ratio of 105. The OTFTs with the planar structure using simplified process can simplify the device process and reduce the fabrication cost.

  12. Simplifying Facility and Event Scheduling: Saving Time and Money.

    ERIC Educational Resources Information Center

    Raasch, Kevin

    2003-01-01

    Describes a product called the Event Management System (EMS), a computer software program to manage facility and event scheduling. Provides example of the school district and university uses of EMS. Describes steps in selecting a scheduling-management system. (PKP)

  13. School Food Service Financial Management Handbook for Uniform Accounting. Simplified System.

    ERIC Educational Resources Information Center

    Food and Nutrition Service (USDA), Washington, DC.

    This handbook is intended to assist the School Food Authority and those responsible for recording and reporting on the various financial activities of a school food service fund. It describes in a simplified form uniform accounting systems suitable for use by all school food authorities. The material, oriented toward the average-to-larger school…

  14. Using simplified Chaos Theory to manage nursing services.

    PubMed

    Haigh, Carol A

    2008-04-01

    The purpose of this study was to evaluate the part simplified chaos theory could play in the management of nursing services. As nursing care becomes more complex, practitioners need to become familiar with business planning and objective time management. There are many time-limited methods that facilitate this type of planning but few that can help practitioners to forecast the end-point outcome of the service they deliver. A growth model was applied to a specialist service to plot service trajectory. Components of chaos theory can play a role in forecasting service outcomes and consequently the impact upon the management of such services. The ability to (1) track the trajectory of a service and (2) manipulate that trajectory by introducing new variables can allow managers to forward plan for service development and to evaluate the effectiveness of a service by plotting its end-point state.

  15. 48 CFR 1352.213-70 - Evaluation utilizing simplified acquisition procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... services, cost management, communications between contracting parties, proactive management and customer satisfaction. (4) Price. (End of clause) [75 FR 10570, Mar. 8, 2010; 75 FR 14496, Mar. 26, 2010] ...

  16. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  17. Use of Structure as a Basis for Abstraction in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Davison, Hayley J.; Hansman, R. John

    2004-01-01

    The safety and efficiency of the air traffic control domain is highly dependent on the capabilities and limitations of its human controllers. Past research has indicated that structure provided by the airspace and procedures could aid in simplifying the controllers cognitive tasks. In this paper, observations, interviews, voice command data analyses, and radar analyses were conducted at the Boston Terminal Route Control (TRACON) facility to determine if there was evidence of controllers using structure to simplify their cognitive processes. The data suggest that controllers do use structure-based abstractions to simplify their cognitive processes, particularly the projection task. How structure simplifies the projection task and the implications of understanding the benefits structure provides to the projection task was discussed.

  18. Hazardous waste management system--Environmental Protection Agency. Notice of regulatory reform actions; request for comments.

    PubMed

    1982-12-13

    In response to Executive Order 12291 and the President's Task Force on Regulatory Relief, the Environmental Protection Agency is reviewing and reassessing the hazardous waste regulations developed under the Resource Conservation and Recovery Act (RCRA). A variety of activities are underway that will simplify procedures and reduce paperwork, modify existing regulations to make them more workable and cost effective, and control new wastes and new processes. The purpose of this notice is to inform the public of these activities and invite comments on the general approaches being taken.

  19. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  20. X-33/RLV System Health Management/ Vehicle Health Management

    NASA Technical Reports Server (NTRS)

    Garbos, Raymond J.; Mouyos, William

    1998-01-01

    To reduce operations cost, the RLV must include the following elements: highly reliable, robust subsystems designed for simple repair access with a simplified servicing infrastructure and incorporating expedited decision making about faults and anomalies. A key component for the Single Stage to Orbit (SSTO) RLV System used to meet these objectives is System Health Management (SHM). SHM deals with the vehicle component- Vehicle Health Management (VHM), the ground processing associated with the fleet (GVHM) and the Ground Infrastructure Health Management (GIHM). The objective is to provide an automated collection and paperless health decision, maintenance and logistics system. Many critical technologies are necessary to make the SHM (and more specifically VHM) practical, reliable and cost effective. Sanders is leading the design, development and integration of the SHM system for RLV and X-33 SHM (a sub-scale, sub-orbit Advanced Technology Demonstrator). This paper will present the X-33 SHM design which forms the baseline for RLV SHM. This paper will also discuss other applications of these technologies.

  1. Re-engineering NASA's space communications to remain viable in a constrained fiscal environment

    NASA Astrophysics Data System (ADS)

    Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.

    1994-11-01

    Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.

  2. Re-engineering NASA's space communications to remain viable in a constrained fiscal environment

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.

    1994-01-01

    Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.

  3. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences.

    PubMed

    Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.

  4. A coated-wire ion-selective electrode for ionic calcium measurements

    NASA Technical Reports Server (NTRS)

    Hines, John W.; Arnaud, Sara; Madou, Marc; Joseph, Jose; Jina, Arvind

    1991-01-01

    A coated-wire ion-selective electrode for measuring ionic calcium was developed, in collaboration with Teknektron Sensor Development Corporation (TSDC). This coated wire electrode sensor makes use of advanced, ion-responsive polyvinyl chloride (PVC) membrane technology, whereby the electroactive agent is incorporated into a polymeric film. The technology greatly simplifies conventional ion-selective electrode measurement technology, and is envisioned to be used for real-time measurement of physiological and environment ionic constituents, initially calcium. A primary target biomedical application is the real-time measurement of urinary and blood calcium changes during extended exposure to microgravity, during prolonged hospital or fracture immobilization, and for osteoporosis research. Potential advanced life support applications include monitoring of calcium and other ions, heavy metals, and related parameters in closed-loop water processing and management systems. This technology provides a much simplified ionic calcium measurement capability, suitable for both automated in-vitro, in-vivo, and in-situ measurement applications, which should be of great interest to the medical, scientific, chemical, and space life sciences communities.

  5. 76 FR 7102 - Simplified Network Application Processing System, On-line Registration and Account Maintenance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... DEPARTMENT OF COMMERCE Bureau of Industry and Security 15 CFR Part 748 [Docket No. 100826397-1059-02] RIN 0694-AE98 Simplified Network Application Processing System, On-line Registration and Account Maintenance AGENCY: Bureau of Industry and Security, Commerce. ACTION: Final rule. SUMMARY: The Bureau of...

  6. Development of evaluation models of manpower needs for dismantling the dry conversion process-related equipment in uranium refining and conversion plant (URCP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sari Izumo; Hideo Usui; Mitsuo Tachibana

    Evaluation models for determining the manpower needs for dismantling various types of equipment in uranium refining and conversion plant (URCP) have been developed. The models are widely applicable to other uranium handling facilities. Additionally, a simplified model was developed for easily and accurately calculating the manpower needs for dismantling dry conversion process-related equipment (DP equipment). It is important to evaluate beforehand project management data such as manpower needs to prepare an optimized decommissioning plan and implement effective dismantling activity. The Japan Atomic Energy Agency (JAEA) has developed the project management data evaluation system for dismantling activities (PRODIA code), which canmore » generate project management data using evaluation models. For preparing an optimized decommissioning plan, these evaluation models should be established based on the type of nuclear facility and actual dismantling data. In URCP, the dry conversion process of reprocessed uranium and others was operated until 1999, and the equipment related to the main process was dismantled from 2008 to 2011. Actual data such as manpower for dismantling were collected during the dismantling activities, and evaluation models were developed using the collected actual data on the basis of equipment classification considering the characteristics of uranium handling facility. (authors)« less

  7. Scalable problems and memory bounded speedup

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Ni, Lionel M.

    1992-01-01

    In this paper three models of parallel speedup are studied. They are fixed-size speedup, fixed-time speedup and memory-bounded speedup. The latter two consider the relationship between speedup and problem scalability. Two sets of speedup formulations are derived for these three models. One set considers uneven workload allocation and communication overhead and gives more accurate estimation. Another set considers a simplified case and provides a clear picture on the impact of the sequential portion of an application on the possible performance gain from parallel processing. The simplified fixed-size speedup is Amdahl's law. The simplified fixed-time speedup is Gustafson's scaled speedup. The simplified memory-bounded speedup contains both Amdahl's law and Gustafson's scaled speedup as special cases. This study leads to a better understanding of parallel processing.

  8. A Computer Program for the Management of Prescription-Based Problems.

    ERIC Educational Resources Information Center

    Cotter, Patricia M.; Gumtow, Robert H.

    1991-01-01

    The Prescription Management Program, a software program using Apple's HyperCard on a MacIntosh, was developed to simplify the creation, storage, modification, and general management of prescription-based problems. Pharmacy instructors may customize the program to serve their individual teaching needs. (Author/DB)

  9. Simplified dichromated gelatin hologram recording process

    NASA Technical Reports Server (NTRS)

    Georgekutty, Tharayil G.; Liu, Hua-Kuang

    1987-01-01

    A simplified method for making dichromated gelatin (DCG) holographic optical elements (HOE) has been discovered. The method is much less tedious and it requires a period of processing time comparable with that for processing a silver halide hologram. HOE characteristics including diffraction efficiency (DE), linearity, and spectral sensitivity have been quantitatively investigated. The quality of the holographic grating is very high. Ninety percent or higher diffraction efficiency has been achieved in simple plane gratings made by this process.

  10. A Cluster-Randomized, Controlled Trial of a Simplified Multifaceted Management Program for Individuals at High Cardiovascular Risk (SimCard Trial) in Rural Tibet, China, and Haryana, India.

    PubMed

    Tian, Maoyi; Ajay, Vamadevan S; Dunzhu, Danzeng; Hameed, Safraj S; Li, Xian; Liu, Zhong; Li, Cong; Chen, Hao; Cho, KaWing; Li, Ruilai; Zhao, Xingshan; Jindal, Devraj; Rawal, Ishita; Ali, Mohammed K; Peterson, Eric D; Ji, Jiachao; Amarchand, Ritvik; Krishnan, Anand; Tandon, Nikhil; Xu, Li-Qun; Wu, Yangfeng; Prabhakaran, Dorairaj; Yan, Lijing L

    2015-09-01

    In rural areas in China and India, the cardiovascular disease burden is high but economic and healthcare resources are limited. This study (the Simplified Cardiovascular Management Study [SimCard]) aims to develop and evaluate a simplified cardiovascular management program delivered by community health workers with the aid of a smartphone-based electronic decision support system. The SimCard study was a yearlong cluster-randomized, controlled trial conducted in 47 villages (27 in China and 20 in India). Recruited for the study were 2086 individuals with high cardiovascular risk (aged ≥40 years with self-reported history of coronary heart disease, stroke, diabetes mellitus, and/or measured systolic blood pressure ≥160 mm Hg). Participants in the intervention villages were managed by community health workers through an Android-powered app on a monthly basis focusing on 2 medication use and 2 lifestyle modifications. In comparison with the control group, the intervention group had a 25.5% (P<0.001) higher net increase in the primary outcome of the proportion of patient-reported antihypertensive medication use pre- and post-intervention. There were also significant differences in certain secondary outcomes: aspirin use (net difference: 17.1%; P<0.001) and systolic blood pressure (-2.7 mm Hg; P=0.04). However, no significant changes were observed in the lifestyle factors. The intervention was culturally tailored, and country-specific results revealed important differences between the regions. The results indicate that the simplified cardiovascular management program improved quality of primary care and clinical outcomes in resource-poor settings in China and India. Larger trials in more places are needed to ascertain the potential impacts on mortality and morbidity outcomes. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01503814. © 2015 American Heart Association, Inc.

  11. Computer models for economic and silvicultural decisions

    Treesearch

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  12. Computer multitasking with Desqview 386 in a family practice.

    PubMed Central

    Davis, A E

    1990-01-01

    Computers are now widely used in medical practice for accounting and secretarial tasks. However, it has been much more difficult to use computers in more physician-related activities of daily practice. I investigated the Desqview multitasking system on a 386 computer as a solution to this problem. Physician-directed tasks of management of patient charts, retrieval of reference information, word processing, appointment scheduling and office organization were each managed by separate programs. Desqview allowed instantaneous switching back and forth between the various programs. I compared the time and cost savings and the need for physician input between Desqview 386, a 386 computer alone and an older, XT computer. Desqview significantly simplified the use of computer programs for medical information management and minimized the necessity for physician intervention. The time saved was 15 minutes per day; the costs saved were estimated to be $5000 annually. PMID:2383848

  13. Integrity management of offshore structures and its implication on computation of structural action effects and resistance

    NASA Astrophysics Data System (ADS)

    Moan, T.

    2017-12-01

    An overview of integrity management of offshore structures, with emphasis on the oil and gas energy sector, is given. Based on relevant accident experiences and means to control the associated risks, accidents are categorized from a technical-physical as well as human and organizational point of view. Structural risk relates to extreme actions as well as structural degradation. Risk mitigation measures, including adequate design criteria, inspection, repair and maintenance as well as quality assurance and control of engineering processes, are briefly outlined. The current status of risk and reliability methodology to aid decisions in the integrity management is briefly reviewed. Finally, the need to balance the uncertainties in data, methods and computational efforts and the cautious use and quality assurance and control in applying high fidelity methods to avoid human errors, is emphasized, and with a plea to develop both high fidelity as well as efficient, simplified methods for design.

  14. Real-time automated failure identification in the Control Center Complex (CCC)

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.

  15. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  3. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  7. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  8. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  10. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  11. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  12. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  13. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  14. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  20. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  3. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. Road to Grid Parity through Deployment of Low-Cost 21.5% N-Type Si Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velundur, Vijay

    This project seeks to develop and deploy differentiated 21.5% efficient n-type Si solar cells while reaching the SunShot module cost goal of ≤ $0.50/W. This objective hinges on development of enabling low cost technologies that simplify the manufacturing process and reduce overall processing costs. These comprise of (1) Boron emitter formation and passivation; (2) Simplified processing process for emitter and BSF layers; and (3) Advanced metallization for the front and back contacts.

  7. Management information system of medical equipment using mobile devices

    NASA Astrophysics Data System (ADS)

    Núñez, C.; Castro, D.

    2011-09-01

    The large numbers of technologies currently incorporated into mobile devices transform them into excellent tools for capture and to manage the information, because of the increasing computing power and storage that allow to add many miscellaneous applications. In order to obtain benefits of these technologies, in the biomedical engineering field, it was developed a mobile information system for medical equipment management. The central platform for the system it's a mobile phone, which by a connection with a web server, it's capable to send and receive information relative to any medical equipment. Decoding a type of barcodes, known as QR-Codes, the management process is simplified and improved. These barcodes identified the medical equipments in a database, when these codes are photographed and decoded with the mobile device, you can access to relevant information about the medical equipment in question. This Project in it's actual state is a basic support tool for the maintenance of medical equipment. It is also a modern alternative, competitive and economic in the actual market.

  8. A simplified Excel tool for implementation of RUSLE2 in vineyards for stakeholders with limited dataset

    NASA Astrophysics Data System (ADS)

    Gomez, Jose Alfonso; Biddoccu, Marcella; Guzmán, Gema; Cavallo, Eugenio

    2016-04-01

    Analysis with simulation models is in many situations the only way to evaluate the impact of changes in soil management on soil erosion risk, and the Revised Universal Soil Loss Equation RUSLE (Renard et al. 1997, Dabney et al. 2012) remains as the most widely used. Even with their relative simplicity compared to other, more process based, erosion models proper RUSLE calibration for a given situation outside the modelling community can be challenging, especially in situations outside of those widely covered in the USA. An approach pursued by Gómez et al. (2003) to overcome this problems for calibrating RUSLE, specially the cover-management factor, C, was to build a summary model using the equations defined by the RUSLE manual (Renard et al. 1997) but considering that the basic information required to calibrate the subfactors, such as soil surface roughness and ground cover, soil moisture, … were calculated (or taken from available sources) elsewhere and added to the summary model instead of calculated by the RUSLE software. This strategy simplified the calibration process as well as the understanding and interpretation of the RUSLE parameters and model behavior by on-expert users for its application in olive orchards under a broad range of management conditions. Gómez et al. (2003) build this summary model in Excel and demonstrated the ability to calibrate RUSLE for a broad range of management conditions. Later on several studies (Vanwalleghem et al., 2011, Marin, 2013) demonstrated how this summary model successfully predicted soil losses at hillslope scale close to those determined experimentally. Vines are one of the most extended tree crops covering a wide range of environmental and management conditions, and conceptually present in terms of soil conservation several analogies with olives especially in relation to soil management (Gomez et al., 2011). In vine growing areas, besides topographic and rainfall characteristics, the soil management practices adopted in vineyards could favor erosion. Cultivation with rows running up-and-down the slope on sloping vineyards, maintenance of bare soil, compaction due to high traffic of machinery are some of the vineyard's management practices that expose soil to degradation, favoring runoff and soil erosion processes. On the other side, the adoption of grass cover in vineyards has a fundamental role in soil protection against erosion, in case of high rainfall intensity and erosivity. This communication presents a preliminary version of a summary model to calibrate RUSLE for vines under different soil management options following an approach analogous to that used by Gómez et al. (2003) for olive orchards in a simplified situation of an homogeneous hillslope, including the latest RUSLE conceptual updates (RUSLE2, Dabney et al., 2012). It also presents preliminary results for different values of the C factor under different soil management and environmental conditions, as well as its impact on predicted soil losses in the long term in vineyards located in Southern Spain and N Italy. Keywords: vines, erosion, soil management, RUSLE, model. References Dabney, S.M. Yoder, D.C. Yoder, Vieira, D.A.N. 2012. The application of the Revised Universal Soil Loss Equation, Version 2, to evaluate the impacts of alternative climate change scenarios on runoff and sediment yield. Journal of Soil and Water Conservation 67: 343 - 353. Gómez, J.A., Battany, M., Renschler, C.S., Fereres, E. 2003. Evaluating the impact of soil management on soil loss in olive orchards. Soil Use Manage. 19: 127- 134. Gómez, J.A., Llewellyn, C., Basch, G, Sutton, P.B., Dyson, J.S., Jones, C.A. 2011. The effects of cover crops and conventional tillage on soil and runoff loss in vineyards and olive groves in several Mediterranean countries. Soil Use and Management 27 502 - 514 Marín, V. 2013. Interfaz gráfica para la valoración de la pérdida de suelo en parcelas de olivar. Final Degree project. University of Cordoba. Vanwalleghem, T., Infante, J.A., González, M., Soto, D., Gómez, J.A. 2011. Quantifying the effect of historical soil management on soil erosion rates in Mediterranean olive orchards. Agriculture, Ecosystems & Environment 142: 341-351.

  9. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences

    PubMed Central

    Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287

  10. Multidimensional Simulation Applied to Water Resources Management

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Ferreira, F. C.; Loucks, D. P.; Seixas, M. J.

    1990-09-01

    A framework for an integrated decision aiding simulation (IDEAS) methodology using numerical, linguistic, and pictorial entities and operations is introduced. IDEAS relies upon traditional numerical formulations, logical rules to handle linguistic entities with linguistic values, and a set of pictorial operations. Pictorial entities are defined by their shape, size, color, and position. Pictorial operators include reproduction (copy of a pictorial entity), mutation (expansion, rotation, translation, change in color), fertile encounters (intersection, reunion), and sterile encounters (absorption). Interaction between numerical, linguistic, and pictorial entities is handled through logical rules or a simplified vector calculus operation. This approach is shown to be applicable to various environmental and water resources management analyses using a model to assess the impacts of an oil spill. Future developments, including IDEAS implementation on parallel processing machines, are also discussed.

  11. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  12. BigDataScript: a scripting language for data pipelines

    PubMed Central

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778

  13. Research of Environmental and Economic Interactions of Coke And By-Product Process

    NASA Astrophysics Data System (ADS)

    Mikhailov, Vladimir; Kiseleva, Tamara; Bugrova, Svetlana; Muromtseva, Alina; Mikhailova, Yana

    2017-11-01

    The issues of showing relations between environmental and economic indicators (further - environmental and economic interactions) of coke and by-product process are considered in the article. The purpose of the study is to reveal the regularities of the functioning of the local environmental and economic system on the basis of revealed spectrum of environmental and economic interactions. A simplified scheme of the environmental and economic system "coke and by-product process - the environment" was developed. The forms of the investigated environmental-economic interactions were visualized and the selective interpretation of the tightness of the established connection was made. The main result of the work is modeling system of environmental and economic interactions that allows increasing the efficiency of local ecological and economic system management and optimizing the "interests" of an industrial enterprise - the source of negative impact on the environment. The results of the survey can be recommended to government authorities and industrial enterprises with a wide range of negative impact forms to support the adoption of effective management decisions aimed at sustainable environmental and economic development of the region or individual municipalities.

  14. A simplified CT-guided approach for greater occipital nerve infiltration in the management of occipital neuralgia.

    PubMed

    Kastler, Adrian; Onana, Yannick; Comte, Alexandre; Attyé, Arnaud; Lajoie, Jean-Louis; Kastler, Bruno

    2015-08-01

    To evaluate the efficacy of a simplified CT-guided greater occipital nerve (GON) infiltration approach in the management of occipital neuralgia (ON). Local IRB approval was obtained and written informed consent was waived. Thirty three patients suffering from severe refractory ON who underwent a total of 37 CT-guided GON infiltrations were included between 2012 and 2014. GON infiltration was performed at the first bend of the GON, between the inferior obliqus capitis and semispinalis capitis muscles with local anaesthetics and cortivazol. Pain was evaluated via VAS scores. Clinical success was defined by pain relief greater than or equal to 50 % lasting for at least 3 months. The pre-procedure mean pain score was 8/10. Patients suffered from left GON neuralgia in 13 cases, right GON neuralgia in 16 cases and bilateral GON neuralgia in 4 cases. The clinical success rate was 86 %. In case of clinical success, the mean pain relief duration following the procedure was 9.16 months. Simplified CT-guided infiltration appears to be effective in managing refractory ON. With this technique, infiltration of the GON appears to be faster, technically easier and, therefore, safer compared with other previously described techniques. • Occipital neuralgia is a very painful and debilitating condition • GON infiltrations have been successful in the treatment of occipital neuralgia • This simplified technique presents a high efficacy rate with long-lasting pain relief • This infiltration technique does not require contrast media injection for pre-planning • GON infiltration at the first bend appears easier and safer.

  15. What's so Simple about Simplified Texts? A Computational and Psycholinguistic Investigation of Text Comprehension and Text Processing

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.

    2014-01-01

    This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…

  16. Implementing Project SIED: Special Education Teachers' Perceptions of a Simplified Technology Decision-Making Process for App Identification and Evaluation

    ERIC Educational Resources Information Center

    Schmidt, Matthew M.; Lin, Meng-Fen Grace; Paek, Seungoh; MacSuga-Gage, Ashley; Gage, Nicholas A.

    2017-01-01

    The worldwide explosion in popularity of mobile devices has created a dramatic increase in mobile software (apps) that are quick and easy to find and install, cheap, disposable, and usually single purpose. Hence, teachers need an equally streamlined and simplified decision-making process to help them identify educational apps--an approach that…

  17. 77 FR 19975 - VA Acquisition Regulation: Simplified Acquisition Procedures for Health-Care Resources (Section...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... mail or hand-delivery to the Director, Regulations Management (02REG), Department of Veterans Affairs... received will be available for public inspection in the Office of Regulation Policy and Management, Room... be viewed online through the Federal Docket Management System (FDMS) at www.regulations.gov . FOR...

  18. [A simplified occupational health and safety management system designed for small enterprises. Initial validation results].

    PubMed

    Bacchi, Romana; Veneri, L; Ghini, P; Caso, Maria Alessandra; Baldassarri, Giovanna; Renzetti, F; Santarelli, R

    2009-01-01

    Occupational Health and Safety Management Systems (OHSMS) are known to be effective in improving safety at work. Unfortunately they are often too resource-heavy for small businesses. The aim of this project was to develop and test a simplified model of OHSMS suitable for small enterprises. The model consists of 7 procedures and various operating forms and check lists, that guide the enterprise in managing safety at work. The model was tested in 15 volunteer enterprises. In most of the enterprises two audits showed increased awareness and participation of workers; better definition and formalisation of respon sibilities in 8 firms; election of Union Safety Representatives in over one quarter of the enterprises; improvement of safety equipment. The study also helped identify areas where the model could be improved by simplification of unnecessarily complex and redundant procedures.

  19. Practical, Real-Time, and Robust Watermarking on the Spatial Domain for High-Definition Video Contents

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Su; Lee, Hae-Yeoun; Im, Dong-Hyuck; Lee, Heung-Kyu

    Commercial markets employ digital right management (DRM) systems to protect valuable high-definition (HD) quality videos. DRM system uses watermarking to provide copyright protection and ownership authentication of multimedia contents. We propose a real-time video watermarking scheme for HD video in the uncompressed domain. Especially, our approach is in aspect of practical perspectives to satisfy perceptual quality, real-time processing, and robustness requirements. We simplify and optimize human visual system mask for real-time performance and also apply dithering technique for invisibility. Extensive experiments are performed to prove that the proposed scheme satisfies the invisibility, real-time processing, and robustness requirements against video processing attacks. We concentrate upon video processing attacks that commonly occur in HD quality videos to display on portable devices. These attacks include not only scaling and low bit-rate encoding, but also malicious attacks such as format conversion and frame rate change.

  20. Mathematical modeling of the heat transfer during pyrolysis process used for end-of-life tires treatment

    NASA Astrophysics Data System (ADS)

    Zheleva, I.; Georgiev, I.; Filipova, M.; Menseidov, D.

    2017-10-01

    Mathematical modeling of the heat transfer during the pyrolysis process used for the treatment of the End-of-Lifetires (EOLT) is presented in this paper. The pyrolysis process is 3D and non-stationary and because of this it is very complicated for modeling and studying. To simplify the modeling here a hierarchy of 2D models for the temperature which describe the non-stationary heat transfer in such a pyrolysis station is created. An algorithm for solving the model equations, based on MATLAB software is developed. The results for the temperature for some characteristic periods of operation of pyrolysis station are presented and commented in the paper. The results from this modeling can be used in the real pyrolysis station for more precise displacement of measurement devices and for designing of automated management of the process.

  1. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  2. Off the Shelf Fouling Management

    PubMed Central

    Rittschof, Daniel

    2017-01-01

    This chapter tells the story of a research thread that identified and modified a pharmaceutical that could be a component of environmentally benign fouling management coatings. First, I present the background context of biofouling and how fouling is managed. The major target of the research is disrupting transduction of a complex process in all macrofouling organisms: metamorphosis. Using a bioassay directed approach we first identified a pharmaceutical candidate. Then, based on structure function studies coupled with laboratory and field bioassays, we simplified the molecule, eliminating halogens and aromatic rings to a pharmacophore that could be readily broken down by bacteria. Next, we did further structure function studies coupled to lab and field bioassays of modifications that enabled delivery of the molecule in a variety of coatings. The outcome is a different way of thinking about managing fouling and concepts in which molecules are designed to perform a function and then degrade. This work is discussed in the context of existing fouling management approaches and business models which use long-lived broad-spectrum biocides which have consequences for human, environmental health, and food security. PMID:28613232

  3. Cooperative Game-Based Energy Efficiency Management over Ultra-Dense Wireless Cellular Networks

    PubMed Central

    Li, Ming; Chen, Pengpeng; Gao, Shouwan

    2016-01-01

    Ultra-dense wireless cellular networks have been envisioned as a promising technique for handling the explosive increase of wireless traffic volume. With the extensive deployment of small cells in wireless cellular networks, the network spectral efficiency (SE) is improved with the use of limited frequency. However, the mutual inter-tier and intra-tier interference between or among small cells and macro cells becomes serious. On the other hand, more chances for potential cooperation among different cells are introduced. Energy efficiency (EE) has become one of the most important problems for future wireless networks. This paper proposes a cooperative bargaining game-based method for comprehensive EE management in an ultra-dense wireless cellular network, which highlights the complicated interference influence on energy-saving challenges and the power-coordination process among small cells and macro cells. Especially, a unified EE utility with the consideration of the interference mitigation is proposed to jointly address the SE, the deployment efficiency (DE), and the EE. In particular, closed-form power-coordination solutions for the optimal EE are derived to show the convergence property of the algorithm. Moreover, a simplified algorithm is presented to reduce the complexity of the signaling overhead, which is significant for ultra-dense small cells. Finally, numerical simulations are provided to illustrate the efficiency of the proposed cooperative bargaining game-based and simplified schemes. PMID:27649170

  4. Cooperative Game-Based Energy Efficiency Management over Ultra-Dense Wireless Cellular Networks.

    PubMed

    Li, Ming; Chen, Pengpeng; Gao, Shouwan

    2016-09-13

    Ultra-dense wireless cellular networks have been envisioned as a promising technique for handling the explosive increase of wireless traffic volume. With the extensive deployment of small cells in wireless cellular networks, the network spectral efficiency (SE) is improved with the use of limited frequency. However, the mutual inter-tier and intra-tier interference between or among small cells and macro cells becomes serious. On the other hand, more chances for potential cooperation among different cells are introduced. Energy efficiency (EE) has become one of the most important problems for future wireless networks. This paper proposes a cooperative bargaining game-based method for comprehensive EE management in an ultra-dense wireless cellular network, which highlights the complicated interference influence on energy-saving challenges and the power-coordination process among small cells and macro cells. Especially, a unified EE utility with the consideration of the interference mitigation is proposed to jointly address the SE, the deployment efficiency (DE), and the EE. In particular, closed-form power-coordination solutions for the optimal EE are derived to show the convergence property of the algorithm. Moreover, a simplified algorithm is presented to reduce the complexity of the signaling overhead, which is significant for ultra-dense small cells. Finally, numerical simulations are provided to illustrate the efficiency of the proposed cooperative bargaining game-based and simplified schemes.

  5. Mobility, expansion and management of a multi-species scuba diving fishery in East Africa.

    PubMed

    Eriksson, Hampus; de la Torre-Castro, Maricela; Olsson, Per

    2012-01-01

    Scuba diving fishing, predominantly targeting sea cucumbers, has been documented to occur in an uncontrolled manner in the Western Indian Ocean and in other tropical regions. Although this type of fishing generally indicates a destructive activity, little attention has been directed towards this category of fishery, a major knowledge gap and barrier to management. With the aim to capture geographic scales, fishing processes and social aspects the scuba diving fishery that operate out of Zanzibar was studied using interviews, discussions, participant observations and catch monitoring. The diving fishery was resilient to resource declines and had expanded to new species, new depths and new fishing grounds, sometimes operating approximately 250 km away from Zanzibar at depths down to 50 meters, as a result of depleted easy-access stock. The diving operations were embedded in a regional and global trade network, and its actors operated in a roving manner on multiple spatial levels, taking advantage of unfair patron-client relationships and of the insufficient management in Zanzibar. This study illustrates that roving dynamics in fisheries, which have been predominantly addressed on a global scale, also take place at a considerably smaller spatial scale. Importantly, while proposed management of the sea cucumber fishery is often generic to a simplified fishery situation, this study illustrates a multifaceted fishery with diverse management requirements. The documented spatial scales and processes in the scuba diving fishery emphasize the need for increased regional governance partnerships to implement management that fit the spatial scales and processes of the operation.

  6. Mobility, Expansion and Management of a Multi-Species Scuba Diving Fishery in East Africa

    PubMed Central

    Eriksson, Hampus; de la Torre-Castro, Maricela; Olsson, Per

    2012-01-01

    Background Scuba diving fishing, predominantly targeting sea cucumbers, has been documented to occur in an uncontrolled manner in the Western Indian Ocean and in other tropical regions. Although this type of fishing generally indicates a destructive activity, little attention has been directed towards this category of fishery, a major knowledge gap and barrier to management. Methodology and Principal Findings With the aim to capture geographic scales, fishing processes and social aspects the scuba diving fishery that operate out of Zanzibar was studied using interviews, discussions, participant observations and catch monitoring. The diving fishery was resilient to resource declines and had expanded to new species, new depths and new fishing grounds, sometimes operating approximately 250 km away from Zanzibar at depths down to 50 meters, as a result of depleted easy-access stock. The diving operations were embedded in a regional and global trade network, and its actors operated in a roving manner on multiple spatial levels, taking advantage of unfair patron-client relationships and of the insufficient management in Zanzibar. Conclusions and Significance This study illustrates that roving dynamics in fisheries, which have been predominantly addressed on a global scale, also take place at a considerably smaller spatial scale. Importantly, while proposed management of the sea cucumber fishery is often generic to a simplified fishery situation, this study illustrates a multifaceted fishery with diverse management requirements. The documented spatial scales and processes in the scuba diving fishery emphasize the need for increased regional governance partnerships to implement management that fit the spatial scales and processes of the operation. PMID:22530034

  7. Successive membrane separation processes simplify concentration of lipases produced by Aspergillus niger by solid-state fermentation.

    PubMed

    Reinehr, Christian Oliveira; Treichel, Helen; Tres, Marcus Vinicius; Steffens, Juliana; Brião, Vandré Barbosa; Colla, Luciane Maria

    2017-06-01

    In this study, we developed a simplified method for producing, separating, and concentrating lipases derived from solid-state fermentation of agro-industrial residues by filamentous fungi. First, we used Aspergillus niger to produce lipases with hydrolytic activity. We analyzed the separation and concentration of enzymes using membrane separation processes. The sequential use of microfiltration and ultrafiltration processes made it possible to obtain concentrates with enzymatic activities much higher than those in the initial extract. The permeate flux was higher than 60 L/m 2 h during microfiltration using 20- and 0.45-µm membranes and during ultrafiltration using 100- and 50-kDa membranes, where fouling was reversible during the filtration steps, thereby indicating that the fouling may be removed by cleaning processes. These results demonstrate the feasibility of lipase production using A. niger by solid-state fermentation of agro-industrial residues, followed by successive tangential filtration with membranes, which simplify the separation and concentration steps that are typically required in downstream processes.

  8. Mixed-species forest ecosystems in the Great Lakes region: A bibliography

    Treesearch

    John P. Gerlach; Daniel W. Gilmore; Klaus J. Puettmann; John C. Zasada

    2002-01-01

    Most of the world?s forests are dominated by mixed species stands but until recently, most forest management activities have focused on the development of single-species stands. To maximize fiber production, monoculture plantations were preferred because management and growth and yield prediction were simplified. This model of forest management developed because the...

  9. Sex and the single squirrel: a genetic view of forest management in the Pacific Northwest.

    Treesearch

    Sally Duncan

    2003-01-01

    Forest management throughout the world is producing simplified forests. There is growing concern that these forests maintain neither complete vertebrate communities nor conditions favorable to maintenance of genetic diversity of those vertebrate populations that do find habitat in simply structured stands. Genetics is increasingly being used as a basis for management...

  10. Formulation of detailed consumables management models for the development (preoperational) period of advanced space transportation system. Volume 3: Study of constraints/limitations for STS consumables management

    NASA Technical Reports Server (NTRS)

    Newman, C. M.

    1976-01-01

    The constraints and limitations for STS Consumables Management are studied. Variables imposing constraints on the consumables related subsystems are identified, and a method determining constraint violations with the simplified consumables model in the Mission Planning Processor is presented.

  11. Toward an integrated classification of ecosystems: Defining opportunities for managing fish and forest health

    Treesearch

    Bruce E. Rieman; Danny C. Lee; Russell F. Thurow; Paul F. Hessburg; James R. Sedell

    2000-01-01

    Many of the aquatic and terrestrial ecosystems of the Pacific Northwest United States have been simplified and degraded in part through past land-management activities. Recent listings of fishes under the Endangered Species Act and major new initiatives for the restoration of forest health have precipitated contentious debate among managers and conservation interests...

  12. 48 CFR 242.7204 - Contract clause.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Management and Accounting System 242.7204 Contract clause. Use the clause at 252.242-7004, Material Management and Accounting System, in all solicitations and contracts exceeding the simplified acquisition... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contract clause. 242.7204...

  13. A Practical Risk Stratification Approach for Implementing a Primary Care Chronic Disease Management Program in an Underserved Community.

    PubMed

    Xu, Junjun; Williams-Livingston, Arletha; Gaglioti, Anne; McAllister, Calvin; Rust, George

    2018-01-01

    The use of value metrics is often dependent on payer-initiated health care management incentives. There is a need for practices to define and manage their own patient panels regardless of payer to participate effectively in population health management. A key step is to define a panel of primary care patients with high comorbidity profiles. Our sample included all patients seen in an urban academic family medicine clinic over a two-year period. The simplified risk stratification was built using internal electronic health record and billing system data based on ICD-9 codes. There were 347 patients classified as high-risk out of the 5,364 patient panel. Average age was 59 years (SD 15). Hypertension (90%), hyperlipidemia (62%), and depression (55%) were the most common conditions among high-risk patients. Simplified risk stratification provides a feasible option for our team to understand and respond to the nuances of population health in our underserved community.

  14. Development of the ICD-10 simplified version and field test.

    PubMed

    Paoin, Wansa; Yuenyongsuwan, Maliwan; Yokobori, Yukiko; Endo, Hiroyoshi; Kim, Sukil

    2018-05-01

    The International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) has been used in various Asia-Pacific countries for more than 20 years. Although ICD-10 is a powerful tool, clinical coding processes are complex; therefore, many developing countries have not been able to implement ICD-10-based health statistics (WHO-FIC APN, 2007). This study aimed to simplify ICD-10 clinical coding processes, to modify index terms to facilitate computer searching and to provide a simplified version of ICD-10 for use in developing countries. The World Health Organization Family of International Classifications Asia-Pacific Network (APN) developed a simplified version of the ICD-10 and conducted field testing in Cambodia during February and March 2016. Ten hospitals were selected to participate. Each hospital sent a team to join a training workshop before using the ICD-10 simplified version to code 100 cases. All hospitals subsequently sent their coded records to the researchers. Overall, there were 1038 coded records with a total of 1099 ICD clinical codes assigned. The average accuracy rate was calculated as 80.71% (66.67-93.41%). Three types of clinical coding errors were found. These related to errors relating to the coder (14.56%), those resulting from the physician documentation (1.27%) and those considered system errors (3.46%). The field trial results demonstrated that the APN ICD-10 simplified version is feasible for implementation as an effective tool to implement ICD-10 clinical coding for hospitals. Developing countries may consider adopting the APN ICD-10 simplified version for ICD-10 code assignment in hospitals and health care centres. The simplified version can be viewed as an introductory tool which leads to the implementation of the full ICD-10 and may support subsequent ICD-11 adoption.

  15. Doping-free white organic light-emitting diodes without blue molecular emitter: An unexplored approach to achieve high performance via exciplex emission

    NASA Astrophysics Data System (ADS)

    Luo, Dongxiang; Xiao, Ye; Hao, Mingming; Zhao, Yu; Yang, Yibin; Gao, Yuan; Liu, Baiquan

    2017-02-01

    Doping-free white organic light-emitting diodes (DF-WOLEDs) are promising for the low-cost commercialization because of their simplified device structures. However, DF-WOLEDs reported thus far in the literature are based on the use of blue single molecular emitters, whose processing can represent a crucial point in device manufacture. Herein, DF-WOLEDs without the blue single molecular emitter have been demonstrated by managing a blue exciplex system. For the single-molecular-emitter (orange or yellow emitter) DF-WOLEDs, (i) a color rendering index (CRI) of 81 at 1000 cd/m2 can be obtained, which is one of the highest for the single-molecular-emitter WOLEDs, or (ii) a high efficiency of 35.4 lm/W can be yielded. For the dual-molecular-emitter (yellow/red emitters) DF-WOLED, a high CRI of 85 and low correlated color temperature of 2376 K at 1000 cd/m2 have been simultaneously achieved, which has not been reported by previous DF-WOLEDs. Such presented findings may unlock an alternative avenue to the simplified but high-performance WOLEDs.

  16. A cumulative energy demand indicator (CED), life cycle based, for industrial waste management decision making.

    PubMed

    Puig, Rita; Fullana-I-Palmer, Pere; Baquero, Grau; Riba, Jordi-Roger; Bala, Alba

    2013-12-01

    Life cycle thinking is a good approach to be used for environmental decision-support, although the complexity of the Life Cycle Assessment (LCA) studies sometimes prevents their wide use. The purpose of this paper is to show how LCA methodology can be simplified to be more useful for certain applications. In order to improve waste management in Catalonia (Spain), a Cumulative Energy Demand indicator (LCA-based) has been used to obtain four mathematical models to help the government in the decision of preventing or allowing a specific waste from going out of the borders. The conceptual equations and all the subsequent developments and assumptions made to obtain the simplified models are presented. One of the four models is discussed in detail, presenting the final simplified equation to be subsequently used by the government in decision making. The resulting model has been found to be scientifically robust, simple to implement and, above all, fulfilling its purpose: the limitation of waste transport out of Catalonia unless the waste recovery operations are significantly better and justify this transport. Copyright © 2013. Published by Elsevier Ltd.

  17. Standard Versus Simplified Consent Materials for Biobank Participation: Differences in Patient Knowledge and Trial Accrual.

    PubMed

    Garrett, Sarah B; Murphy, Marie; Wiley, James; Dohan, Daniel

    2017-12-01

    Replacing standard consent materials with simplified materials is a promising intervention to improve patient comprehension, but there is little evidence on its real-world implementation. We employed a sequential two-arm design to compare the effect of standard versus simplified consent materials on potential donors' understanding of biobank processes and their accrual to an active biobanking program. Participants were female patients of a California breast health clinic. Subjects from the simplified arm answered more items correctly ( p = .064), reported "don't know" for fewer items ( p = .077), and consented to donate to the biobank at higher rates ( p = .025) than those from the standard arm. Replacing an extant consent form with a simplified version is feasible and may benefit patient comprehension and study accrual.

  18. SpectraFox: A free open-source data management and analysis tool for scanning probe microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Ruby, Michael

    In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.

  19. Access Control Management for SCADA Systems

    NASA Astrophysics Data System (ADS)

    Hong, Seng-Phil; Ahn, Gail-Joon; Xu, Wenjuan

    The information technology revolution has transformed all aspects of our society including critical infrastructures and led a significant shift from their old and disparate business models based on proprietary and legacy environments to more open and consolidated ones. Supervisory Control and Data Acquisition (SCADA) systems have been widely used not only for industrial processes but also for some experimental facilities. Due to the nature of open environments, managing SCADA systems should meet various security requirements since system administrators need to deal with a large number of entities and functions involved in critical infrastructures. In this paper, we identify necessary access control requirements in SCADA systems and articulate access control policies for the simulated SCADA systems. We also attempt to analyze and realize those requirements and policies in the context of role-based access control that is suitable for simplifying administrative tasks in large scale enterprises.

  20. Integrated Building Management System (IBMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anita Lewis

    This project provides a combination of software and services that more easily and cost-effectively help to achieve optimized building performance and energy efficiency. Featuring an open-platform, cloud- hosted application suite and an intuitive user experience, this solution simplifies a traditionally very complex process by collecting data from disparate building systems and creating a single, integrated view of building and system performance. The Fault Detection and Diagnostics algorithms developed within the IBMS have been designed and tested as an integrated component of the control algorithms running the equipment being monitored. The algorithms identify the normal control behaviors of the equipment withoutmore » interfering with the equipment control sequences. The algorithms also work without interfering with any cooperative control sequences operating between different pieces of equipment or building systems. In this manner the FDD algorithms create an integrated building management system.« less

  1. Anovulatory and ovulatory infertility: results with simplified management.

    PubMed Central

    Hull, M G; Savage, P E; Bromham, D R

    1982-01-01

    A simplified scheme for the management of anovulatory and of ovulatory (usually called unexplained) infertility was evaluated in 244 women. Eighteen patients were excluded because of primary ovarian failure, 164 were treated for ovulatory failure, and 62 with ovulatory infertility remained untreated. Twenty-five patients had a properly validated negative postcoital test. In the remaining 201 patients the two-year conception rates were 96% in patients with amenorrhoea, 83% in those with oligomenorrhoea, 74% in those with luteal deficiency, and 88% in those with ovulatory infertility. Comparison with normal rates implied that amenorrhoea represents a pure form of ovulatory failure that is completely correctable whereas in other conditions unexplained factors also contribute to infertility though to a much smaller extent than was previously thought. PMID:6805656

  2. [The subject matters concerned with use of simplified analytical systems from the perspective of the Japanese Association of Medical Technologists].

    PubMed

    Morishita, Y

    2001-05-01

    The subject matters concerned with use of so-called simplified analytical systems for the purpose of useful utilizing are mentioned from the perspective of a laboratory technician. 1. The data from simplified analytical systems should to be agreed with those of particular reference methods not to occur the discrepancy of the data from different laboratories. 2. Accuracy of the measured results using simplified analytical systems is hard to be scrutinized thoroughly and correctly with the quality control surveillance procedure on the stored pooled serum or partly-processed blood. 3. It is necessary to present the guide line to follow about the contents of evaluation to guarantee on quality of simplified analytical systems. 4. Maintenance and manual performance of simplified analytical systems have to be standardized by a laboratory technician and a selling agent technician. 5. It calls attention, further that the cost of simplified analytical systems is much expensive compared to that of routine method with liquid reagents. 6. Various substances in human serum, like cytokine, hormone, tumor marker, and vitamin, etc. are also hoped to be measured by simplified analytical systems.

  3. Contamination concerns in the modular containerless processing facility

    NASA Technical Reports Server (NTRS)

    Seshan, P. K.; Trinh, E. H.

    1989-01-01

    This paper describes the problems of the control and management of contamination in the Modular Containerless Processing Facility (MCPF), that is being currently developed at the JPL for the Space Station, and in the MCPF's precursor version, called the Drop Physics Module (DPM), which will be carried aboard one or more Space Shuttle missions. Attention is given to the identification of contamination sources, their mode of transport to the sample positioned within the chamber, and the protection of the sample, as well as to the mathematical simulatiom of the contaminant transport. It is emphasized that, in order to choose and implement the most appropriate contamination control strategy for each investigator, a number of simplified mathematical simulations will have to be developed, and ground-based contamination experiments will have to be carried out with identical materials.

  4. Simplifying the complexity of resistance heterogeneity in metastasis

    PubMed Central

    Lavi, Orit; Greene, James M.; Levy, Doron; Gottesman, Michael M.

    2014-01-01

    The main goal of treatment regimens for metastasis is to control growth rates, not eradicate all cancer cells. Mathematical models offer methodologies that incorporate high-throughput data with dynamic effects on net growth. The ideal approach would simplify, but not over-simplify, a complex problem into meaningful and manageable estimators that predict a patient’s response to specific treatments. Here, we explore three fundamental approaches with different assumptions concerning resistance mechanisms, in which the cells are categorized into either discrete compartments or described by a continuous range of resistance levels. We argue in favor of modeling resistance as a continuum and demonstrate how integrating cellular growth rates, density-dependent versus exponential growth, and intratumoral heterogeneity improves predictions concerning the resistance heterogeneity of metastases. PMID:24491979

  5. Simplified method for creating a density-absorbed dose calibration curve for the low dose range from Gafchromic EBT3 film.

    PubMed

    Gotanda, Tatsuhiro; Katsuda, Toshizo; Gotanda, Rumi; Kuwano, Tadao; Akagawa, Takuya; Tanki, Nobuyoshi; Tabuchi, Akihiko; Shimono, Tetsunori; Kawaji, Yasuyuki

    2016-01-01

    Radiochromic film dosimeters have a disadvantage in comparison with an ionization chamber in that the dosimetry process is time-consuming for creating a density-absorbed dose calibration curve. The purpose of this study was the development of a simplified method of creating a density-absorbed dose calibration curve from radiochromic film within a short time. This simplified method was performed using Gafchromic EBT3 film with a low energy dependence and step-shaped Al filter. The simplified method was compared with the standard method. The density-absorbed dose calibration curves created using the simplified and standard methods exhibited approximately similar straight lines, and the gradients of the density-absorbed dose calibration curves were -32.336 and -33.746, respectively. The simplified method can obtain calibration curves within a much shorter time compared to the standard method. It is considered that the simplified method for EBT3 film offers a more time-efficient means of determining the density-absorbed dose calibration curve within a low absorbed dose range such as the diagnostic range.

  6. Simplified method for creating a density-absorbed dose calibration curve for the low dose range from Gafchromic EBT3 film

    PubMed Central

    Gotanda, Tatsuhiro; Katsuda, Toshizo; Gotanda, Rumi; Kuwano, Tadao; Akagawa, Takuya; Tanki, Nobuyoshi; Tabuchi, Akihiko; Shimono, Tetsunori; Kawaji, Yasuyuki

    2016-01-01

    Radiochromic film dosimeters have a disadvantage in comparison with an ionization chamber in that the dosimetry process is time-consuming for creating a density-absorbed dose calibration curve. The purpose of this study was the development of a simplified method of creating a density-absorbed dose calibration curve from radiochromic film within a short time. This simplified method was performed using Gafchromic EBT3 film with a low energy dependence and step-shaped Al filter. The simplified method was compared with the standard method. The density-absorbed dose calibration curves created using the simplified and standard methods exhibited approximately similar straight lines, and the gradients of the density-absorbed dose calibration curves were −32.336 and −33.746, respectively. The simplified method can obtain calibration curves within a much shorter time compared to the standard method. It is considered that the simplified method for EBT3 film offers a more time-efficient means of determining the density-absorbed dose calibration curve within a low absorbed dose range such as the diagnostic range. PMID:28144120

  7. Simplified procedures for correlation of experimentally measured and predicted thrust chamber performance

    NASA Technical Reports Server (NTRS)

    Powell, W. B.

    1973-01-01

    Thrust chamber performance is evaluated in terms of an analytical model incorporating all the loss processes that occur in a real rocket motor. The important loss processes in the real thrust chamber were identified, and a methodology and recommended procedure for predicting real thrust chamber vacuum specific impulse were developed. Simplified equations for the calculation of vacuum specific impulse are developed to relate the delivered performance (both vacuum specific impulse and characteristic velocity) to the ideal performance as degraded by the losses corresponding to a specified list of loss processes. These simplified equations enable the various performance loss components, and the corresponding efficiencies, to be quantified separately (except that interaction effects are arbitrarily assigned in the process). The loss and efficiency expressions presented can be used to evaluate experimentally measured thrust chamber performance, to direct development effort into the areas most likely to yield improvements in performance, and as a basis to predict performance of related thrust chamber configurations.

  8. Information retrieval and display system

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; King, W. L.

    1977-01-01

    Versatile command-driven data management system offers users, through simplified command language, a means of storing and searching data files, sorting data files into specified orders, performing simple or complex computations, effecting file updates, and printing or displaying output data. Commands are simple to use and flexible enough to meet most data management requirements.

  9. Commonality in Military Equipment. A Framework to Improve Acquisition Decisions

    DTIC Science & Technology

    2008-01-01

    Improving Acquisition Decisions Chopra, Sunil , and Peter Meindl, Supply Chain Management : Strategy, Planning, Operation, Upper Saddle River, N.J...Personnel Costs in Managing Suppliers and Ordering Parts. The effort to perform these activities may be reduced and simplified through a smaller supply ...a Combined MOS on Mechanic Supply Variability

  10. A multiscale approach to modelling electrochemical processes occurring across the cell membrane with application to transmission of action potentials.

    PubMed

    Richardson, G

    2009-09-01

    By application of matched asymptotic expansions, a simplified partial differential equation (PDE) model for the dynamic electrochemical processes occurring in the vicinity of a membrane, as ions selectively permeate across it, is formally derived from the Poisson-Nernst-Planck equations of electrochemistry. It is demonstrated that this simplified model reduces itself, in the limit of a long thin axon, to the cable equation used by Hodgkin and Huxley to describe the propagation of action potentials in the unmyelinated squid giant axon. The asymptotic reduction from the simplified PDE model to the cable equation leads to insights that are not otherwise apparent; these include an explanation of why the squid giant axon attains a diameter in the region of 1 mm. The simplified PDE model has more general application than the Hodgkin-Huxley cable equation and can, e.g. be used to describe action potential propagation in myelinated axons and neuronal cell bodies.

  11. The research and realization of multi-platform real-time message-oriented middleware in large-scale air traffic control system

    NASA Astrophysics Data System (ADS)

    Liang, Haijun; Ren, Jialong; Song, Tao

    2017-05-01

    Operating requirement of air traffic control system, the multi-platform real-time message-oriented middleware was studied and realized, which is composed of CDCC and CDCS. The former provides application process interface, while the latter realizes data synchronism of CDCC and data exchange. MQM, as one important part of it, provides message queue management and, encrypt and compress data during transmitting procedure. The practical system application verifies that the middleware can simplify the development of air traffic control system, enhance its stability, improve its systematic function and make it convenient for maintenance and reuse.

  12. Simplifying the writing process for the novice writer.

    PubMed

    Redmond, Mary Connie

    2002-10-01

    Nurses take responsibility for reading information to update their professional knowledge and to meet relicensure requirements. However, nurses are less enthusiastic about writing for professional publication. This article explores the reluctance of nurses to write, the reasons why writing for publication is important to the nursing profession, the importance of mentoring to potential writers, and basic information about simplifying the writing process for novice writers. Copyright 2002 by American Society of PeriAnesthesia Nurses.

  13. 'Are you siding with a personality or the grant proposal?': observations on how peer review panels function.

    PubMed

    Coveney, John; Herbert, Danielle L; Hill, Kathy; Mow, Karen E; Graves, Nicholas; Barnett, Adrian

    2017-01-01

    In Australia, the peer review process for competitive funding is usually conducted by a peer review group in conjunction with prior assessment from external assessors. This process is quite mysterious to those outside it. The purpose of this research was to throw light on grant review panels (sometimes called the 'black box') through an examination of the impact of panel procedures, panel composition and panel dynamics on the decision-making in the grant review process. A further purpose was to compare experience of a simplified review process with more conventional processes used in assessing grant proposals in Australia. This project was one aspect of a larger study into the costs and benefits of a simplified peer review process. The Queensland University of Technology (QUT)-simplified process was compared with the National Health and Medical Research Council's (NHMRC) more complex process. Grant review panellists involved in both processes were interviewed about their experience of the decision-making process that assesses the excellence of an application. All interviews were recorded and transcribed. Each transcription was de-identified and returned to the respondent for review. Final transcripts were read repeatedly and coded, and similar codes were amalgamated into categories that were used to build themes. Final themes were shared with the research team for feedback. Two major themes arose from the research: (1) assessing grant proposals and (2) factors influencing the fairness, integrity and objectivity of review. Issues such as the quality of writing in a grant proposal, comparison of the two review methods, the purpose and use of the rebuttal, assessing the financial value of funded projects, the importance of the experience of the panel membership and the role of track record and the impact of group dynamics on the review process were all discussed. The research also examined the influence of research culture on decision-making in grant review panels. One of the aims of this study was to compare a simplified review process with more conventional processes. Generally, participants were supportive of the simplified process. Transparency in the grant review process will result in better appreciation of the outcome. Despite the provision of clear guidelines for peer review, reviewing processes are likely to be subjective to the extent that different reviewers apply different rules. The peer review process will come under more scrutiny as funding for research becomes even more competitive. There is justification for further research on the process, especially of a kind that taps more deeply into the 'black box' of peer review.

  14. Development of a multipurpose smart recorder for general aviation aircraft

    NASA Technical Reports Server (NTRS)

    White, J. H.; Finger, J. F.

    1988-01-01

    An intelligent flight recorder, called the Smart Recorder, was fabricated and installed on a King Air aircraft used in standard commercial charter service. This recorder was used for collection of data toward two objectives: (1) the characterization of the typical environment encountered by the aircraft; and (2) research in the area of trend monitoring. Data processing routines and data presentation formats were defined that are applicable to commuter size aircraft. The feasibility of a cost-effective, multipurpose recorder for general aviation aircraft was successfully demonstrated. Implementation of on-board environmental data processing increased the number of flight hours that could be stored on a single data cartridge and simplified the data management problem by reducing the volume of data to be processed in the laboratory. Trend monitoring algorithms showed less variability in the trend plots when compared against plots of manual data.

  15. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  16. Simplified power processing for ion-thruster subsystems

    NASA Technical Reports Server (NTRS)

    Wessel, F. J.; Hancock, D. J.

    1983-01-01

    Compared to chemical propulsion, ion propulsion offers distinct payload-mass increases for many future low-thrust earth-orbital and deep-space missions. Despite this advantage, the high initial cost and complexity of ion-propulsion subsystems reduce their attractiveness for most present and near-term spacecraft missions. Investigations have, therefore, been conducted with the objective to attempt to simplify the power-processing unit (PPU), which is the single most complex and expensive component in the thruster subsystem. The present investigation is concerned with a program to simplify the design of the PPU employed in a 8-cm mercury-ion-thruster subsystem. In this program a dramatic simplification in the design of the PPU could be achieved, while retaining essential thruster control and subsystem operational flexibility.

  17. Toward improved simulation of river operations through integration with a hydrologic model

    USGS Publications Warehouse

    Morway, Eric D.; Niswonger, Richard G.; Triana, Enrique

    2016-01-01

    Advanced modeling tools are needed for informed water resources planning and management. Two classes of modeling tools are often used to this end–(1) distributed-parameter hydrologic models for quantifying supply and (2) river-operation models for sorting out demands under rule-based systems such as the prior-appropriation doctrine. Within each of these two broad classes of models, there are many software tools that excel at simulating the processes specific to each discipline, but have historically over-simplified, or at worse completely neglected, aspects of the other. As a result, water managers reliant on river-operation models for administering water resources need improved tools for representing spatially and temporally varying groundwater resources in conjunctive-use systems. A new tool is described that improves the representation of groundwater/surface-water (GW-SW) interaction within a river-operations modeling context and, in so doing, advances evaluation of system-wide hydrologic consequences of new or altered management regimes.

  18. Unimolecular decomposition reactions at low-pressure: A comparison of competitive methods

    NASA Technical Reports Server (NTRS)

    Adams, G. F.

    1980-01-01

    The lack of a simple rate coefficient expression to describe the pressure and temperature dependence hampers chemical modeling of flame systems. Recently developed simplified models to describe unimolecular processes include the calculation of rate constants for thermal unimolecular reactions and recombinations at the low pressure limit, at the high pressure limit and in the intermediate fall-off region. Comparison between two different applications of Troe's simplified model and a comparison between the simplified model and the classic RRKM theory are described.

  19. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  20. The role of predictive uncertainty in the operational management of reservoirs

    NASA Astrophysics Data System (ADS)

    Todini, E.

    2014-09-01

    The present work deals with the operational management of multi-purpose reservoirs, whose optimisation-based rules are derived, in the planning phase, via deterministic (linear and nonlinear programming, dynamic programming, etc.) or via stochastic (generally stochastic dynamic programming) approaches. In operation, the resulting deterministic or stochastic optimised operating rules are then triggered based on inflow predictions. In order to fully benefit from predictions, one must avoid using them as direct inputs to the reservoirs, but rather assess the "predictive knowledge" in terms of a predictive probability density to be operationally used in the decision making process for the estimation of expected benefits and/or expected losses. Using a theoretical and extremely simplified case, it will be shown why directly using model forecasts instead of the full predictive density leads to less robust reservoir management decisions. Moreover, the effectiveness and the tangible benefits for using the entire predictive probability density instead of the model predicted values will be demonstrated on the basis of the Lake Como management system, operational since 1997, as well as on the basis of a case study on the lake of Aswan.

  1. Improving profitability through slurry management: a look at the impact of slurry pH on various glass types

    NASA Astrophysics Data System (ADS)

    Hooper, Abigail R.; Boffa, Christopher C.; Sarkas, Harry W.; Cureton, Kevin

    2015-08-01

    When building an optical system, optical fabricators and designers meticulously choose the glass types for their application knowing that each one will have different chemical, thermal and mechanical properties. As the requirements for new optical systems have grown more demanding, the range of available glass types has vastly expanded and the specifications on the produced products have grown tighter. In an attempt to simplify processes and streamline consumable purchases, optical polishing houses often rely on one polishing slurry to manage these vast array of glass types. An unforeseen consequence of these practices can be a reduction in productivity by reduced removal rate, poor yields and frequent rework all translating into higher costs and reduced profitability. In this paper, the authors will examine the impact slurry pH has on glass types of different compositions and chemical, thermal and mechanical properties when using a double-sided polishing process. Experiments will use material removal rate, surface quality, and surface figure to provide insight into improving process control for differing glass types. Further guidance will be provided on how simple on-site monitoring and adjustment can deliver improved profitability on challenging substrates.

  2. Strategies Used by Families to Simplify Tasks for Individuals with Alzheimer's Disease and Related Disorders: Psychometric Analysis of the Task Management Strategy Index (TMSI)

    ERIC Educational Resources Information Center

    Gitlin, Laura N.; Winter, Laraine; Dennis, Marie P.; Corcoran, Mary; Schinfeld, Sandy; Hauck, Walter W.

    2002-01-01

    Purpose: Little is known about the specific behavioral strategies used by families to manage the physical dependency of persons with Alzheimer's disease and related disorders (ADRD). This study reports the psychometric properties of the Task Management Strategy Index (TMSI), a measure designed to identify actions taken by caregivers to simplify…

  3. Measuring Phantom Recollection in the Simplified Conjoint Recognition Paradigm

    ERIC Educational Resources Information Center

    Stahl, Christoph; Klauer, Karl Christoph

    2009-01-01

    False memories are sometimes strong enough to elicit recollective experiences. This phenomenon has been termed Phantom Recollection (PR). The Conjoint Recognition (CR) paradigm has been used to empirically separate PR from other memory processes. Recently, a simplification of the CR procedure has been proposed. We herein extend the simplified CR…

  4. Simplify Web Development for Faculty and Promote Instructional Design.

    ERIC Educational Resources Information Center

    Pedersen, David C.

    Faculty members are often overwhelmed with the prospect of implementing Web-based instruction. In an effort to simplify the process and incorporate some basic instructional design elements, the Educational Technology Team at Embry Riddle Aeronautical University created a course template for WebCT. Utilizing rapid prototyping, the template…

  5. Integrating RSS Feeds of New Books into the Campus Course Management System

    ERIC Educational Resources Information Center

    Corrado, Edward M.; Moulaison, Heather L.

    2006-01-01

    By integrating RSS feeds of new books into their campus' course management system, the authors, a systems librarian (Corrado) and a cataloging/modern languages librarian (Moulaison) at the The College of New Jersey (TCNJ), simplified initial research and spotlighted the library's collections. Faculty members are flocking to this award-winning…

  6. Competing Strategies to Balance the Budgets of Publicly Funded Higher Education Institutions

    ERIC Educational Resources Information Center

    Askari, Mahmoud Yousef

    2017-01-01

    This paper compares and contrasts different strategies to balance academic institutions' operating budgets. Some strategies use economic theory to recommend a budgeting technique, others use management methods to cut cost, and some strategies use a management accounting approach to reach a balanced budget. Through the use of a simplified numerical…

  7. Simplifying Education Management

    ERIC Educational Resources Information Center

    Wiley, Wayne C.

    2004-01-01

    Managing district information, web sites, and data security as well as supplying information on-demand are just a few of the tasks causing educational administrators to seek new solutions these days. The answer is to streamline the business of running schools by putting all information in one place so coordinated data and files can be accessed.…

  8. Experimental Evaluation of Processing Time for the Synchronization of XML-Based Business Objects

    NASA Astrophysics Data System (ADS)

    Ameling, Michael; Wolf, Bernhard; Springer, Thomas; Schill, Alexander

    Business objects (BOs) are data containers for complex data structures used in business applications such as Supply Chain Management and Customer Relationship Management. Due to the replication of application logic, multiple copies of BOs are created which have to be synchronized and updated. This is a complex and time consuming task because BOs rigorously vary in their structure according to the distribution, number and size of elements. Since BOs are internally represented as XML documents, the parsing of XML is one major cost factor which has to be considered for minimizing the processing time during synchronization. The prediction of the parsing time for BOs is an significant property for the selection of an efficient synchronization mechanism. In this paper, we present a method to evaluate the influence of the structure of BOs on their parsing time. The results of our experimental evaluation incorporating four different XML parsers examine the dependencies between the distribution of elements and the parsing time. Finally, a general cost model will be validated and simplified according to the results of the experimental setup.

  9. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  10. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  11. Why projects often fail even with high cost contingencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard

    2002-02-28

    In this note we assume that the individual risks have been adequately quantified and the total project cost contingency adequately computed to ensure an agreed-to probability or confidence level that the total project cost estimate will not be exceeded. But even projects that implement such a process are likely to result in significant cost overruns and/or project failure if the project manager allocates the contingencies to the individual subsystems. The intuitive and mathematically valid solution is to maintain a project-wide contingency and to distribute it to the individual risks on an as-needed basis. Such an approach ensures cost-efficient risk management,more » and projects that implement it are more likely to succeed and to cost less. We illustrate these ideas using a simplified project with two independent risks. The formulation can readily be extended to multiple risks.« less

  12. Simplified Metadata Curation via the Metadata Management Tool

    NASA Astrophysics Data System (ADS)

    Shum, D.; Pilone, D.

    2015-12-01

    The Metadata Management Tool (MMT) is the newest capability developed as part of NASA Earth Observing System Data and Information System's (EOSDIS) efforts to simplify metadata creation and improve metadata quality. The MMT was developed via an agile methodology, taking into account inputs from GCMD's science coordinators and other end-users. In its initial release, the MMT uses the Unified Metadata Model for Collections (UMM-C) to allow metadata providers to easily create and update collection records in the ISO-19115 format. Through a simplified UI experience, metadata curators can create and edit collections without full knowledge of the NASA Best Practices implementation of ISO-19115 format, while still generating compliant metadata. More experienced users are also able to access raw metadata to build more complex records as needed. In future releases, the MMT will build upon recent work done in the community to assess metadata quality and compliance with a variety of standards through application of metadata rubrics. The tool will provide users with clear guidance as to how to easily change their metadata in order to improve their quality and compliance. Through these features, the MMT allows data providers to create and maintain compliant and high quality metadata in a short amount of time.

  13. Framework for Automation of Hazard Log Management on Large Critical Projects

    NASA Astrophysics Data System (ADS)

    Vinerbi, Lorenzo; Babu, Arun P.

    2016-08-01

    Hazard log is a database of all risk management activities in a project. Maintaining its correctness and consistency on large safety/mission critical projects involving multiple vendors, suppliers, and partners is critical and challenging. IBM DOORS is one of the popular tool used for hazard management in space applications. However, not all stake- holders are familiar with it. Also, It is not always feasible to expect all stake-holders to provide correct and consistent hazard data.The current work describes the process and tools to simplify the process of hazard data collection on large projects. It demonstrates how the collected data from all stake-holders is merged to form the hazard log while ensuring data consistency and correctness.The data provided by all parties are collected using a template containing scripts. The scripts check for mistakes based on internal standards of company in charge of hazard management. The collected data is then subjected to merging in DOORS, which also contain scripts to check and import data to form the hazard log. The proposed tool has been applied to a mission critical project, and has been found to save time and reduce the number of mistakes while creating the hazard log. The use of automatic checks paves the way for correct tracking of risk and hazard analysis activities for large critical projects.

  14. Contracting Deployment Customer Guide.

    DTIC Science & Technology

    1996-12-01

    functional managers from the major commands expressed the need to develop a Customer Guide for contingency deployments which would standardize, simplify, and...streamline the support our Contingency Contracting Officers (CCOs) provide to our customers .

  15. Kennedy Space Center Florida Scrub-Jay Compensation Plan

    NASA Technical Reports Server (NTRS)

    Pitcock, Taylor Morgan (Compiler)

    2014-01-01

    Many organizations have interest in using NASA property on KSC. The purpose of this document is to consolidate the goals of ecosystem management associated with Florida Scrub-Jays and compliance with the Endangered Species Act (ESA) in order to streamline and reduce the costs of facility planning, impact assessment, and impact minimization. This will simplify the process and reduce regulatory uncertainty.However, the resulting process must be consistent with the Merritt Island National Wildlife Refuge (MINWR) Comprehensive Conservation Plan (CCP). In addition, this document considers anticipated construction impacts on KSC during the next 10 years and summarizes priorities in a spatially explicit manner. The document describes anticipated compensation requirements to facilitate restoration of degraded habitat in areas most important to the KSC Scrub-Jay population through resources provided to MINWR. The plan assumes that all construction on KSC is compensated on KSC.

  16. Simplified filtered Smith predictor for MIMO processes with multiple time delays.

    PubMed

    Santos, Tito L M; Torrico, Bismark C; Normey-Rico, Julio E

    2016-11-01

    This paper proposes a simplified tuning strategy for the multivariable filtered Smith predictor. It is shown that offset-free control can be achieved with step references and disturbances regardless of the poles of the primary controller, i.e., integral action is not explicitly required. This strategy reduces the number of design parameters and simplifies tuning procedure because the implicit integrative poles are not considered for design purposes. The simplified approach can be used to design continuous-time or discrete-time controllers. Three case studies are used to illustrate the advantages of the proposed strategy if compared with the standard approach, which is based on the explicit integrative action. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Hypersonic Vehicle Propulsion System Simplified Model Development

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Raitano, Paul; Le, Dzu K.; Ouzts, Peter

    2007-01-01

    This document addresses the modeling task plan for the hypersonic GN&C GRC team members. The overall propulsion system modeling task plan is a multi-step process and the task plan identified in this document addresses the first steps (short term modeling goals). The procedures and tools produced from this effort will be useful for creating simplified dynamic models applicable to a hypersonic vehicle propulsion system. The document continues with the GRC short term modeling goal. Next, a general description of the desired simplified model is presented along with simulations that are available to varying degrees. The simulations may be available in electronic form (FORTRAN, CFD, MatLab,...) or in paper form in published documents. Finally, roadmaps outlining possible avenues towards realizing simplified model are presented.

  18. A novel methodology to model the cooling processes of packed horticultural produce using 3D shape models

    NASA Astrophysics Data System (ADS)

    Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart

    2017-10-01

    Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.

  19. Transosseous fixation of pediatric displaced mandibular fractures with polyglactin resorbable suture--a simplified technique.

    PubMed

    Chandan, Sanjay; Halli, Rajshekhar; Joshi, Samir; Chhabaria, Gaurav; Setiya, Sneha

    2013-11-01

    Management of pediatric mandibular fractures presents a unique challenge to surgeons in terms of its numerous variations compared to adults. Both conservative and open methods have been advocated with their obvious limitations and complications. However, conservative modalities may not be possible in grossly displaced fractures, which necessitate the open method of fixation. We present a novel and simplified technique of transosseous fixation of displaced pediatric mandibular fractures with polyglactin resorbable suture, which provides adequate stability without any interference with tooth buds and which is easy to master.

  20. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  1. Marshall information retrieval and display system (MIRADS)

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; Jones, S. C.; King, W. L.

    1974-01-01

    Program for data management system allows sophisticated inquiries while utilizing simplified language. Online system is composed of several programs. System is written primarily in COBOL with routines in ASSEMBLER and FORTRAN V.

  2. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities.

    PubMed

    Troshin, Peter V; Postis, Vincent Lg; Ashworth, Denise; Baldwin, Stephen A; McPherson, Michael J; Barton, Geoffrey J

    2011-03-07

    Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/.

  3. Mining dynamic noteworthy functions in software execution sequences.

    PubMed

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  4. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  5. A hybrid approach to urine drug testing using high-resolution mass spectrometry and select immunoassays.

    PubMed

    McMillin, Gwendolyn A; Marin, Stephanie J; Johnson-Davis, Kamisha L; Lawlor, Bryan G; Strathmann, Frederick G

    2015-02-01

    The major objective of this research was to propose a simplified approach for the evaluation of medication adherence in chronic pain management patients, using liquid chromatography time-of-flight (TOF) mass spectrometry, performed in parallel with select homogeneous enzyme immunoassays (HEIAs). We called it a "hybrid" approach to urine drug testing. The hybrid approach was defined based on anticipated positivity rates, availability of commercial reagents for HEIAs, and assay performance, particularly analytical sensitivity and specificity for drug(s) of interest. Subsequent to implementation of the hybrid approach, time to result was compared with that observed with other urine drug testing approaches. Opioids, benzodiazepines, zolpidem, amphetamine-like stimulants, and methylphenidate metabolite were detected by TOF mass spectrometry to maximize specificity and sensitivity of these 37 drug analytes. Barbiturates, cannabinoid metabolite, carisoprodol, cocaine metabolite, ethyl glucuronide, methadone, phencyclidine, propoxyphene, and tramadol were detected by HEIAs that performed adequately and/or for which positivity rates were very low. Time to result was significantly reduced compared with the traditional approach. The hybrid approach to urine drug testing provides a simplified and analytically specific testing process that minimizes the need for secondary confirmation. Copyright© by the American Society for Clinical Pathology.

  6. Auditory Alterations in Children Infected by Human Immunodeficiency Virus Verified Through Auditory Processing Test

    PubMed Central

    Romero, Ana Carla Leite; Alfaya, Lívia Marangoni; Gonçales, Alina Sanches; Frizzo, Ana Claudia Figueiredo; Isaac, Myriam de Lima

    2016-01-01

    Introduction The auditory system of HIV-positive children may have deficits at various levels, such as the high incidence of problems in the middle ear that can cause hearing loss. Objective The objective of this study is to characterize the development of children infected by the Human Immunodeficiency Virus (HIV) in the Simplified Auditory Processing Test (SAPT) and the Staggered Spondaic Word Test. Methods We performed behavioral tests composed of the Simplified Auditory Processing Test and the Portuguese version of the Staggered Spondaic Word Test (SSW). The participants were 15 children infected by HIV, all using antiretroviral medication. Results The children had abnormal auditory processing verified by Simplified Auditory Processing Test and the Portuguese version of SSW. In the Simplified Auditory Processing Test, 60% of the children presented hearing impairment. In the SAPT, the memory test for verbal sounds showed more errors (53.33%); whereas in SSW, 86.67% of the children showed deficiencies indicating deficit in figure-ground, attention, and memory auditory skills. Furthermore, there are more errors in conditions of background noise in both age groups, where most errors were in the left ear in the Group of 8-year-olds, with similar results for the group aged 9 years. Conclusion The high incidence of hearing loss in children with HIV and comorbidity with several biological and environmental factors indicate the need for: 1) familiar and professional awareness of the impact on auditory alteration on the developing and learning of the children with HIV, and 2) access to educational plans and follow-up with multidisciplinary teams as early as possible to minimize the damage caused by auditory deficits. PMID:28050213

  7. Initial riparian down wood dynamics in relation to thinning and buffer width

    Treesearch

    Paul D. Anderson; Deanna H. Olson; Adrian Ares

    2013-01-01

    Down wood plays many functional roles in aquatic and riparian ecosystems. Simplifi cation of forest structure and low abundance of down wood in stream channels and riparian areas is a common legacy of historical management in headwater forests west of the Cascade Range in the US northwest. Contemporary management practices emphasize the implementation of vegetation...

  8. Flexible solid-state supercapacitors based on carbon nanoparticles/MnO2 nanorods hybrid structure.

    PubMed

    Yuan, Longyan; Lu, Xi-Hong; Xiao, Xu; Zhai, Teng; Dai, Junjie; Zhang, Fengchao; Hu, Bin; Wang, Xue; Gong, Li; Chen, Jian; Hu, Chenguo; Tong, Yexiang; Zhou, Jun; Wang, Zhong Lin

    2012-01-24

    A highly flexible solid-state supercapacitor was fabricated through a simple flame synthesis method and electrochemical deposition process based on a carbon nanoparticles/MnO(2) nanorods hybrid structure using polyvinyl alcohol/H(3)PO(4) electrolyte. Carbon fabric is used as a current collector and electrode (mechanical support), leading to a simplified, highly flexible, and lightweight architecture. The device exhibited good electrochemical performance with an energy density of 4.8 Wh/kg at a power density of 14 kW/kg, and a demonstration of a practical device is also presented, highlighting the path for its enormous potential in energy management. © 2011 American Chemical Society

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Mafalda T., E-mail: mafaldatcosta@gmail.com; Carolino, Elisabete, E-mail: lizcarolino@gmail.com; Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt

    In water supply systems with distribution networkthe most critical aspects of control and Monitoring of water quality, which generates crises system, are the effects of cross-contamination originated by the network typology. The classics of control of quality systems through the application of Shewhart charts are generally difficult to manage in real time due to the high number of charts that must be completed and evaluated. As an alternative to the traditional control systems with Shewhart charts, this study aimed to apply a simplified methodology of a monitoring plan quality parameters in a drinking water distribution, by applying Hotelling’s T{sup 2}more » charts and supplemented with Shewhart charts with Bonferroni limits system, whenever instabilities with processes were detected.« less

  10. KITTEN Lightweight Kernel 0.1 Beta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedretti, Kevin; Levenhagen, Michael; Kelly, Suzanne

    2007-12-12

    The Kitten Lightweight Kernel is a simplified OS (operating system) kernel that is intended to manage a compute node's hardware resources. It provides a set of mechanisms to user-level applications for utilizing hardware resources (e.g., allocating memory, creating processes, accessing the network). Kitten is much simpler than general-purpose OS kernels, such as Linux or Windows, but includes all of the esssential functionality needed to support HPC (high-performance computing) MPI, PGAS and OpenMP applications. Kitten provides unique capabilities such as physically contiguous application memory, transparent large page support, and noise-free tick-less operation, which enable HPC applications to obtain greater efficiency andmore » scalability than with general purpose OS kernels.« less

  11. Integration, design, and construction of a CELSS breadboard facility for bioregenerative life support system research

    NASA Technical Reports Server (NTRS)

    Prince, R.; Knott, W.; Buchanan, Paul

    1987-01-01

    Design criteria for the Biomass Production Chamber (BPC), preliminary operating procedures, and requirements for the future development of the Controlled Ecological Life Support System (CELSS) are discussed. CELSS, which uses a bioregenerative system, includes the following three major units: (1) a biomass production component to grow plants under controlled conditions; (2) food processing components to derive maximum edible content from all plant parts; and (3) waste management components to recover and recycle all solids, liquids, and gases necessary to support life. The current status of the CELSS breadboard facility is reviewed; a block diagram of a simplified version of CELSS and schematic diagrams of the BPS are included.

  12. Managing malocclusion in the mixed dentition: six keys to success. Part 1.

    PubMed

    Fleming, Padhraig S; Johal, Ama; DiBiase, Andrew T

    2008-11-01

    Indications of developing malocclusion are often present in the mixed dentition.With judicious supervision and timely intervention their effects can be minimized. The general dental practitioner is ideally placed to recognize, manage and correct many such incipient problems. This first of two papers considers three keys to success involving, normal dental development, deviations from normal eruption patterns, crossbite correction and habit cessation. The appropriate management of developing malocclusion may simplify later orthodontic management or indeed make such intervention unnecessary.

  13. Simplified Parallel Domain Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson III, David J

    2011-01-01

    Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less

  14. A Cluster-Randomized Controlled Trial of a Simplified Multifaceted Management Program for Individuals at High Cardiovascular Risk (SimCard Trial) in Rural Tibet, China and Haryana, India

    PubMed Central

    Tian, Maoyi; Ajay, Vamadevan S.; Dunzhu, Danzeng; Hameed, Safraj S.; Li, Xian; Liu, Zhong; Li, Cong; Chen, Hao; Cho, KaWing; Li, Ruilai; Zhao, Xingshan; Jindal, Devraj; Rawal, Ishita; Ali, Mohammed K.; Peterson, Eric D.; Ji, Jiachao; Amarchand, Ritvik; Krishnan, Anand; Tandon, Nikhil; Xu, Li-Qun; Wu, Yangfeng; Prabhakaran, Dorairaj; Yan, Lijing L.

    2015-01-01

    Background In rural areas in China and India, cardiovascular disease burden is high but economic and healthcare resources are limited. This study aims to develop and evaluate a simplified cardiovascular management program (SimCard) delivered by community health workers (CHWs) with the aid of a smartphone-based electronic decision support system. Methods and Results The SimCard study was a yearlong cluster-randomized controlled trial conducted in 47 villages (27 in China and 20 in India). 2,086 ‘high cardiovascular risk’ individuals (aged 40 years or older with self-reported history of coronary heart disease, stroke, diabetes, and/or measured systolic blood pressure ≥160 mmHg) were recruited. Participants in the intervention villages were managed by CHWs through an Android-powered “app” on a monthly basis focusing on two medication use and two lifestyle modifications. Compared with the control group, the intervention group had a 25.5% (P<0.001) higher net increase in the primary outcome of the proportion of patient-reported anti-hypertensive medication use pre-and-post intervention. There were also significant differences in certain secondary outcomes: aspirin use (net difference 17.1%, P<0.001) and systolic blood pressure (−2.7 mmHg, P=0.04). However, no significant changes were observed in the lifestyle factors. The intervention was culturally tailored and country-specific results revealed important differences between the regions. Conclusions The results indicate that the simplified cardiovascular management program improved quality of primary care and clinical outcomes in resource-poor settings in China and India. Larger trials in more places are needed to ascertain potential impacts on mortality and morbidity outcomes. Clinical Trial Registration Information clinicaltrials.gov. Identifier: NCT01503814. PMID:26187183

  15. Simplified signal processing for impedance spectroscopy with spectrally sparse sequences

    NASA Astrophysics Data System (ADS)

    Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.

    2013-04-01

    Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.

  16. Policy-Based Management Natural Language Parser

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    The Policy-Based Management Natural Language Parser (PBEM) is a rules-based approach to enterprise management that can be used to automate certain management tasks. This parser simplifies the management of a given endeavor by establishing policies to deal with situations that are likely to occur. Policies are operating rules that can be referred to as a means of maintaining order, security, consistency, or other ways of successfully furthering a goal or mission. PBEM provides a way of managing configuration of network elements, applications, and processes via a set of high-level rules or business policies rather than managing individual elements, thus switching the control to a higher level. This software allows unique management rules (or commands) to be specified and applied to a cross-section of the Global Information Grid (GIG). This software embodies a parser that is capable of recognizing and understanding conversational English. Because all possible dialect variants cannot be anticipated, a unique capability was developed that parses passed on conversation intent rather than the exact way the words are used. This software can increase productivity by enabling a user to converse with the system in conversational English to define network policies. PBEM can be used in both manned and unmanned science-gathering programs. Because policy statements can be domain-independent, this software can be applied equally to a wide variety of applications.

  17. Modified off-midline closure of pilonidal sinus disease.

    PubMed

    Saber, Aly

    2014-05-01

    Numerous surgical procedures have been described for pilonidal sinus disease, but treatment failure and disease recurrence are frequent. Conventional off-midline flap closures have relatively favorable surgical outcomes, but relatively unfavorable cosmetic outcomes. The author reported outcomes of a new simplified off-midline technique for closure of the defect after complete excision of the sinus tracts. Two hundred patients of both sexes were enrolled for modified D-shaped excisions were used to include all sinuses and their ramifications, with a simplified procedure to close the defect. The overall wound infection rate was 12%, (12.2% for males and 11.1% for females). Wound disruption was necessitating laying the whole wound open and management as open technique. The overall wound disruption rate was 6%, (6.1% for males and 5.5% for females) and the overall recurrence rate was 7%. Our simplified off-midline closure without flap appeared to be comparable to conventional off-midline closure with flap, in terms of wound infection, wound dehiscence, and recurrence. Advantages of the simplified procedure include potentially reduced surgery complexity, reduced surgery time, and improved cosmetic outcome.

  18. Baby Talk as a Simplified Register. Papers and Reports on Child Language Development, No. 9.

    ERIC Educational Resources Information Center

    Ferguson, Charles A.

    Every speech community has a baby talk register (BT) of phonological, grammatical, and lexical features regarded as primarily appropriate for addressing young children and also for other displaced or extended uses. Much BT is analyzable as derived from normal adult speech (AS) by such simplifying processes as reduction, substitution, assimilation,…

  19. Simplifier: a web tool to eliminate redundant NGS contigs.

    PubMed

    Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Azevedo, Vasco; Schneider, Maria Paula; Barh, Debmalya; Silva, Artur

    2012-01-01

    Modern genomic sequencing technologies produce a large amount of data with reduced cost per base; however, this data consists of short reads. This reduction in the size of the reads, compared to those obtained with previous methodologies, presents new challenges, including a need for efficient algorithms for the assembly of genomes from short reads and for resolving repetitions. Additionally after abinitio assembly, curation of the hundreds or thousands of contigs generated by assemblers demands considerable time and computational resources. We developed Simplifier, a stand-alone software that selectively eliminates redundant sequences from the collection of contigs generated by ab initio assembly of genomes. Application of Simplifier to data generated by assembly of the genome of Corynebacterium pseudotuberculosis strain 258 reduced the number of contigs generated by ab initio methods from 8,004 to 5,272, a reduction of 34.14%; in addition, N50 increased from 1 kb to 1.5 kb. Processing the contigs of Escherichia coli DH10B with Simplifier reduced the mate-paired library 17.47% and the fragment library 23.91%. Simplifier removed redundant sequences from datasets produced by assemblers, thereby reducing the effort required for finalization of genome assembly in tests with data from Prokaryotic organisms. Simplifier is available at http://www.genoma.ufpa.br/rramos/softwares/simplifier.xhtmlIt requires Sun jdk 6 or higher.

  20. A novel simplified model for torsional vibration analysis of a series-parallel hybrid electric vehicle

    NASA Astrophysics Data System (ADS)

    Tang, Xiaolin; Yang, Wei; Hu, Xiaosong; Zhang, Dejiu

    2017-02-01

    In this study, based on our previous work, a novel simplified torsional vibration dynamic model is established to study the torsional vibration characteristics of a compound planetary hybrid propulsion system. The main frequencies of the hybrid driveline are determined. In contrast to vibration characteristics of the previous 16-degree of freedom model, the simplified model can be used to accurately describe the low-frequency vibration property of this hybrid powertrain. This study provides a basis for further vibration control of the hybrid powertrain during the process of engine start/stop.

  1. Climate Leadership webinar on Greenhouse Gas Management Resources for Small Businesses

    EPA Pesticide Factsheets

    Small businesses can calculate their carbon footprint and construct a greenhouse gas inventory to help track progress towards reaching emissions reduction goals. One strategy for this is EPA's Simplified GHG Emissions Calculator.

  2. A simplified and powerful image processing methods to separate Thai jasmine rice and sticky rice varieties

    NASA Astrophysics Data System (ADS)

    Khondok, Piyoros; Sakulkalavek, Aparporn; Suwansukho, Kajpanya

    2018-03-01

    A simplified and powerful image processing procedures to separate the paddy of KHAW DOK MALI 105 or Thai jasmine rice and the paddy of sticky rice RD6 varieties were proposed. The procedures consist of image thresholding, image chain coding and curve fitting using polynomial function. From the fitting, three parameters of each variety, perimeters, area, and eccentricity, were calculated. Finally, the overall parameters were determined by using principal component analysis. The result shown that these procedures can be significantly separate both varieties.

  3. NLM's Medical Library Resource Improvement Grant for Consortia Development: a proposed outline to simplify the application process.

    PubMed

    Kabler, A W

    1980-01-01

    The National Library of Medicine's Resource Improvement Grant for Consortia is available to assist with developing hospital library consortia, and to support the development of basic healthy information collections. In an effort to simplify the grant application process, this paper presents suggestions for writing the narrative section of the first budget-period application, using the outline in NLM's Application Instructions for Consortium Applicants. Suggestions for writing the narratives of the second budget-period application and the collection development application are also included.

  4. The development and evaluation of a nursing information system for caring clinical in-patient.

    PubMed

    Fang, Yu-Wen; Li, Chih-Ping; Wang, Mei-Hua

    2015-01-01

    The research aimed to develop a nursing information system in order to simplify the admission procedure for caring clinical in-patient, enhance the efficiency of medical information documentation. Therefore, by correctly delivering patients’ health records, and providing continues care, patient safety and care quality would be effectively improved. The study method was to apply Spiral Model development system to compose a nursing information team. By using strategies of data collection, working environment observation, applying use-case modeling, and conferences of Joint Application Design (JAD) to complete the system requirement analysis and design. The Admission Care Management Information System (ACMIS) mainly included: (1) Admission nursing management information system. (2) Inter-shift meeting information management system. (3) The linkage of drug management system and physical examination record system. The framework contained qualitative and quantitative components that provided both formative and summative elements of the evaluation. System evaluation was to apply information success model, and developed questionnaire of consisting nurses’ acceptance and satisfaction. The results of questionnaires were users’ satisfaction, the perceived self-involvement, age and information quality were positively to personal and organizational effectiveness. According to the results of this study, the Admission Care Management Information System was practical to simplifying clinic working procedure and effective in communicating and documenting admission medical information.

  5. A risk assessment approach for fresh fruits.

    PubMed

    Bassett, J; McClure, P

    2008-04-01

    To describe the approach used in conducting a fit-for-purpose risk assessment of microbiological human pathogens associated with fresh fruit and the risk management recommendations made. A qualitative risk assessment for microbiological hazards in fresh fruit was carried out based on the Codex Alimentarius (Codex) framework, modified to consider multiple hazards and all fresh (whole) fruits. The assessment determines 14 significant bacterial, viral, protozoal and nematodal hazards associated with fresh produce, assesses the probable level of exposure from fresh fruit, concludes on the risk from each hazard, and considers and recommends risk management actions. A review of potential risk management options allowed the comparison of effectiveness with the potential exposure to each hazard. Washing to a recommended protocol is an appropriate risk management action for the vast majority of consumption events, particularly when good agricultural and hygienic practices are followed and with the addition of refrigerated storage for low acid fruit. Additional safeguards are recommended for aggregate fruits with respect to the risk from protozoa. The potentially complex process of assessing the risks of multiple hazards in multiple but similar commodities can be simplified in a qualitative assessment approach that employs the Codex methodology.

  6. Ergatis: a web interface and scalable software system for bioinformatics workflows

    PubMed Central

    Orvis, Joshua; Crabtree, Jonathan; Galens, Kevin; Gussman, Aaron; Inman, Jason M.; Lee, Eduardo; Nampally, Sreenath; Riley, David; Sundaram, Jaideep P.; Felix, Victor; Whitty, Brett; Mahurkar, Anup; Wortman, Jennifer; White, Owen; Angiuoli, Samuel V.

    2010-01-01

    Motivation: The growth of sequence data has been accompanied by an increasing need to analyze data on distributed computer clusters. The use of these systems for routine analysis requires scalable and robust software for data management of large datasets. Software is also needed to simplify data management and make large-scale bioinformatics analysis accessible and reproducible to a wide class of target users. Results: We have developed a workflow management system named Ergatis that enables users to build, execute and monitor pipelines for computational analysis of genomics data. Ergatis contains preconfigured components and template pipelines for a number of common bioinformatics tasks such as prokaryotic genome annotation and genome comparisons. Outputs from many of these components can be loaded into a Chado relational database. Ergatis was designed to be accessible to a broad class of users and provides a user friendly, web-based interface. Ergatis supports high-throughput batch processing on distributed compute clusters and has been used for data management in a number of genome annotation and comparative genomics projects. Availability: Ergatis is an open-source project and is freely available at http://ergatis.sourceforge.net Contact: jorvis@users.sourceforge.net PMID:20413634

  7. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  9. Economic, social and resource management factors influencing groundwater trade: Evidence from Victoria, Australia

    NASA Astrophysics Data System (ADS)

    Gill, Bruce; Webb, John; Stott, Kerry; Cheng, Xiang; Wilkinson, Roger; Cossens, Brendan

    2017-07-01

    In Victoria, Australia, most groundwater resources are now fully allocated and opportunities for new groundwater development can only occur through trading of license entitlements. Groundwater usage has rarely exceeded 50% of the available licensed volume, even in the 2008/9 drought year, and 50 to 70% of individual license holders use less than 5% of their allocation each year. However, little groundwater trading is occurring at present. Interviews were conducted with groundwater license holders and water brokers to investigate why the Victorian groundwater trade market is underdeveloped. Responses show there is a complex mix of social, economic, institutional and technical reasons. Barriers to trade are influenced by the circumstances of each groundwater user, administrative process and resource management rules. Water brokers deal with few trades at low margins and noted unrealistic selling prices and administrative difficulties. Irrigators who have successfully traded identify that there are few participants in trading, technical appraisals are expensive and administrative requirements and fees are burdensome, especially when compared to surface water trading. Opportunities to facilitate trade include groundwater management plan refinement and improved information provision. Simplifying transaction processes and costs, demonstrating good resource stewardship and preventing third party impacts from trade could address some concerns raised by market participants. There are, however, numerous individual circumstances that inhibit groundwater trading, so it is unlikely that policy and process changes alone could increase usage rates without greater demand for groundwater or more favourable farming economic circumstances.

  10. Battery Cell Balancing Optimisation for Battery Management System

    NASA Astrophysics Data System (ADS)

    Yusof, M. S.; Toha, S. F.; Kamisan, N. A.; Hashim, N. N. W. N.; Abdullah, M. A.

    2017-03-01

    Battery cell balancing in every electrical component such as home electronic equipment and electric vehicle is very important to extend battery run time which is simplified known as battery life. The underlying solution to equalize the balance of cell voltage and SOC between the cells when they are in complete charge. In order to control and extend the battery life, the battery cell balancing is design and manipulated in such way as well as shorten the charging process. Active and passive cell balancing strategies as a unique hallmark enables the balancing of the battery with the excellent performances configuration so that the charging process will be faster. The experimental and simulation covers an analysis of how fast the battery can balance for certain time. The simulation based analysis is conducted to certify the use of optimisation in active or passive cell balancing to extend battery life for long periods of time.

  11. Applying the vantage PDMS to jack-up drilling ships

    NASA Astrophysics Data System (ADS)

    Yin, Peng; Chen, Yuan-Ming; Cui, Tong-Kai; Wang, Zi-Shen; Gong, Li-Jiang; Yu, Xiang-Fen

    2009-09-01

    The plant design management system (PDMS) is an integrated application which includes a database and is useful when designing complex 3-D industrial projects. It could be used to simplify the most difficult part of a subsea oil extraction project—detailed pipeline design. It could also be used to integrate the design of equipment, structures, HVAC, E-ways as well as the detailed designs of other specialists. This article mainly examines the applicability of the Vantage PDMS database to pipeline projects involving jack-up drilling ships. It discusses the catalogue (CATA) of the pipeline, the spec-world (SPWL) of the pipeline, the bolt tables (BLTA) and so on. This article explains the main methods for CATA construction as well as problem in the process of construction. In this article, the authors point out matters needing attention when using the Vantage PDMS database in the design process and discuss partial solutions to these questions.

  12. Simultaneous Visualization of Different Utility Networks for Disaster Management

    NASA Astrophysics Data System (ADS)

    Semm, S.; Becker, T.; Kolbe, T. H.

    2012-07-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  13. Integration of communications and tracking data processing simulation for space station

    NASA Technical Reports Server (NTRS)

    Lacovara, Robert C.

    1987-01-01

    A simplified model of the communications network for the Communications and Tracking Data Processing System (CTDP) was developed. It was simulated by use of programs running on several on-site computers. These programs communicate with one another by means of both local area networks and direct serial connections. The domain of the model and its simulation is from Orbital Replaceable Unit (ORU) interface to Data Management Systems (DMS). The simulation was designed to allow status queries from remote entities across the DMS networks to be propagated through the model to several simulated ORU's. The ORU response is then propagated back to the remote entity which originated the request. Response times at the various levels were investigated in a multi-tasking, multi-user operating system environment. Results indicate that the effective bandwidth of the system may be too low to support expected data volume requirements under conventional operating systems. Instead, some form of embedded process control program may be required on the node computers.

  14. A combined model to assess technical and economic consequences of changing conditions and management options for wastewater utilities.

    PubMed

    Giessler, Mathias; Tränckner, Jens

    2018-02-01

    The paper presents a simplified model that quantifies economic and technical consequences of changing conditions in wastewater systems on utility level. It has been developed based on data from stakeholders and ministries, collected by a survey that determined resulting effects and adapted measures. The model comprises all substantial cost relevant assets and activities of a typical German wastewater utility. It consists of three modules: i) Sewer for describing the state development of sewer systems, ii) WWTP for process parameter consideration of waste water treatment plants (WWTP) and iii) Cost Accounting for calculation of expenses in the cost categories and resulting charges. Validity and accuracy of this model was verified by using historical data from an exemplary wastewater utility. Calculated process as well as economic parameters shows a high accuracy compared to measured parameters and given expenses. Thus, the model is proposed to support strategic, process oriented decision making on utility level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Manager's Role in Electromagnetic Interference (EMI) Control

    NASA Technical Reports Server (NTRS)

    Sargent, Noel B.; Lewis, Catherine C.

    2013-01-01

    This presentation captures the essence of electromagnetic compatibility (EMC) engineering from a project manager's perspective. It explains the basics of EMC and the benefits to the project of early incorporation of EMC best practices. The EMC requirement products during a project life cycle are identified, along with the requirement verification methods that should be utilized. The goal of the presentation is to raise awareness and simplify the mystique surrounding electromagnetic compatibility for managers that have little or no electromagnetics background

  16. Simplified process model discovery based on role-oriented genetic mining.

    PubMed

    Zhao, Weidong; Liu, Xi; Dai, Weihui

    2014-01-01

    Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.

  17. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  18. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  19. Application of threshold concepts to ecological management problems: occupancy of Golden Eagles in Denali National Park, Alaska: Chapter 5

    USGS Publications Warehouse

    Eaton, Mitchell J.; Martin, Julien; Nichols, James D.; McIntyre, Carol; McCluskie, Maggie C.; Schmutz, Joel A.; Lubow, Bruce L.; Runge, Michael C.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    In this chapter, we demonstrate the application of the various classes of thresholds, detailed in earlier chapters and elsewhere, via an actual but simplified natural resource management case study. We intend our example to provide the reader with the ability to recognize and apply the theoretical concepts of utility, ecological and decision thresholds to management problems through a formalized decision-analytic process. Our case study concerns the management of human recreational activities in Alaska’s Denali National Park, USA, and the possible impacts of such activities on nesting Golden Eagles, Aquila chrysaetos. Managers desire to allow visitors the greatest amount of access to park lands, provided that eagle nesting-site occupancy is maintained at a level determined to be acceptable by the managers themselves. As these two management objectives are potentially at odds, we treat minimum desired occupancy level as a utility threshold which, then, serves to guide the selection of annual management alternatives in the decision process. As human disturbance is not the only factor influencing eagle occupancy, we model nesting-site dynamics as a function of both disturbance and prey availability. We incorporate uncertainty in these dynamics by considering several hypotheses, including a hypothesis that site occupancy is affected only at a threshold level of prey abundance (i.e., an ecological threshold effect). By considering competing management objectives and accounting for two forms of thresholds in the decision process, we are able to determine the optimal number of annual nesting-site restrictions that will produce the greatest long-term benefits for both eagles and humans. Setting a utility threshold of 75 occupied sites, out of a total of 90 potential nesting sites, the optimization specified a decision threshold at approximately 80 occupied sites. At the point that current occupancy falls below 80 sites, the recommended decision is to begin restricting access to humans; above this level, it is recommended that all eagle territories be opened to human recreation. We evaluated the sensitivity of the decision threshold to uncertainty in system dynamics and to management objectives (i.e., to the utility threshold).

  20. Simplified Interval Observer Scheme: A New Approach for Fault Diagnosis in Instruments

    PubMed Central

    Martínez-Sibaja, Albino; Astorga-Zaragoza, Carlos M.; Alvarado-Lassman, Alejandro; Posada-Gómez, Rubén; Aguila-Rodríguez, Gerardo; Rodríguez-Jarquin, José P.; Adam-Medina, Manuel

    2011-01-01

    There are different schemes based on observers to detect and isolate faults in dynamic processes. In the case of fault diagnosis in instruments (FDI) there are different diagnosis schemes based on the number of observers: the Simplified Observer Scheme (SOS) only requires one observer, uses all the inputs and only one output, detecting faults in one detector; the Dedicated Observer Scheme (DOS), which again uses all the inputs and just one output, but this time there is a bank of observers capable of locating multiple faults in sensors, and the Generalized Observer Scheme (GOS) which involves a reduced bank of observers, where each observer uses all the inputs and m-1 outputs, and allows the localization of unique faults. This work proposes a new scheme named Simplified Interval Observer SIOS-FDI, which does not requires the measurement of any input and just with just one output allows the detection of unique faults in sensors and because it does not require any input, it simplifies in an important way the diagnosis of faults in processes in which it is difficult to measure all the inputs, as in the case of biologic reactors. PMID:22346593

  1. A Fast Proceduere for Optimizing Thermal Protection Systems of Re-Entry Vehicles

    NASA Astrophysics Data System (ADS)

    Ferraiuolo, M.; Riccio, A.; Tescione, D.; Gigliotti, M.

    The aim of the present work is to introduce a fast procedure to optimize thermal protection systems for re-entry vehicles subjected to high thermal loads. A simplified one-dimensional optimization process, performed in order to find the optimum design variables (lengths, sections etc.), is the first step of the proposed design procedure. Simultaneously, the most suitable materials able to sustain high temperatures and meeting the weight requirements are selected and positioned within the design layout. In this stage of the design procedure, simplified (generalized plane strain) FEM models are used when boundary and geometrical conditions allow the reduction of the degrees of freedom. Those simplified local FEM models can be useful because they are time-saving and very simple to build; they are essentially one dimensional and can be used for optimization processes in order to determine the optimum configuration with regard to weight, temperature and stresses. A triple-layer and a double-layer body, subjected to the same aero-thermal loads, have been optimized to minimize the overall weight. Full two and three-dimensional analyses are performed in order to validate those simplified models. Thermal-structural analyses and optimizations are executed by adopting the Ansys FEM code.

  2. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth

    PubMed Central

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production. PMID:28848565

  3. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth.

    PubMed

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production.

  4. The lean service machine.

    PubMed

    Swank, Cynthia Karen

    2003-10-01

    Jefferson Pilot Financial, a life insurance and annuities firm, like many U.S. service companies at the end of the 1990s was looking for new ways to grow. Its top managers recognized that JPF needed to differentiate itself in the eyes of its customers, the independent life-insurance advisers who sell and service policies. To establish itself as these advisers' preferred partner, it set out to reduce the turnaround time on policy applications, simplify the submission process, and reduce errors. JPF's managers looked to the "lean production" practices that U.S. manufacturers adopted in response to competition from Japanese companies. Lean production is built around the concept of continuous-flow processing--a departure from traditional production systems, in which large batches are processed at each step. JPF appointed a "lean team" to reengineer its New Business unit's operations, beginning with the creation of a "model cell"--a fully functioning microcosm of JPF's entire process. This approach allowed managers to experiment and smooth out the kinks while working toward an optimal design. The team applied lean-manufacturing practices, including placing linked processes near one another, balancing employees' workloads, posting performance results, and measuring performance and productivity from the customer's perspective. Customer-focused metrics helped erode the employees' "My work is all that matters" mind-set. The results were so impressive that JPF is rolling out similar systems across many of its operations. To convince employees of the value of lean production, the lean team introduced a simulation in which teams compete to build the best paper airplane based on invented customer specifications. This game drives home lean production's basic principles, establishing a foundation for deep and far-reaching changes in the production system.

  5. 77 FR 20632 - Information Collection Being Submitted to the Office of Management and Budget for Review and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... Digital Devices. (a) The Declaration of Conformity equipment authorization procedure, 47 CFR 2.1071... simplified filing and reporting procedure for authorizing equipment for marketing. (d) Finally, testing and...

  6. 48 CFR 742.1170-2 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Section 742.1170-2 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACT MANAGEMENT CONTRACT ADMINISTRATION Production, Surveillance, and Reporting 742.1170-2 Applicability. (a) This section applies to USAID non-personal, professional/technical services contracts exceeding the simplified...

  7. 48 CFR 1513.507 - Clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... automatic data processing equipment, word processing, and similar types of commercially available equipment... CONTRACT TYPES SIMPLIFIED ACQUISITION PROCEDURES Purchase Orders 1513.507 Clauses. (a) It is the general...

  8. Adapting land management to emergence of novel site conditions on the continental lowlands of SE Europe

    NASA Astrophysics Data System (ADS)

    Mátyás, Csaba; Berki, Imre; Bidlo, Andras; Czimber, Kornel.; Gálos, Borbala; Gribovszki, Zoltan; Lakatos, Ferenc; Borovics, Attila; Csóka, György; Führer, Ernő; Illés, Gábor; Rasztovits, Ervin; Somogyi, Zoltán; Bartholy, Judit

    2017-04-01

    The rapid progress of site potential change, caused by the shift of climate zones is a serious problem of lowland management in Southeast Europe. In forestry, the resilience potential of main, climate-dependent tree species (e.g. spruce, beech, sessile oak) and ecosystems is limited at their lower (xeric) limits of distribution. A conventional mitigation measure for adaptive forest management is the return to nature-close management. Severe drought- and biotic impacts in forests indicate however the urgency of fundamental changes in forest policy. To provide assistance in selecting climate-tolerant provenances, species and adaptive technologies for future site conditions is therefore critical. A simplified Decision Support System has been developed for Hungary, keeping conventional elements of site potential assessment. Projections are specified for discrete site types. Processing forest inventory, landcover and geodata, the System provides GIS-supported site information and projections for individual forest compartments, options for tree species better tolerating future climate scenarios as well as their expected yield and risks. Data respectively projections are available for recent and current conditions, and for future reference periods until 2100. Also non-forest site conditions in the novel grassland (steppe) climate zone appear in projections. Experiences for proper management on these sites are however scarce.

  9. A bi-level environmental impact assessment framework for comparing construction and demolition waste management strategies.

    PubMed

    Yazdanbakhsh, Ardavan

    2018-04-27

    Several pioneering life cycle assessment (LCA) studies have been conducted in the past to assess the environmental impact of specific methods for managing mineral construction and demolition waste (MCDW), such as recycling the waste for use in concrete. Those studies focus on comparing the use of recycled MCDW and that of virgin components to produce materials or systems that serve specified functions. Often, the approaches adopted by the studies do not account for the potential environmental consequence of avoiding the existing or alternative waste management practices. The present work focuses on how product systems need to be defined in recycling LCA studies and what processes need to be within the system boundaries. A bi-level LCA framework is presented for modelling alternative waste management approaches in which the impacts are measured and compared at two scales of strategy and decision-making. Different functional units are defined for each level, all of which correspond to the same flow of MCDW in a cascade of product systems. For the sole purpose of demonstrating how the framework is implemented an illustrative example is presented, based on real data and a number of simplifying assumptions, which compares the impacts of a number of potential MCDW management strategies in New York City. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Natural-Language Parser for PBEM

    NASA Technical Reports Server (NTRS)

    James, Mark

    2010-01-01

    A computer program called "Hunter" accepts, as input, a colloquial-English description of a set of policy-based-management rules, and parses that description into a form useable by policy-based enterprise management (PBEM) software. PBEM is a rules-based approach suitable for automating some management tasks. PBEM simplifies the management of a given enterprise through establishment of policies addressing situations that are likely to occur. Hunter was developed to have a unique capability to extract the intended meaning instead of focusing on parsing the exact ways in which individual words are used.

  11. A simplified boron diffusion for preparing the silicon single crystal p-n junction as an educational device

    NASA Astrophysics Data System (ADS)

    Shiota, Koki; Kai, Kazuho; Nagaoka, Shiro; Tsuji, Takuto; Wakahara, Akihiro; Rusop, Mohamad

    2016-07-01

    The educational method which is including designing, making, and evaluating actual semiconductor devices with learning the theory is one of the best way to obtain the fundamental understanding of the device physics and to cultivate the ability to make unique ideas using the knowledge in the semiconductor device. In this paper, the simplified Boron thermal diffusion process using Sol-Gel material under normal air environment was proposed based on simple hypothesis and the feasibility of the reproducibility and reliability were investigated to simplify the diffusion process for making the educational devices, such as p-n junction, bipolar and pMOS devices. As the result, this method was successfully achieved making p+ region on the surface of the n-type silicon substrates with good reproducibility. And good rectification property of the p-n junctions was obtained successfully. This result indicates that there is a possibility to apply on the process making pMOS or bipolar transistors. It suggests that there is a variety of the possibility of the applications in the educational field to foster an imagination of new devices.

  12. Application of statistical mining in healthcare data management for allergic diseases

    NASA Astrophysics Data System (ADS)

    Wawrzyniak, Zbigniew M.; Martínez Santolaya, Sara

    2014-11-01

    The paper aims to discuss data mining techniques based on statistical tools in medical data management in case of long-term diseases. The data collected from a population survey is the source for reasoning and identifying disease processes responsible for patient's illness and its symptoms, and prescribing a knowledge and decisions in course of action to correct patient's condition. The case considered as a sample of constructive approach to data management is a dependence of allergic diseases of chronic nature on some symptoms and environmental conditions. The knowledge summarized in a systematic way as accumulated experience constitutes to an experiential simplified model of the diseases with feature space constructed of small set of indicators. We have presented the model of disease-symptom-opinion with knowledge discovery for data management in healthcare. The feature is evident that the model is purely data-driven to evaluate the knowledge of the diseases` processes and probability dependence of future disease events on symptoms and other attributes. The example done from the outcomes of the survey of long-term (chronic) disease shows that a small set of core indicators as 4 or more symptoms and opinions could be very helpful in reflecting health status change over disease causes. Furthermore, the data driven understanding of the mechanisms of diseases gives physicians the basis for choices of treatment what outlines the need of data governance in this research domain of discovered knowledge from surveys.

  13. A simplified conjoint recognition paradigm for the measurement of gist and verbatim memory.

    PubMed

    Stahl, Christoph; Klauer, Karl Christoph

    2008-05-01

    The distinction between verbatim and gist memory traces has furthered the understanding of numerous phenomena in various fields, such as false memory research, research on reasoning and decision making, and cognitive development. To measure verbatim and gist memory empirically, an experimental paradigm and multinomial measurement model has been proposed but rarely applied. In the present article, a simplified conjoint recognition paradigm and multinomial model is introduced and validated as a measurement tool for the separate assessment of verbatim and gist memory processes. A Bayesian metacognitive framework is applied to validate guessing processes. Extensions of the model toward incorporating the processes of phantom recollection and erroneous recollection rejection are discussed.

  14. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    PubMed Central

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  15. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    PubMed

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  16. Expanded Guidance for NASA Systems Engineering. Volume 2: Crosscutting Topics, Special Topics, and Appendices

    NASA Technical Reports Server (NTRS)

    Hirshorn, Steven R.

    2017-01-01

    Historically, most successful NASA projects have depended on effectively blending project management, systems engineering, and technical expertise among NASA, contractors, and third parties. Underlying these successes are a variety of agreements (e.g., contract, memorandum of understanding, grant, cooperative agreement) between NASA organizations or between NASA and other Government agencies, Government organizations, companies, universities, research laboratories, and so on. To simplify the discussions, the term "contract" is used to encompass these agreements. This section focuses on the NASA systems engineering activities pertinent to awarding a contract, managing contract performance, and completing a contract. In particular, NASA systems engineering interfaces to the procurement process are covered, since the NASA engineering technical team plays a key role in the development and evaluation of contract documentation. Contractors and third parties perform activities that supplement (or substitute for) the NASA project technical team accomplishment of the NASA common systems engineering technical process activities and requirements outlined in this guide. Since contractors might be involved in any part of the systems engineering life cycle, the NASA project technical team needs to know how to prepare for, allocate or perform, and implement surveillance of technical activities that are allocated to contractors.

  17. Multiple diseases impact survival of pine species planted in red spine stands harvested in spatially variable retention patterns

    Treesearch

    M.E. Ostry; M.J. Moore; C.C. Kern; R.C. Venette; B.J. Palik

    2012-01-01

    Increasing the diversity of species and structure of red pine (Pinus resinosa) is often a management goal in stands simplified by practices such as fire suppression and plantation management in many areas of the Great Lakes Region. One approach to diversification is to convert predominantly even-aged, pure red pine stands to multi-cohort, mixed-...

  18. Git Replacement for the

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, P.

    2014-09-23

    GRAPE is a tool for managing software project workflows for the Git version control system. It provides a suite of tools to simplify and configure branch based development, integration with a project's testing suite, and integration with the Atlassian Stash repository hosting tool.

  19. A Simplified Decision Support Approach for Evaluating Wetlands Ecosystem Services NABS11

    EPA Science Inventory

    State-level managers and environmental advocates often must justify their restoration actions in terms of tangible beneficial outcomes. Wetlands functional assessment tools (e.g, Wetland Evaluation Technique (WET), Habitat Evaluation Procedures (HEP), Hydrogeomorphic Method (HGM)...

  20. The people side of MRP (materiel requirements planning).

    PubMed

    Lunn, T

    1994-05-01

    A montage of ideas and concepts have been successfully used to train and motivate people to use MRP II systems more effectively. This is important today because many companies are striving to achieve World Class Manufacturing status. Closed loop Materiel Requirements Planning (MRP) systems are an integral part of the process of continuous improvement. Successfully using a formal management planning system, such as MRP II, is a fundamental stepping stone on the path toward World Class Excellence. Included in this article are techniques that companies use to reduce lead time, simplify bills of materiel, and improve schedule adherence. These and other steps all depend on the people who use the system. The focus will be on how companies use the MRP tool more effectively.

  1. Transportable Applications Environment (TAE) Plus - A NASA productivity tool used to develop graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.

  2. A simplified computational memory model from information processing.

    PubMed

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  3. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    PubMed Central

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  4. Modified Off-Midline Closure of Pilonidal Sinus Disease

    PubMed Central

    Saber, Aly

    2014-01-01

    Background: Numerous surgical procedures have been described for pilonidal sinus disease, but treatment failure and disease recurrence are frequent. Conventional off-midline flap closures have relatively favorable surgical outcomes, but relatively unfavorable cosmetic outcomes. Aim: The author reported outcomes of a new simplified off-midline technique for closure of the defect after complete excision of the sinus tracts. Patients and Methods: Two hundred patients of both sexes were enrolled for modified D-shaped excisions were used to include all sinuses and their ramifications, with a simplified procedure to close the defect. Results: The overall wound infection rate was 12%, (12.2% for males and 11.1% for females). Wound disruption was necessitating laying the whole wound open and management as open technique. The overall wound disruption rate was 6%, (6.1% for males and 5.5% for females) and the overall recurrence rate was 7%. Conclusion: Our simplified off-midline closure without flap appeared to be comparable to conventional off-midline closure with flap, in terms of wound infection, wound dehiscence, and recurrence. Advantages of the simplified procedure include potentially reduced surgery complexity, reduced surgery time, and improved cosmetic outcome. PMID:24926445

  5. Efficiency of energy recovery from municipal solid waste and the resultant effect on the greenhouse gas balance.

    PubMed

    Gohlke, Oliver

    2009-11-01

    Global warming is a focus of political interest and life-cycle assessment of waste management systems reveals that energy recovery from municipal solid waste is a key issue. This paper demonstrates how the greenhouse gas effects of waste treatment processes can be described in a simplified manner by considering energy efficiency indicators. For evaluation to be consistent, it is necessary to use reasonable system boundaries and to take the generation of electricity and the use of heat into account. The new European R1 efficiency criterion will lead to the development and implementation of optimized processes/systems with increased energy efficiency which, in turn, will exert an influence on the greenhouse gas effects of waste management in Europe. Promising technologies are: the increase of steam parameters, reduction of in-plant energy consumption, and the combined use of heat and power. Plants in Brescia and Amsterdam are current examples of good performance with highly efficient electricity generation. Other examples of particularly high heat recovery rates are the energy-from-waste (EfW) plants in Malmö and Gothenburg. To achieve the full potential of greenhouse gas reduction in waste management, it is necessary to avoid landfilling combustible wastes, for example, by means of landfill taxes and by putting incentives in place for increasing the efficiency of EfW systems.

  6. Interpreting drinking water quality in the distribution system using Dempster-Shafer theory of evidence.

    PubMed

    Sadiq, Rehan; Rodriguez, Manuel J

    2005-04-01

    Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.

  7. Landowner Satisfaction with the Wetland Reserve Program in Texas: A Mixed-Methods Analysis

    NASA Astrophysics Data System (ADS)

    Stroman, Dianne; Kreuter, Urs P.

    2016-01-01

    Using mail survey data and telephone interviews, we report on landowner satisfaction with permanent easements held by the Natural Resources Conservation Service (NRCS) throughout Texas. This study found that landowners were dissatisfied with the NRCS Wetland Reserve Program (WRP), conflicting with results of previous studies. The objective of this study was to explore specific reasons for frustration expressed by landowners with the program. We found three predominant themes underpinning program dissatisfaction: (1) upfront restoration failures, (2) overly restrictive easement constraints, and (3) bureaucratic hurdles limiting landowners' ability to conduct adaptive management on their easement property. The implications of this study suggest that attitudes of landowners participating in the WRP may limit the long-term effectiveness of this program. Suggestions for improving the program include implementing timely, ecologically sound restoration procedures and streamlining and simplifying the approval process for management activity requests. In addition, the NRCS should consider revising WRP restriction guidelines in order to provide more balance between protection goals and landowner autonomy.

  8. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  9. A Guided, Conservative Approach for the Management of Localized Mandibular Anterior Tooth Wear.

    PubMed

    Mehta, Shamir B; Francis, Selar; Banerji, Subir

    2016-03-01

    The successful management of the worn mandibular anterior dentition may present an awkward challenge to the dental operator. The purpose of this article is to describe a case report illustrating the use of a guided, three-dimensional protocol for the ultra-conservative and predictable restoration of the worn lower anterior dentition using direct resin composite. This technique utilizes information based on established biomechanical and occlusal principles to fabricate a diagnostic wax-up, which is duplicated in dental stone. This is used to prepare a vacuum-formed modified stent, assisting the clinician to place directly bonded resin composite restorations to restore the worn lower anterior dentition. The technique, described in 2012 and referred to as 'injection moulding' has the potential to offer optimal form, function and an aesthetic outcome in an efficient manner. CPD/Clinical Relevance: This article aims to describe an alternative technique to simplify the processes involved with restoration of worn lower anterior teeth.

  10. Possibility-induced simplified neutrosophic aggregation operators and their application to multi-criteria group decision-making

    NASA Astrophysics Data System (ADS)

    Şahin, Rıdvan; Liu, Peide

    2017-07-01

    Simplified neutrosophic set (SNS) is an appropriate tool used to express the incompleteness, indeterminacy and uncertainty of the evaluation objects in decision-making process. In this study, we define the concept of possibility SNS including two types of information such as the neutrosophic performance provided from the evaluation objects and its possibility degree using a value ranging from zero to one. Then by extending the existing neutrosophic information, aggregation models for SNSs that cannot be used effectively to fusion the two different information described above, we propose two novel neutrosophic aggregation operators considering possibility, which are named as a possibility-induced simplified neutrosophic weighted arithmetic averaging operator and possibility-induced simplified neutrosophic weighted geometric averaging operator, and discuss their properties. Moreover, we develop a useful method based on the proposed aggregation operators for solving a multi-criteria group decision-making problem with the possibility simplified neutrosophic information, in which the weights of decision-makers and decision criteria are calculated based on entropy measure. Finally, a practical example is utilised to show the practicality and effectiveness of the proposed method.

  11. A hip joint simulator study using simplified loading and motion cycles generating physiological wear paths and rates.

    PubMed

    Barbour, P S; Stone, M H; Fisher, J

    1999-01-01

    In some designs of hip joint simulator the cost of building a highly complex machine has been offset with the requirement for a large number of test stations. The application of the wear results generated by these machines depends on their ability to reproduce physiological wear rates and processes. In this study a hip joint simulator has been shown to reproduce physiological wear using only one load vector and two degrees of motion with simplified input cycles. The actual path of points on the femoral head relative to the acetabular cup were calculated and compared for physiological and simplified input cycles. The in vitro wear rates were found to be highly dependent on the shape of these paths and similarities could be drawn between the shape of the physiological paths and the simplified elliptical paths.

  12. Simplified web-based decision support method for traffic management and work zone analysis.

    DOT National Transportation Integrated Search

    2017-01-01

    Traffic congestion mitigation is one of the key challenges that transportation planners and operations engineers face when planning for construction and maintenance activities. There is a wide variety of approaches and methods that address work zone ...

  13. Simplified web-based decision support method for traffic management and work zone analysis.

    DOT National Transportation Integrated Search

    2015-06-01

    Traffic congestion mitigation is one of the key challenges that transportation planners and operations engineers face when : planning for construction and maintenance activities. There is a wide variety of approaches and methods that address work : z...

  14. SIMPLIFYING EVALUATIONS OF GREEN CHEMISTRIES: HOW MUCH INFORMATION DO WE NEED?

    EPA Science Inventory

    Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the Environmental Sustainability of Chemistries with a multi-Ob...

  15. Treatment outcomes after implementation of an adapted WHO protocol for severe sepsis and septic shock in Haiti.

    PubMed

    Papali, Alfred; Eoin West, T; Verceles, Avelino C; Augustin, Marc E; Nathalie Colas, L; Jean-Francois, Carl H; Patel, Devang M; Todd, Nevins W; McCurdy, Michael T

    2017-10-01

    The World Health Organization (WHO) has developed a simplified algorithm specific to resource-limited settings for the treatment of severe sepsis emphasizing early fluids and antibiotics. However, this protocol's clinical effectiveness is unknown. We describe patient outcomes before and after implementation of an adapted WHO severe sepsis protocol at a community hospital in Haiti. Using a before-and-after study design, we retrospectively enrolled 99 adult Emergency Department patients with severe sepsis from January through March 2012. After protocol implementation in January 2014, we compared outcomes to 67 patients with severe sepsis retrospectively enrolled from February to April 2014. We defined sepsis according to the WHO's Integrated Management of Adult Illness guidelines and severe sepsis as sepsis plus organ dysfunction. After protocol implementation, quantity of fluid administered increased and the physician's differential diagnoses more often included sepsis. Patients were more likely to have follow-up vital signs taken sooner, a radiograph performed, and a lactic acid tested. There were no improvements in mortality, time to fluids or antimicrobials. Use of a simplified sepsis protocol based primarily on physiologic parameters allows for substantial improvements in process measures in the care of severely septic patients in a resource-constrained setting. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Mining dynamic noteworthy functions in software execution sequences

    PubMed Central

    Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276

  17. An architecture for genomics analysis in a clinical setting using Galaxy and Docker

    PubMed Central

    Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A

    2017-01-01

    Abstract Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. PMID:29048555

  18. An architecture for genomics analysis in a clinical setting using Galaxy and Docker.

    PubMed

    Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A; Rance, B

    2017-11-01

    Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. © The Author 2017. Published by Oxford University Press.

  19. Staff experiences within the implementation of computer-based nursing records in residential aged care facilities: a systematic review and synthesis of qualitative research.

    PubMed

    Meißner, Anne; Schnepp, Wilfried

    2014-06-20

    Since the introduction of electronic nursing documentation systems, its implementation in recent years has increased rapidly in Germany. The objectives of such systems are to save time, to improve information handling and to improve quality. To integrate IT in the daily working processes, the employee is the pivotal element. Therefore it is important to understand nurses' experience with IT implementation. At present the literature shows a lack of understanding exploring staff experiences within the implementation process. A systematic review and meta-ethnographic synthesis of primary studies using qualitative methods was conducted in PubMed, CINAHL, and Cochrane. It adheres to the principles of the PRISMA statement. The studies were original, peer-reviewed articles from 2000 to 2013, focusing on computer-based nursing documentation in Residential Aged Care Facilities. The use of IT requires a different form of information processing. Some experience this new form of information processing as a benefit while others do not. The latter find it more difficult to enter data and this result in poor clinical documentation. Improvement in the quality of residents' records leads to an overall improvement in the quality of care. However, if the quality of those records is poor, some residents do not receive the necessary care. Furthermore, the length of time necessary to complete the documentation is a prominent theme within that process. Those who are more efficient with the electronic documentation demonstrate improved time management. For those who are less efficient with electronic documentation the information processing is perceived as time consuming. Normally, it is possible to experience benefits when using IT, but this depends on either promoting or hindering factors, e.g. ease of use and ability to use it, equipment availability and technical functionality, as well as attitude. In summary, the findings showed that members of staff experience IT as a benefit when it simplifies their daily working routines and as a burden when it complicates their working processes. Whether IT complicates or simplifies their routines depends on influencing factors. The line between benefit and burden is semipermeable. The experiences differ according to duties and responsibilities.

  20. The influence of a wind tunnel on helicopter rotational noise: Formulation of analysis

    NASA Technical Reports Server (NTRS)

    Mosher, M.

    1984-01-01

    An analytical model is discussed that can be used to examine the effects of wind tunnel walls on helicopter rotational noise. A complete physical model of an acoustic source in a wind tunnel is described and a simplified version is then developed. This simplified model retains the important physical processes involved, yet it is more amenable to analysis. The simplified physical model is then modeled as a mathematical problem. An inhomogeneous partial differential equation with mixed boundary conditions is set up and then transformed into an integral equation. Details of generating a suitable Green's function and integral equation are included and the equation is discussed and also given for a two-dimensional case.

  1. Resource Effective Strategies to Prevent and Treat Cardiovascular Disease

    PubMed Central

    Schwalm, Jon-David; McKee, Martin; Huffman, Mark D.; Yusuf, Salim

    2016-01-01

    Cardiovascular disease (CVD) is the leading cause of global deaths, with the majority occurring in low- and middle-income countries (LMIC). The primary and secondary prevention of CVD is suboptimal throughout the world, but the evidence-practice gaps are much more pronounced in LMIC. Barriers at the patient, health-care provider, and health system level prevent the implementation of optimal primary and secondary prevention. Identification of the particular barriers that exist in resource-constrained settings is necessary to inform effective strategies to reduce the identified evidence-practice gaps. Furthermore, targeting modifiable factors that contribute most significantly to the global burden of CVD, including tobacco use, hypertension, and secondary prevention for CVD will lead to the biggest gains in mortality reduction. We review a select number of novel, resource-efficient strategies to reduce premature mortality from CVD, including: (1) effective measures for tobacco control; (2) implementation of simplified screening and management algorithms for those with or at risk of CVD, (3) increasing the availability and affordability of simplified and cost-effective treatment regimens including combination CVD preventive drug therapy, and (4) simplified delivery of health care through task-sharing (non-physician health workers) and optimizing self-management (treatment supporters). Developing and deploying systems of care that address barriers related to the above, will lead to substantial reductions in CVD and related mortality. PMID:26903017

  2. Simplifying the negotiating process with physicians: critical elements in negotiating from private practice to employed physician.

    PubMed

    Gallucci, Armen; Deutsch, Thomas; Youngquist, Jaymie

    2013-01-01

    The authors attempt to simplify the key elements to the process of negotiating successfully with private physicians. From their experience, the business elements that have resulted in the most discussion center on the compensation including the incentive plan. Secondarily, how the issue of malpractice is handled will also consume a fair amount of time. What the authors have also learned is that the intangible issues can often be the reason for an unexpectedly large amount of discussion and therefore add time to the negotiation process. To assist with this process, they have derived a negotiation checklist, which seeks to help hospital leaders and administrators set the proper framework to ensure successful negotiation conversations. More importantly, being organized and recognizing these broad issues upfront and remaining transparent throughout the process will help to ensure a successful negotiation.

  3. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  4. Simplified Physics Based Models Research Topical Report on Task #2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, Srikanta; Ganesh, Priya

    We present a simplified-physics based approach, where only the most important physical processes are modeled, to develop and validate simplified predictive models of CO2 sequestration in deep saline formation. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. We use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and themore » nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Similar correlations are also developed to predict the average pressure within the injection reservoir, and the pressure buildup within the caprock.« less

  5. Simplified Perovskite Solar Cell with 4.1% Efficiency Employing Inorganic CsPbBr3 as Light Absorber.

    PubMed

    Duan, Jialong; Zhao, Yuanyuan; He, Benlin; Tang, Qunwei

    2018-05-01

    Perovskite solar cells with cost-effectiveness, high power conversion efficiency, and improved stability are promising solutions to the energy crisis and environmental pollution. However, a wide-bandgap inorganic-semiconductor electron-transporting layer such as TiO 2 can harvest ultraviolet light to photodegrade perovskite halides, and the high cost of a state-of-the-art hole-transporting layer is an economic burden for commercialization. Here, the building of a simplified cesium lead bromide (CsPbBr 3 ) perovskite solar cell with fluorine-doped tin oxide (FTO)/CsPbBr 3 /carbon architecture by a multistep solution-processed deposition technology is demonstrated, achieving an efficiency as high as 4.1% and improved stability upon interfacial modification by graphene quantum dots and CsPbBrI 2 quantum dots. This work provides new opportunities of building next-generation solar cells with significantly simplified processes and reduced production costs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. A Simplified Decision Support Approach for Evaluating Wetlands Ecosystem Services

    EPA Science Inventory

    State-level managers and restoration advocates have expressed a desire for approaches that address wetlands services and benefits for two purposes: to demonstrate the benefits of money budgeted for restoration, and to compare proposals when awarding restoration funds for specific...

  7. 48 CFR 2906.301 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REQUIREMENTS Other Than Full and Open Competition 2906.301 Policy. (a) Department of Labor acquisitions must... for Administration and Management and, in the case of research and development contracts, also by the... for research and development, the contracting officer has the authority below the simplified...

  8. 48 CFR 2906.301 - Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... REQUIREMENTS Other Than Full and Open Competition 2906.301 Policy. (a) Department of Labor acquisitions must... for Administration and Management and, in the case of research and development contracts, also by the... for research and development, the contracting officer has the authority below the simplified...

  9. Future launchers strategy : the ariane 2010 initiative

    NASA Astrophysics Data System (ADS)

    Bonnal, Ch.; Eymard, M.; Soccodato, C.

    2001-03-01

    With the new cryogenic upper stage ESC, the European heavy launcher Ariane 5+ is perfectly suited to the space market envisioned for the coming decade: flexible to cope with any payload and commercially attractive despite a fierce competition. Current Arianespace projections for the following years 2010-2020 indicate two major trends: satellites may still become larger and may require very different final orbits; today's market largely dominated by GEO may well evolve, influenced by LEO operations such as those linked to ISS or by constellations, to remain competitive, the launch cost has to be reduced. The future generation of the European heavy launcher has therefore to focus on an ever increased flexibility with a drastic cost reduction. Two strategies are possible to achieve this double goal: reusable launchers, either partially or totally, may ease the access to space, limiting costly expendable stages; the assessment of their technical feasibility and financial viability is undergoing in Europe under the Future Launchers Technology Program (FLTP), expendable launchers, derived from the future Ariane 5+. This second way started by CNES at the end of year 1999 is called the "Ariane 2010 initiative". The main objectives are simultaneously an increase of 25% in performance and a reduction of 30% in launch cost wrt Ariane 5+. To achieve these very ambitious goals, numerous major modifications are studied: technical improvements : modifications of the Solid Rocket Boosters may consist in filament winding casing, increased loading, simplified casting, improved grain, simplified Thrust Vector Control, … evolution of the Vulcain engine leading to higher efficiency despite a simplified design, flow separation controlled nozzle extension, propellant management of the two cryogenic stages, simplified electrical system, increased standardization, for instance on flanged interfaces and manufacturing processes, operational improvements such as launch cycle simplification and standardization of the coupled analyses, organizational improvements such as a redistribution of responsibilities for the developments. All these modifications will of course not be implemented together; the aim is to have a coherent catalogue of improvements in order to enable future choices depending on effective requirements. These basic elements will also be considered for the development of other launchers, in the small or medium size range.

  10. Estimating inelastic heavy-particle-hydrogen collision data. I. Simplified model and application to potassium-hydrogen collisions

    NASA Astrophysics Data System (ADS)

    Belyaev, Andrey K.; Yakovleva, Svetlana A.

    2017-10-01

    Aims: We derive a simplified model for estimating atomic data on inelastic processes in low-energy collisions of heavy-particles with hydrogen, in particular for the inelastic processes with high and moderate rate coefficients. It is known that these processes are important for non-LTE modeling of cool stellar atmospheres. Methods: Rate coefficients are evaluated using a derived method, which is a simplified version of a recently proposed approach based on the asymptotic method for electronic structure calculations and the Landau-Zener model for nonadiabatic transition probability determination. Results: The rate coefficients are found to be expressed via statistical probabilities and reduced rate coefficients. It turns out that the reduced rate coefficients for mutual neutralization and ion-pair formation processes depend on single electronic bound energies of an atom, while the reduced rate coefficients for excitation and de-excitation processes depend on two electronic bound energies. The reduced rate coefficients are calculated and tabulated as functions of electronic bound energies. The derived model is applied to potassium-hydrogen collisions. For the first time, rate coefficients are evaluated for inelastic processes in K+H and K++H- collisions for all transitions from ground states up to and including ionic states. Tables with calculated data are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/606/A147

  11. Planning Staff and Space Capacity Requirements during Wartime.

    PubMed

    Kepner, Elisa B; Spencer, Rachel

    2016-01-01

    Determining staff and space requirements for military medical centers can be challenging. Changing patient populations change the caseload requirements. Deployment and assignment rotations change the experience and education of clinicians and support staff, thereby changing the caseload capacity of a facility. During wartime, planning becomes increasingly more complex. What will the patient mix and caseload volume be by location? What type of clinicians will be available and when? How many beds are needed at each facility to meet caseload demand and match clinician supply? As soon as these factors are known, operations are likely to change and planning factors quickly become inaccurate. Soon, more beds or staff are needed in certain locations to meet caseload demand while other locations retain underutilized staff, waiting for additional caseload fluctuations. This type of complexity challenges the best commanders. As in so many other industries, supply and demand principles apply to military health, but very little is stable about military health capacity planning. Planning analysts build complex statistical forecasting models to predict caseload based on historical patterns. These capacity planning techniques work best in stable repeatable processes where caseload and staffing resources remain constant over a long period of time. Variability must be simplified to predict complex operations. This is counterintuitive to the majority of capacity planners who believe more data drives better answers. When the best predictor of future needs is not historical patterns, traditional capacity planning does not work. Rather, simplified estimation techniques coupled with frequent calibration adjustments to account for environmental changes will create the most accurate and most useful capacity planning and management system. The method presented in this article outlines the capacity planning approach used to actively manage hospital staff and space during Operations Iraqi Freedom and Enduring Freedom.

  12. Development and validation of a simplified titration method for monitoring volatile fatty acids in anaerobic digestion.

    PubMed

    Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie

    2017-09-01

    The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.

  13. A simplified boron diffusion for preparing the silicon single crystal p-n junction as an educational device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiota, Koki, E-mail: a14510@sr.kagawa-nct.ac.jp; Kai, Kazuho; Nagaoka, Shiro, E-mail: nagaoka@es.kagawa-nct.ac.jp

    The educational method which is including designing, making, and evaluating actual semiconductor devices with learning the theory is one of the best way to obtain the fundamental understanding of the device physics and to cultivate the ability to make unique ideas using the knowledge in the semiconductor device. In this paper, the simplified Boron thermal diffusion process using Sol-Gel material under normal air environment was proposed based on simple hypothesis and the feasibility of the reproducibility and reliability were investigated to simplify the diffusion process for making the educational devices, such as p-n junction, bipolar and pMOS devices. As themore » result, this method was successfully achieved making p+ region on the surface of the n-type silicon substrates with good reproducibility. And good rectification property of the p-n junctions was obtained successfully. This result indicates that there is a possibility to apply on the process making pMOS or bipolar transistors. It suggests that there is a variety of the possibility of the applications in the educational field to foster an imagination of new devices.« less

  14. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  15. Application of a cave inventory system to stimulate development of management strategies: the case of west-central Florida, USA.

    PubMed

    Harley, Grant L; Polk, Jason S; North, Leslie A; Reeder, Philip P

    2011-10-01

    The active management of air-filled cave systems is virtually non-existent within the karst landscape of west-central Florida. As in every karst landscape, caves are important because they contain a wide variety of resources (e.g., biota, speleothems) and can act as direct connections between surface and subsurface hydrological processes, potentially exacerbating the pollution of groundwater. Before sound management policies can be drafted, implemented, and enforced, stakeholders must first have knowledge of the management requirements of each cave. However, there is an informational disconnect between researchers, stakeholders, and the recreational caving community. Here, we present a cave inventory system that simplifies the dissemination of resource knowledge to stakeholders so that cave management and protection policies can be drafted and implemented at the state and local level. We inventoried 36 caves in west-central Florida, located on both public and private land, and analyzed cave resource data to provide insights on cave sensitivity and disturbance using two standardized indices. The data revealed that both public and private caves exhibit a wide range of sensitivity and disturbance, and before management strategies can be drafted, the ownership of each cave must be considered. Our inventory geodatabase serves as a link between researchers, landowners, and the public. To ensure the conservation and protection of caves, support from county or state government, combined with cave inventory data, is crucial in developing sound management policy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Waste Management Improvement Initiatives at Atomic Energy of Canada Limited - 13091

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Nicholas; Adams, Lynne; Wong, Pierre

    2013-07-01

    Atomic Energy of Canada Limited's (AECL) Chalk River Laboratories (CRL) has been in operation for over 60 years. Radioactive, mixed, hazardous and non-hazardous wastes have been and continue to be generated at CRL as a result of research and development, radioisotope production, reactor operation and facility decommissioning activities. AECL has implemented several improvement initiatives at CRL to simplify the interface between waste generators and waste receivers: - Introduction of trained Waste Officers representing their facilities or activities at CRL; - Establishment of a Waste Management Customer Support Service as a Single-Point of Contact to provide guidance to waste generators formore » all waste management processes; and - Implementation of a streamlined approach for waste identification with emphasis on early identification of waste types and potential disposition paths. As a result of implementing these improvement initiatives, improvements in waste management and waste transfer efficiencies have been realized at CRL. These included: 1) waste generators contacting the Customer Support Service for information or guidance instead of various waste receivers; 2) more clear and consistent guidance provided to waste generators for waste management through the Customer Support Service; 3) more consistent and correct waste information provided to waste receivers through Waste Officers, resulting in reduced time and resources required for waste management (i.e., overall cost); 4) improved waste minimization and segregation approaches, as identified by in-house Waste Officers; and 5) enhanced communication between waste generators and waste management groups. (authors)« less

  17. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities

    PubMed Central

    2011-01-01

    Background Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. Results The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. Conclusions PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/. PMID:21385349

  18. NASA Work Breakdown Structure (WBS) Handbook

    NASA Technical Reports Server (NTRS)

    Fleming, Jon F.; Poole, Kenneth W.

    2016-01-01

    The purpose of this document is to provide program/project teams necessary instruction and guidance in the best practices for Work Breakdown Structure (WBS) and WBS dictionary development and use for project implementation and management control. This handbook can be used for all types of NASA projects and work activities including research, development, construction, test and evaluation, and operations. The products of these work efforts may be hardware, software, data, or service elements (alone or in combination). The aim of this document is to assist project teams in the development of effective work breakdown structures that provide a framework of common reference for all project elements. The WBS and WBS dictionary are effective management processes for planning, organizing, and administering NASA programs and projects. The guidance contained in this document is applicable to both in-house, NASA-led effort and contracted effort. It assists management teams from both entities in fulfilling necessary responsibilities for successful accomplishment of project cost, schedule, and technical goals. Benefits resulting from the use of an effective WBS include, but are not limited to: providing a basis for assigned project responsibilities, providing a basis for project schedule and budget development, simplifying a project by dividing the total work scope into manageable units, and providing a common reference for all project communication.

  19. Work Breakdown Structure (WBS) Handbook

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The purpose of this document is to provide program/project teams necessary instruction and guidance in the best practices for Work Breakdown Structure (WBS) and WBS dictionary development and use for project implementation and management control. This handbook can be used for all types of NASA projects and work activities including research, development, construction, test and evaluation, and operations. The products of these work efforts may be hardware, software, data, or service elements (alone or in combination). The aim of this document is to assist project teams in the development of effective work breakdown structures that provide a framework of common reference for all project elements. The WBS and WBS dictionary are effective management processes for planning, organizing, and administering NASA programs and projects. The guidance contained in this document is applicable to both in-house, NASA-led effort and contracted effort. It assists management teams from both entities in fulfilling necessary responsibilities for successful accomplishment of project cost, schedule, and technical goals. Benefits resulting from the use of an effective WBS include, but are not limited to: providing a basis for assigned project responsibilities, providing a basis for project schedule development, simplifying a project by dividing the total work scope into manageable units, and providing a common reference for all project communication.

  20. Changing workforce demographics necessitates succession planning in health care.

    PubMed

    Collins, Sandra K; Collins, Kevin S

    2007-01-01

    Health care organizations continue to be plagued by labor shortage issues. Further complicating the already existing workforce challenges is an aging population poised to retire en masse within the next few years. With fewer cohorts in the age group of 25 to 44 years (Vital Speeches Day. 2004:71:23-27), a more mobile workforce (Grow Your Own Leaders: How to Identify, Develop, and Retain Leadership Talent, 2002), and an overall reduction in the number of individuals seeking employment in the health care field (J Healthc Manag. 2003:48:6-11), the industry could be faced with an unmanageable number of vacant positions throughout the organization. Bracing for the potential impact of these issues is crucial to the ongoing business continuity of health care organization. Many health care organizations have embraced succession planning to combat the potential labor famine. However, the health care industry as a whole seems to lag behind other industries in terms of succession planning efforts (Healthc Financ Manage. 2005;59:64-67). This article seeks to provide health care managers with a framework for improving the systematic preparation of the next generation of managers by analyzing the succession planning process. The proposition of these models is to initiate and simplify the gap reduction between theoretical concepts and future organizational application.

  1. Transcriptomic responses of a simplified soil microcosm to a plant pathogen and its biocontrol agent reveal a complex reaction to harsh habitat.

    PubMed

    Perazzolli, Michele; Herrero, Noemí; Sterck, Lieven; Lenzi, Luisa; Pellegrini, Alberto; Puopolo, Gerardo; Van de Peer, Yves; Pertot, Ilaria

    2016-10-27

    Soil microorganisms are key determinants of soil fertility and plant health. Soil phytopathogenic fungi are one of the most important causes of crop losses worldwide. Microbial biocontrol agents have been extensively studied as alternatives for controlling phytopathogenic soil microorganisms, but molecular interactions between them have mainly been characterised in dual cultures, without taking into account the soil microbial community. We used an RNA sequencing approach to elucidate the molecular interplay of a soil microbial community in response to a plant pathogen and its biocontrol agent, in order to examine the molecular patterns activated by the microorganisms. A simplified soil microcosm containing 11 soil microorganisms was incubated with a plant root pathogen (Armillaria mellea) and its biocontrol agent (Trichoderma atroviride) for 24 h under controlled conditions. More than 46 million paired-end reads were obtained for each replicate and 28,309 differentially expressed genes were identified in total. Pathway analysis revealed complex adaptations of soil microorganisms to the harsh conditions of the soil matrix and to reciprocal microbial competition/cooperation relationships. Both the phytopathogen and its biocontrol agent were specifically recognised by the simplified soil microcosm: defence reaction mechanisms and neutral adaptation processes were activated in response to competitive (T. atroviride) or non-competitive (A. mellea) microorganisms, respectively. Moreover, activation of resistance mechanisms dominated in the simplified soil microcosm in the presence of both A. mellea and T. atroviride. Biocontrol processes of T. atroviride were already activated during incubation in the simplified soil microcosm, possibly to occupy niches in a competitive ecosystem, and they were not further enhanced by the introduction of A. mellea. This work represents an additional step towards understanding molecular interactions between plant pathogens and biocontrol agents within a soil ecosystem. Global transcriptional analysis of the simplified soil microcosm revealed complex metabolic adaptation in the soil environment and specific responses to antagonistic or neutral intruders.

  2. Analysis of simplified heat transfer models for thermal property determination of nano-film by TDTR method

    NASA Astrophysics Data System (ADS)

    Wang, Xinwei; Chen, Zhe; Sun, Fangyuan; Zhang, Hang; Jiang, Yuyan; Tang, Dawei

    2018-03-01

    Heat transfer in nanostructures is of critical importance for a wide range of applications such as functional materials and thermal management of electronics. Time-domain thermoreflectance (TDTR) has been proved to be a reliable measurement technique for the thermal property determinations of nanoscale structures. However, it is difficult to determine more than three thermal properties at the same time. Heat transfer model simplifications can reduce the fitting variables and provide an alternative way for thermal property determination. In this paper, two simplified models are investigated and analyzed by the transform matrix method and simulations. TDTR measurements are performed on Al-SiO2-Si samples with different SiO2 thickness. Both theoretical and experimental results show that the simplified tri-layer model (STM) is reliable and suitable for thin film samples with a wide range of thickness. Furthermore, the STM can also extract the intrinsic thermal conductivity and interfacial thermal resistance from serial samples with different thickness.

  3. Application of geosites assessment method in geopark context

    NASA Astrophysics Data System (ADS)

    Martin, Simon; Perret, Amandine; Renau, Pierre; Cartier-Moulin, Olivier; Regolini-Bissig, Géraldine

    2014-05-01

    The regional natural park of the Monts d'Ardèche (Ardèche and Haute-Loire departments, France) is candidate to the European Geopark Network (EGN) in 2014. The area has a wide geodiversity - with rocks from Cambrian to Pleistocene (basalt flows) - and interesting features like phonolitic protrusions, maars and granite boulders fields. Around 115 sites were selected and documented through a geosites inventory carried out in the territory. This pre-selection was supervised by the Ardèche Geological Society and is therefore expert advice based. In the context of EGN candidature, these potential geosites were assessed with a simplified method. It follows the spirit of the method from the University of Lausanne (Reynard et al., 2007) and its recent developments: assessment of the scientific (central) value and of a set of additional values (ecological and cultural). As this assessment aimed to offer a management tool to the future geopark's authorities, a special focus was given to management aspects. In particular, the opportunities to use the site for education (from schools to universities) and for tourism as well as the existence of protection and of interpretive facilities were documented and assessed. Several interesting conclusions may be drawn from this case study: (1) expert assessment is effective when it is based on a pre-existing inventory which is well structured and documented; (2) even simplified, an assessment method is a very useful framework to expert assessment as it focuses the discussions on most important points and helps to balance the assessment; (3) whereas the inventory can be extensively detailed and partly academic, the assessment in the geopark context is objective-driven in order to answer management needs. The place of the geosites assessment among the three key players of a geopark construction process (i.e. territory's managers, local geoscientists and EGN) is also discussed. This place can be defined as the point of consensus of needs and wishes of all stakeholders. For instance, the local geoscientists are most interested in conservation and scientific interests whereas managers aim to develop and promote the tourist (and economic) dimension. The definition and application of the assessment method is the outcome of constant discussion with both local key players; it therefore reflects and moderates the - sometimes antagonistic - interests. All the discussions around geosites assessment can be considered as the prime mover at local scale of the geopark construction process. This example shows that geosites assessment can not be considered only as an academic operation, but also as an essential step to initiate a local dynamic and consensus that help to achieve some of the objectives of a geopark defined by EGN like local involvement, sustainable development, or cooperation with local communities.

  4. [The use of an opect optic system in neurosurgical practice].

    PubMed

    Kalinovskiy, A V; Rzaev, D A; Yoshimitsu, K

    2018-01-01

    Modern neurosurgical practice is impossible without access to various information sources. The use of MRI and MSCT data during surgery is an integral part of the neurosurgeon's daily practice. Devices capable of managing an image viewer system without direct contact with equipment simplify working in the operating room. To test operation of a non-contact MRI and MSCT image viewer system in the operating room and to evaluate the system effectiveness. An Opect non-contact image management system developed at the Tokyo Women's Medical University was installed in one of the operating rooms of the Novosibirsk Federal Center of Neurosurgery in 2014. In 2015, the Opect system was used by operating surgeons in 73 surgeries performed in the same operating room. The system effectiveness was analyzed based on a survey of surgeons. The non-contact image viewer system occurred to be easy-to-learn for the personnel to operate this system, easy-to-manage it, and easy-to-present visual information during surgery. Application of the Opect system simplifies work with neuroimaging data during surgery. The surgeon can independently view series of relevant MRI and MSCT scans without any assistance.

  5. Development and technical basis of simplified guidelines for emergency triage assessment and treatment in developing countries. WHO Integrated Management of Childhood Illness (IMCI) Referral Care Project.

    PubMed

    Gove, S; Tamburlini, G; Molyneux, E; Whitesell, P; Campbell, H

    1999-12-01

    Simplified guidelines for the emergency care of children have been developed to improve the triage and rapid initiation of appropriate emergency treatments for children presenting to hospitals in developing countries. The guidelines are part of the effort to improve referral level paediatric care within the World Health Organisation/Unicef strategy integrated management of childhood illness (IMCI), based on evidence of significant deficiencies in triage and emergency care. Existing emergency guidelines have been modified according to resource limitations and significant differences in the epidemiology of severe paediatric illness and preventable death in developing countries with raised infant and child mortality rates. In these settings, it is important to address the emergency management of diarrhoea with severe dehydration, severe malaria, severe malnutrition, and severe bacterial pneumonia, and to focus attention on sick infants younger than 2 months of age. The triage assessment relies on a few clinical signs, which can be readily taught so that it can be used by health workers with limited clinical background. The assessment has been designed so that it can be carried out quickly if negative, making it functional for triaging children in queues.

  6. 48 CFR 46.202-2 - Government reliance on inspection by contractor.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... the contractor to accomplish all inspection and testing needed to ensure that supplies or services acquired at or below the simplified acquisition threshold conform to contract quality requirements before...

  7. Geophysical data base

    NASA Technical Reports Server (NTRS)

    Williamson, M. R.; Kirschner, L. R.

    1975-01-01

    A general data-management system that provides a random-access capability for large amounts of data is described. The system operates on a CDC 6400 computer using a combination of magnetic tape and disk storage. A FORTRAN subroutine package is provided to simplify the maintenance and use of the data.

  8. Design and implementation of a sigma delta technology based pulse oximeter's acquisition stage

    NASA Astrophysics Data System (ADS)

    Rossi, E. E.; Peñalva, A.; Schaumburg, F.

    2011-12-01

    Pulse oximetry is a widely used tool in medical practice for estimating patient's fraction of hemoglobin bonded to oxygen. Conventional oximetry presents limitations when changes in the baseline, or low amplitude of signals involved occur. The aim of this paper is to simultaneously solve these constraints and to simplify the circuitry needed, by using ΣΔ technology. For this purpose, a board for the acquisition of the needed signals was developed, together with a PC managed software which controls it, and displays and processes in real time the information acquired. Also laboratory and field tests where designed and executed to verify the performance of this equipment in adverse situations. A simple, robust and economic instrument was achieved, capable of obtaining signals even in situations where conventional oximetry fails.

  9. Multispectral imaging approach for simplified non-invasive in-vivo evaluation of gingival erythema

    NASA Astrophysics Data System (ADS)

    Eckhard, Timo; Valero, Eva M.; Nieves, Juan L.; Gallegos-Rueda, José M.; Mesa, Francisco

    2012-03-01

    Erythema is a common visual sign of gingivitis. In this work, a new and simple low-cost image capture and analysis method for erythema assessment is proposed. The method is based on digital still images of gingivae and applied on a pixel-by-pixel basis. Multispectral images are acquired with a conventional digital camera and multiplexed LED illumination panels at 460nm and 630nm peak wavelength. An automatic work-flow segments teeth from gingiva regions in the images and creates a map of local blood oxygenation levels, which relates to the presence of erythema. The map is computed from the ratio of the two spectral images. An advantage of the proposed approach is that the whole process is easy to manage by dental health care professionals in clinical environment.

  10. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    PubMed

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  11. Information security governance: a risk assessment approach to health information systems protection.

    PubMed

    Williams, Patricia A H

    2013-01-01

    It is no small task to manage the protection of healthcare data and healthcare information systems. In an environment that is demanding adaptation to change for all information collection, storage and retrieval systems, including those for of e-health and information systems, it is imperative that good information security governance is in place. This includes understanding and meeting legislative and regulatory requirements. This chapter provides three models to educate and guide organisations in this complex area, and to simplify the process of information security governance and ensure appropriate and effective measures are put in place. The approach is risk based, adapted and contextualized for healthcare. In addition, specific considerations of the impact of cloud services, secondary use of data, big data and mobile health are discussed.

  12. Electrolytic decontamination of conductive materials for hazardous waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wedman, D.E.; Martinez, H.E.; Nelson, T.O.

    1996-12-31

    Electrolytic removal of plutonium and americium from stainless steel and uranium surfaces has been demonstrated. Preliminary experiments were performed on the electrochemically based decontamination of type 304L stainless steel in sodium nitrate solutions to better understand the metal removal effects of varying cur-rent density, pH, and nitrate concentration parameters. Material removal rates and changes in surface morphology under these varying conditions are reported. Experimental results indicate that an electropolishing step before contamination removes surface roughness, thereby simplifying later electrolytic decontamination. Sodium nitrate based electrolytic decontamination produced the most uniform stripping of material at low to intermediate pH and at sodiummore » nitrate concentrations of 200 g L{sup -1} and higher. Stirring was also observed to increase the uniformity of the stripping process.« less

  13. Managing Data, Provenance and Chaos through Standardization and Automation at the Georgia Coastal Ecosystems LTER Site

    NASA Astrophysics Data System (ADS)

    Sheldon, W.

    2013-12-01

    Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.

  14. Induced simplified neutrosophic correlated aggregation operators for multi-criteria group decision-making

    NASA Astrophysics Data System (ADS)

    Şahin, Rıdvan; Zhang, Hong-yu

    2018-03-01

    Induced Choquet integral is a powerful tool to deal with imprecise or uncertain nature. This study proposes a combination process of the induced Choquet integral and neutrosophic information. We first give the operational properties of simplified neutrosophic numbers (SNNs). Then, we develop some new information aggregation operators, including an induced simplified neutrosophic correlated averaging (I-SNCA) operator and an induced simplified neutrosophic correlated geometric (I-SNCG) operator. These operators not only consider the importance of elements or their ordered positions, but also take into account the interactions phenomena among decision criteria or their ordered positions under multiple decision-makers. Moreover, we present a detailed analysis of I-SNCA and I-SNCG operators, including the properties of idempotency, commutativity and monotonicity, and study the relationships among the proposed operators and existing simplified neutrosophic aggregation operators. In order to handle the multi-criteria group decision-making (MCGDM) situations where the weights of criteria and decision-makers usually correlative and the criterion values are considered as SNNs, an approach is established based on I-SNCA operator. Finally, a numerical example is presented to demonstrate the proposed approach and to verify its effectiveness and practicality.

  15. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.

    PubMed

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-05-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT

    PubMed Central

    Schenk, Andreas D.; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-01-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library & Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. PMID:23500887

  17. The Cassini Solstice Mission: Streamlining Operations by Sequencing with PIEs

    NASA Technical Reports Server (NTRS)

    Vandermey, Nancy; Alonge, Eleanor K.; Magee, Kari; Heventhal, William

    2014-01-01

    The Cassini Solstice Mission (CSM) is the second extended mission phase of the highly successful Cassini/Huygens mission to Saturn. Conducted at a much-reduced funding level, operations for the CSM have been streamlined and simplified significantly. Integration of the science timeline, which involves allocating observation time in a balanced manner to each of the five different science disciplines (with representatives from the twelve different science instruments), has long been a labor-intensive endeavor. Lessons learned from the prime mission (2004-2008) and first extended mission (Equinox mission, 2008-2010) were utilized to design a new process involving PIEs (Pre-Integrated Events) to ensure the highest priority observations for each discipline could be accomplished despite reduced work force and overall simplification of processes. Discipline-level PIE lists were managed by the Science Planning team and graphically mapped to aid timeline deconfliction meetings prior to assigning discrete segments of time to the various disciplines. Periapse segments are generally discipline-focused, with the exception of a handful of PIEs. In addition to all PIEs being documented in a spreadsheet, allocated out-of-discipline PIEs were entered into the Cassini Information Management System (CIMS) well in advance of timeline integration. The disciplines were then free to work the rest of the timeline internally, without the need for frequent interaction, debate, and negotiation with representatives from other disciplines. As a result, the number of integration meetings has been cut back extensively, freeing up workforce. The sequence implementation process was streamlined as well, combining two previous processes (and teams) into one. The new Sequence Implementation Process (SIP) schedules 22 weeks to build each 10-week-long sequence, and only 3 sequence processes overlap. This differs significantly from prime mission during which 5-week-long sequences were built in 24 weeks, with 6 overlapping processes.

  18. Simplified Method for Preparing Methylene-Blue-Sensitized Dichromated Gelatin

    NASA Astrophysics Data System (ADS)

    Kurokawa, Kazumasa; Koike, Satoshi; Namba, Sinji; Mizuno, Toru; Kubota, Toshihiro

    1998-05-01

    Methylene-blue-sensitized dichromated gelatin (MBDCG) is a suitable material for recording full-color holograms in a single layer. However, a drying process in an ammonia atmosphere is necessary to prepare the MBDCG plate. This process is time-consuming and unstable. A simplified method for preparing the MBDCG plate is presented in which the MBDCG can be dried without ammonia. Elimination of the drying process is possible when the methylene blue in MBDCG does not separate. This is achieved by a decrease in the concentration of dichromate in the photosensitized solution and the addition of an ammonia solution to the photosensitized solution. Last, the gelatin is allowed to gel. A Lippmann color hologram grating with a diffraction efficiency of more than 80% is obtained by use of this MBDCG.

  19. Low energy production processes in manufacturing of silicon solar cells

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, A. R.

    1976-01-01

    Ion implantation and pulsed energy techniques are being combined for fabrication of silicon solar cells totally under vacuum and at room temperature. Simplified sequences allow very short processing times with small process energy consumption. Economic projections for fully automated production are excellent.

  20. Finding simplicity in complexity: modelling post-fire hydrogeomorphic processes and risks

    NASA Astrophysics Data System (ADS)

    Sheridan, Gary; Langhans, Christoph; Lane, Patrick; Nyman, Petter

    2017-04-01

    Post-fire runoff and erosion can shape landscapes, destroy infrastructure, and result in the loss of human life. However even within seemingly similar geographic regions post-fire hydro-geomorphic responses vary from almost no response through to catastrophic flash floods and debris flows. Why is there so much variability, and how can we predict areas at risk? This presentation describes the research journey taken by the post-fire research group at The University of Melbourne to answer this question for the se Australian uplands. Key steps along the way have included identifying the dominant erosion processes (and their forcings), and the key system properties controlling the rates of these dominant processes. The high degree of complexity in the interactions between the forcings, the system properties, and the erosion processes, necessitated the development of a simplified conceptual representation of post-fire hydrogeomorphic system that was conducive to modelling and simulation. Spatially mappable metrics (and proxies) for key system forcings and properties were then required to parameterize and drive the model. Each step in this journey has depended on new research, as well as ongoing feedback from land and water management agencies tasked with implementing these risk models and interpreting the results. These models are now imbedded within agencies and used for strategic risk assessments, for tactical response during fires, and for post-fire remediation and risk planning. Reflecting on the successes and failures along the way provides for some more general insights into the process of developing research-based models for operational use by land and water management agencies.

  1. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  2. Supervised Classification Processes for the Characterization of Heritage Elements, Case Study: Cuenca-Ecuador

    NASA Astrophysics Data System (ADS)

    Briones, J. C.; Heras, V.; Abril, C.; Sinchi, E.

    2017-08-01

    The proper control of built heritage entails many challenges related to the complexity of heritage elements and the extent of the area to be managed, for which the available resources must be efficiently used. In this scenario, the preventive conservation approach, based on the concept that prevent is better than cure, emerges as a strategy to avoid the progressive and imminent loss of monuments and heritage sites. Regular monitoring appears as a key tool to identify timely changes in heritage assets. This research demonstrates that the supervised learning model (Support Vector Machines - SVM) is an ideal tool that supports the monitoring process detecting visible elements in aerial images such as roofs structures, vegetation and pavements. The linear, gaussian and polynomial kernel functions were tested; the lineal function provided better results over the other functions. It is important to mention that due to the high level of segmentation generated by the classification procedure, it was necessary to apply a generalization process through opening a mathematical morphological operation, which simplified the over classification for the monitored elements.

  3. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  4. Enrolling in Medicaid through the National School Lunch Program: outcome of a pilot project in California schools.

    PubMed

    Cousineau, Michael R; Wada, Eriko O; Hogan, Laura

    2007-01-01

    California has several health insurance programs for children. However, the system for enrolling into these programs is complex and difficult to manage for many families. Express Lane Eligibility is designed to streamline the Medicaid (called Medi-Cal in California) enrollment process by linking it to the National School Lunch Program. If a child is eligible for free lunch and the parents consent, the program provides two months of presumptive eligibility for Medi-Cal and a simplified application process for continuation in Medi-Cal. For those who are ineligible, it provides a referral to other programs. An evaluation of Express Lane shows that while many children were presumptively enrolled, nearly half of the applicants were already enrolled in Medi-Cal. Many Express Enrolled children failed to complete the full Medi-Cal enrollment process. Few were referred to the State Children's Health Insurance Program or county programs. Express Lane is less useful as a broad screening strategy, but can be one of many tools that communities use to enroll children in health insurance.

  5. The influence of wind-tunnel walls on discrete frequency noise

    NASA Technical Reports Server (NTRS)

    Mosher, M.

    1984-01-01

    This paper describes an analytical model that can be used to examine the effects of wind-tunnel walls on discrete frequency noise. First, a complete physical model of an acoustic source in a wind tunnel is described, and a simplified version is then developed. This simplified model retains the important physical processes involved, yet it is more amenable to analysis. Second, the simplified physical model is formulated as a mathematical problem. An inhomogeneous partial differential equation with mixed boundary conditions is set up and then transformed into an integral equation. The integral equation has been solved with a panel program on a computer. Preliminary results from a simple model problem will be shown and compared with the approximate analytic solution.

  6. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, P.R.

    The purpose of this research was to provide a handbook on Project PETROL RAM and its ideas for enhancing the base-level fuels operations. This handbook is to serve as an introduction and reference guide to the components, characteristics, and capabilities of the proposed initiative. In preparing this handbook, available information on the structure, operations, and procedures of a typical base Fuels Management Branch was gathered, then condensed and simplified. For added depth and understanding, personal interviews with personnel involved in the design and development of this project were conducted. The research resulted in a handbook that describes, in simplified terminology,more » the different systems which are being developed under the Project PETROL RAM initiative.« less

  8. Managing Salary Equity. AIR Forum 1981 Paper.

    ERIC Educational Resources Information Center

    Prather, James E.; Posey, Ellen I.

    Technical considerations in the development of a salary equity model based upon regression analysis are reviewed, and a simplified salary prediction equation is examined. Application and communication of the results of the analysis within the existing operational context of a postsecondary institution are also addressed. The literature is…

  9. Controlling Inventory: Real-World Mathematical Modeling

    ERIC Educational Resources Information Center

    Edwards, Thomas G.; Özgün-Koca, S. Asli; Chelst, Kenneth R.

    2013-01-01

    Amazon, Walmart, and other large-scale retailers owe their success partly to efficient inventory management. For such firms, holding too little inventory risks losing sales, whereas holding idle inventory wastes money. Therefore profits hinge on the inventory level chosen. In this activity, students investigate a simplified inventory-control…

  10. 48 CFR 36.515 - Schedules for construction contracts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... contemplated, the contract amount is expected to exceed the simplified acquisition threshold, and the period of... covering other management approaches for ensuring that a contractor makes adequate progress. [48 FR 42356... Schedules for construction contracts. The contracting officer may insert the clause at 52.236-15, Schedules...

  11. Synthesis of research on work zone delays and simplified application of QuickZone analysis tool.

    DOT National Transportation Integrated Search

    2010-03-01

    The objectives of this project were to synthesize the latest information on work zone safety and management and identify case studies in which FHWAs decision support tool QuickZone or other appropriate analysis tools could be applied. The results ...

  12. The Uncertainty of Mass Discharge Measurements Using Pumping Methods Under Simplified Conditions

    EPA Science Inventory

    Mass discharge measurements at contaminated sites have been used to assist with site management decisions, and can be divided into two broad categories: point-scale measurement techniques and pumping methods. Pumping methods can be sub-divided based on the pumping procedures use...

  13. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  14. Ultralow percolation threshold of single walled carbon nanotube-epoxy composites synthesized via an ionic liquid dispersant/initiator

    NASA Astrophysics Data System (ADS)

    Watters, Arianna L.; Palmese, Giuseppe R.

    2014-09-01

    Uniform dispersion of single walled carbon nanotubes (SWNTs) in an epoxy was achieved by a streamlined mechano-chemical processing method. SWNT-epoxy composites were synthesized using a room temperature ionic liquid (IL) with an imidazolium cation and dicyanamide anion. The novel approach of using ionic liquid that behaves as a dispersant for SWNTs and initiator for epoxy polymerization greatly simplifies nanocomposite synthesis. The material was processed using simple and scalable three roll milling. The SWNT dispersion of the resultant composite was evaluated by electron microscopy and electrical conductivity measurements in conjunction with percolation theory. Processing conditions were optimized to achieve the lowest possible percolation threshold, 4.29 × 10-5 volume fraction SWNTs. This percolation threshold is among the best reported in literature yet it was obtained using a streamlined method that greatly simplifies processing.

  15. International Conference on the Methods of Aerophysical Research 98 "ICMAR 98". Proceedings, Part 1

    DTIC Science & Technology

    1998-01-01

    pumping air through device and airdrying due to vapour condensation on cooled surfaces. Fig. 1 In this report, approximate estimates are presented...picture is used for flow field between disks and for water vapor condensation on cooled moving surfaces. Shown in Fig. 1 is a simplified flow...frequency of disks rotation), thus, breaking away from channel walls. Regarding condensation process, a number of usual simplifying assumptions is made

  16. An Image Understanding Environment for DARPA Supported Research and Applications, Second Annual Report

    DTIC Science & Technology

    1992-05-01

    relatively independent of the 29 30 Basic Objects Support Objects GUI Access Objects Displays Display Mapping Menues Pixel Snapshot Gizmos /Widgets...a user interactively or set from some gizmo /widget, or that a particular browser field is to be updated when some state occurs or a process completes...also want to distinguish tree graph browsers.] 4.3.2 Simplified access to GUI objects "* Gizmos and Widgets: The IUE should provide simplified

  17. ATLAS TDAQ System Administration: Master of Puppets

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Brasolin, F.; Fazio, D.; Gament, C.; Lee, C. J.; Scannicchio, D. A.; Twomey, M. S.

    2017-10-01

    Within the ATLAS detector, the Trigger and Data Acquisition system is responsible for the online processing of data streamed from the detector during collisions at the Large Hadron Collider at CERN. The online farm is comprised of ∼4000 servers processing the data read out from ∼100 million detector channels through multiple trigger levels. The configurtion of these servers is not an easy task, especially since the detector itself is made up of multiple different sub-detectors, each with their own particular requirements. The previous method of configuring these servers, using Quattor and a hierarchical scripts system was cumbersome and restrictive. A better, unified system was therefore required to simplify the tasks of the TDAQ Systems Administrators, for both the local and net-booted systems, and to be able to fulfil the requirements of TDAQ, Detector Control Systems and the sub-detectors groups. Various configuration management systems were evaluated, though in the end, Puppet was chosen as the application of choice and was the first such implementation at CERN.

  18. A Statistical Representation of Pyrotechnic Igniter Output

    NASA Astrophysics Data System (ADS)

    Guo, Shuyue; Cooper, Marcia

    2017-06-01

    The output of simplified pyrotechnic igniters for research investigations is statistically characterized by monitoring the post-ignition external flow field with Schlieren imaging. Unique to this work is a detailed quantification of all measurable manufacturing parameters (e.g., bridgewire length, charge cavity dimensions, powder bed density) and associated shock-motion variability in the tested igniters. To demonstrate experimental precision of the recorded Schlieren images and developed image processing methodologies, commercial exploding bridgewires using wires of different parameters were tested. Finally, a statistically-significant population of manufactured igniters were tested within the Schlieren arrangement resulting in a characterization of the nominal output. Comparisons between the variances measured throughout the manufacturing processes and the calculated output variance provide insight into the critical device phenomena that dominate performance. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's NNSA under contract DE-AC04-94AL85000.

  19. Application of indoor noise prediction in the real world

    NASA Astrophysics Data System (ADS)

    Lewis, David N.

    2002-11-01

    Predicting indoor noise in industrial workrooms is an important part of the process of designing industrial plants. Predicted levels are used in the design process to determine compliance with occupational-noise regulations, and to estimate levels inside the walls in order to predict community noise radiated from the building. Once predicted levels are known, noise-control strategies can be developed. In this paper an overview of over 20 years of experience is given with the use of various prediction approaches to manage noise in Unilever plants. This work has applied empirical and ray-tracing approaches separately, and in combination, to design various packaging and production plants and other facilities. The advantages of prediction methods in general, and of the various approaches in particular, will be discussed. A case-study application of prediction methods to the optimization of noise-control measures in a food-packaging plant will be presented. Plans to acquire a simplified prediction model for use as a company noise-screening tool will be discussed.

  20. Application of densification process in organic waste management.

    PubMed

    Zafari, Abedin; Kianmehr, Mohammad Hossein

    2013-07-01

    Densification of biomass material that usually has a low density is good way of increasing density, reducing the cost of transportation, and simplifying the storage and distribution of this material. The current study was conducted to investigate the influence of raw material parameters (moisture content and particle size), and densification process parameters (piston speed and die length) on the density and durability of pellets from compost manure. A hydraulic press and a single pelleter were used to produce pellets in controlled conditions. Ground biomass samples were compressed with three levels of moisture content [35%, 40% and 45% (wet basis)], piston speed (2, 6 and 10 mm/s), die length (8, 10 and 12 mm) and particle size (0.3., 0.9 and 1.5 mm) to establish density and durability of pellets. A response surface methodology based on the Box Behnken design was used to study the responses pattern and to understand the influence of parameters. The results revealed that all independent variables have significant (P < 0.01) effects on studied responses in this research.

  1. Advanced wastewater treatment simplified through research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souther, R.H.

    A waste water treatment plant was built based on results of a small-scale pilot plant study, conducted largely in a search for efficiency as well as economy. Results were that 98 percent carbonaceous BOD (BOD/sub C/) and nitrogenous BOD (BOD/sub N/) were removed in a simplified, low-cost, single-stage advanced treatment process surpassing even some of the most sophisticated advanced complex waste treatment methods. The single-stage process treats domestic waste alone or combined with very high amounts of textile, electroplating, chemical, food, and other processing industrial wastewater. The process removed 100 percent of the sulfides above 98 percent of NH/sub 3/-N,more » over 90 percent of COD and phenols; chromium was converted from highly toxic hexavalent CrVI to nearly nontoxic trivalent chrome (CrIII). A pH up to 12 may be tolerated if no free hydroxyl (OH) ions are present. Equalization ponds, primary settling tanks, trickling filters, extra nitrogen removal tanks, carbon columns, and chemical treatment are not required. Color removal is excellent with clear effluent suitable for recycling after chlorination to water supply lakes. The construction cost of the single-stage advanced treatment plant is surprisingly low, about /sup 1///sub 2/ to /sup 1///sub 6/ as much as most conventional ineffective complex plants. This simplified, innovative process developed in independent research at Guilford College is considered by some a breakthrough in waste treatment efficiency and economy. (MU)« less

  2. Spontaneous Intracranial Hypotension: A Review and Introduction of an Algorithm For Management.

    PubMed

    Davidson, Benjamin; Nassiri, Farshad; Mansouri, Alireza; Badhiwala, Jetan H; Witiw, Christopher D; Shamji, Mohammed F; Peng, Philip W; Farb, Richard I; Bernstein, Mark

    2017-05-01

    Spontaneous intracranial hypotension (SIH) is a condition of low cerebrospinal fluid volume and pressure caused by a leak of cerebrospinal fluid through a dural defect. Diagnosis and management can be difficult, often requiring coordination between multiple disciplines for myelography, blood patching, and possible surgical repair. Patients should be monitored closely, because they can deteriorate into a coma or even death. There are no widely accepted guidelines for the management of SIH. We review the existing SIH literature, illustrate management challenges via a case review, and propose an algorithm developed by neurosurgeons, radiologists, and anesthesiologists intended to simplify and streamline the management of SIH. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  4. Using NERSC High-Performance Computing (HPC) systems for high-energy nuclear physics applications with ALICE

    NASA Astrophysics Data System (ADS)

    Fasel, Markus

    2016-10-01

    High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.

  5. Managing seafood processing wastewater on the Oregon coast: A time of transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, M.D.; Miner, J.R.

    1997-12-01

    Seafood processors along the Oregon coast practice a wastewater management plan that is unique within the state. Most of these operations discharge wastewater under a General Permit issued by the Oregon Department of Environmental Quality (DEQ) that requires only that they screen the wastewater to remove particles that will not pass through a 40 mesh screen. The General Permit was issued in February of 1992 and was scheduled to expire at the end of December, 1996. It has been extended until a replacement is adopted. Alternatives are currently under consideration by the DEQ. A second issue is the increasing competitionmore » for water within the coastal communities that are experiencing a growing tourist industry and a static water supply. Tourism and seafood processing both have their peak water demands during the summer months when fresh water supplies are most limited. Disposal of solid wastes has been simplified for many of the processors along the Lower Columbia River by a Fisheries Enhancement Program which allows processors to grind the solid waste then to discharge it into the stream under appropriate tidal conditions. There is no data which indicates water quality damage from this practice nor is there clear evidence of enhanced fishery productivity.« less

  6. X-33/RLV System Health Management/Vehicle Health Management

    NASA Technical Reports Server (NTRS)

    Mouyos, William; Wangu, Srimal

    1998-01-01

    To reduce operations costs, Reusable Launch Vehicles (RLVS) must include highly reliable robust subsystems which are designed for simple repair access with a simplified servicing infrastructure, and which incorporate expedited decision-making about faults and anomalies. A key component for the Single Stage To Orbit (SSTO) RLV system used to meet these objectives is System Health Management (SHM). SHM incorporates Vehicle Health Management (VHM), ground processing associated with the vehicle fleet (GVHM), and Ground Infrastructure Health Management (GIHM). The primary objective of SHM is to provide an automated and paperless health decision, maintenance, and logistics system. Sanders, a Lockheed Martin Company, is leading the design, development, and integration of the SHM system for RLV and for X-33 (a sub-scale, sub-orbit Advanced Technology Demonstrator). Many critical technologies are necessary to make SHM (and more specifically VHM) practical, reliable, and cost effective. This paper will present the X-33 SHM design which forms the baseline for the RLV SHM, and it will discuss applications of advanced technologies to future RLVs. In addition, this paper will describe a Virtual Design Environment (VDE) which is being developed for RLV. This VDE will allow for system design engineering, as well as program management teams, to accurately and efficiently evaluate system designs, analyze the behavior of current systems, and predict the feasibility of making smooth and cost-efficient transitions from older technologies to newer ones. The RLV SHM design methodology will reduce program costs, decrease total program life-cycle time, and ultimately increase mission success.

  7. Landscape moderation of biodiversity patterns and processes - eight hypotheses.

    PubMed

    Tscharntke, Teja; Tylianakis, Jason M; Rand, Tatyana A; Didham, Raphael K; Fahrig, Lenore; Batáry, Péter; Bengtsson, Janne; Clough, Yann; Crist, Thomas O; Dormann, Carsten F; Ewers, Robert M; Fründ, Jochen; Holt, Robert D; Holzschuh, Andrea; Klein, Alexandra M; Kleijn, David; Kremen, Claire; Landis, Doug A; Laurance, William; Lindenmayer, David; Scherber, Christoph; Sodhi, Navjot; Steffan-Dewenter, Ingolf; Thies, Carsten; van der Putten, Wim H; Westphal, Catrin

    2012-08-01

    Understanding how landscape characteristics affect biodiversity patterns and ecological processes at local and landscape scales is critical for mitigating effects of global environmental change. In this review, we use knowledge gained from human-modified landscapes to suggest eight hypotheses, which we hope will encourage more systematic research on the role of landscape composition and configuration in determining the structure of ecological communities, ecosystem functioning and services. We organize the eight hypotheses under four overarching themes. Section A: 'landscape moderation of biodiversity patterns' includes (1) the landscape species pool hypothesis-the size of the landscape-wide species pool moderates local (alpha) biodiversity, and (2) the dominance of beta diversity hypothesis-landscape-moderated dissimilarity of local communities determines landscape-wide biodiversity and overrides negative local effects of habitat fragmentation on biodiversity. Section B: 'landscape moderation of population dynamics' includes (3) the cross-habitat spillover hypothesis-landscape-moderated spillover of energy, resources and organisms across habitats, including between managed and natural ecosystems, influences landscape-wide community structure and associated processes and (4) the landscape-moderated concentration and dilution hypothesis-spatial and temporal changes in landscape composition can cause transient concentration or dilution of populations with functional consequences. Section C: 'landscape moderation of functional trait selection' includes (5) the landscape-moderated functional trait selection hypothesis-landscape moderation of species trait selection shapes the functional role and trajectory of community assembly, and (6) the landscape-moderated insurance hypothesis-landscape complexity provides spatial and temporal insurance, i.e. high resilience and stability of ecological processes in changing environments. Section D: 'landscape constraints on conservation management' includes (7) the intermediate landscape-complexity hypothesis-landscape-moderated effectiveness of local conservation management is highest in structurally simple, rather than in cleared (i.e. extremely simplified) or in complex landscapes, and (8) the landscape-moderated biodiversity versus ecosystem service management hypothesis-landscape-moderated biodiversity conservation to optimize functional diversity and related ecosystem services will not protect endangered species. Shifting our research focus from local to landscape-moderated effects on biodiversity will be critical to developing solutions for future biodiversity and ecosystem service management. © 2012 The Authors. Biological Reviews © 2012 Cambridge Philosophical Society.

  8. Challenges and potential improvements in the admission process of patients with spinal cord injury in a specialized rehabilitation clinic - an interview based qualitative study of an interdisciplinary team.

    PubMed

    Röthlisberger, Fabian; Boes, Stefan; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-06-26

    The admission process of patients to a hospital is the starting point for inpatient services. In order to optimize the quality of the health services provision, one needs a good understanding of the patient admission workflow in a clinic. The aim of this study was to identify challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic from the perspective of an interdisciplinary team of health professionals. Semi-structured interviews with eight health professionals (medical doctors, physical therapists, occupational therapists, nurses) at the Swiss Paraplegic Centre (acute and rehabilitation clinic) were conducted based on a maximum variety purposive sampling strategy. The interviews were analyzed using a thematic analysis approach. The interviewees described the challenges and potential improvements in this admission process, focusing on five themes. First, the characteristics of the patient with his/her health condition and personality and his/her family influence different areas in the admission process. Improvements in the exchange of information between the hospital and the patient could speed up and simplify the admission process. In addition, challenges and potential improvements were found concerning the rehabilitation planning, the organization of the admission process and the interdisciplinary work. This study identified five themes of challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic. When planning adaptations of process steps in one of the areas, awareness of effects in other fields is necessary. Improved pre-admission information would be a first important step to optimize the admission process. A common IT-system providing an interdisciplinary overview and possibilities for interdisciplinary exchange would support the management of the admission process. Managers of other hospitals can supplement the results of this study with their own process analyses, to improve their own patient admission processes.

  9. Development of a global aerosol model using a two-dimensional sectional method: 1. Model design

    NASA Astrophysics Data System (ADS)

    Matsui, H.

    2017-08-01

    This study develops an aerosol module, the Aerosol Two-dimensional bin module for foRmation and Aging Simulation version 2 (ATRAS2), and implements the module into a global climate model, Community Atmosphere Model. The ATRAS2 module uses a two-dimensional (2-D) sectional representation with 12 size bins for particles from 1 nm to 10 μm in dry diameter and 8 black carbon (BC) mixing state bins. The module can explicitly calculate the enhancement of absorption and cloud condensation nuclei activity of BC-containing particles by aging processes. The ATRAS2 module is an extension of a 2-D sectional aerosol module ATRAS used in our previous studies within a framework of a regional three-dimensional model. Compared with ATRAS, the computational cost of the aerosol module is reduced by more than a factor of 10 by simplifying the treatment of aerosol processes and 2-D sectional representation, while maintaining good accuracy of aerosol parameters in the simulations. Aerosol processes are simplified for condensation of sulfate, ammonium, and nitrate, organic aerosol formation, coagulation, and new particle formation processes, and box model simulations show that these simplifications do not substantially change the predicted aerosol number and mass concentrations and their mixing states. The 2-D sectional representation is simplified (the number of advected species is reduced) primarily by the treatment of chemical compositions using two interactive bin representations. The simplifications do not change the accuracy of global aerosol simulations. In part 2, comparisons with measurements and the results focused on aerosol processes such as BC aging processes are shown.

  10. Modeling and characterization of supercapacitors for wireless sensor network applications

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Yang, Hengzhao

    A simple circuit model is developed to describe supercapacitor behavior, which uses two resistor-capacitor branches with different time constants to characterize the charging and redistribution processes, and a variable leakage resistance to characterize the self-discharge process. The parameter values of a supercapacitor can be determined by a charging-redistribution experiment and a self-discharge experiment. The modeling and characterization procedures are illustrated using a 22F supercapacitor. The accuracy of the model is compared with that of other models often used in power electronics applications. The results show that the proposed model has better accuracy in characterizing the self-discharge process while maintaining similar performance as other models during charging and redistribution processes. Additionally, the proposed model is evaluated in a simplified energy storage system for self-powered wireless sensors. The model performance is compared with that of a commonly used energy recursive equation (ERE) model. The results demonstrate that the proposed model can predict the evolution profile of voltage across the supercapacitor more accurately than the ERE model, and therefore provides a better alternative for supporting research on storage system design and power management for wireless sensor networks.

  11. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  12. A systematic risk management approach employed on the CloudSat project

    NASA Technical Reports Server (NTRS)

    Basilio, R. R.; Plourde, K. S.; Lam, T.

    2000-01-01

    The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.

  13. Simplified Techniques for Evaluation and Interpretation of Pavement Deflections for Network-level Analysis : Guide for Assessment of Pavement Structure Performance for PMS Applications

    DOT National Transportation Integrated Search

    2012-06-01

    The objective of this study was to develop an approach for incorporating techniques to interpret and evaluate deflection : data for network-level pavement management system (PMS) applications. The first part of this research focused on : identifying ...

  14. How Plain English Works for Business: Twelve Case Studies.

    ERIC Educational Resources Information Center

    Office of Consumer Affairs, Washington, DC.

    Detailing the false starts, uncertainty, and internal questioning that occur as companies organize and manage language simplification projects, the 12 case studies contained in the two sections of this book reveal how some business organizations have benefited by simplifying consumer documents. Descriptions of each case contain information on the…

  15. An Analysis of the Effect of Knowledge Management on the Execution of Simplified Acquisition Procedures

    DTIC Science & Technology

    2012-12-27

    of Work UCC Uniform Commercial Code USD(AT&L) Under Secretary of Defense for Acquisition, Technology, and Logistics WBS Work Breakdown Structure...intensive career field. The FAR, the DFARS, and other federal agency supplements of the FAR, the Uniform Commercial Code ( UCC ), installation guidelines

  16. A vegetation classification system for use in California: its conceptual basis

    Treesearch

    Timothy E. Paysen; Jeanine A. Derby; C. Eugene Conrad

    1982-01-01

    A taxonomic Vegetation Classification System proposed for use in California is designed to simplify interdisciplinary communication about vegetation. The system structure is an aggregative plant community hierarchy at four levels of precision--the Association, Series, Subformation, and Formation. A flexible Phase category links specific resource management concerns to...

  17. Groundwater withdrawals under drought: reconciling GRACE and land surface models in the United States High Plains Aquifer

    USDA-ARS?s Scientific Manuscript database

    Advanced Land Surface Models (LSM) offer a powerful tool for studying hydrological variability. Highly managed systems, however, present a challenge for these models, which typically have simplified or incomplete representations of human water use. Here we examine recent groundwater declines in the ...

  18. Molecular dynamics of conformational substates for a simplified protein model

    NASA Astrophysics Data System (ADS)

    Grubmüller, Helmut; Tavan, Paul

    1994-09-01

    Extended molecular dynamics simulations covering a total of 0.232 μs have been carried out on a simplified protein model. Despite its simplified structure, that model exhibits properties similar to those of more realistic protein models. In particular, the model was found to undergo transitions between conformational substates at a time scale of several hundred picoseconds. The computed trajectories turned out to be sufficiently long as to permit a statistical analysis of that conformational dynamics. To check whether effective descriptions neglecting memory effects can reproduce the observed conformational dynamics, two stochastic models were studied. A one-dimensional Langevin effective potential model derived by elimination of subpicosecond dynamical processes could not describe the observed conformational transition rates. In contrast, a simple Markov model describing the transitions between but neglecting dynamical processes within conformational substates reproduced the observed distribution of first passage times. These findings suggest, that protein dynamics generally does not exhibit memory effects at time scales above a few hundred picoseconds, but confirms the existence of memory effects at a picosecond time scale.

  19. A Simplified Program Needs Assessment Process.

    ERIC Educational Resources Information Center

    Clark, Larry

    A rationale, background information, and a discussion of methodology are presented for a needs assessment process intended for pilot implementation at Western Piedmont Community College (WPCC). This process was designed to assess the local need for paraprofessional programs in the Human Services area, i.e., Early Childhood Associate, Mental Health…

  20. The Complexity of Developmental Predictions from Dual Process Models

    ERIC Educational Resources Information Center

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  1. Applying the Theory of Constraints to a Base Civil Engineering Operations Branch

    DTIC Science & Technology

    1991-09-01

    Figure Page 1. Typical Work Order Processing . .......... 7 2. Typical Job Order Processing . .......... 8 3. Typical Simplified In-Service Work Plan for...Customers’ Customer Request Service Planning Unit Production] Control Center Material Control Scheduling CE Shops Figure 1.. Typical Work Order Processing 7

  2. Simulation of Simple Controlled Processes with Dead-Time.

    ERIC Educational Resources Information Center

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  3. An isothermal based recombinase polymerase amplification assay for rapid, sensitive and robust indexing of citrus yellow mosaic virus.

    PubMed

    Kumar, P V; Sharma, S K; Rishi, N; Ghosh, D K; Baranwal, V K

    Management of viral diseases relies on definite and sensitive detection methods. Citrus yellow mosaic virus (CYMV), a double stranded DNA virus of the genus Badnavirus, causes yellow mosaic disease in citrus plants. CYMV is transmitted through budwood and requires a robust and simplified indexing protocol for budwood certification programme. The present study reports development and standardization of an isothermal based recombinase polymerase amplification (RPA) assay for a sensitive, rapid, easy, and cost-effective method for detection and diagnosis of CYMV. Two different oligonucleotide primer sets were designed from ORF III (coding for polyprotein) and ORF II (coding for virion associated protein) regions of CYMV to perform amplification assays. Comparative evaluation of RPA, PCR and immuno-capture recombinase polymerase amplification (IC-RPA) based assays were done using purified DNA and plant crude sap. CYMV infection was efficiently detected from the crude sap in RPA and IC-RPA assays. The primer set used in RPA was specific and did not show any cross-amplification with banana streak MY virus (BSMYV), another Badnavirus species. The results from the present study indicated that RPA assay can be used easily in routine indexing of citrus planting material. To the best of our knowledge, this is the first report on development of a rapid and simplified isothermal detection assay for CYMV and can be utilized as an effective technique in quarantine and budwood certification process.

  4. Emotional valence of stimuli modulates false recognition: Using a modified version of the simplified conjoint recognition paradigm.

    PubMed

    Gong, Xianmin; Xiao, Hongrui; Wang, Dahua

    2016-11-01

    False recognition results from the interplay of multiple cognitive processes, including verbatim memory, gist memory, phantom recollection, and response bias. In the current study, we modified the simplified Conjoint Recognition (CR) paradigm to investigate the way in which the valence of emotional stimuli affects the cognitive process and behavioral outcome of false recognition. In Study 1, we examined the applicability of the modification to the simplified CR paradigm and model. Twenty-six undergraduate students (13 females, aged 21.00±2.30years) learned and recognized both the large and small categories of photo objects. The applicability of the paradigm and model was confirmed by a fair goodness-of-fit of the model to the observational data and by their competence in detecting the memory differences between the large- and small-category conditions. In Study 2, we recruited another sample of 29 undergraduate students (14 females, aged 22.60±2.74years) to learn and recognize the categories of photo objects that were emotionally provocative. The results showed that negative valence increased false recognition, particularly the rate of false "remember" responses, by facilitating phantom recollection; positive valence did not influence false recognition significantly though enhanced gist processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less

  6. Computational modeling of the pressurization process in a NASP vehicle propellant tank experimental simulation

    NASA Technical Reports Server (NTRS)

    Sasmal, G. P.; Hochstein, J. I.; Wendl, M. C.; Hardy, T. L.

    1991-01-01

    A multidimensional computational model of the pressurization process in a slush hydrogen propellant storage tank was developed and its accuracy evaluated by comparison to experimental data measured for a 5 ft diameter spherical tank. The fluid mechanic, thermodynamic, and heat transfer processes within the ullage are represented by a finite-volume model. The model was shown to be in reasonable agreement with the experiment data. A parameter study was undertaken to examine the dependence of the pressurization process on initial ullage temperature distribution and pressurant mass flow rate. It is shown that for a given heat flux rate at the ullage boundary, the pressurization process is nearly independent of initial temperature distribution. Significant differences were identified between the ullage temperature and velocity fields predicted for pressurization of slush and those predicted for pressurization of liquid hydrogen. A simplified model of the pressurization process was constructed in search of a dimensionless characterization of the pressurization process. It is shown that the relationship derived from this simplified model collapses all of the pressure history data generated during this study into a single curve.

  7. Practical modeling approaches for geological storage of carbon dioxide.

    PubMed

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  8. Simplifying the interaction between cognitive models and task environments with the JSON Network Interface.

    PubMed

    Hope, Ryan M; Schoelles, Michael J; Gray, Wayne D

    2014-12-01

    Process models of cognition, written in architectures such as ACT-R and EPIC, should be able to interact with the same software with which human subjects interact. By eliminating the need to simulate the experiment, this approach would simplify the modeler's effort, while ensuring that all steps required of the human are also required by the model. In practice, the difficulties of allowing one software system to interact with another present a significant barrier to any modeler who is not also skilled at this type of programming. The barrier increases if the programming language used by the modeling software differs from that used by the experimental software. The JSON Network Interface simplifies this problem for ACT-R modelers, and potentially, modelers using other systems.

  9. Simplified analysis and optimization of space base and space shuttle heat rejection systems

    NASA Technical Reports Server (NTRS)

    Wulff, W.

    1972-01-01

    A simplified radiator system analysis was performed to predict steady state radiator system performance. The system performance was found to be describable in terms of five non-dimensional system parameters. The governing differential equations are integrated numerically to yield the enthalpy rejection for the coolant fluid. The simplified analysis was extended to produce the derivatives of the coolant exit temperature with respect to the governing system parameters. A procedure was developed to find the optimum set of system parameters which yields the lowest possible coolant exit temperature for either a given projected area or a given total mass. The process can be inverted to yield either the minimum area or the minimum mass, together with the optimum geometry, for a specified heat rejection rate.

  10. Hydrologic Connectivity for Understanding Watershed Processes: Brand-new Puzzle or Emerging Panacea?

    NASA Astrophysics Data System (ADS)

    Ali, G. A.; Roy, A. G.; Tetzlaff, D.; Soulsby, C.; McDonnell, J. J.

    2011-12-01

    As a way to develop a more holistic approach to watershed assessment and management, the concept of hydrologic connectivity (HC) is often put at the forefront. HC can be seen as the strength of the water-mediated linkages between discrete units of the landscape and as such, it facilitates our intuitive understanding of the mechanisms driving runoff initiation and cessation. Much of the excitement surrounding HC is attributable to its potential to enhance our ability to gain insights into multiple areas including process dynamics, numerical model building, the effects of human elements in our landscape conceptualization, and the development of simplified watershed management tools. However, before such potential can be fully demonstrated, many issues must be resolved with regards to the measure of HC. Here we provide examples highlighting how connectivity can be useful towards understanding water routing in river basins, ecohydrological systems coupling, and intermittent rainfall-runoff dynamics. First, the use of connectivity metrics to examine the relative influence of surface/subsurface topography and soil characteristics on runoff generation will be discussed. Second, the effectiveness of using geochemical tracers will be examined with respect to identifying non-point runoff sources and linking hillslope-to-channel connectivity with surface water-groundwater exchanges in the biologically sensitive hyporheic zone. Third, the identification of different hydrologic thresholds will be presented as a way to discriminate the establishment of connectivity across a range of contrasted catchments located in Canada, Scotland, the USA, and Sweden. These examples will show that current challenges with regards to HC revolve around the choice of an accurate methodological framework for an appropriate translation of experimental findings into effective watershed management approaches. Addressing these questions simultaneously will lead to the emergence of HC as a powerful tool for watershed process understanding.

  11. 7 CFR 4280.102 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Improvements Program § 4280.102 General. (a) Sections 4280.103 through 4280.106 discuss definitions, exception... evaluation process, and post-grant Federal requirements for both the simplified and full application processes. Sections 4280.115 through 4280.117 address project planning, development, and completion as...

  12. Study on low intensity aeration oxygenation model and optimization for shallow water

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Ding, Zhibin; Ding, Jian; Wang, Yi

    2018-02-01

    Aeration/oxygenation is an effective measure to improve self-purification capacity in shallow water treatment while high energy consumption, high noise and expensive management refrain the development and the application of this process. Based on two-film theory, the theoretical model of the three-dimensional partial differential equation of aeration in shallow water is established. In order to simplify the equation, the basic assumptions of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction are proposed based on engineering practice and are tested by the simulation results of gas holdup which are obtained by simulating the gas-liquid two-phase flow in aeration tank under low-intensity condition. Based on the basic assumptions and the theory of shallow permeability, the model of three-dimensional partial differential equations is simplified and the calculation model of low-intensity aeration oxygenation is obtained. The model is verified through comparing the aeration experiment. Conclusions as follows: (1)The calculation model of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction can reflect the process of aeration well; (2) Under low-intensity conditions, the long-term aeration and oxygenation is theoretically feasible to enhance the self-purification capacity of water bodies; (3) In the case of the same total aeration intensity, the effect of multipoint distributed aeration on the diffusion of oxygen concentration in the horizontal direction is obvious; (4) In the shallow water treatment, reducing the volume of aeration equipment with the methods of miniaturization, array, low-intensity, mobilization to overcome the high energy consumption, large size, noise and other problems can provide a good reference.

  13. A Cloud Architecture for Teleradiology-as-a-Service.

    PubMed

    Melício Monteiro, Eriksson J; Costa, Carlos; Oliveira, José L

    2016-05-17

    Telemedicine has been promoted by healthcare professionals as an efficient way to obtain remote assistance from specialised centres, to get a second opinion about complex diagnosis or even to share knowledge among practitioners. The current economic restrictions in many countries are increasing the demand for these solutions even more, in order to optimize processes and reduce costs. However, despite some technological solutions already in place, their adoption has been hindered by the lack of usability, especially in the set-up process. In this article we propose a telemedicine platform that relies on a cloud computing infrastructure and social media principles to simplify the creation of dynamic user-based groups, opening up opportunities for the establishment of teleradiology trust domains. The collaborative platform is provided as a Software-as-a-Service solution, supporting real time and asynchronous collaboration between users. To evaluate the solution, we have deployed the platform in a private cloud infrastructure. The system is made up of three main components - the collaborative framework, the Medical Management Information System (MMIS) and the HTML5 (Hyper Text Markup Language) Web client application - connected by a message-oriented middleware. The solution allows physicians to create easily dynamic network groups for synchronous or asynchronous cooperation. The network created improves dataflow between colleagues and also knowledge sharing and cooperation through social media tools. The platform was implemented and it has already been used in two distinct scenarios: teaching of radiology and tele-reporting. Collaborative systems can simplify the establishment of telemedicine expert groups with tools that enable physicians to improve their clinical practice. Streamlining the usage of this kind of systems through the adoption of Web technologies that are common in social media will increase the quality of current solutions, facilitating the sharing of clinical information, medical imaging studies and patient diagnostics among collaborators.

  14. Biomedical waste management guidelines 2016: What's done and what needs to be done.

    PubMed

    Singhal, Lipika; Tuli, Arpandeep Kaur; Gautam, Vikas

    2017-01-01

    The latest biomedical waste (BMW) management guidelines which have been introduced in 2016 are simplified and made easier so that they can be easily followed by various health agencies. The categories of BMW have been reduced from ten (in 1998) to four in the latest (2016) guidelines. Many changes have been made in these latest guidelines, which have been summarised in the article below. The segregation of hospital waste plays a very important role, so the waste has to be sorted out at the source of generation according to the category to which it belongs as given in the newer guidelines. Newer waste treatment facilities such as plasma pyrolysis, encapsulation, inertisation have been introduced, and we have to do away with older facilities such as incineration as toxic fumes (dioxins and furans) are produced which are harmful to both health and environment. We can even think of using these wastewater treatment plants to remove the antimicrobial resistance genes during the processing of the waste, which is being generated from the hospitals.

  15. An investigation of lithium-ion battery thermal management using paraffin/porous-graphite-matrix composite

    NASA Astrophysics Data System (ADS)

    Greco, Angelo; Jiang, Xi; Cao, Dongpu

    2015-03-01

    The thermal management of a cylindrical battery cell by a phase change material (PCM)/compressed expanded natural graphite (CENG) is investigated in this study. The transient thermal behaviour of both the battery and the PCM/CENG is described with a simplified one-dimensional model taking into account the physical and phase change properties of the PCM/CENG composite. The 1D analytical/computational model yielded nearly identical results to the three-dimensional simulation results for various cooling strategies. Therefore, the 1D model is sufficient to describe the transient behaviour of the battery cooled by a PCM/CENG composite. Moreover, the maximum temperature reached by the PCM/CENG cooling strategy is much lower than that by the forced convection in the same configuration. In the test case studied, the PCM showed superior transient characteristics to forced convection cooling. The PCM cooling is able to maintain a lower maximum temperature during the melting process and to extend the transient time for temperature rise. Furthermore, the graphite-matrix bulk density is identified as an important parameter for optimising the PCM/CENG cooling strategy.

  16. An integrated and pragmatic approach: Global plant safety management

    NASA Astrophysics Data System (ADS)

    McNutt, Jack; Gross, Andrew

    1989-05-01

    The Bhopal disaster in India in 1984 has compelled manufacturing companies to review their operations in order to minimize their risk exposure. Much study has been done on the subject of risk assessment and in refining safety reviews of plant operations. However, little work has been done to address the broader needs of decision makers in the multinational environment. The corporate headquarters of multinational organizations are concerned with identifying vulnerable areas to assure that appropriate risk-minimization measures are in force or will be taken. But the task of screening global business units for safety prowess is complicated and time consuming. This article takes a step towards simplifying this process by presenting the decisional model developed by the authors. Beginning with an overview of key issues affecting global safety management, the focus shifts to the multinational vulnerability model developed by the authors, which reflects an integration of approaches. The article concludes with a discussion of areas for further research. While the global chemical industry and major incidents therein are used for illustration, the procedures and solutions suggested here are applicable to all manufacturing operations.

  17. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  18. ATGC transcriptomics: a web-based application to integrate, explore and analyze de novo transcriptomic data.

    PubMed

    Gonzalez, Sergio; Clavijo, Bernardo; Rivarola, Máximo; Moreno, Patricio; Fernandez, Paula; Dopazo, Joaquín; Paniego, Norma

    2017-02-22

    In the last years, applications based on massively parallelized RNA sequencing (RNA-seq) have become valuable approaches for studying non-model species, e.g., without a fully sequenced genome. RNA-seq is a useful tool for detecting novel transcripts and genetic variations and for evaluating differential gene expression by digital measurements. The large and complex datasets resulting from functional genomic experiments represent a challenge in data processing, management, and analysis. This problem is especially significant for small research groups working with non-model species. We developed a web-based application, called ATGC transcriptomics, with a flexible and adaptable interface that allows users to work with new generation sequencing (NGS) transcriptomic analysis results using an ontology-driven database. This new application simplifies data exploration, visualization, and integration for a better comprehension of the results. ATGC transcriptomics provides access to non-expert computer users and small research groups to a scalable storage option and simple data integration, including database administration and management. The software is freely available under the terms of GNU public license at http://atgcinta.sourceforge.net .

  19. Cost effective management of space venture risks

    NASA Technical Reports Server (NTRS)

    Giuntini, Ronald E.; Storm, Richard E.

    1986-01-01

    The development of a model for the cost-effective management of space venture risks is discussed. The risk assessment and control program of insurance companies is examined. A simplified system development cycle which consists of a conceptual design phase, a preliminary design phase, a final design phase, a construction phase, and a system operations and maintenance phase is described. The model incorporates insurance safety risk methods and reliability engineering, and testing practices used in the development of large aerospace and defense systems.

  20. Radiology education 2.0--on the cusp of change: part 2. eBooks; file sharing and synchronization tools; websites/teaching files; reference management tools and note taking applications.

    PubMed

    Bhargava, Puneet; Dhand, Sabeen; Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Jambhekar, Kedar

    2013-03-01

    Increasing use of smartphones and handheld computers is accompanied by a rapid growth in the other related industries. Electronic books have revolutionized the centuries-old conventional books and magazines markets and have simplified publishing by reducing the cost and processing time required to create and distribute any given book. We are now able to read, review, store, and share various types of documents via several electronic tools, many of which are available free of charge. Additionally, this electronic revolution has resulted in an explosion of readily available Internet-based educational resources for the residents and has paved the path for educators to reach out to a larger and more diverse student population. Published by Elsevier Inc.

  1. Making transboundary risks governable: reducing complexity, constructing spatial identity, and ascribing capabilities.

    PubMed

    Lidskog, Rolf; Uggla, Ylva; Soneryd, Linda

    2011-03-01

    Environmental problems that cross national borders are attracting increasing public and political attention; regulating them involves coordinating the goals and activities of various governments, which often presupposes simplifying and standardizing complex knowledge, and finding ways to manage uncertainty. This article explores how transboundary environmental problems are dealt with to render complex issues governable. By discussing oil pollution in the Baltic Sea and the gas pipeline between Russia and Germany, we elucidate how boundaries are negotiated to make issues governable. Three processes are found to be particularly relevant to how involved actors render complex issues governable: complexity reduction, construction of a spatial identity for an issue, and ascription of capabilities to new or old actor constellations. We conclude that such regulation is always provisional, implying that existing regulation is always open for negotiation and criticism.

  2. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.

  3. The CELSS breadboard project: Plant production

    NASA Technical Reports Server (NTRS)

    Knott, William M.

    1990-01-01

    NASA's Breadboard Project for the Controlled Ecological Life Support System (CELSS) program is described. The simplified schematic of a CELSS is given. A modular approach is taken to building the CELSS Breadboard. Each module is researched in order to develop a data set for each one prior to its integration into the complete system. The data being obtained from the Biomass Production Module or the Biomass Production Chamber is examined. The other primary modules, food processing and resource recovery or waste management, are discussed briefly. The crew habitat module is not discussed. The primary goal of the Breadboard Project is to scale-up research data to an integrated system capable of supporting one person in order to establish feasibility for the development and operation of a CELSS. Breadboard is NASA's first attempt at developing a large scale CELSS.

  4. Real-time control of combined surface water quantity and quality: polder flushing.

    PubMed

    Xu, M; van Overloop, P J; van de Giesen, N C; Stelling, G S

    2010-01-01

    In open water systems, keeping both water depths and water quality at specified values is critical for maintaining a 'healthy' water system. Many systems still require manual operation, at least for water quality management. When applying real-time control, both quantity and quality standards need to be met. In this paper, an artificial polder flushing case is studied. Model Predictive Control (MPC) is developed to control the system. In addition to MPC, a 'forward estimation' procedure is used to acquire water quality predictions for the simplified model used in MPC optimization. In order to illustrate the advantages of MPC, classical control [Proportional-Integral control (PI)] has been developed for comparison in the test case. The results show that both algorithms are able to control the polder flushing process, but MPC is more efficient in functionality and control flexibility.

  5. [Organization of safe cost-effective blood transfusion: experience APHM-EFSAM].

    PubMed

    Ferrera-Tourenc, V; Dettori, I; Chiaroni, J; Lassale, B

    2013-03-01

    Blood transfusion safety depends on strict compliance with each step of a process beginning with the order for labile blood products and related immunohematologic testing and ending with administration and follow-up of the receiver. This process is governed by stringent regulatory texts and guidelines. Despite precautions, processing errors are still reported. Analysis of incident reports shows that the most common cause involves patient identification and that most errors occur at two levels, i.e. the entry of patient information and management of multiple regulatory crosschecks and record-keeping using different systems. The purpose of this report is to describe the collaborative approach implemented by the Établissement français du Sang Alpes-Méditerranée (EFSAM) and the Assistance publique des Hôpitaux de Marseille (APHM) to secure the blood transfusion process and protect interfaces while simplifying and facilitating exchanges. Close cooperation has had a threefold impact with simplification of administration, improvement of experience feedback, and better management of test ordering. The organization implemented between the two institutions has minimized document redundancy and interfaces between immunohematologic testing and delivery. Collaboration based on experience feedback has improved the level of quality and cost control. In the domain of blood transfusion safety, the threshold of 10(-5) has been reached with regard to the risk of ABO errors in the distribution concentrated red cells (CRC). In addition, this collaborative organization has created further opportunity for improvement by deploying new methods to identify simplification measures and by controlling demand and usage. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  6. How models can support ecosystem-based management of coral reefs

    NASA Astrophysics Data System (ADS)

    Weijerman, Mariska; Fulton, Elizabeth A.; Janssen, Annette B. G.; Kuiper, Jan J.; Leemans, Rik; Robson, Barbara J.; van de Leemput, Ingrid A.; Mooij, Wolf M.

    2015-11-01

    Despite the importance of coral reef ecosystems to the social and economic welfare of coastal communities, the condition of these marine ecosystems have generally degraded over the past decades. With an increased knowledge of coral reef ecosystem processes and a rise in computer power, dynamic models are useful tools in assessing the synergistic effects of local and global stressors on ecosystem functions. We review representative approaches for dynamically modeling coral reef ecosystems and categorize them as minimal, intermediate and complex models. The categorization was based on the leading principle for model development and their level of realism and process detail. This review aims to improve the knowledge of concurrent approaches in coral reef ecosystem modeling and highlights the importance of choosing an appropriate approach based on the type of question(s) to be answered. We contend that minimal and intermediate models are generally valuable tools to assess the response of key states to main stressors and, hence, contribute to understanding ecological surprises. As has been shown in freshwater resources management, insight into these conceptual relations profoundly influences how natural resource managers perceive their systems and how they manage ecosystem recovery. We argue that adaptive resource management requires integrated thinking and decision support, which demands a diversity of modeling approaches. Integration can be achieved through complimentary use of models or through integrated models that systemically combine all relevant aspects in one model. Such whole-of-system models can be useful tools for quantitatively evaluating scenarios. These models allow an assessment of the interactive effects of multiple stressors on various, potentially conflicting, management objectives. All models simplify reality and, as such, have their weaknesses. While minimal models lack multidimensionality, system models are likely difficult to interpret as they require many efforts to decipher the numerous interactions and feedback loops. Given the breadth of questions to be tackled when dealing with coral reefs, the best practice approach uses multiple model types and thus benefits from the strength of different models types.

  7. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    USGS Publications Warehouse

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the future use of these commonly available ecological and statistical methods in preparing assemblage data for use in ecological indicators.

  8. Analyzing Power Supply and Demand on the ISS

    NASA Technical Reports Server (NTRS)

    Thomas, Justin; Pham, Tho; Halyard, Raymond; Conwell, Steve

    2006-01-01

    Station Power and Energy Evaluation Determiner (SPEED) is a Java application program for analyzing the supply and demand aspects of the electrical power system of the International Space Station (ISS). SPEED can be executed on any computer that supports version 1.4 or a subsequent version of the Java Runtime Environment. SPEED includes an analysis module, denoted the Simplified Battery Solar Array Model, which is a simplified engineering model of the ISS primary power system. This simplified model makes it possible to perform analyses quickly. SPEED also includes a user-friendly graphical-interface module, an input file system, a parameter-configuration module, an analysis-configuration-management subsystem, and an output subsystem. SPEED responds to input information on trajectory, shadowing, attitude, and pointing in either a state-of-charge mode or a power-availability mode. In the state-of-charge mode, SPEED calculates battery state-of-charge profiles, given a time-varying power-load profile. In the power-availability mode, SPEED determines the time-varying total available solar array and/or battery power output, given a minimum allowable battery state of charge.

  9. Simulation of mercury capture by sorbent injection using a simplified model.

    PubMed

    Zhao, Bingtao; Zhang, Zhongxiao; Jin, Jing; Pan, Wei-Ping

    2009-10-30

    Mercury pollution by fossil fuel combustion or solid waste incineration is becoming the worldwide environmental concern. As an effective control technology, powdered sorbent injection (PSI) has been successfully used for mercury capture from flue gas with advantages of low cost and easy operation. In order to predict the mercury capture efficiency for PSI more conveniently, a simplified model, which is based on the theory of mass transfer, isothermal adsorption and mass balance, is developed in this paper. The comparisons between theoretical results of this model and experimental results by Meserole et al. [F.B. Meserole, R. Chang, T.R. Carrey, J. Machac, C.F.J. Richardson, Modeling mercury removal by sorbent injection, J. Air Waste Manage. Assoc. 49 (1999) 694-704] demonstrate that the simplified model is able to provide good predictive accuracy. Moreover, the effects of key parameters including the mass transfer coefficient, sorbent concentration, sorbent physical property and sorbent adsorption capacity on mercury adsorption efficiency are compared and evaluated. Finally, the sensitive analysis of impact factor indicates that the injected sorbent concentration plays most important role for mercury capture efficiency.

  10. Against conventional wisdom: when the public, the media, and medical practice collide.

    PubMed

    Jensen, Jakob D; Krakow, Melinda; John, Kevin K; Liu, Miao

    2013-01-01

    In 2009, the U.S. Preventive Services Task Force released new mammography screening guidelines that sparked a torrent of criticism. The subsequent conflict was significant and pitted the Task Force against other health organizations, advocacy groups, the media, and the public at large. We argue that this controversy was driven by the systematic removal of uncertainty from science communication. To increase comprehension and adherence, health information communicators remove caveats, limitations, and hedging so science appears simple and more certain. This streamlining process is, in many instances, initiated by researchers as they engage in dissemination of their findings, and it is facilitated by public relations professionals, journalists, public health practitioners, and others whose tasks involve using the results from research for specific purposes. Uncertainty is removed from public communication because many communicators believe that it is difficult for people to process and/or that it is something the audience wants to avoid. Uncertainty management theory posits that people can find meaning and value in uncertainty. We define key terms relevant to uncertainty management, describe research on the processing of uncertainty, identify directions for future research, and offer recommendations for scientists, practitioners, and media professionals confronted with uncertain findings. Science is routinely simplified as it is prepared for public consumption. In line with the model of information overload, this practice may increase short-term adherence to recommendations at the expense of long-term message consistency and trust in science.

  11. Debates—Perspectives on socio-hydrology: Modeling flood risk as a public policy problem

    NASA Astrophysics Data System (ADS)

    Gober, Patricia; Wheater, Howard S.

    2015-06-01

    Socio-hydrology views human activities as endogenous to water system dynamics; it is the interaction between human and biophysical processes that threatens the viability of current water systems through positive feedbacks and unintended consequences. Di Baldassarre et al. implement socio-hydrology as a flood risk problem using the concept of social memory as a vehicle to link human perceptions to flood damage. Their mathematical model has heuristic value in comparing potential flood damages in green versus technological societies. It can also support communities in exploring the potential consequences of policy decisions and evaluating critical policy tradeoffs, for example, between flood protection and economic development. The concept of social memory does not, however, adequately capture the social processes whereby public perceptions are translated into policy action, including the pivotal role played by the media in intensifying or attenuating perceived flood risk, the success of policy entrepreneurs in keeping flood hazard on the public agenda during short windows of opportunity for policy action, and different societal approaches to managing flood risk that derive from cultural values and economic interests. We endorse the value of seeking to capture these dynamics in a simplified conceptual framework, but favor a broader conceptualization of socio-hydrology that includes a knowledge exchange component, including the way modeling insights and scientific results are communicated to floodplain managers. The social processes used to disseminate the products of socio-hydrological research are as important as the research results themselves in determining whether modeling is used for real-world decision making.

  12. Piloted simulation of an air-ground profile negotiation process in a time-based Air Traffic Control environment

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Green, Steven M.

    1993-01-01

    Historically, development of airborne flight management systems (FMS) and ground-based air traffic control (ATC) systems has tended to focus on different objectives with little consideration for operational integration. A joint program, between NASA's Ames Research Center (Ames) and Langley Research Center (Langley), is underway to investigate the issues of, and develop systems for, the integration of ATC and airborne automation systems. A simulation study was conducted to evaluate a profile negotiation process (PNP) between the Center/TRACON Automation System (CTAS) and an aircraft equipped with a four-dimensional flight management system (4D FMS). Prototype procedures were developed to support the functional implementation of this process. The PNP was designed to provide an arrival trajectory solution which satisfies the separation requirements of ATC while remaining as close as possible to the aircraft's preferred trajectory. Results from the experiment indicate the potential for successful incorporation of aircraft-preferred arrival trajectories in the CTAS automation environment. Fuel savings on the order of 2 percent to 8 percent, compared to fuel required for the baseline CTAS arrival speed strategy, were achieved in the test scenarios. The data link procedures and clearances developed for this experiment, while providing the necessary functionality, were found to be operationally unacceptable to the pilots. In particular, additional pilot control and understanding of the proposed aircraft-preferred trajectory, and a simplified clearance procedure were cited as necessary for operational implementation of the concept.

  13. Modelling of bio-morphodynamics in braided rivers: applications to the Waitaki river (New Zealand)

    NASA Astrophysics Data System (ADS)

    Stecca, G.; Zolezzi, G.; Hicks, M.; Measures, R.; Bertoldi, W.

    2016-12-01

    The planform shape of rivers results from the complex interaction between flow, sediment transport and vegetation processes, and can evolve in time following a change in these controls. The braided planform of the lower Waitaki (New Zealand), for instance, is endangered by the action of artificially-introduced alien vegetation, which spread after the reduction in magnitude of floods following hydropower dam construction. These processes, by favouring the flow concentration into the main channel, would likely promote a shift towards single thread morphology if vegetation was not artificially removed within a central fairway. The purpose of this work is to address the future evolution of these river systems under different management scenarios through two-dimensional numerical modelling. The construction of a suitable model represents a task in itself, since a modelling framework coupling all the relevant processes is not straightforwardly available at present. Our starting point is the GIAMT2D numerical model, solving two-dimensional flow and bedload transport in wet/dry domains, and recently modified by the inclusion of a rule-based bank erosion model. We further develop this model by adding a vegetation module, which accounts in a simplified manner for time-evolving biomass density, and tweaks the local flow roughness, critical shear stress for sediment transport and bank erodibility accordingly. We plan to apply the model to address the decadal-scale evolution of one reach in the Waitaki river, comparing different management scenarios for vegetation control.

  14. Against conventional wisdom: when the public, the media, and medical practice collide

    PubMed Central

    2013-01-01

    Background In 2009, the U.S. Preventive Services Task Force released new mammography screening guidelines that sparked a torrent of criticism. The subsequent conflict was significant and pitted the Task Force against other health organizations, advocacy groups, the media, and the public at large. We argue that this controversy was driven by the systematic removal of uncertainty from science communication. To increase comprehension and adherence, health information communicators remove caveats, limitations, and hedging so science appears simple and more certain. This streamlining process is, in many instances, initiated by researchers as they engage in dissemination of their findings, and it is facilitated by public relations professionals, journalists, public health practitioners, and others whose tasks involve using the results from research for specific purposes. Analysis Uncertainty is removed from public communication because many communicators believe that it is difficult for people to process and/or that it is something the audience wants to avoid. Uncertainty management theory posits that people can find meaning and value in uncertainty. We define key terms relevant to uncertainty management, describe research on the processing of uncertainty, identify directions for future research, and offer recommendations for scientists, practitioners, and media professionals confronted with uncertain findings. Conclusions Science is routinely simplified as it is prepared for public consumption. In line with the model of information overload, this practice may increase short-term adherence to recommendations at the expense of long-term message consistency and trust in science. PMID:24565173

  15. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    NASA Astrophysics Data System (ADS)

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working experience was examined as a cross-cutting property of science and practice with increasing levels of uncertainty awareness and integration among more experienced researchers and practitioners. In conclusion, our study of water managers' perception and handling of uncertainties provides valuable insights for finding routines for uncertainty communication and integration into planning and decision-making processes by acknowledging the divers perceptions among producers, users and receivers of uncertainty information. These results can contribute to more effective integration of hydrological forecast and improved decisions.

  16. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  17. On Anthologies.

    ERIC Educational Resources Information Center

    Jones, Nick

    1983-01-01

    Discusses the form and function of anthologies by distinguishing three "orders" of anthology, together with a fourth, or preliminary category, within a broadly simplified model of the anthological process. (HOD)

  18. WastePlan model implementation for New York State. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Visalli, J.R.; Blackman, D.A.

    1995-07-01

    WastePlan is a computer software tool that models solid waste quantities, costs, and other parameters on a regional basis. The software was developed by the Tellus Institute, a nonprofit research and consulting firm. The project`s objective was to provide local solid waste management planners in New York State responsible to develop and implement comprehensive solid waste management plans authorized by the Solid Waste Management Act of 1988, with a WastePlan model specifically tailored to fit the demographic and other characteristics of New York State and to provide training and technical support to the users. Two-day workshops were held in 1992more » to introduce planners to the existing versions; subsequently, extensive changes were made to the model and a second set of two-day workshops were held in 1993 to introduce planners to the enhanced version of WastePlan. Following user evaluations, WastePlan was further modified to allow users to model systems using a simplified version, and to incorporate report forms required by New York State. A post-project survey of trainees revealed limited regular use of software. Possible reasons include lack of synchronicity with NYSDEC planning process; lack of computer literacy and aptitude among trainees; hardware limitations; software user-friendliness; and the work environment of the trainees. A number of recommendations are made to encourage use of WastePlan by local solid waste management planners.« less

  19. Hydrography for the non-Hydrographer: A Paradigm shift in Data Processing

    NASA Astrophysics Data System (ADS)

    Malzone, C.; Bruce, S.

    2017-12-01

    Advancements in technology have led to overall systematic improvements including; hardware design, software architecture, data transmission/ telepresence. Historically, utilization of this technology has required a high knowledge level obtained with many years of experience, training and/or education. High training costs are incurred to achieve and maintain an acceptable level proficiency within an organization. Recently, engineers have developed off-the-shelf software technology called Qimera that has simplified the processing of hydrographic data. The core technology is centered around the isolation of tasks within the work- flow to capitalize on the technological advances in computing technology to automate the mundane error prone tasks to bring more value to the stages in which the human brain brings value. Key design features include: guided workflow, transcription automation, processing state management, real-time QA, dynamic workflow for validation, collaborative cleaning and production line processing. Since, Qimera is designed to guide the user, it allows expedition leaders to focus on science while providing an educational opportunity for students to quickly learn the hydrographic processing workflow including ancillary data analysis, trouble-shooting, calibration and cleaning. This paper provides case studies on how Qimera is currently implemented in scientific expeditions, benefits of implementation and how it is directing the future of on-board research for the non-hydrographer.

  20. NASA Johnson Space Center: Total quality partnership

    NASA Technical Reports Server (NTRS)

    Harlan, Charlie; Boyd, Alfred A.

    1992-01-01

    The development of and benefits realized from a joint NASA, support contractor continuous improvement process at the Johnson Space Center (JSC) is traced. The joint effort described is the Safety, Reliability, and Quality Assurance Directorate relationship with its three support contractors which began in early 1990. The Continuous Improvement effort started in early 1990 with an initiative to document and simplify numerous engineering change evaluation processes. This effort quickly grew in scope and intensity to include process improvement teams, improvement methodologies, awareness, and training. By early 1991, the support contractor had teams in place and functioning, program goals established and a cultural change effort underway. In mid-l991 it became apparent that a major redirection was needed to counter a growing sense of frustration and dissatisfaction from teams and managers. Sources of frustration were isolated to insufficient joint participation on teams, and to a poorly defined vision. Over the next year, the effort was transformed to a truly joint process. The presentation covers the steps taken to define vision, values, goals, and priorities and to form a joint Steering Committee and joint process improvement teams. The most recent assessment against the President's award criteria is presented as a summary of progress. Small, but important improvement results have already demonstrated the value of the joint effort.

  1. Moist convection: a key to tropical wave-moisture interaction in Indian monsoon intraseasonal oscillation

    NASA Astrophysics Data System (ADS)

    Wu, Longtao; Wong, Sun; Wang, Tao; Huffman, George J.

    2018-01-01

    Simulation of moist convective processes is critical for accurately representing the interaction among tropical wave activities, atmospheric water vapor transport, and clouds associated with the Indian monsoon Intraseasonal Oscillation (ISO). In this study, we apply the Weather Research and Forecasting (WRF) model to simulate Indian monsoon ISO with three different treatments of moist convective processes: (1) the Betts-Miller-Janjić (BMJ) adjustment cumulus scheme without explicit simulation of moist convective processes; (2) the New Simplified Arakawa-Schubert (NSAS) mass-flux scheme with simplified moist convective processes; and (3) explicit simulation of moist convective processes at convection permitting scale (Nest). Results show that the BMJ experiment is unable to properly reproduce the equatorial Rossby wave activities and the corresponding phase relationship between moisture advection and dynamical convergence during the ISO. These features associated with the ISO are approximately captured in the NSAS experiment. The simulation with resolved moist convective processes significantly improves the representation of the ISO evolution, and has good agreements with the observations. This study features the first attempt to investigate the Indian monsoon at convection permitting scale.

  2. The Mathematics of High School Physics: Models, Symbols, Algorithmic Operations and Meaning

    ERIC Educational Resources Information Center

    Kanderakis, Nikos

    2016-01-01

    In the seventeenth and eighteenth centuries, mathematicians and physical philosophers managed to study, via mathematics, various physical systems of the sublunar world through idealized and simplified models of these systems, constructed with the help of geometry. By analyzing these models, they were able to formulate new concepts, laws and…

  3. Simplified Procedures for Eutrophication Assessment and Prediction: User Manual

    DTIC Science & Technology

    1996-09-01

    1975), for use in the Lake Erie Wastewater Management Study and is described by Verhoff, Yaksich, and Melfi (1980) and Westerdahl et al. (1981). This...manual," Technical Re- port E-81-9, U.S. Army Engineer Waterways Experiment Station, Vicksburg, MS. Westerdahl , H. E., Ford, W. B., Harris, J., and

  4. Forest thinning changes movement patterns and habitat use by Pacific marten

    Treesearch

    Katie M. Moriarty; Clinton W. Epps; William J. Zielinski

    2016-01-01

    ABSTRACT Simplifying stand structure to reduce fuel density is a high priority for forest managers; however, affects to Pacific marten (Martes caurina) movement and connectivity are unknown. We evaluated whether thinning forests to reduce fuels influenced movements of Pacific marten. We collected movement paths from 22 martens using global positioning system telemetry...

  5. Response surface models of subsoil K concentration for loess over till soils in Missouri

    USDA-ARS?s Scientific Manuscript database

    Crop uptake of potassium (K) has demonstrated sensitivity to subsoil variation in K content. This fact has not been sufficiently considered in K management strategies in part due to logistical difficulties in sampling spatially variable subsoil K. We propose a simplified soil factorial model, a resp...

  6. National Policy Agenda to Reduce the Burden of Student Debt

    ERIC Educational Resources Information Center

    Institute for College Access & Success, 2014

    2014-01-01

    Since 2005, "The Institute for College Access & Success" (TICAS) and its Project on Student Debt have worked to reduce the risks and burdens of student debt. TICAS helped create and improve income-based repayment plans to keep federal loan payments manageable; strengthen Pell Grants, which reduce the need to borrow; and simplify the…

  7. An Ecological Context for Regenerating Mult-cohort, Mixed-species Red Pine Forests

    Treesearch

    Brian Palik; John Zasada

    2003-01-01

    Human disturbances have simplified the structure and composition of red pine forest, relative to historical conditions. A greater understanding of natureal disturbances and their role in generating complex stand structures, and their associated benefits, has increased interest in managing for mixed-species, multi-aged stands. We outline a conceptual approach for...

  8. Attaining Visual Literacy Using Simplified Graphics in Industry.

    ERIC Educational Resources Information Center

    Burton, Terry

    In the current milieu of ISO 9000 certification, just-in-time engineering (JIT), demand flow technology (DFT), and total quality management (TQM), industry is attempting to implement available technology for the creation, control, and delivery of documentation. In most cases, their efforts are in need of outside resources to analyze, develop,…

  9. Implementation of an Interorganizational System: The Case of Medical Insurance E-Clearance

    ERIC Educational Resources Information Center

    Bose, Indranil; Liu, Han; Ye, Alex

    2012-01-01

    The patients receiving treatment from a hospital need to interact with multiple entities when claiming reimbursements. The complexities of the medical service supply chain can be simplified with an electronic clearance management system that allows hospitals, medical insurance bureau, bank, and patients to interact in a seamless and cashless…

  10. Influence of mass transfer resistance on overall nitrate removal rate in upflow sludge bed reactors.

    PubMed

    Ting, Wen-Huei; Huang, Ju-Sheng

    2006-09-01

    A kinetic model with intrinsic reaction kinetics and a simplified model with apparent reaction kinetics for denitrification in upflow sludge bed (USB) reactors were proposed. USB-reactor performance data with and without sludge wasting were also obtained for model verification. An independent batch study showed that the apparent kinetic constants k' did not differ from the intrinsic k but the apparent Ks' was significantly larger than the intrinsic Ks suggesting that the intra-granule mass transfer resistance can be modeled by changes in Ks. Calculations of the overall effectiveness factor, Thiele modulus, and Biot number combined with parametric sensitivity analysis showed that the influence of internal mass transfer resistance on the overall nitrate removal rate in USB reactors is more significant than the external mass transfer resistance. The simulated residual nitrate concentrations using the simplified model were in good agreement with the experimental data; the simulated results using the simplified model were also close to those using the kinetic model. Accordingly, the simplified model adequately described the overall nitrate removal rate and can be used for process design.

  11. Perspectives on transgenic, herbicide-resistant crops in the United States almost 20 years after introduction.

    PubMed

    Duke, Stephen O

    2015-05-01

    Herbicide-resistant crops have had a profound impact on weed management. Most of the impact has been by glyphosate-resistant maize, cotton, soybean and canola. Significant economic savings, yield increases and more efficacious and simplified weed management have resulted in widespread adoption of the technology. Initially, glyphosate-resistant crops enabled significantly reduced tillage and reduced the environmental impact of weed management. Continuous use of glyphosate with glyphosate-resistant crops over broad areas facilitated the evolution of glyphosate-resistant weeds, which have resulted in increases in the use of tillage and other herbicides with glyphosate, reducing some of the initial environmental benefits of glyphosate-resistant crops. Transgenic crops with resistance to auxinic herbicides, as well as to herbicides that inhibit acetolactate synthase, acetyl-CoA carboxylase and hydroxyphenylpyruvate dioxygenase, stacked with glyphosate and/or glufosinate resistance, will become available in the next few years. These technologies will provide additional weed management options for farmers, but will not have all of the positive effects (reduced cost, simplified weed management, lowered environmental impact and reduced tillage) that glyphosate-resistant crops had initially. In the more distant future, other herbicide-resistant crops (including non-transgenic ones), herbicides with new modes of action and technologies that are currently in their infancy (e.g. bioherbicides, sprayable herbicidal RNAi and/or robotic weeding) may affect the role of transgenic, herbicide-resistant crops in weed management. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  12. Assessing the Risk of Aquifer Salinization in a Large-Scale Coastal Irrigation Scheme in Southern Italy

    NASA Astrophysics Data System (ADS)

    Zaccaria, Daniele; Passarella, Giuseppe; D'Agostino, Daniela; Giordano, Raffaele; Sandoval-Solis, Samuel; Maggi, Sabino; Bruno, Delia; Foglia, Laura

    2017-04-01

    A research study was conducted on a coastal irrigated agricultural area of southern Italy to assess the risks of aquifer degradation likely resulting from the intensive groundwater pumping from individual farm wells and reduced aquifer recharge. Information were collected both from farmers and delivery system's operators during a survey conducted in 2012 revealing that farmers depend mainly on groundwater with the aim to achieve flexible irrigation management as opposed to the rigid rotational delivery service of surface water supply provided by the local water management agency. The study area is intensively farmed by small land-holding growers with high-value micro-irrigated horticultural crops. Our team appraised the soil and aquifer degradation hazards using a simplified procedure for environmental risk assessment that allowed identifying the risk-generating processes, evaluating the magnitude of impacts, and estimating the overall risks significance. We also collected the stakeholders' perceptions on agricultural water management and use through field interviews, whereas parallel investigations revealed significant aquifer salinity increase during the recent years. As a final step, some preliminary risk mitigation options were appraised by exploring the growers' response to possible changes of irrigation deliveries by the water management agency. The present study integrated multi-annual observations, data interpretation, and modelling efforts, which jointly enabled the analysis of complex water management scenarios and the development of informed decisions. Keywords: Environmental risk assessment, Fuzzy cognitive maps, Groundwater degradation, Seawater intrusion

  13. 77 FR 76588 - Request for Proposal Platform Pilot

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-28

    ...The Small Business Administration (SBA) is announcing a pilot where federal agencies will test a new request for proposal (RFP) platform (RFP-EZ) to streamline the process through which the government buys web design and related technology services from small businesses for acquisitions valued at or below the simplified acquisition threshold (SAT). RFP-EZ is one of five projects sponsored by the Office of Science and Technology Policy's Presidential Innovation Fellows Program, which leverages the ingenuity of leading problem solvers from across America together with federal innovators to tackle projects that aim to fuel job creation, save taxpayers money, and significantly improve how the federal government serves the American people. Under the RFP-EZ pilot, which will initially run from December 28, 2012 through May 1, 2013, agencies will identify individual procurements valued at or below the simplified acquisition threshold that can be set aside for small businesses to test a suite of functional tools for: (1) Simplifying the development of statements of work, (2) improving agency access to information about small businesses, (3) enabling small businesses to submit quotes, bids or proposals (collectively referred to as proposals) electronically in response to a solicitation posted on Federal Business Opportunities (FedBizOpps); (4) enhancing efficiencies for evaluating proposals, and (5) improving how information (including prices paid by federal agencies) is captured and stored. The pilot will be conducted in accordance with existing laws and regulations. Interested parties are encouraged to review and comment on the functionality of RFP-EZ, as described at www.sba.gov/rfpez and highlighted in this notice. Responses to this notice will be considered for possible refinements to the RFP-EZ platform during the pilot and as part of the evaluation of the benefits and costs of making RFP-EZ a permanent platform fully integrated with FedBizOpps, the System for Award Management and agency contract writing systems.

  14. Development of a Comprehensive Community Nitrogen Oxide Emissions Reduction Toolkit (CCNERT)

    NASA Astrophysics Data System (ADS)

    Sung, Yong Hoon

    The main objective of this study is to research and develop a simplified tool to estimate energy use in a community and its associated effects on air pollution. This tool is intended to predict the impacts of selected energy conservation options and efficiency programs on emission reduction. It is intended to help local government and their residents understand and manage information collection and the procedures to be used. This study presents a broad overview of the community-wide energy use and NOx emissions inventory process. It also presents various simplified procedures to estimate each sector's energy use. In an effort to better understand community-wide energy use and its associated NOx emissions, the City of College Station, Texas, was selected as a case study community for this research. While one community might successfully reduce the production of NOx emissions by adopting electricity efficiency programs in its buildings, another community might be equally successful by changing the mix of fuel sources used to generate electricity, which is consumed by the community. In yet a third community low NOx automobiles may be mandated. Unfortunately, the impact and cost of one strategy over another changes over time as major sources of pollution are reduced. Therefore, this research proposes to help community planners answer these questions and to assist local communities with their NOx emission reduction plans by developing a Comprehensive Community NOx Emissions Reduction Toolkit (CCNERT). The proposed simplified tool could have a substantial impact on reducing NOx emission by providing decision-makers with a preliminary understanding about the impacts of various energy efficiency programs on emissions reductions. To help decision makers, this study has addressed these issues by providing a general framework for examining how a community's non-renewable energy use leads to NOx emissions, by quantifying each end-user's energy usage and its associated NOx emissions, and by evaluating the environmental benefits of various types of energy saving options.

  15. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  16. Process for making carbon foam

    DOEpatents

    Klett, James W.

    2000-01-01

    The process obviates the need for conventional oxidative stabilization. The process employs mesophase or isotropic pitch and a simplified process using a single mold. The foam has a relatively uniform distribution of pore sizes and a highly aligned graphic structure in the struts. The foam material can be made into a composite which is useful in high temperature sandwich panels for both thermal and structural applications.

  17. A study of an alignment-less lithography method as an educational resource

    NASA Astrophysics Data System (ADS)

    Kai, Kazuho; Shiota, Koki; Nagaoka, Shiro; Mahmood, Mohamad Rusop Bin Haji; Kawai, Akira

    2016-07-01

    A simplification of the lithography process was studied. The simplification method of photolithography, named "alignment-less lithography" was proposed by omitting the photomask alignment process in photolithography process using mechanically aligned photomasks and substrate by using a simple jig on which countersinks were formed. Photomasks made of glass and the photomasks made of transparent plastic sheets were prepared for the process. As the result, approximately 5µm in the case of the glass mask, and 20µm in the case of the OHP mask were obtained with repetitive accuracies, respectively. It was confirmed that the alignment-less lithography method was successful. The possibility of the application to an educational program, such as a heuristic for solving problems was suggested using the method with the OHP mask. The nMOS FET fabrication process was successfully demonstrated using this method. The feasibility of this process was confirmed. It is expected that a totally simplified device fabrication process can be achievable when combined with other simplifications, such ass the simplified impurity diffusion processes using PSG and BSG thin film as diffusion source prepared by the Sol-Gel material under normal air environment.

  18. GIS-based probability assessment of natural hazards in forested landscapes of Central and South-Eastern Europe.

    PubMed

    Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F

    2010-12-01

    We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  19. Comparing probabilistic microbial risk assessments for drinking water against daily rather than annualised infection probability targets.

    PubMed

    Signor, R S; Ashbolt, N J

    2009-12-01

    Some national drinking water guidelines provide guidance on how to define 'safe' drinking water. Regarding microbial water quality, a common position is that the chance of an individual becoming infected by some reference waterborne pathogen (e.g. Cryptsporidium) present in the drinking water should < 10(-4) in any year. However the instantaneous levels of risk to a water consumer vary over the course of a year, and waterborne disease outbreaks have been associated with shorter-duration periods of heightened risk. Performing probabilistic microbial risk assessments is becoming commonplace to capture the impacts of temporal variability on overall infection risk levels. A case is presented here for adoption of a shorter-duration reference period (i.e. daily) infection probability target over which to assess, report and benchmark such risks. A daily infection probability benchmark may provide added incentive and guidance for exercising control over short-term adverse risk fluctuation events and their causes. Management planning could involve outlining measures so that the daily target is met under a variety of pre-identified event scenarios. Other benefits of a daily target could include providing a platform for managers to design and assess management initiatives, as well as simplifying the technical components of the risk assessment process.

  20. [No exchange of information without technology : modern infrastructure in radiology].

    PubMed

    Hupperts, H; Hermann, K-G A

    2014-01-01

    Modern radiology cannot accomplish the daily numbers of examinations without supportive technology. Even though technology seems to be becoming increasingly more indispensable, business continuity should be ensured at any time and if necessary even with a limited technical infrastructure by business continuity management. An efficient information security management system forms the basis. The early radiology information systems were islands of information processing. A modern radiology department must be able to be modularly integrated into an informational network of a bigger organization. The secondary use of stored data for clinical decision-making support poses new challenges for the integrity of the data or systems because medical knowledge is displayed and provided in a context of treatment. In terms of imaging the creation and distribution radiology services work in a fully digital manner which is often different for radiology reports. Legally secure electronic diagnostic reports require a complex technical infrastructure; therefore, diagnostic findings still need to be filed as a paper document. The internal exchange and an improved dose management can be simplified by systems which continuously and automatically record the doses and thus provide the possibility of permanent analysis and reporting. Communication between patient and radiologist will gain ongoing importance. Intelligent use of technology will convey this to the radiologist and it will facilitate the understanding of the information by the patient.

  1. The visualisation of clinical leadership in the content of nursing education--a qualitative study of nursing students' experiences.

    PubMed

    Démeh, Waddah; Rosengren, Kristina

    2015-07-01

    The aim of this study was to describe nursing students' experiences of clinical leadership during their last year of education. Work as a nurse is complex with several demands from stakeholders who are colleagues, managers, patients and relatives. Therefore, it is important to provide students with tools for a forthcoming professional life as a nurse. A qualitative descriptive study was carried out in Jordan. Narratives (n=20) written by nursing students in their last year before graduation as a registered nurse were collected. The data were analysed by a manifest content analysis. The results formed one category: (Clinical leadership-safety in being a nurse), and three subcategories (eye-opener, a role model and bridging the gap) described the students' clinical leadership experiences due to the preparation process for being a nurse. Clinical leadership applies theory to practice by using a holistic view in nursing. Clinical leadership is a valuable tool for bridging the gap between theory and practice in nursing education. Skills within nursing management clarify and simplify nursing activities, which facilitates the transition from student to nurse. Focus on learning needs in nursing management is needed for stakeholders within education and health care organisations to facilitate graduation of well skilled nurses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    PubMed

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  3. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE PAGES

    An, Ke; Yuan, Lang; Dial, Laura; ...

    2017-09-11

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  4. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Ke; Yuan, Lang; Dial, Laura

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  5. Agile green process design for the intensified Kolbe-Schmitt synthesis by accompanying (simplified) life cycle assessment.

    PubMed

    Kressirer, Sabine; Kralisch, Dana; Stark, Annegret; Krtschil, Ulrich; Hessel, Volker

    2013-05-21

    In order to investigate the potential for process intensification, various reaction conditions were applied to the Kolbe-Schmitt synthesis starting from resorcinol. Different CO₂ precursors such as aqueous potassium hydrogencarbonate, hydrogencarbonate-based ionic liquids, DIMCARB, or sc-CO₂, the application of microwave irradiation for fast volumetric heating of the reaction mixture, and the effect of harsh reaction conditions were investigated. The experiments, carried out in conventional batch-wise as well as in continuously operated microstructured reactors, aimed at the development of an environmentally benign process for the preparation of 2,4-dihydroxybenzoic acid. To provide decision support toward a green process design, a research-accompanying simplified life cycle assessment (SLCA) was performed throughout the whole investigation. Following this approach, it was found that convective heating methods such as oil bath or electrical heating were more beneficial than the application of microwave irradiation. Furthermore, the consideration of workup procedures was crucial for a holistic view on the environmental burdens.

  6. Research methodology simplification for teaching purposes illustrated by clutch automatic control device testing

    NASA Astrophysics Data System (ADS)

    Wojs, J.

    2016-09-01

    The paper proves that simplified, shorter examination of an object, feasible in laboratory classes, can produce results similar to those reached in scientific investigation of the device using extensive equipment. A thorough investigation of an object, an automatic clutch device in this case, enabled identifying the magnitudes that most significantly affect its operation. The knowledge of these most sensitive magnitudes allows focusing in the teaching process on simplified measurement of only selected magnitudes and verifying the given object in the positive or negative.

  7. LOW-COST PERSONNEL DOSIMETER.

    DTIC Science & Technology

    specification was achieved by simplifying and improving the basic Bendix dosimeter design, using plastics for component parts, minimizing direct labor, and making the instrument suitable for automated processing and assembly. (Author)

  8. Development of Generation System of Simplified Digital Maps

    NASA Astrophysics Data System (ADS)

    Uchimura, Keiichi; Kawano, Masato; Tokitsu, Hiroki; Hu, Zhencheng

    In recent years, digital maps have been used in a variety of scenarios, including car navigation systems and map information services over the Internet. These digital maps are formed by multiple layers of maps of different scales; the map data most suitable for the specific situation are used. Currently, the production of map data of different scales is done by hand due to constraints related to processing time and accuracy. We conducted research concerning technologies for automatic generation of simplified map data from detailed map data. In the present paper, the authors propose the following: (1) a method to transform data related to streets, rivers, etc. containing widths into line data, (2) a method to eliminate the component points of the data, and (3) a method to eliminate data that lie below a certain threshold. In addition, in order to evaluate the proposed method, a user survey was conducted; in this survey we compared maps generated using the proposed method with the commercially available maps. From the viewpoint of the amount of data reduction and processing time, and on the basis of the results of the survey, we confirmed the effectiveness of the automatic generation of simplified maps using the proposed methods.

  9. Root Resorption: Simplifying Diagnosis and Improving Outcomes.

    PubMed

    Darcey, James; Qualtrough, Alison

    2016-05-01

    Root resorption is a condition resulting in the progressive loss of dental hard tissue. It may occur both within the root and upon the external aspect of the root. Diagnosis can be difficult and management challenging. Understanding the pathology is critical to understanding why and when this disease occurs and what the best management techniques involve. With such knowledge practitioners can confidently diagnose resorption, discuss prognoses and management strategies with the patient and either refer or begin treatment. Early intervention is paramount in improving outcomes. As such, if practitioners choose to refer patients they must be aware of what can be done immediately to mitigate risks until consultation and specialist treatment begins.

  10. Symptomatic Overlap and Therapeutic Opportunities in Primary Headache.

    PubMed

    Cady, Roger; Garas, Sandy Yacoub; Patel, Ketu; Peterson, Andrew; Wenzel, Richard

    2015-08-01

    Headache, a nearly universal experience, remains costly, disabling, and often suboptimally managed. The most common presentations in the United States are migraine, tension-type headache (TTH) and "sinus" headache, but their extensive symptomatic overlap suggests that these conditions can be approached as variations in the same underlying pathology and managed accordingly. We use case studies of patients with varying prior diagnoses (none, migraine, TTH, and sinus headache), as well as a 4-question diagnostic screening tool, to illustrate how pharmacists can use this conceptual framework to simplify identification, management, and referral of patients with primary headache conditions of uncertain etiology. © The Author(s) 2014.

  11. Short Shot Tower for Silicon

    NASA Technical Reports Server (NTRS)

    Bates, H. E.; Hill, D. M.; Jewett, D. N.

    1983-01-01

    Drop length necessary to convert molten silicon to shot reduced by proposed new process. Conversion of silicon from powder or chunks to shot often simplifies processing. Shot is more easily handled in most processing equipment. Drops of liquid silicon fall through protective cloud of argon, then through rapidly cooling bath of methanol, where they quickly turn into solid shot.

  12. Spinoff 2011

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Topics include: Bioreactors Drive Advances in Tissue Engineering; Tooling Techniques Enhance Medical Imaging; Ventilator Technologies Sustain Critically Injured Patients; Protein Innovations Advance Drug Treatments, Skin Care; Mass Analyzers Facilitate Research on Addiction; Frameworks Coordinate Scientific Data Management; Cameras Improve Navigation for Pilots, Drivers; Integrated Design Tools Reduce Risk, Cost; Advisory Systems Save Time, Fuel for Airlines; Modeling Programs Increase Aircraft Design Safety; Fly-by-Wire Systems Enable Safer, More Efficient Flight; Modified Fittings Enhance Industrial Safety; Simulation Tools Model Icing for Aircraft Design; Information Systems Coordinate Emergency Management; Imaging Systems Provide Maps for U.S. Soldiers; High-Pressure Systems Suppress Fires in Seconds; Alloy-Enhanced Fans Maintain Fresh Air in Tunnels; Control Algorithms Charge Batteries Faster; Software Programs Derive Measurements from Photographs; Retrofits Convert Gas Vehicles into Hybrids; NASA Missions Inspire Online Video Games; Monitors Track Vital Signs for Fitness and Safety; Thermal Components Boost Performance of HVAC Systems; World Wind Tools Reveal Environmental Change; Analyzers Measure Greenhouse Gasses, Airborne Pollutants; Remediation Technologies Eliminate Contaminants; Receivers Gather Data for Climate, Weather Prediction; Coating Processes Boost Performance of Solar Cells; Analyzers Provide Water Security in Space and on Earth; Catalyst Substrates Remove Contaminants, Produce Fuel; Rocket Engine Innovations Advance Clean Energy; Technologies Render Views of Earth for Virtual Navigation; Content Platforms Meet Data Storage, Retrieval Needs; Tools Ensure Reliability of Critical Software; Electronic Handbooks Simplify Process Management; Software Innovations Speed Scientific Computing; Controller Chips Preserve Microprocessor Function; Nanotube Production Devices Expand Research Capabilities; Custom Machines Advance Composite Manufacturing; Polyimide Foams Offer Superior Insulation; Beam Steering Devices Reduce Payload Weight; Models Support Energy-Saving Microwave Technologies; Materials Advance Chemical Propulsion Technology; and High-Temperature Coatings Offer Energy Savings.

  13. The social process of escalation: a promising focus for crisis management research

    PubMed Central

    2012-01-01

    Background This study identifies a promising, new focus for the crisis management research in the health care domain. After reviewing the literature on health care crisis management, there seems to be a knowledge-gap regarding organisational change and adaption, especially when health care situations goes from normal, to non-normal, to pathological and further into a state of emergency or crisis. Discussion Based on studies of escalating situations in obstetric care it is suggested that two theoretical perspectives (contingency theory and the idea of failure as a result of incomplete interaction) tend to simplify the issue of escalation rather than attend to its complexities (including the various power relations among the stakeholders involved). However studying the process of escalation as inherently complex and social allows us to see the definition of a situation as normal or non-normal as an exercise of power in itself, rather than representing a putatively correct response to a particular emergency. Implications The concept of escalation, when treated this way, can help us further the analysis of clinical and institutional acts and competence. It can also turn our attention to some important elements in a class of social phenomenon, crises and emergencies, that so far have not received the attention they deserve. Focusing on organisational choreography, that interplay of potential factors such as power, professional identity, organisational accountability, and experience, is not only a promising focus for future naturalistic research but also for developing more pragmatic strategies that can enhance organisational coordination and response in complex events. PMID:22704075

  14. The Clone Factory

    ERIC Educational Resources Information Center

    Stoddard, Beryl

    2005-01-01

    Have humans been cloned? Is it possible? Immediate interest is sparked when students are asked these questions. In response to their curiosity, the clone factory activity was developed to help them understand the process of cloning. In this activity, students reenact the cloning process, in a very simplified simulation. After completing the…

  15. A Scheduling Algorithm for Computational Grids that Minimizes Centralized Processing in Genome Assembly of Next-Generation Sequencing Data

    PubMed Central

    Lima, Jakelyne; Cerdeira, Louise Teixeira; Bol, Erick; Schneider, Maria Paula Cruz; Silva, Artur; Azevedo, Vasco; Abelém, Antônio Jorge Gomes

    2012-01-01

    Improvements in genome sequencing techniques have resulted in generation of huge volumes of data. As a consequence of this progress, the genome assembly stage demands even more computational power, since the incoming sequence files contain large amounts of data. To speed up the process, it is often necessary to distribute the workload among a group of machines. However, this requires hardware and software solutions specially configured for this purpose. Grid computing try to simplify this process of aggregate resources, but do not always offer the best performance possible due to heterogeneity and decentralized management of its resources. Thus, it is necessary to develop software that takes into account these peculiarities. In order to achieve this purpose, we developed an algorithm aimed to optimize the functionality of de novo assembly software ABySS in order to optimize its operation in grids. We run ABySS with and without the algorithm we developed in the grid simulator SimGrid. Tests showed that our algorithm is viable, flexible, and scalable even on a heterogeneous environment, which improved the genome assembly time in computational grids without changing its quality. PMID:22461785

  16. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  17. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    PubMed

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A practical method of determining water current velocities and diffusion coefficients in coastal waters by remote sensing techniques

    NASA Technical Reports Server (NTRS)

    James, W. P.

    1971-01-01

    A simplified procedure is presented for determining water current velocities and diffusion coefficients. Dye drops which form dye patches in the receiving water are made from an aircraft. The changes in position and size of the patches are recorded from two flights over the area. The simplified data processing procedure requires only that the ground coordinates about the dye patches be determined at the time of each flight. With an automatic recording coordinatograph for measuring coordinates and a computer for processing the data, this technique provides a practical method of determining circulation patterns and mixing characteristics of large aquatic systems. This information is useful in assessing the environmental impact of waste water discharges and for industrial plant siting.

  19. Navigation system and method

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Sennott, J. W. (Inventor)

    1984-01-01

    In a global positioning system (GPS), such as the NAVSTAR/GPS system, wherein the position coordinates of user terminals are obtained by processing multiple signals transmitted by a constellation of orbiting satellites, an acquisition-aiding signal generated by an earth-based control station is relayed to user terminals via a geostationary satellite to simplify user equipment. The aiding signal is FSK modulated on a reference channel slightly offset from the standard GPS channel. The aiding signal identifies satellites in view having best geometry and includes Doppler prediction data as well as GPS satellite coordinates and identification data associated with user terminals within an area being served by the control station and relay satellite. The aiding signal significantly reduces user equipment by simplifying spread spectrum signal demodulation and reducing data processing functions previously carried out at the user terminals.

  20. A computerized model for integrating the physical environmental factors into metropolitan landscape planning

    Treesearch

    Julius Gy Fabos; Kimball H. Ferris

    1977-01-01

    This paper justifies and illustrates (in simplified form) a landscape planning approach to the environmental management of the metropolitan landscape. The model utilizes a computerized assessment and mapping system, which exhibits a recent advancement in computer technology that allows for greater accuracy and the weighting of different values when mapping at the...

  1. Putting Organizational Unlearning into Practice: A Few Steps for the Practitioner

    ERIC Educational Resources Information Center

    Reese, Simon

    2017-01-01

    This summary piece aims to provide a clear connection for leaders, managers or even individuals as they endeavor to put organizational unlearning into practice inside their organization. The author attempts to simplify the details from the diverse articles within this issue only with the aim of helping build an easier method for practical…

  2. Actual evapotranspiration (water use) assessment of the Colorado River Basin at the Landsat resolution using the operational Simplified Surface Energy Balance Model

    USDA-ARS?s Scientific Manuscript database

    Accurately estimating consumptive water use in the Colorado River Basin (CRB) is important for assessing and managing limited water resources in the basin. Increasing water demand from various sectors may threaten long-term sustainability of the water supply in the arid southwestern United States. L...

  3. The Emerging Role of School Leadership in Israel: From External to Internal Locus of Control.

    ERIC Educational Resources Information Center

    Volansky, Ami; Habinski, Avi

    1998-01-01

    In 1997, all elementary schools in Jerusalem were transferred to a self-managing program over a three-year period. Research undertaken in schools entering the program in 1995 reveals high satisfaction among principals and teachers, enhanced school relations with local authorities, and simplified financial-reporting mechanisms. (10 references) (MLH)

  4. 48 CFR 52.213-4 - Terms and Conditions-Simplified Acquisitions (Other Than Commercial Items)

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-consuming products listed in the ENERGY STAR® Program or Federal Energy Management Program (FEMP) will be... exceeds the micro-purchase threshold and the acquisition— (A) Is set aside for small business concerns; or (B) Cannot be set aside for small business concerns (see 19.502-2), and does not exceed $25,000.) (x...

  5. Selected mesostructure properties in loblolly pine from Arkansas plantations

    Treesearch

    David E. Kretschmann; Steven M. Cramer; Roderic Lakes; Troy Schmidt

    2006-01-01

    Design properties of wood are currently established at the macroscale, assuming wood to be a homogeneous orthotropic material. The resulting variability from the use of such a simplified assumption has been handled by designing with lower percentile values and applying a number of factors to account for the wide statistical variation in properties. With managed...

  6. Environmentally Sound Small-Scale Livestock Projects. Guidelines for Planning Series Number 5.

    ERIC Educational Resources Information Center

    Jacobs, Linda

    This document was developed in response to the need for simplified technical information for planning environmentally sound small-scale projects in third world countries. It is aimed specifically at those who are planning or managing small-scale livestock projects in less-developed areas of the tropics and sub-tropics. The guidelines included in…

  7. PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process

    Treesearch

    Henry Spelter

    1990-01-01

    This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...

  8. Simulation of nitrous oxide emissions at field scale using the SPACSYS model

    PubMed Central

    Wu, L.; Rees, R.M.; Tarsitano, D.; Zhang, Xubo; Jones, S.K.; Whitmore, A.P.

    2015-01-01

    Nitrous oxide emitted to the atmosphere via the soil processes of nitrification and denitrification plays an important role in the greenhouse gas balance of the atmosphere and is involved in the destruction of stratospheric ozone. These processes are controlled by biological, physical and chemical factors such as growth and activity of microbes, nitrogen availability, soil temperature and water availability. A comprehensive understanding of these processes embodied in an appropriate model can help develop agricultural mitigation strategies to reduce greenhouse gas emissions, and help with estimating emissions at landscape and regional scales. A detailed module to describe the denitrification and nitrification processes and nitrogenous gas emissions was incorporated into the SPACSYS model to replace an earlier module that used a simplified first-order equation to estimate denitrification and was unable to distinguish the emissions of individual nitrogenous gases. A dataset derived from a Scottish grassland experiment in silage production was used to validate soil moisture in the top 10 cm soil, cut biomass, nitrogen offtake and N2O emissions. The comparison between the simulated and observed data suggested that the new module can provide a good representation of these processes and improve prediction of N2O emissions. The model provides an opportunity to estimate gaseous N emissions under a wide range of management scenarios in agriculture, and synthesises our understanding of the interaction and regulation of the processes. PMID:26026411

  9. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    NASA Astrophysics Data System (ADS)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  10. An integrated environmental risk assessment and management framework for enhancing the sustainability of marine protected areas: the Cape d'Aguilar Marine Reserve case study in Hong Kong.

    PubMed

    Xu, Elvis G B; Leung, Kenneth M Y; Morton, Brian; Lee, Joseph H W

    2015-02-01

    Marine protected areas (MPAs), such as marine parks and reserves, contain natural resources of immense value to the environment and mankind. Since MPAs may be situated in close proximity to urbanized areas and influenced by anthropogenic activities (e.g. continuous discharges of contaminated waters), the marine organisms contained in such waters are probably at risk. This study aimed at developing an integrated environmental risk assessment and management (IERAM) framework for enhancing the sustainability of such MPAs. The IERAM framework integrates conventional environmental risk assessment methods with a multi-layer-DPSIR (Driver-Pressure-State-Impact-Response) conceptual approach, which can simplify the complex issues embraced by environmental management strategies and provide logical and concise management information. The IERAM process can generate a useful database, offer timely update on the status of MPAs, and assist in the prioritization of management options. We use the Cape d'Aguilar Marine Reserve in Hong Kong as an example to illustrate the IERAM framework. A comprehensive set of indicators were selected, aggregated and analyzed using this framework. Effects of management practices and programs were also assessed by comparing the temporal distributions of these indicators over a certain timeframe. Based on the obtained results, we have identified the most significant components for safeguarding the integrity of the marine reserve, and indicated the existing information gaps concerned with the management of the reserve. Apart from assessing the MPA's present condition, a successful implementation of the IERAM framework as evocated here would also facilitate better-informed decision-making and, hence, indirectly enhance the protection and conservation of the MPA's marine biodiversity. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Data management in an object-oriented distributed aircraft conceptual design environment

    NASA Astrophysics Data System (ADS)

    Lu, Zhijie

    In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the distributed object-oriented framework. By overcoming the shortcomings of the traditional approach of modeling aircraft conceptual design data, this data model makes it possible to capture specific detailed information of aircraft conceptual design without sacrificing generality, which is one of the most desired features of a data model for aircraft conceptual design. Based upon this data model, a prototype of the data management system, which is one of the fundamental building blocks of the NextADE, is implemented utilizing the state of the art information technologies. Using a general-purpose integration software package to demonstrate the efficacy of the proposed framework and the data management system, the NextADE is initially implemented by integrating the prototype of the data management system with other building blocks of the design environment, such as disciplinary analyses programs and mission analyses programs. As experiments, two case studies are conducted in the integrated design environments. One is based upon a simplified conceptual design of a notional conventional aircraft; the other is a simplified conceptual design of an unconventional aircraft. As a result of the experiments, the proposed framework and the data management approach are shown to be feasible solutions to the research problems.

  12. Quality management system in the CIEMAT Radiation Dosimetry Service.

    PubMed

    Martín, R; Navarro, T; Romero, A M; López, M A

    2011-03-01

    This paper describes the activities realised by the CIEMAT Radiation Dosimetry Service (SDR) for the implementation of a quality management system (QMS) in order to achieve compliance with the requirements of ISO/IEC 17025 and to apply for the accreditation for testing measurements of radiation dose. SDR has decided the accreditation of the service as a whole and not for each of its component laboratories. This makes it necessary to design a QMS common to all, thus ensuring alignment and compliance with standard requirements, and simplifying routine works as possible.

  13. RILS: What are they, what are they good for, and do we have any?

    USDA-ARS?s Scientific Manuscript database

    RILs, or recombinant inbred lines, are a set of genetically related individuals that can simplify the gene discovery process. They are constructed using regular breeding processes rather than using tissue culture or other advanced biotechnology. Operationally, a hybrid is made, and this hybrid is se...

  14. 75 FR 77649 - Agency Information Collection Activities: Proposed Collection: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ... Division of Independent Review Grant Reviewer Recruitment Form (OMB No. 0915-0295)--[Extension] HRSA's... of all eligible applications submitted to HRSA. DIR ensures that the independent review process is... experience; and allows maximum use of drop-down menus to simplify the data collection process. The Web-based...

  15. Operational Control Procedures for the Activated Sludge Process, Part III-A: Calculation Procedures.

    ERIC Educational Resources Information Center

    West, Alfred W.

    This is the second in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. This document deals exclusively with the calculation procedures, including simplified mixing formulas, aeration tank…

  16. Evaluation of Models of the Reading Process.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  17. Initial Crisis Reaction and Poliheuristic Theory

    ERIC Educational Resources Information Center

    DeRouen, Karl, Jr.; Sprecher, Christopher

    2004-01-01

    Poliheuristic (PH) theory models foreign policy decisions using a two-stage process. The first step eliminates alternatives on the basis of a simplifying heuristic. The second step involves a selection from among the remaining alternatives and can employ a more rational and compensatory means of processing information. The PH model posits that…

  18. A Guide to Program Planning Vol. II.

    ERIC Educational Resources Information Center

    Allen, Earl, Sr.

    This booklet is a simplified guide for program planning and is intended to complement a somewhat lengthier companion booklet on program evaluation. It spells out in outline fashion the basic elements and steps involved in the planning process. Brief sections focus in turn on different phases of the planning process, including problem…

  19. Polymer flammability

    DOT National Transportation Integrated Search

    2005-05-01

    This report provides an overview of polymer flammability from a material science perspective and describes currently accepted test methods to quantify burning behavior. Simplifying assumptions about the gas and condensed phase processes of flaming co...

  20. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

Top