Sample records for risk management computer

  1. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  2. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  3. Managing the Risks Associated with End-User Computing.

    ERIC Educational Resources Information Center

    Alavi, Maryam; Weiss, Ira R.

    1986-01-01

    Identifies organizational risks of end-user computing (EUC) associated with different stages of the end-user applications life cycle (analysis, design, implementation). Generic controls are identified that address each of the risks enumerated in a manner that allows EUC management to select those most appropriate to their EUC environment. (5…

  4. SYN-OP-SYS™: A Computerized Management Information System for Quality Assurance and Risk Management

    PubMed Central

    Thomas, David J.; Weiner, Jayne; Lippincott, Ronald C.

    1985-01-01

    SYN·OP·SYS™ is a computerized management information system for quality assurance and risk management. Computer software for the efficient collection and analysis of “occurrences” and the clinical data associated with these kinds of patient events is described. The system is evaluated according to certain computer design criteria, and the system's implementation is assessed.

  5. Computer-Aided Nodule Assessment and Risk Yield Risk Management of Adenocarcinoma: The Future of Imaging?

    PubMed

    Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias

    2016-01-01

    Increased clinical use of chest high-resolution computed tomography results in increased identification of lung adenocarcinomas and persistent subsolid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to noninvasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semiquantitative measures to decrease interrater and intrarater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still suboptimal, require validation and are not yet clinically applicable. The computer-aided nodule assessment and risk yield software application represents a validated tool for the automated, quantitative, and noninvasive tool for risk stratification of adenocarcinoma lung nodules. Computer-aided nodule assessment and risk yield correlates well with consensus histology and postsurgical patient outcomes, and therefore may help to guide individualized patient management, for example, in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  7. Guidelines for developing NASA (National Aeronautics and Space Administration) ADP security risk management plans

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1983-01-01

    This report presents guidance to NASA Computer security officials for developing ADP security risk management plans. The six components of the risk management process are identified and discussed. Guidance is presented on how to manage security risks that have been identified during a risk analysis performed at a data processing facility or during the security evaluation of an application system.

  8. A randomised controlled trial testing a web-based, computer-tailored self-management intervention for people with or at risk for chronic obstructive pulmonary disease: a study protocol

    PubMed Central

    2013-01-01

    Background Chronic Obstructive Pulmonary Disease (COPD) is a major cause of morbidity and mortality. Effective self-management support interventions are needed to improve the health and functional status of people with COPD or at risk for COPD. Computer-tailored technology could be an effective way to provide this support. Methods/Design This paper presents the protocol of a randomised controlled trial testing the effectiveness of a web-based, computer-tailored self-management intervention to change health behaviours of people with or at risk for COPD. An intervention group will be compared to a usual care control group, in which the intervention group will receive a web-based, computer-tailored self-management intervention. Participants will be recruited from an online panel and through general practices. Outcomes will be measured at baseline and at 6 months. The primary outcomes will be smoking behaviour, measuring the 7-day point prevalence abstinence and physical activity, measured in minutes. Secondary outcomes will include dyspnoea score, quality of life, stages of change, intention to change behaviour and alternative smoking behaviour measures, including current smoking behaviour, 24-hour point prevalence abstinence, prolonged abstinence, continued abstinence and number of quit attempts. Discussion To the best of our knowledge, this will be the first randomised controlled trial to test the effectiveness of a web-based, computer-tailored self-management intervention for people with or at risk for COPD. The results will be important to explore the possible benefits of computer-tailored interventions for the self-management of people with or at risk for COPD and potentially other chronic health conditions. Dutch trial register NTR3421 PMID:23742208

  9. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  10. Capacity planning for electronic waste management facilities under uncertainty: multi-objective multi-time-step model development.

    PubMed

    Poonam Khanijo Ahluwalia; Nema, Arvind K

    2011-07-01

    Selection of optimum locations for locating new facilities and decision regarding capacities at the proposed facilities is a major concern for municipal authorities/managers. The decision as to whether a single facility is preferred over multiple facilities of smaller capacities would vary with varying priorities to cost and associated risks such as environmental or health risk or risk perceived by the society. Currently management of waste streams such as that of computer waste is being done using rudimentary practices and is flourishing as an unorganized sector, mainly as backyard workshops in many cities of developing nations such as India. Uncertainty in the quantification of computer waste generation is another major concern due to the informal setup of present computer waste management scenario. Hence, there is a need to simultaneously address uncertainty in waste generation quantities while analyzing the tradeoffs between cost and associated risks. The present study aimed to address the above-mentioned issues in a multi-time-step, multi-objective decision-support model, which can address multiple objectives of cost, environmental risk, socially perceived risk and health risk, while selecting the optimum configuration of existing and proposed facilities (location and capacities).

  11. Computer Viruses. Legal and Policy Issues Facing Colleges and Universities.

    ERIC Educational Resources Information Center

    Johnson, David R.; And Others

    Compiled by various members of the higher educational community together with risk managers, computer center managers, and computer industry experts, this report recommends establishing policies on an institutional level to protect colleges and universities from computer viruses and the accompanying liability. Various aspects of the topic are…

  12. Proposal for a Security Management in Cloud Computing for Health Care

    PubMed Central

    Dzombeta, Srdan; Brandis, Knud

    2014-01-01

    Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources. PMID:24701137

  13. Proposal for a security management in cloud computing for health care.

    PubMed

    Haufe, Knut; Dzombeta, Srdan; Brandis, Knud

    2014-01-01

    Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources.

  14. Risks and crises for healthcare providers: the impact of cloud computing.

    PubMed

    Glasberg, Ronald; Hartmann, Michael; Draheim, Michael; Tamm, Gerrit; Hessel, Franz

    2014-01-01

    We analyze risks and crises for healthcare providers and discuss the impact of cloud computing in such scenarios. The analysis is conducted in a holistic way, taking into account organizational and human aspects, clinical, IT-related, and utilities-related risks as well as incorporating the view of the overall risk management.

  15. Improving the U.S. Navy’s Execution of Technical Authority through a Common Risk Management and Technical Assessment Process

    DTIC Science & Technology

    2008-09-01

    ITP . Assessment Indicators: • Has the risk management team (RMT) provided a risk management plan (RMP)? − Does the RMP provide an organized...processes. Diskettes, which contain the necessary programs for accessing BMP◊NET from IBM -compatible or Macintosh computers with a modem, and answers to

  16. Health-adjusted premium subsidies in the Netherlands.

    PubMed

    van de Ven, Wynand P M M; van Vliet, René C J A; Lamers, Leida M

    2004-01-01

    The Dutch government has decided to proceed with managed competition in health care. In this paper we report on progress made with health-based risk adjustment, a key issue in managed competition. In 2004 both Diagnostic Cost Groups (DCGs) computed from hospital diagnoses only and Pharmacy-based Cost Groups (PCGs) computed from out-patient prescription drugs are used to set the premium subsidies for competing risk-bearing sickness funds. These health-based risk adjusters appear to be effective and complementary. Risk selection is not a major problem in the Netherlands. Despite the progress made, we are still faced with a full research agenda for risk adjustment in the coming years.

  17. Risks and Crises for Healthcare Providers: The Impact of Cloud Computing

    PubMed Central

    Glasberg, Ronald; Hartmann, Michael; Tamm, Gerrit

    2014-01-01

    We analyze risks and crises for healthcare providers and discuss the impact of cloud computing in such scenarios. The analysis is conducted in a holistic way, taking into account organizational and human aspects, clinical, IT-related, and utilities-related risks as well as incorporating the view of the overall risk management. PMID:24707207

  18. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  19. Adapting risk management and computational intelligence network optimization techniques to improve traffic throughput and tail risk analysis.

    DOT National Transportation Integrated Search

    2014-04-01

    Risk management techniques are used to analyze fluctuations in uncontrollable variables and keep those fluctuations from impeding : the core function of a system or business. Examples of this are making sure that volatility in copper and aluminum pri...

  20. Appropriate test selection for single-photon emission computed tomography imaging: association with clinical risk, posttest management, and outcomes.

    PubMed

    Aldweib, Nael; Negishi, Kazuaki; Seicean, Sinziana; Jaber, Wael A; Hachamovitch, Rory; Cerqueira, Manuel; Marwick, Thomas H

    2013-09-01

    Appropriate use criteria (AUC) for stress single-photon emission computed tomography (SPECT) are only one step in appropriate use of imaging. Other steps include pretest clinical risk evaluation and optimal management responses. We sought to understand the link between AUC, risk evaluation, management, and outcome. We used AUC to classify 1,199 consecutive patients (63.8 ± 12.5 years, 56% male) undergoing SPECT as inappropriate, uncertain, and appropriate. Framingham score for asymptomatic patients and Bethesda angina score for symptomatic patients were used to classify patients into high (≥5%/y), intermediate, and low (≤1%/y) risk. Subsequent patient management was defined as appropriate or inappropriate based on the concordance between management decisions and the SPECT result. Patients were followed up for a median of 4.8 years, and cause of death was obtained from the social security death registry. Overall, 62% of SPECTs were appropriate, 18% inappropriate, and 20% uncertain (only 5 were unclassified). Of 324 low-risk studies, 108 (33%) were inappropriate, compared with 94 (15%) of 621 intermediate-risk and 1 (1%) of 160 high-risk studies (P < .001). There were 79 events, with outcomes of inappropriate patients better than uncertain and appropriate patients. Management was appropriate in 986 (89%), and appropriateness of patient management was unrelated to AUC (P = .65). Pretest clinical risk evaluation may be helpful in appropriateness assessment because very few high-risk patients are inappropriate, but almost half of low-risk patients are inappropriate or uncertain. Appropriate patient management is independent of appropriateness of testing. © 2013.

  1. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  2. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-06

    ... network. Vetting requests, analyses, and results will be stored separately on a classified computer... DEPARTMENT OF STATE [Public Notice 7709] Privacy Act; System of Records: State-78, Risk Analysis... a system of records, Risk Analysis and Management Records, State-78, pursuant to the provisions of...

  3. A method for mapping fire hazard and risk across multiple scales and its application in fire management

    Treesearch

    Robert E. Keane; Stacy A. Drury; Eva C. Karau; Paul F. Hessburg; Keith M. Reynolds

    2010-01-01

    This paper presents modeling methods for mapping fire hazard and fire risk using a research model called FIREHARM (FIRE Hazard and Risk Model) that computes common measures of fire behavior, fire danger, and fire effects to spatially portray fire hazard over space. FIREHARM can compute a measure of risk associated with the distribution of these measures over time using...

  4. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  5. Optimizing Security of Cloud Computing within the DoD

    DTIC Science & Technology

    2010-12-01

    information security governance and risk management; application security; cryptography; security architecture and design; operations security; business ...governance and risk management; application security; cryptography; security architecture and design; operations security; business continuity...20 7. Operational Security (OPSEC).........................................................20 8. Business Continuity Planning (BCP) and Disaster

  6. Integrity management of offshore structures and its implication on computation of structural action effects and resistance

    NASA Astrophysics Data System (ADS)

    Moan, T.

    2017-12-01

    An overview of integrity management of offshore structures, with emphasis on the oil and gas energy sector, is given. Based on relevant accident experiences and means to control the associated risks, accidents are categorized from a technical-physical as well as human and organizational point of view. Structural risk relates to extreme actions as well as structural degradation. Risk mitigation measures, including adequate design criteria, inspection, repair and maintenance as well as quality assurance and control of engineering processes, are briefly outlined. The current status of risk and reliability methodology to aid decisions in the integrity management is briefly reviewed. Finally, the need to balance the uncertainties in data, methods and computational efforts and the cautious use and quality assurance and control in applying high fidelity methods to avoid human errors, is emphasized, and with a plea to develop both high fidelity as well as efficient, simplified methods for design.

  7. Intelligent instrumentation applied in environment management

    NASA Astrophysics Data System (ADS)

    Magheti, Mihnea I.; Walsh, Patrick; Delassus, Patrick

    2005-06-01

    The use of information and communications technology in environment management and research has witnessed a renaissance in recent years. From optical sensor technology, expert systems, GIS and communications technologies to computer aided harvesting and yield prediction, these systems are increasable used for applications developing in the management sector of natural resources and biodiversity. This paper presents an environmental decision support system, used to monitor biodiversity and present a risk rating for the invasion of pests into the particular systems being examined. This system will utilise expert mobile technology coupled with artificial intelligence and predictive modelling, and will emphasize the potential for expansion into many areas of intelligent remote sensing and computer aided decision-making for environment management or certification. Monitoring and prediction in natural systems, harnessing the potential of computing and communication technologies is an emerging technology within the area of environmental management. This research will lead to the initiation of a hardware and software multi tier decision support system for environment management allowing an evaluation of areas for biodiversity or areas at risk from invasive species, based upon environmental factors/systems.

  8. Risk in Enterprise Cloud Computing: Re-Evaluated

    ERIC Educational Resources Information Center

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  9. Guidelines for contingency planning NASA (National Aeronautics and Space Administration) ADP security risk reduction decision studies

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1984-01-01

    Guidance is presented to NASA Computer Security Officials for determining the acceptability or unacceptability of ADP security risks based on the technical, operational and economic feasibility of potential safeguards. The risk management process is reviewed as a specialized application of the systems approach to problem solving and information systems analysis and design. Reporting the results of the risk reduction analysis to management is considered. Report formats for the risk reduction study are provided.

  10. [Computer-assisted cardiovascular disease management: better implementation of care but no improvement in clinical outcomes].

    PubMed

    de Wit, Niek J

    2012-01-01

    Computer support is considered by many to be a promising strategy for improving healthcare interventions, especially in the management of chronic diseases. So far, however, evidence of the effectiveness of ICT support in healthcare is limited. Recently, computer-supported cardiovascular disease management was compared with usual care during an RCT comprised of 1100 primary care patients. This trial demonstrated that neither the clinical outcome nor the cardiovascular morbidity rate improved, even though management of the risk factors improved over 1 year of follow-up. The pragmatic design of the RCT in daily general practice may have restricted implementing the computer support, and may also have hampered the evaluation of the cardiovascular effects. The results demonstrate that although computer support may help improve the performance of disease management, its impact on disease outcomes is questionable. ICT innovations in healthcare require rigorous investigative evaluation before their implementation in daily practice can be justified.

  11. 17 CFR 200.19a - Director of the Division of Trading and Markets.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... dealers that compute deductions for market and credit risk pursuant to § 240.15c3-1e of this chapter. This supervision includes the assessment of internal risk management controls and mathematical models used to... COMMISSION ORGANIZATION; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management...

  12. Securing PCs and Data in Libraries and Schools: A Handbook with Menuing, Anti-Virus, and Other Protective Software.

    ERIC Educational Resources Information Center

    Benson, Allen C.

    This handbook is designed to help readers identify and eliminate security risks, with sound recommendations and library-tested security software. Chapter 1 "Managing Your Facilities and Assessing Your Risks" addresses fundamental management responsibilities including planning for a secure system, organizing computer-related information, assessing…

  13. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  14. CANARY Risk Management of Adenocarcinoma: The Future of Imaging?

    PubMed Central

    Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A.; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias

    2016-01-01

    Increased clinical utilization of chest high resolution computed tomography results in increased identification of lung adenocarcinomas and persistent sub-solid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to non-invasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semi-quantitative measures to decrease inter- and intra-rater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still sub-optimal, require validation and are not yet clinically applicable. The Computer-Aided Nodule Assessment and Risk Yield (CANARY) software application represents a validated tool for the automated, quantitative, non-invasive tool for risk stratification of adenocarcinoma lung nodules. CANARY correlates well with consensus histology and post-surgical patient outcomes and therefore may help to guide individualized patient management e.g. in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. PMID:27568149

  15. Highway rock slope management program.

    DOT National Transportation Integrated Search

    2001-06-30

    Development of a comprehensive geotechnical database for risk management of highway rock slope problems is described. Computer software selected to program the client/server application in windows environment, components and structure of the geote...

  16. Security Risks: Management and Mitigation in the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.

    2004-01-01

    A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.

  17. Proceedings from the conference on high speed computing: High speed computing and national security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirons, K.P.; Vigil, M.; Carlson, R.

    1997-07-01

    This meeting covered the following topics: technologies/national needs/policies: past, present and future; information warfare; crisis management/massive data systems; risk assessment/vulnerabilities; Internet law/privacy and rights of society; challenges to effective ASCI programmatic use of 100 TFLOPs systems; and new computing technologies.

  18. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  19. Knowledge management: an application to wildfire prevention planning

    Treesearch

    Daniel L Schmoldt

    1989-01-01

    Residential encroachment into wildland areas places an additional burden on fire management activities. Prevention programs, fuel management efforts, and suppression strategies, previously employed in wildland areas, require modification for protection of increased values at risk in this interface area. Knowledge-based computer systems are being investigated as...

  20. APPLICATION OF THE US DECISION SUPPORT TOOL FOR MATERIALS AND WASTE MANAGEMENT

    EPA Science Inventory

    EPA¿s National Risk Management Research Laboratory has led the development of a municipal solid waste decision support tool (MSW-DST). The computer software can be used to calculate life-cycle environmental tradeoffs and full costs of different waste management plans or recycling...

  1. Contrast Media Extravasation of Computed Tomography and Magnetic Resonance Imaging: Management Guidelines for the Radiologist.

    PubMed

    Nicola, Refky; Shaqdan, Khalid Wael; Aran, Shima; Prabhakar, Anand M; Singh, Ajay Kumar; Abujudeh, Hani H

    2016-01-01

    Intravenous contrast administration has been of great importance in diagnostic radiology, but it is not without risks either due to the local, systemic allergic reactions or due to subcutaneous extravasation of contrast media. Subcutaneous contrast medium extravasationis an infrequent, yet a well-recognized complication. However, most incidents are minor and can be managed conservatively, but there are a few cases that require immediate surgical intervention. This article discusses the risks factors, clinical manifestations, and conservative and surgical approaches of subcutaneous contrast media extravasation for both computed tomography and magnetic resonance imaging. Copyright © 2015 Mosby, Inc. All rights reserved.

  2. Radiation dose reduction in computed tomography: techniques and future perspective

    PubMed Central

    Yu, Lifeng; Liu, Xin; Leng, Shuai; Kofler, James M; Ramirez-Giraldo, Juan C; Qu, Mingliang; Christner, Jodie; Fletcher, Joel G; McCollough, Cynthia H

    2011-01-01

    Despite universal consensus that computed tomography (CT) overwhelmingly benefits patients when used for appropriate indications, concerns have been raised regarding the potential risk of cancer induction from CT due to the exponentially increased use of CT in medicine. Keeping radiation dose as low as reasonably achievable, consistent with the diagnostic task, remains the most important strategy for decreasing this potential risk. This article summarizes the general technical strategies that are commonly used for radiation dose management in CT. Dose-management strategies for pediatric CT, cardiac CT, dual-energy CT, CT perfusion and interventional CT are specifically discussed, and future perspectives on CT dose reduction are presented. PMID:22308169

  3. ESR/ERS white paper on lung cancer screening

    PubMed Central

    Bonomo, Lorenzo; Gaga, Mina; Nackaerts, Kristiaan; Peled, Nir; Prokop, Mathias; Remy-Jardin, Martine; von Stackelberg, Oyunbileg; Sculier, Jean-Paul

    2015-01-01

    Lung cancer is the most frequently fatal cancer, with poor survival once the disease is advanced. Annual low dose computed tomography has shown a survival benefit in screening individuals at high risk for lung cancer. Based on the available evidence, the European Society of Radiology and the European Respiratory Society recommend lung cancer screening in comprehensive, quality-assured, longitudinal programmes within a clinical trial or in routine clinical practice at certified multidisciplinary medical centres. Minimum requirements include: standardised operating procedures for low dose image acquisition, computer-assisted nodule evaluation, and positive screening results and their management; inclusion/exclusion criteria; expectation management; and smoking cessation programmes. Further refinements are recommended to increase quality, outcome and cost-effectiveness of lung cancer screening: inclusion of risk models, reduction of effective radiation dose, computer-assisted volumetric measurements and assessment of comorbidities (chronic obstructive pulmonary disease and vascular calcification). All these requirements should be adjusted to the regional infrastructure and healthcare system, in order to exactly define eligibility using a risk model, nodule management and quality assurance plan. The establishment of a central registry, including biobank and image bank, and preferably on a European level, is strongly encouraged. PMID:25929956

  4. Integrated Geo Hazard Management System in Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  5. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  6. All Hazards Risk Assessment Transition Project: Report on Capability Assessment Management System (CAMS) Automation

    DTIC Science & Technology

    2014-04-01

    All Hazards Risk Assessment Transition Project : Report on Capability Assessment Management System (CAMS) Automation...Prepared by: George Giroux Computer Applications Specialist Modis155 Queen Street, Suite 1206 Ottawa, ON K1P 6L1 Contract # THS 2335474-2 Project ...Under a Canadian Safety and Security Program (CSSP) targeted investigation (TI) project (CSSP-2012-TI- 1108), Defence Research and Development

  7. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  8. Cloud computing in pharmaceutical R&D: business risks and mitigations.

    PubMed

    Geiger, Karl

    2010-05-01

    Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.

  9. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  10. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  11. Cybersecurity Capability Maturity Model for Information Technology Services (C2M2 for IT Services), Version 1.0

    DTIC Science & Technology

    2015-04-01

    Information and technology assets are a particular focus of the model. Information assets could be digital (e.g., stored in a computer system...which give context for the domain and intro - duce its practices and its abbreviation. (The abbreviation for the Risk Management domain, for example...Objectives and Practices 1. Manage Asset Inventory MIL1 a. There is an inventory of technology assets (e.g., computers and telecommunication equipment

  12. Managing multihazards risk in metropolitan USA

    NASA Astrophysics Data System (ADS)

    Aktan, A. Emin; Comfort, Louise K.; Shanis, Donald S.

    2003-07-01

    This proposal outlines an action plan for risk management in the Delaware Valley Metropolitan Region. This plan is consistent with the goals for strengthening homeland security announced by President Bush, and is designed to complement efforts currently under development by Pennsylvania Emergency Management Agency and Department of Health. This plan proposes the formation of a Delaware Valley Risk Management Consortium, representing the critical disciplines and organizations related to risk assessment and management. This group would have membership from academic institutions, government agencies, industry, and nonprofit organizations. This Consortium would develop a systemic scope of work with the appropriate recommendations for technology acquisition, development and integration with risk management policies and procedures. This scope of work would include the development of two related information systems for the Delaware Valley Region. The first would be a comprehensive 'health monitoring' system to assess the continuity of operations, which would use integrated remote sensing and imaging, information gathering, communication, computation, and, information processing and management over wide-area networks covering the entire metropolitan area. The second would use real-time information from the health monitoring system to support interactive communication, search and information exchange needed to coordinate action among the relevant agencies to mitigate risk, respond to hazards and manage its resources efficiently and effectively.

  13. Quantitative Microbial Risk Assessment Tutorial: Publishing a Microbial Density Time Series as a Txt File

    EPA Science Inventory

    A SARA Timeseries Utility supports analysis and management of time-varying environmental data including listing, graphing, computing statistics, computing meteorological data and saving in a WDM or text file. File formats supported include WDM, HSPF Binary (.hbn), USGS RDB, and T...

  14. Laptop Computers in the Elementary Classroom: Authentic Instruction with At-Risk Students

    ERIC Educational Resources Information Center

    Kemker, Kate; Barron, Ann E.; Harmes, J. Christine

    2007-01-01

    This case study investigated the integration of laptop computers into an elementary classroom in a low socioeconomic status (SES) school. Specifically, the research examined classroom management techniques and aspects of authentic learning relative to the student projects and activities. A mixed methods approach included classroom observations,…

  15. Using Computational Approaches to Improve Risk-Stratified Patient Management: Rationale and Methods

    PubMed Central

    Stone, Bryan L; Sakaguchi, Farrant; Sheng, Xiaoming; Murtaugh, Maureen A

    2015-01-01

    Background Chronic diseases affect 52% of Americans and consume 86% of health care costs. A small portion of patients consume most health care resources and costs. More intensive patient management strategies, such as case management, are usually more effective at improving health outcomes, but are also more expensive. To use limited resources efficiently, risk stratification is commonly used in managing patients with chronic diseases, such as asthma, chronic obstructive pulmonary disease, diabetes, and heart disease. Patients are stratified based on predicted risk with patients at higher risk given more intensive care. The current risk-stratified patient management approach has 3 limitations resulting in many patients not receiving the most appropriate care, unnecessarily increased costs, and suboptimal health outcomes. First, using predictive models for health outcomes and costs is currently the best method for forecasting individual patient’s risk. Yet, accuracy of predictive models remains poor causing many patients to be misstratified. If an existing model were used to identify candidate patients for case management, enrollment would miss more than half of those who would benefit most, but include others unlikely to benefit, wasting limited resources. Existing models have been developed under the assumption that patient characteristics primarily influence outcomes and costs, leaving physician characteristics out of the models. In reality, both characteristics have an impact. Second, existing models usually give neither an explanation why a particular patient is predicted to be at high risk nor suggestions on interventions tailored to the patient’s specific case. As a result, many high-risk patients miss some suitable interventions. Third, thresholds for risk strata are suboptimal and determined heuristically with no quality guarantee. Objective The purpose of this study is to improve risk-stratified patient management so that more patients will receive the most appropriate care. Methods This study will (1) combine patient, physician profile, and environmental variable features to improve prediction accuracy of individual patient health outcomes and costs; (2) develop the first algorithm to explain prediction results and suggest tailored interventions; (3) develop the first algorithm to compute optimal thresholds for risk strata; and (4) conduct simulations to estimate outcomes of risk-stratified patient management for various configurations. The proposed techniques will be demonstrated on a test case of asthma patients. Results We are currently in the process of extracting clinical and administrative data from an integrated health care system’s enterprise data warehouse. We plan to complete this study in approximately 5 years. Conclusions Methods developed in this study will help transform risk-stratified patient management for better clinical outcomes, higher patient satisfaction and quality of life, reduced health care use, and lower costs. PMID:26503357

  16. Information Management Functional Economic Analysis for Finance Workstations to the Defense Information Technology Services Organization

    DTIC Science & Technology

    1993-03-01

    values themselves. The Wools perform risk-adjusted present-value comparisons and compute the ROI using discount factors. The assessment of risk in a...developed X Window system, the de facto industry standard window system in the UNIX environment. An X- terminal’s use is limited to display. It has no...2.1 IT HARDWARE The DOS-based PC used in this analysis costs $2,060. It includes an ASL 486DX-33 Industry Standard Architecture (ISA) computer with 8

  17. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data

  18. Noninvasive Computed Tomography–based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial

    PubMed Central

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.

    2015-01-01

    Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977

  19. Noninvasive Computed Tomography-based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial.

    PubMed

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias

    2015-09-15

    Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.

  20. Unclassified Computing Capability: User Responses to a Multiprogrammatic and Institutional Computing Questionnaire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M; Kissel, L

    2002-01-29

    We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware.more » This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.« less

  1. Computers in medicine. Virtual rehabilitation: dream or reality?

    PubMed

    Berra, Kathy

    2006-08-01

    Coronary heart disease is the number one cause of death for men and women in the United States and internationally. Identification of persons at risk for cardiovascular disease and reduction of cardiovascular risk factors are key factors in managing this tremendous societal burden. The internet holds great promise in helping to identify and manage persons at high risk of a cardiac or vascular event. The Internet has the capability to assess risk and provide support for cardiovascular risk reduction for large numbers of persons in a cost effective and time efficient manner. The purpose of this report is to describe important advances in the use of the Internet in identifying persons at risk for a cardiovascular event and in the Internet's ability to provide interventions designed to reduce this risk.

  2. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  3. Sonography in Fetal Birth Weight Estimation

    ERIC Educational Resources Information Center

    Akinola, R. A.; Akinola, O. I.; Oyekan, O. O.

    2009-01-01

    The estimation of fetal birth weight is an important factor in the management of high risk pregnancies. The information and knowledge gained through this study, comparing a combination of various fetal parameters using computer assisted analysis, will help the obstetrician to screen the high risk pregnancies, monitor the growth and development,…

  4. Computer network security for the radiology enterprise.

    PubMed

    Eng, J

    2001-08-01

    As computer networks become an integral part of the radiology practice, it is appropriate to raise concerns regarding their security. The purpose of this article is to present an overview of computer network security risks and preventive strategies as they pertain to the radiology enterprise. A number of technologies are available that provide strong deterrence against attacks on networks and networked computer systems in the radiology enterprise. While effective, these technologies must be supplemented with vigilant user and system management.

  5. Computer-Assisted versus Oral-and-Written History Taking for the Prevention and Management of Cardiovascular Disease: a Systematic Review of the Literature.

    PubMed

    Pappas, Yannis; Všetečková, Jitka; Poduval, Shoba; Tseng, Pei Ching; Car, Josip

    CVD is an important global healthcare issue; it is the leading cause of global mortality, with an increasing incidence identified in both developed and developing countries. It is also an extremely costly disease for healthcare systems unless managed effectively. In this review we aimed to: - Assess the effect of computer-assisted versus oral-and-written history taking on the quality of collected information for the prevention and management of CVD. - Assess the effect of computer-assisted versus oral-and-written history taking on the prevention and management of CVD. A systematic review of randomised controlled trials that included participants of 16 years or older at the beginning of the study, who were at risk of CVD (prevention) or were either previously diagnosed with CVD (management). We searched all major databases. We assessed risk of bias using the Cochrane Collaboration tool. Two studies met the inclusion criteria. One comparing the two methods of history-taking for the prevention of cardiovascular disease n = 75. The study shows that generally the patients in the experimental group underwent more laboratory procedures, had more biomarker readings recorded and/or were given (or had reviewed), more dietary changes than the control group. The other study compares the two methods of history-taking for the management of cardiovascular disease (n = 479). The study showed that the computerized decision aid appears to increase the proportion of patients who responded to invitations to discuss CVD prevention with their doctor. The Computer- Assisted History Taking Systems (CAHTS) increased the proportion of patients who discussed CHD risk reduction with their doctor from 24% to 40% and increased the proportion who had a specific plan to reduce their risk from 24% to 37%. With only one study meeting the inclusion criteria, for prevention of CVD and one study for management of CVD we did not gather sufficient evidence to address all of the objectives of the review. We were unable to report on most of the secondary patient outcomes in our protocol. We tentatively conclude that CAHTS can provide individually-tailored information about CVD prevention. However, further primary studies are needed to confirm these findings. We cannot draw any conclusions in relation to any other clinical outcomes at this stage. There is a need to develop an evidence base to support the effective development and use of CAHTS in this area of practice. In the absence of evidence on effectiveness, the implementation of computer-assisted history taking may only rely on the clinicians' tacit knowledge, published monographs and viewpoint articles.

  6. Bioinformatics for Exploration

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy A.

    2006-01-01

    For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.

  7. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  8. The FITS model office ergonomics program: a model for best practice.

    PubMed

    Chim, Justine M Y

    2014-01-01

    An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.

  9. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    NASA Astrophysics Data System (ADS)

    Frigm, R.; Johnson, L.

    The Probability of Collision (Pc) has become a universal metric and statement of on-orbit collision risk. Although several flavors of the computation exist and are well-documented in the literature, the basic calculation requires the same input: estimates for the position, position uncertainty, and sizes of the two objects involved. The Pc is used operationally to make decisions on whether a given conjunction poses significant collision risk to the primary object (or space asset of concern). It is also used to determine necessity and degree of mitigative action (typically in the form of an orbital maneuver) to be performed. The predicted post-maneuver Pc also informs the maneuver planning process into regarding the timing, direction, and magnitude of the maneuver needed to mitigate the collision risk. Although the data sources, techniques, decision calculus, and workflows vary for different agencies and organizations, they all have a common thread. The standard conjunction assessment and collision risk concept of operations (CONOPS) predicts conjunctions, assesses the collision risk (typically, via the Pc), and plans and executes avoidance activities for conjunctions as a discrete events. As the space debris environment continues to increase and improvements are made to remote sensing capabilities and sensitivities to detect, track, and predict smaller debris objects, the number of conjunctions will in turn continue to increase. The expected order-of-magnitude increase in the number of predicted conjunctions will challenge the paradigm of treating each conjunction as a discrete event. The challenge will not be limited to workload issues, such as manpower and computing performance, but also the ability for satellite owner/operators to successfully execute their mission while also managing on-orbit collision risk. Executing a propulsive maneuver occasionally can easily be absorbed into the mission planning and operations tempo; whereas, continuously planning evasive maneuvers for multiple conjunction events is time-consuming and would disrupt mission and science operations beyond what is tolerable. At the point when the number of conjunctions is so large that it is no longer possible to consider each individually, some sort of an amalgamation of events and risk must be considered. This shift is to one where each conjunction cannot be treated individually and the effects of all conjunctions within a given period of time must be considered together. This new paradigm is called finite Conjunction Assessment (CA) risk management. This paper considers the use of the Total Probability of Collision (TPc) as an analogous collision risk metric in the finite CA paradigm. TPc is expressed by the equation below and provides an aggregate probability of colliding with any one of the predicted conjunctions under consideration. TPc=1-?(1-Pc,i) While the TPc computation is straightforward and its physical meaning is understandable, the implications of its usage operationally requires a change in mindset and approach to collision risk management. This paper explores the necessary changes to evolve the basic CA and collision risk management CONOPS from discrete to finite CA, including aspects of collision risk assessment and collision risk mitigation. It proposes numerical and graphical decision aids to understand both the “risk outlook” for a given primary as well as mitigation options for the total collision risk. Both concepts make use of the TPc as a metric for finite collision risk management. Several operational scenarios are used to demonstrate the proposed concepts in practice.

  10. Technical Performance Measurement, Earned Value, and Risk Management: An Integrated Diagnostic Tool for Program Management

    DTIC Science & Technology

    2002-06-01

    time, the monkey would eventually produce the collected works of Shakespeare . Unfortunately for the analogist, systems, even live ones, do not work...limited his simulated computer monkey to producing, in a single random step, the sentence uttered by Polonius in the play Hamlet : “Methinks it is

  11. An optimization-based approach for facility energy management with uncertainties, and, Power portfolio optimization in deregulated electricity markets with risk management

    NASA Astrophysics Data System (ADS)

    Xu, Jun

    Topic 1. An Optimization-Based Approach for Facility Energy Management with Uncertainties. Effective energy management for facilities is becoming increasingly important in view of the rising energy costs, the government mandate on the reduction of energy consumption, and the human comfort requirements. This part of dissertation presents a daily energy management formulation and the corresponding solution methodology for HVAC systems. The problem is to minimize the energy and demand costs through the control of HVAC units while satisfying human comfort, system dynamics, load limit constraints, and other requirements. The problem is difficult in view of the fact that the system is nonlinear, time-varying, building-dependent, and uncertain; and that the direct control of a large number of HVAC components is difficult. In this work, HVAC setpoints are the control variables developed on top of a Direct Digital Control (DDC) system. A method that combines Lagrangian relaxation, neural networks, stochastic dynamic programming, and heuristics is developed to predict the system dynamics and uncontrollable load, and to optimize the setpoints. Numerical testing and prototype implementation results show that our method can effectively reduce total costs, manage uncertainties, and shed the load, is computationally efficient. Furthermore, it is significantly better than existing methods. Topic 2. Power Portfolio Optimization in Deregulated Electricity Markets with Risk Management. In a deregulated electric power system, multiple markets of different time scales exist with various power supply instruments. A load serving entity (LSE) has multiple choices from these instruments to meet its load obligations. In view of the large amount of power involved, the complex market structure, risks in such volatile markets, stringent constraints to be satisfied, and the long time horizon, a power portfolio optimization problem is of critical importance but difficulty for an LSE to serve the load, maximize its profit, and manage risks. In this topic, a mid-term power portfolio optimization problem with risk management is presented. Key instruments are considered, risk terms based on semi-variances of spot market transactions are introduced, and penalties on load obligation violations are added to the objective function to improve algorithm convergence and constraint satisfaction. To overcome the inseparability of the resulting problem, a surrogate optimization framework is developed enabling a decomposition and coordination approach. Numerical testing results show that our method effectively provides decisions for various instruments to maximize profit, manage risks, and is computationally efficient.

  12. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  13. Computer-based interventions to improve self-management in adults with type 2 diabetes: a systematic review and meta-analysis.

    PubMed

    Pal, Kingshuk; Eastwood, Sophie V; Michie, Susan; Farmer, Andrew; Barnard, Maria L; Peacock, Richard; Wood, Bindie; Edwards, Phil; Murray, Elizabeth

    2014-06-01

    Structured patient education programs can reduce the risk of diabetes-related complications. However, people appear to have difficulties attending face-to-face education and alternatives are needed. This review looked at the impact of computer-based diabetes self-management interventions on health status, cardiovascular risk factors, and quality of life of adults with type 2 diabetes. We searched The Cochrane Library, Medline, Embase, PsycINFO, Web of Science, and CINAHL for relevant trials from inception to November 2011. Reference lists from relevant published studies were screened and authors contacted for further information when required. Two authors independently extracted relevant data using standard data extraction templates. Sixteen randomized controlled trials with 3,578 participants met the inclusion criteria. Interventions were delivered via clinics, the Internet, and mobile phones. Computer-based diabetes self-management interventions appear to have small benefits on glycemic control: the pooled effect on HbA1c was -0.2% (-2.3 mmol/mol [95% CI -0.4 to -0.1%]). A subgroup analysis on mobile phone-based interventions showed a larger effect: the pooled effect on HbA1c from three studies was -0.50% (-5.46 mmol/mol [95% CI -0.7 to -0.3%]). There was no evidence of improvement in depression, quality of life, blood pressure, serum lipids, or weight. There was no evidence of significant adverse effects. Computer-based diabetes self-management interventions to manage type 2 diabetes appear to have a small beneficial effect on blood glucose control, and this effect was larger in the mobile phone subgroup. There was no evidence of benefit for other biological, cognitive, behavioral, or emotional outcomes. © 2014 by the American Diabetes Association.

  14. Primary care physicians' perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study.

    PubMed

    Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva

    2015-09-24

    Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.

  15. The myth of secure computing.

    PubMed

    Austin, Robert D; Darby, Christopher A

    2003-06-01

    Few senior executives pay a whole lot of attention to computer security. They either hand off responsibility to their technical people or bring in consultants. But given the stakes involved, an arm's-length approach is extremely unwise. According to industry estimates, security breaches affect 90% of all businesses every year and cost some $17 billion. Fortunately, the authors say, senior executives don't need to learn about the more arcane aspects of their company's IT systems in order to take a hands-on approach. Instead, they should focus on the familiar task of managing risk. Their role should be to assess the business value of their information assets, determine the likelihood that those assets will be compromised, and then tailor a set of risk abatement processes to their company's particular vulnerabilities. This approach, which views computer security as an operational rather than a technical challenge, is akin to a classic quality assurance program in that it attempts to avoid problems rather than fix them and involves all employees, not just IT staffers. The goal is not to make computer systems completely secure--that's impossible--but to reduce the business risk to an acceptable level. This article looks at the types of threats a company is apt to face. It also examines the processes a general manager should spearhead to lessen the likelihood of a successful attack. The authors recommend eight processes in all, ranging from deciding how much protection each digital asset deserves to insisting on secure software to rehearsing a response to a security breach. The important thing to realize, they emphasize, is that decisions about digital security are not much different from other cost-benefit decisions. The tools general managers bring to bear on other areas of the business are good models for what they need to do in this technical space.

  16. A job safety program for construction workers designed to reduce the potential for occupational injury using tool box training sessions and computer-assisted biofeedback stress management techniques.

    PubMed

    Johnson, Kenneth A; Ruppe, Joan

    2002-01-01

    This project was conducted with a multicultural construction company in Hawaii, USA. The job duties performed included drywall and carpentry work. The following objectives were selected for this project: (a) fire prevention training and inspection of first aid equipment; (b) blood-borne pathogen training and risk evaluation; (c) ergonomic and risk evaluation intervention program; (d) electrical safety training and inspection program; (e) slips, trips, and falls safety training; (f) stress assessment and Personal Profile System; (g) safety and health program survey; (h) improving employee relations and morale by emphasizing spirituality; and (i) computer-assisted biofeedback stress management training. Results of the project indicated that observed safety hazards, reported injuries, and levels of perceived stress. were reduced for the majority of the population.

  17. Cyberbiosecurity: From Naive Trust to Risk Awareness.

    PubMed

    Peccoud, Jean; Gallegos, Jenna E; Murch, Randall; Buchholz, Wallace G; Raman, Sanjay

    2018-01-01

    The cyber-physical nature of biotechnology raises unprecedented security concerns. Computers can be compromised by encoding malware in DNA sequences, and biological threats can be synthesized using publicly available data. Trust within the biotechnology community creates vulnerabilities at the interface between cyberspace and biology. Awareness is a prerequisite to managing these risks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Dynamic Information Management and Exchange for Command and Control Applications: A Framework in Support of Emergency Management for Specified and Unspecified Emergencies

    DTIC Science & Technology

    2014-03-01

    64 selections, 128 aggregations and 510 join operators . 0 100 200 300 400 500 600 700 800 900 1000 0 10 20 30 40 50 60 70 T im e...DC, USA, 2001, IEEE Computer So- ciety, pp. 391–398. [66] E. Network and I. S. A. (ENISA), Inventory of risk managemen - t /risk assessment methods, Sept... Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that

  19. The Careful Puppet Master: Reducing risk and fortifying acceptance testing with Jenkins CI

    NASA Astrophysics Data System (ADS)

    Smith, Jason A.; Richman, Gabriel; DeStefano, John; Pryor, James; Rao, Tejas; Strecker-Kellogg, William; Wong, Tony

    2015-12-01

    Centralized configuration management, including the use of automation tools such as Puppet, can greatly increase provisioning speed and efficiency when configuring new systems or making changes to existing systems, reduce duplication of work, and improve automated processes. However, centralized management also brings with it a level of inherent risk: a single change in just one file can quickly be pushed out to thousands of computers and, if that change is not properly and thoroughly tested and contains an error, could result in catastrophic damage to many services, potentially bringing an entire computer facility offline. Change management procedures can—and should—be formalized in order to prevent such accidents. However, like the configuration management process itself, if such procedures are not automated, they can be difficult to enforce strictly. Therefore, to reduce the risk of merging potentially harmful changes into our production Puppet environment, we have created an automated testing system, which includes the Jenkins CI tool, to manage our Puppet testing process. This system includes the proposed changes and runs Puppet on a pool of dozens of RedHat Enterprise Virtualization (RHEV) virtual machines (VMs) that replicate most of our important production services for the purpose of testing. This paper describes our automated test system and how it hooks into our production approval process for automatic acceptance testing. All pending changes that have been pushed to production must pass this validation process before they can be approved and merged into production.

  20. Development of a statewide landslide inventory program.

    DOT National Transportation Integrated Search

    2003-02-01

    Development of a comprehensive geotechnical database for risk management of highway landslide problems is described. Computer software selected to program the client/server application in a data window, components and structure of the geotechnical da...

  1. Building a Practical Framework for Enterprise-Wide Security Management

    DTIC Science & Technology

    2004-04-28

    management. They have found that current efforts to manage security vulnerabilities and security risks only take an enterprise so far, with results...analyzed reports to determine the cause of the increase. Slide 5 © 2004 by Carnegie Mellon University Version 1.0 Secure IT 2004 - page 5 Attack...Nearly 1 in 5 of those surveyed reported that none of their IT staff have any formal security training. [A survey of 896 Computing Technology

  2. A toolbox to visualise benefits resulting from flood hazard mitigation

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Heiser, Micha

    2017-04-01

    In order to visualize the benefits resulting from technical mitigation, a toolbox was developed within an open-source environment that allows for an assessment of gains and losses for buildings exposed to flood hazards. Starting with different scenarios showing the changes in flood magnitude with respect to the considered management options, the computation was based on the amount and value of buildings exposed as well as their vulnerability, following the general concept of risk assessment. As a result, beneficiaries of risk reduction may be identified and - more general - also different mitigation options may be strategically evaluated with respect to the height of risk reduction for different elements exposed. As such, multiple management options can be ranked according to their costs and benefits, and in order of priority. A relational database composed from different modules was created in order to mirror the requirements of an open source application and to allow for future dynamics in the data availability as well as the spatiotemporal dynamics of this data (Fuchs et al. 2013). An economic module was used to compute the monetary value of buildings exposed using (a) the building footprint, (b) the information of the building cadaster such as building type, number of storeys and utilisation, and (c) regionally averaged construction costs. An exposition module was applied to connect the spatial GIS information (X and Y coordinates) of elements at risk to the hazard information in order to achieve information on exposure. An impact module linked this information to vulnerability functions (Totschnig and Fuchs 2013; Papathoma-Köhle et al. 2015) in order to achieve the monetary level of risk for every building exposed. These values were finally computed before and after the implementation of mitigation measure in order to show gains and losses, and visualised. The results can be exported in terms of spread sheets for further computation. References Fuchs S, Keiler M, Sokratov SA, Shnyparkov A (2013) Spatiotemporal dynamics: the need for an innovative approach in mountain hazard risk management. Natural Hazards 68 (3):1217-1241 Papathoma-Köhle M, Zischg A, Fuchs S, Glade T, Keiler M (2015) Loss estimation for landslides in mountain areas - An integrated toolbox for vulnerability assessment and damage documentation. Environmental Modelling and Software 63:156-169 Totschnig R, Fuchs S (2013) Mountain torrents: quantifying vulnerability and assessing uncertainties. Engineering Geology 155:31-44

  3. Italian Chapter of the International Society of Cardiovascular Ultrasound expert consensus document on coronary computed tomography angiography: overview and new insights.

    PubMed

    Sozzi, Fabiola B; Maiello, Maria; Pelliccia, Francesco; Parato, Vito Maurizio; Canetta, Ciro; Savino, Ketty; Lombardi, Federico; Palmiero, Pasquale

    2016-09-01

    Coronary computed tomography angiography is a noninvasive heart imaging test currently undergoing rapid development and advancement. The high resolution of the three-dimensional pictures of the moving heart and great vessels is performed during a coronary computed tomography to identify coronary artery disease and classify patient risk for atherosclerotic cardiovascular disease. The technique provides useful information about the coronary tree and atherosclerotic plaques beyond simple luminal narrowing and plaque type defined by calcium content. This application will improve image-guided prevention, medical therapy, and coronary interventions. The ability to interpret coronary computed tomography images is of utmost importance as we develop personalized medical care to enable therapeutic interventions stratified on the bases of plaque characteristics. This overview provides available data and expert's recommendations in the utilization of coronary computed tomography findings. We focus on the use of coronary computed tomography to detect coronary artery disease and stratify patients at risk, illustrating the implications of this test on patient management. We describe its diagnostic power in identifying patients at higher risk to develop acute coronary syndrome and its prognostic significance. Finally, we highlight the features of the vulnerable plaques imaged by coronary computed tomography angiography. © 2016, Wiley Periodicals, Inc.

  4. Occupation and thyroid cancer risk in Sweden.

    PubMed

    Lope, Virginia; Pollán, Marina; Gustavsson, Per; Plato, Nils; Pérez-Gómez, Beatriz; Aragonés, Nuria; Suárez, Berta; Carrasco, José Miguel; Rodríguez, Silvia; Ramis, Rebeca; Boldo, Elena; López-Abente, Gonzalo

    2005-09-01

    The objective of this study was to identify occupations and industries with increased incidence of thyroid cancer in Swedish workers. Standardized incidence ratios were computed for each job and industry for the period 1971-1989 through record-linkage with the Swedish National Cancer and Death Registers. Age-, period-, geographically adjusted relative risks were calculated using Poisson models. Increased risks were found for teachers, construction carpenters, policemen, and prison/reformatory officials in men, and medical technicians, shop managers, tailors, and shoecutters among women. Industries with risk excess are manufacture of agricultural machinery, manufacture of computing/accessories, and public administration/police among men; and manufacture of prefabricated wooden buildings, electric installation work, and wholesale of live animals/fertilizers/oilseed/grain among women. Our results corroborate some previously reported increased risks. Further research is needed to assess the influence of specific chemical agents related with some of the highlighted work environments.

  5. NSI security task: Overview

    NASA Technical Reports Server (NTRS)

    Tencati, Ron

    1991-01-01

    An overview is presented of the NASA Science Internet (NSI) security task. The task includes the following: policies and security documentation; risk analysis and management; computer emergency response team; incident handling; toolkit development; user consulting; and working groups, conferences, and committees.

  6. 75 FR 32519 - Miracor Diagnostics, Inc., Monaco Finance, Inc., MPEL Holdings Corp. (f/k/a Computer Transceiver...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-08

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] Miracor Diagnostics, Inc., Monaco Finance, Inc., MPEL Holdings Corp. (f/k/a Computer Transceiver Systems, Inc.), MR3 Systems, Inc., Mutual Risk Management, Ltd.; Order of Suspension of Trading June 4, 2010. It appears to the Securities and Exchange Commission that there is a lack of current and...

  7. Water Resources Management and Hydrologic Design Under Uncertain Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Teegavarapu, R. S.

    2008-05-01

    The impact of climate change on hydrologic design and management of water resource systems could be one of the important challenges faced by future practicing hydrologists and water resources managers. Many water resources managers currently rely on the historical hydrological data and adaptive real-time operations without consideration of the impact of climate change on major inputs influencing the behavior of hydrologic systems and the operating rules. Issues such as risk, reliability and robustness of water resources systems under different climate change scenarios were addressed in the past. However, water resources management with the decision maker's preferences attached to climate change has never been dealt with. This presentation discusses issues related to impacts of climate change on water resources management and application of a soft-computing approach, fuzzy set theory, for climate-sensitive management of water resources systems. A real-life case study example is presented to illustrate the applicability of soft-computing approach for handling the decision maker's preferences in accepting or rejecting the magnitude and direction of climate change.

  8. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  9. Pneumoperitoneum after virtual colonoscopy: causes, risk factors, and management.

    PubMed

    Baccaro, Leopoldo M; Markelov, Alexey; Wilhelm, Jakub; Bloch, Robert

    2014-06-01

    Computed tomographic virtual colonoscopy (CTVC) is a safe and minimally invasive modality when compared with fiberoptic colonoscopy for evaluating the colon and rectum. We have reviewed the risks for colonic perforation by investigating the relevant literature. The objectives of this study were to assess the risk of colonic perforation during CTVC, describe risk factors, evaluate ways to reduce the incidence complications, and to review management and treatment options. A formal search of indexed publications was performed through PubMed. Search queries using keywords "CT colonography," "CT virtual colonoscopy," "virtual colonoscopy," and "perforation" yielded a total of 133 articles. A total of eight case reports and four review articles were selected. Combining case reports and review articles, a total of 25 cases of colonic perforation after CTVC have been reported. Causes include, but are not limited to, diverticular disease, irritable bowel diseases, obstructive processes, malignancies, and iatrogenic injury. Both operative and nonoperative management has been described. Nonoperative management has been proven safe and successful in minimally symptomatic and stable patients. Colonic perforation after CTVC is a rare complication and very few cases have been reported. Several risk factors are recurrent in the literature and must be acknowledged at the time of the study. Management options vary and should be tailored to each individual patient.

  10. Volunteered Cloud Computing for Disaster Management

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.

  11. A red-flag-based approach to risk management of EHR-related safety concerns.

    PubMed

    Sittig, Dean F; Singh, Hardeep

    2013-01-01

    Although electronic health records (EHRs) have a significant potential to improve patient safety, EHR-related safety concerns have begun to emerge. Based on 369 responses to a survey sent to the memberships of the American Society for Healthcare Risk Management and the American Health Lawyers Association and supplemented by our previous work in EHR-related patient safety, we identified the following common EHR-related safety concerns: (1) incorrect patient identification; (2) extended EHR unavailability (either planned or unplanned); (3) failure to heed a computer-generated warning or alert; (4) system-to-system interface errors; (5) failure to identify, find, or use the most recent patient data; (6) misunderstandings about time; (7) incorrect item selected from a list of items; and (8) open or incomplete orders. In this article, we present a "red-flag"-based approach that can be used by risk managers to identify potential EHR safety concerns in their institutions. An organization that routinely conducts EHR-related surveillance activities, such as the ones proposed here, can significantly reduce risks associated with EHR implementation and use. © 2013 American Society for Healthcare Risk Management of the American Hospital Association.

  12. Electronic Risk Assessment System as an Appropriate Tool for the Prevention of Cancer: a Qualitative Study.

    PubMed

    Javan Amoli, Amir Hossein; Maserat, Elham; Safdari, Reza; Zali, Mohammad Reza

    2015-01-01

    Decision making modalities for screening for many cancer conditions and different stages have become increasingly complex. Computer-based risk assessment systems facilitate scheduling and decision making and support the delivery of cancer screening services. The aim of this article was to survey electronic risk assessment system as an appropriate tool for the prevention of cancer. A qualitative design was used involving 21 face-to-face interviews. Interviewing involved asking questions and getting answers from exclusive managers of cancer screening. Of the participants 6 were female and 15 were male, and ages ranged from 32 to 78 years. The study was based on a grounded theory approach and the tool was a semi- structured interview. Researchers studied 5 dimensions, comprising electronic guideline standards of colorectal cancer screening, work flow of clinical and genetic activities, pathways of colorectal cancer screening and functionality of computer based guidelines and barriers. Electronic guideline standards of colorectal cancer screening were described in the s3 categories of content standard, telecommunications and technical standards and nomenclature and classification standards. According to the participations' views, workflow and genetic pathways of colorectal cancer screening were identified. The study demonstrated an effective role of computer-guided consultation for screening management. Electronic based systems facilitate real-time decision making during a clinical interaction. Electronic pathways have been applied for clinical and genetic decision support, workflow management, update recommendation and resource estimates. A suitable technical and clinical infrastructure is an integral part of clinical practice guidline of screening. As a conclusion, it is recommended to consider the necessity of architecture assessment and also integration standards.

  13. Computer-supported games and role plays in teaching water management

    NASA Astrophysics Data System (ADS)

    Hoekstra, A. Y.

    2012-08-01

    There is an increasing demand for an interdisciplinary approach in teaching water management. Computer-supported games and role plays offer the potential of creating an environment in which different disciplines come together and in which students are challenged to develop integrated understanding. Two examples are discussed. The River Basin Game is a common-pool resource game in which participants experience the risk of over-abstractions of water in a river basin and learn how this risk relates to the complexity of the system, the conflict between individual and group optimums and the difficulty in achieving good cooperation. The Globalization of Water Role Play makes participants familiar with the global dimension of water management by letting them experience how national governments can integrate considerations of water scarcity and domestic water productivities into decisions on international trade in commodities like food, cotton and bio-energy. The two examples illustrate that play sessions inspire participants to think about the functioning of systems as a whole and to develop good cooperative courses of action, whereby both uncertainties about the system and the presence of different values and perspectives among participants play a role.

  14. Comparative Analysis of Cervical Spine Management in a Subset of Severe Traumatic Brain Injury Cases Using Computer Simulation

    PubMed Central

    Carter, Kimbroe J.; Dunham, C. Michael; Castro, Frank; Erickson, Barbara

    2011-01-01

    Background No randomized control trial to date has studied the use of cervical spine management strategies in cases of severe traumatic brain injury (TBI) at risk for cervical spine instability solely due to damaged ligaments. A computer algorithm is used to decide between four cervical spine management strategies. A model assumption is that the emergency room evaluation shows no spinal deficit and a computerized tomogram of the cervical spine excludes the possibility of fracture of cervical vertebrae. The study's goal is to determine cervical spine management strategies that maximize brain injury functional survival while minimizing quadriplegia. Methods/Findings The severity of TBI is categorized as unstable, high risk and stable based on intracranial hypertension, hypoxemia, hypotension, early ventilator associated pneumonia, admission Glasgow Coma Scale (GCS) and age. Complications resulting from cervical spine management are simulated using three decision trees. Each case starts with an amount of primary and secondary brain injury and ends as a functional survivor, severely brain injured, quadriplegic or dead. Cervical spine instability is studied with one-way and two-way sensitivity analyses providing rankings of cervical spine management strategies for probabilities of management complications based on QALYs. Early collar removal received more QALYs than the alternative strategies in most arrangements of these comparisons. A limitation of the model is the absence of testing against an independent data set. Conclusions When clinical logic and components of cervical spine management are systematically altered, changes that improve health outcomes are identified. In the absence of controlled clinical studies, the results of this comparative computer assessment show that early collar removal is preferred over a wide range of realistic inputs for this subset of traumatic brain injury. Future research is needed on identifying factors in projecting awakening from coma and the role of delirium in these cases. PMID:21544239

  15. Towards a conceptual framework of OSH risk management in smart working environments based on smart PPE, ambient intelligence and the Internet of Things technologies.

    PubMed

    Podgórski, Daniel; Majchrzycka, Katarzyna; Dąbrowska, Anna; Gralewicz, Grzegorz; Okrasa, Małgorzata

    2017-03-01

    Recent developments in domains of ambient intelligence (AmI), Internet of Things, cyber-physical systems (CPS), ubiquitous/pervasive computing, etc., have led to numerous attempts to apply ICT solutions in the occupational safety and health (OSH) area. A literature review reveals a wide range of examples of smart materials, smart personal protective equipment and other AmI applications that have been developed to improve workers' safety and health. Because the use of these solutions modifies work methods, increases complexity of production processes and introduces high dynamism into thus created smart working environments (SWE), a new conceptual framework for dynamic OSH management in SWE is called for. A proposed framework is based on a new paradigm of OSH risk management consisting of real-time risk assessment and the capacity to monitor the risk level of each worker individually. A rationale for context-based reasoning in SWE and a respective model of the SWE-dedicated CPS are also proposed.

  16. Countering the biggest risk of all.

    PubMed

    Slywotzky, Adrian J; Drzik, John

    2005-04-01

    Corporate treasurers and chief financial officers have become adept at quantifying and managing a wide variety of risks: financial (for example, currency fluctuations), hazard (chemical spills), and operational (computer system failures). To defend themselves, they use tried-and-true tools such as hedging, insurance, and backup systems. Some companies have even adopted the concept of enterprise risk management, integrating available risk management techniques in a comprehensive, organization-wide approach. But most managers have not addressed in a systematic way the greatest threat of all--strategic risks, the array of external events and trends that can devastate a company's growth trajectory and shareholder value. Strategic risks go beyond such familiar challenges as the possible failure of an acquisition or a product launch. A new technology may overtake your product. Gradual shifts in the market may slowly erode one of your brands beyond the point of viability. Or rapidly shifting customer priorities may suddenly change your industry. The key to surviving these strategic risks, the authors say, is knowing how to assess and respond to them. In this article, they lay out a method for identifying and responding to strategic threats. They categorize the risks into seven major classes (industry, technology, brand, competitor, customer, project, and stagnation) and describe a particularly dangerous example within each category. The authors also offer countermeasures to take against these risks and describe how individual companies (American Express, Coach, and Air Liquide, among them) have deployed them to neutralize a threat and, in many cases, capitalize on it. Besides limiting the downside of risk, strategic-risk management forces executives to think more systematically about the future, thus helping them identify opportunities for growth.

  17. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  18. Why projects often fail even with high cost contingencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard

    2002-02-28

    In this note we assume that the individual risks have been adequately quantified and the total project cost contingency adequately computed to ensure an agreed-to probability or confidence level that the total project cost estimate will not be exceeded. But even projects that implement such a process are likely to result in significant cost overruns and/or project failure if the project manager allocates the contingencies to the individual subsystems. The intuitive and mathematically valid solution is to maintain a project-wide contingency and to distribute it to the individual risks on an as-needed basis. Such an approach ensures cost-efficient risk management,more » and projects that implement it are more likely to succeed and to cost less. We illustrate these ideas using a simplified project with two independent risks. The formulation can readily be extended to multiple risks.« less

  19. Mobile computing in critical care.

    PubMed

    Lapinsky, Stephen E

    2007-03-01

    Handheld computing devices are increasingly used by health care workers, and offer a mobile platform for point-of-care information access. Improved technology, with larger memory capacity, higher screen resolution, faster processors, and wireless connectivity has broadened the potential roles for these devices in critical care. In addition to the personal information management functions, handheld computers have been used to access reference information, management guidelines and pharmacopoeias as well as to track the educational experience of trainees. They can act as an interface with a clinical information system, providing rapid access to patient information. Despite their popularity, these devices have limitations related to their small size, and acceptance by physicians has not been uniform. In the critical care environment, the risk of transmitting microorganisms by such a portable device should always be considered.

  20. Management of Risk and Uncertainty in Systems Acquisition: Proceedings of the 1983 Defense Risk and Uncertainty Workshop Held at Fort Belvoir, Virginia on 13-15 July 1983

    DTIC Science & Technology

    1983-07-15

    categories, however, represent the reality in major acquisition and are often overlooked. Although Figure 1 does not reflect tne dynamics and Interactions...networking and improved computer capabili- ties probabilistic network simulation became a reality . The Naval Sea Systems Command became involved in...reasons for using the WBS are plain: 1. Virtually all risk-prone activities are performed by the contractor, not Government. Government is responsible

  1. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  2. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less

  3. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. © 2015 Society for Risk Analysis.

  4. Online patient education and risk assessment: project OPERA from Cancerbackup. Putting inherited breast cancer risk information into context using argumentation theory.

    PubMed

    Mackay, James; Schulz, Peter; Rubinelli, Sara; Pithers, Andrea

    2007-08-01

    Many people are concerned about their family history of breast cancer, and are anxious about the possibility of developing breast cancer themselves. The majority of these people are likely not to be at significantly increased risk of developing inherited breast cancer. All women are at risk of developing sporadic breast cancer, and this risk increases with age. This project aims to aid people's understanding of these issues using an interactive online computer programme. The UK National Institute of Health and Clinical Excellence has published guidance for the National Health Service on the management of familial breast cancer. That guidance lays down clear criteria for categorising risk level and the appropriate management options. We have developed a user-friendly computer programme named OPERA (online patient education and risk assessment) which captures the individuality of the user's situation in a comprehensive way, and then produces personalised information packages, building on the theoretical framework of argumentation developed by Toulmin [Toulmin S. The uses of argument. Cambridge, MA: Cambridge University Press; 1958]. We will test this programme in a series of pilot studies commencing in 2007. This paper describes the progress of this project to date and focuses on the design of the programme. It is possible to construct a user friendly programme which delivers a personalised information package to individuals who are concerned about their risk of developing breast cancer. This user friendly programme needs to be tested within a series of carefully thought out pilot studies before it is ready for general release and use by the public.

  5. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Competency Index. [Health Technology Cluster.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This competency index lists the competencies included in the 62 units of the Tech Prep Competency Profiles within the Health Technologies Cluster. The unit topics are as follows: employability skills; professionalism; teamwork; computer literacy; documentation; infection control and risk management; medical terminology; anatomy, physiology, and…

  7. Prediction of Pathologic Fracture Risk in Activities of Daily Living and Rehabilitation of Patients With Metastatic Breast Carcinoma of the Pelvis and Femur

    DTIC Science & Technology

    2002-08-01

    demonstrated in three-dimensional graphics and animations. This computer model will aid in planning of non-operative or operative management, rehabilitation regimens, nursing programs, and patient education .

  8. Multi-stakeholder decision analysis and comparative risk assessment for reuse-recycle oriented e-waste management strategies: a game theoretic approach.

    PubMed

    Kaushal, Rajendra Kumar; Nema, Arvind K

    2013-09-01

    This article deals with assessment of the potential health risk posed by carcinogenic and non-carcinogenic substances, namely lead (Pb), cadmium (Cd), copper, chromium (CrVI), zinc, nickel and mercury, present in e-waste. A multi-objective, multi-stakeholder approach based on strategic game theory model has been developed considering cost, as well as human health risk. The trade-off due to cost difference between a hazardous substances-free (HSF) and a hazardous substance (HS)-containing desktop computer, and the risk posed by them at the time of disposal, has been analyzed. The cancer risk due to dust inhalation for workers at a recycling site in Bangalore for Pb, Cr(VI) and Cd was found to be 4, 33 and 101 in 1 million respectively. Pb and Cr(VI) result in a very high risk owing to dust ingestion at slums near the recycling site--175 and 81 in 1 million for children, and 24 and 11 in 1 million for adults respectively. The concentration of Pb at a battery workshop in Mayapuri, Delhi (hazard quotient = 3.178) was found to pose adverse health hazards. The government may impose an appropriate penalty on the land disposal of computer waste and/or may give an incentive to manufacturer for producing HSF computers through, for example, relaxing taxes, but there should be no such incentive for manufacturing HS-containing computers.

  9. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  10. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searchedmore » and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.« less

  12. Public health risk management case concerning the city of Isfahan according to a hypothetical release of HF from a chemical plant.

    PubMed

    Azari, Mansour R; Sadighzadeh, Asghar; Bayatian, Majid

    2018-06-19

    Accidents have happened in the chemical industries all over the world with serious consequences for the adjacent heavily populated areas. In this study, the impact of the probable hypothetical event, releasing considerable amounts of hydrogen fluoride (HF) as a strong irritant into the atmosphere over the city of Isfahan from a strategic chemical plant, was simulated by computational fluid dynamics (CFD). In this model, the meteorological parameters were integrated into time and space, and dispersion of the pollutants was estimated based on a probable accidental release of HF. According to the hypothetical results of the simulation model in this study, HF clouds reached Isfahan in 20 min and exposed 80% of the general public to HF concentration in the range of 0-34 ppm. Then, they dissipated 240 min after the time of the incident. Supposing the uniform population density within the proximity of the city of Isfahan with the population of 1.75 million, 5% of the population (87,500 people) could be exposed for a few minutes to a HF concentration as high as 34 ppm. This concentration is higher than a very hazardous concentration described as the Immediate Danger to Life and Health (30 ppm). This hypothetical risk evaluation of environmental exposure to HF with the potential of health risks was very instrumental for the general public of Isfahan in terms of risk management. Similar studies based on probable accidental scenarios along with the application of a simulation model for computation of dispersed pollutants are recommended for risk evaluation and management of cities in the developing countries with a fast pace of urbanization around the industrial sites.

  13. Paediatric mild head injury: is routine admission to a tertiary trauma hospital necessary?

    PubMed

    Tallapragada, Krishna; Peddada, Ratna Soundarya; Dexter, Mark

    2018-03-01

    Previous studies have shown that children with isolated linear skull fractures have excellent clinical outcomes and low risk of surgery. We wish to identify other injury patterns within the spectrum of paediatric mild head injury, which need only conservative management. Children with low risk of evolving neurosurgical lesions could be safely managed in primary hospitals. We retrospectively analysed all children with mild head injury (i.e. admission Glasgow coma score 13-15) and skull fracture or haematoma on a head computed tomography scan admitted to Westmead Children's Hospital, Sydney over the years 2009-2014. Data were collected regarding demographics, clinical findings, mechanism of injury, head computed tomography scan findings, neurosurgical intervention, outcome and length of admission. Wilcoxon paired test was used with P value <0.05 considered significant. Four hundred and ten children were analysed. Three hundred and eighty-one (93%) children were managed conservatively, 18 (4%) underwent evacuation of extradural haematoma (TBI surgery) and 11 (3%) needed fracture repair surgery. Two children evolved a surgical lesion 24 h post-admission. Only 17 of 214 children transferred from peripheral hospitals needed neurosurgery. Overall outcomes: zero deaths, one needed brain injury rehabilitation and 63 needed child protection unit intervention. Seventy-five percentage of children with non-surgical lesions were discharged within 2 days. Eighty-three percentage of road transfers were discharged within 3 days. Children with small intracranial haematomas and/or skull fractures who need no surgery only require brief inpatient symptomatic treatment and could be safely managed in primary hospitals. Improved tertiary hospital transfer guidelines with protocols to manage clinical deterioration could have cost benefit without risking patient safety. © 2017 Royal Australasian College of Surgeons.

  14. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  15. Recording pressure ulcer risk assessment and incidence.

    PubMed

    Plaskitt, Anne; Heywood, Nicola; Arrowsmith, Michaela

    2015-07-15

    This article reports on the introduction of an innovative computer-based system developed to record and report pressure ulcer risk and incidence at an acute NHS trust. The system was introduced to ensure that all patients have an early pressure ulcer risk assessment, which prompts staff to initiate appropriate management if a pressure ulcer is detected, thereby preventing further patient harm. Initial findings suggest that this electronic process has helped to improve the timeliness and accuracy of data on pressure ulcer risk and incidence. In addition, it has resulted in a reduced number of reported hospital-acquired pressure ulcers.

  16. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  17. Time Is Not Always the Matter: An Instance of Encapsulating Peritoneal Sclerosis Developing in a Patient on Peritoneal Dialysis for a Short Term.

    PubMed

    De Oleo, Radhames Ramos; Villanueva, Hugo; Lwin, Lin; Katikaneni, Madhavi; Yoo, Jinil

    Encapsulating peritoneal sclerosis (EPS) is an infrequent but serious complication that is observed mostly in patients on long-term peritoneal dialysis (PD). However it can occur after short-term PD, in association with "second hit" risk factors such as peritonitis, acute cessation of PD, or kidney transplantation with the use of calcineurin inhibitors.In our case, a young woman with second-hit risk factors presented with clinical and abdominal computed tomography findings consistent with EPS after short-term PD. She was treated conservatively with nutritional support and was discharged in improved and stable clinical status.In general, the diagnosis of EPS requires clinical findings of bowel obstruction combined with typical computed tomography imaging features. However, the clinical manifestations can be very vague, and the diagnosis is often unclear. A recent study categorized EPS into 4 clinical stages, from pre-EPS to chronic ileus, with associated management from conservative treatment to surgical intervention.In association with second-hit risk factors, EPS can occur after short-term PD. Severity is variable, and the outcome is often devastating. Timely recognition and expert management of EPS can change the outcome very favorably.

  18. Role of risk stratification by SPECT, PET, and hybrid imaging in guiding management of stable patients with ischaemic heart disease: expert panel of the EANM cardiovascular committee and EACVI.

    PubMed

    Acampa, Wanda; Gaemperli, Oliver; Gimelli, Alessia; Knaapen, Paul; Schindler, Thomas H; Verberne, Hein J; Zellweger, Michael J

    2015-12-01

    Risk stratification has become increasingly important in the management of patients with suspected or known ischaemic heart disease (IHD). Recent guidelines recommend that these patients have their care driven by risk assessment. The purpose of this position statement is to summarize current evidence on the value of cardiac single-photon emission computed tomography, positron emission tomography, and hybrid imaging in risk stratifying asymptomatic or symptomatic patients with suspected IHD, patients with stable disease, patients after coronary revascularization, heart failure patients, and specific patient population. In addition, this position statement evaluates the impact of imaging results on clinical decision-making and thereby its role in patient management. The document represents the opinion of the European Association of Nuclear Medicine (EANM) Cardiovascular Committee and of the European Association of Cardiovascular Imaging (EACVI) and intends to stimulate future research in this field. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  19. Managing Counterparty Risk in an Unstable Financial System

    ERIC Educational Resources Information Center

    Belmont, David

    2012-01-01

    The recent flow of headlines excoriating bankers and financiers for malfeasance, fraud, and collusion has been almost biblical in proportion. Counterparties that appeared creditworthy based on financial statements and ratings have revealed that they are impaired either due to computer errors, control failures, malfeasance, or potential regulatory…

  20. Traceability System For Agricultural Productsbased on Rfid and Mobile Technology

    NASA Astrophysics Data System (ADS)

    Sugahara, Koji

    In agriculture, it is required to establish and integrate food traceability systems and risk management systems in order to improve food safety in the entire food chain. The integrated traceability system for agricultural products was developed, based on innovative technology of RFID and mobile computing. In order to identify individual products on the distribution process efficiently,small RFID tags with unique ID and handy RFID readers were applied. On the distribution process, the RFID tags are checked by using the readers, and transit records of the products are stored to the database via wireless LAN.Regarding agricultural production, the recent issues of pesticides misuse affect consumer confidence in food safety. The Navigation System for Appropriate Pesticide Use (Nouyaku-navi) was developed, which is available in the fields by Internet cell-phones. Based on it, agricultural risk management systems have been developed. These systems collaborate with traceability systems and they can be applied for process control and risk management in agriculture.

  1. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    PubMed

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.

  2. Basic Microbiologic and Infection Control Information to Reduce the Potential Transmission of Pathogens to Patients via Computer Hardware

    PubMed Central

    Neely, Alice N.; Sittig, Dean F.

    2002-01-01

    Computer technology from the management of individual patient medical records to the tracking of epidemiologic trends has become an essential part of all aspects of modern medicine. Consequently, computers, including bedside components, point-of-care testing equipment, and handheld computer devices, are increasingly present in patients’ rooms. Recent articles have indicated that computer hardware, just as other medical equipment, may act as a reservoir for microorganisms and contribute to the transfer of pathogens to patients. This article presents basic microbiological concepts relative to infection, reviews the present literature concerning possible links between computer contamination and nosocomial colonizations and infections, discusses basic principles for the control of contamination, and provides guidelines for reducing the risk of transfer of microorganisms to susceptible patient populations. PMID:12223502

  3. Drought Risk Identification: Early Warning System of Seasonal Agrometeorological Drought

    NASA Astrophysics Data System (ADS)

    Dalecios, Nicolas; Spyropoulos, Nicos V.; Tarquis, Ana M.

    2014-05-01

    By considering drought as a hazard, drought types are classified into three categories, namely meteorological or climatological, agrometeorological or agricultural and hydrological drought and as a fourth class the socioeconomic impacts can be considered. This paper addresses agrometeorological drought affecting agriculture within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with the quantification and monitoring of agrometeorological drought, which constitute part of risk identification. For the quantitative assessment of agrometeorological or agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the Vegetation Health Index (VHI). The computation of VHI is based on satellite data of temperature and the Normalized Difference Vegetation Index (NDVI). The spatiotemporal features of drought, which are extracted from VHI are: areal extent, onset and end time, duration and severity. In this paper, a 20-year (1981-2001) time series of NOAA/AVHRR satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural region of Greece characterized by vulnerable and drought-prone agriculture. The results show that every year there is a seasonal agrometeorological drought with a gradual increase in the areal extent and severity with peaks appearing usually during the summer. Drought monitoring is conducted by monthly remotely sensed VHI images. Drought early warning is developed using empirical relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought, respectively. The two fitted curves offer a seasonal forecasting tool on a monthly basis from April till October each year. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential for seasonal agrometeorological drought. Key words: agrometeorological drought, risk identification, remote sensing.

  4. Predicting long-term performance of engineered geologic carbon dioxide storage systems to inform decisions amidst uncertainty

    NASA Astrophysics Data System (ADS)

    Pawar, R.

    2016-12-01

    Risk assessment and risk management of engineered geologic CO2 storage systems is an area of active investigation. The potential geologic CO2 storage systems currently under consideration are inherently heterogeneous and have limited to no characterization data. Effective risk management decisions to ensure safe, long-term CO2 storage requires assessing and quantifying risks while taking into account the uncertainties in a storage site's characteristics. The key decisions are typically related to definition of area of review, effective monitoring strategy and monitoring duration, potential of leakage and associated impacts, etc. A quantitative methodology for predicting a sequestration site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale geologic storage projects where projects will require quantitative assessments of potential long-term liabilities. An integrated assessment modeling (IAM) paradigm which treats a geologic CO2 storage site as a system made up of various linked subsystems can be used to predict long-term performance. The subsystems include storage reservoir, seals, potential leakage pathways (such as wellbores, natural fractures/faults) and receptors (such as shallow groundwater aquifers). CO2 movement within each of the subsystems and resulting interactions are captured through reduced order models (ROMs). The ROMs capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. The computational efficiency allows for performing Monte Carlo simulations necessary for quantitative probabilistic risk assessment. We have used the IAM to predict long-term performance of geologic CO2 sequestration systems and to answer questions related to probability of leakage of CO2 through wellbores, impact of CO2/brine leakage into shallow aquifer, etc. Answers to such questions are critical in making key risk management decisions. A systematic uncertainty quantification approach can been used to understand how uncertain parameters associated with different subsystems (e.g., reservoir permeability, wellbore cement permeability, wellbore density, etc.) impact the overall site performance predictions.

  5. Developing and Assessing E-Learning Techniques for Teaching Forecasting

    ERIC Educational Resources Information Center

    Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian

    2014-01-01

    In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…

  6. Information Security in a World of Global Connectivity: A Case Study

    ERIC Educational Resources Information Center

    Lawrence, Cameron; Olson, Garrett; Douma, Bambi

    2015-01-01

    The widespread use of digital technologies such as smartphones, tablets, and notebook computers expose firms engaged in international business to risks that far exceed what most corporate technology users understand. This case study examines some of the technology-specific vulnerabilities managers face when engaged in international travel and…

  7. Host culling as an adaptive management tool for chronic wasting disease in white-tailed deer: a modelling study.

    PubMed

    Wasserberg, Gideon; Osnas, Erik E; Rolley, Robert E; Samuel, Michael D

    2009-04-01

    Emerging wildlife diseases pose a significant threat to natural and human systems. Because of real or perceived risks of delayed actions, disease management strategies such as culling are often implemented before thorough scientific knowledge of disease dynamics is available. Adaptive management is a valuable approach in addressing the uncertainty and complexity associated with wildlife disease problems and can be facilitated by using a formal model.We developed a multi-state computer simulation model using age, sex, infection-stage, and seasonality as a tool for scientific learning and managing chronic wasting disease (CWD) in white-tailed deer Odocoileus virginianus. Our matrix model used disease transmission parameters based on data collected through disease management activities. We used this model to evaluate management issues on density- (DD) and frequency-dependent (FD) transmission, time since disease introduction, and deer culling on the demographics, epizootiology, and management of CWD.Both DD and FD models fit the Wisconsin data for a harvested white-tailed deer population, but FD was slightly better. Time since disease introduction was estimated as 36 (95% CI, 24-50) and 188 (41->200) years for DD and FD transmission, respectively. Deer harvest using intermediate to high non-selective rates can be used to reduce uncertainty between DD and FD transmission and improve our prediction of long-term epidemic patterns and host population impacts. A higher harvest rate allows earlier detection of these differences, but substantially reduces deer abundance.Results showed that CWD has spread slowly within Wisconsin deer populations, and therefore, epidemics and disease management are expected to last for decades. Non-hunted deer populations can develop and sustain a high level of infection, generating a substantial risk of disease spread. In contrast, CWD prevalence remains lower in hunted deer populations, but at a higher prevalence the disease competes with recreational hunting to reduce deer abundance.Synthesis and applications. Uncertainty about density- or frequency-dependent transmission hinders predictions about the long-term impacts of chronic wasting disease on cervid populations and the development of appropriate management strategies. An adaptive management strategy using computer modelling coupled with experimental management and monitoring can be used to test model predictions, identify the likely mode of disease transmission, and evaluate the risks of alternative management responses.

  8. Balancing Materiel Readiness Risks and Concurrency in Weapon System Acquisition: A Handbook for Program Managers

    DTIC Science & Technology

    1984-07-15

    ftViCCii UWNC COMMAND MIX CM AFT DCP OUTUNC moo ffEOUCST FOR PROGRAM DECISION DRAFT DCP AFSC REVIEW RECOWMEM CATIONS OHI*OC Arse wioc...CS.P3 F16. El*. P» MCA Exhibit 4-6b. EMBEDDED COMPUTER HARDWARE vs. SOFTWARE Exhibit 4-6c. DoD EMBEDDED COMPUTER MARKET 31.J1...the mix of stores carried by that vehicle 6. Anticipated combat tactics employed by the carrying or launching vehicle and its maneuvering

  9. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill-delineated fractions of protection zones. Within an illustrative simplified 2D synthetic test case, we demonstrate our concept, involving synthetic transmissivity and head measurements for conditioning. We demonstrate the worth of optimally collected data in the context of protection zone delineation by assessing the reduced areal demand of delineated area at user-specified risk acceptance level. Results indicate that, thanks to optimally collected data, risk-aware delineation can be made at low to moderate additional costs compared to conventional delineation strategies.

  10. Information management and information technologies: keys to professional and business success.

    PubMed

    Otten, K W

    1984-01-01

    Personal computers, spreadsheets, decision support software, electronic mail and video disks are just a few of the innovations of information technology which attract the attention of information professionals and managers alike: they are all concerned with the rapidly changing face of information technology and how to cope with a changing competitive environment, personally, and for the benefit of their companies. This paper is the first in a monthly series which tries to illuminate some of the factors and changes which shape our future as professionals and managers. In so doing, it guides and motivates the reader to become "information literate," a prerequisite for personal advancement in an information-based economy. This first paper outlines the relationship between technological innovations, use of information tools and information management and what to consider in order to benefit from the information revolution. It explains the risks of becoming professionally obsolete and alerts the reader to get personally involved to remain or become "information and computer literate."

  11. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  12. Natural hazard management high education: laboratory of hydrologic and hydraulic risk management and applied geomorphology

    NASA Astrophysics Data System (ADS)

    Giosa, L.; Margiotta, M. R.; Sdao, F.; Sole, A.; Albano, R.; Cappa, G.; Giammatteo, C.; Pagliuca, R.; Piccolo, G.; Statuto, D.

    2009-04-01

    The Environmental Engineering Faculty of University of Basilicata have higher-level course for students in the field of natural hazard. The curriculum provides expertise in the field of prediction, prevention and management of earthquake risk, hydrologic-hydraulic risk, and geomorphological risk. These skills will contribute to the training of specialists, as well as having a thorough knowledge of the genesis and the phenomenology of natural risks, know how to interpret, evaluate and monitor the dynamic of environment and of territory. In addition to basic training in the fields of mathematics and physics, the course of study provides specific lessons relating to seismic and structural dynamics of land, environmental and computational hydraulics, hydrology and applied hydrogeology. In particular in this course there are organized two connected examination arguments: Laboratory of hydrologic and hydraulic risk management and Applied geomorphology. These course foresee the development and resolution of natural hazard problems through the study of a real natural disaster. In the last year, the work project has regarded the collapse of two decantation basins of fluorspar, extracted from some mines in Stava Valley, 19 July 1985, northern Italy. During the development of the course, data and event information has been collected, a guided tour to the places of the disaster has been organized, and finally the application of mathematical models to simulate the disaster and analysis of the results has been carried out. The student work has been presented in a public workshop.

  13. Does albendazole affect seizure remission and computed tomography response in children with neurocysticercosis? A Systematic review and meta-analysis.

    PubMed

    Mazumdar, Maitreyi; Pandharipande, Pari; Poduri, Annapurna

    2007-02-01

    A recent trial suggested that albendazole reduces seizures in adults with neurocysticercosis. There is still no consensus regarding optimal management of neurocysticercosis in children. The authors conducted a systematic review and meta-analysis to assess the efficacy of albendazole in children with neurocysticercosis, by searching the Cochrane Databases, MEDLINE, EMBASE, and LILACS. Three reviewers extracted data using an intent-to-treat analysis. Random effects models were used to estimate relative risks. Four randomized trials were selected for meta-analysis, and 10 observational studies were selected for qualitative review. The relative risk of seizure remission in treatment versus control was 1.26 (1.09, 1.46). The relative risk of improvement in computed tomography in these trials was 1.15 (0.97, 1.36). Review of observational studies showed conflicting results, likely owing to preferential administration of albendazole to sicker children.

  14. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  15. Global computer-assisted appraisal of osteoporosis risk in Asian women: an innovative study.

    PubMed

    Chang, Shu F; Hong, Chin M; Yang, Rong S

    2011-05-01

    To develop a computer-assisted appraisal system of osteoporosis that can predict osteoporosis health risk in community-dwelling women and to use it in an empirical analysis of the risk in Asian women. As the literature indicates, health risk assessment tools are generally applied in clinical practice for patient diagnosis. However, few studies have explored how to assist community-dwelling women to understand the risk of osteoporosis without invasive data. A longitudinal, evidence-based study. The first stage of this study is to establish a system that combines expertise in nursing, medicine and information technology. This part includes information from random samples (n = 700), including data on bone mineral density, osteoporosis risk factors, knowledge, beliefs and behaviour, which are used as the health risk appraisal system database. The second stage is to apply an empirical study. The relative risks of osteoporosis of the participants (n = 300) were determined with the system. The participants that were classified as at-risk were randomly grouped into experimental and control groups. Each group was treated using different nursing intervention methods. The sensitivity and specificity of the analytical tools was 75%. In empirical study, analysis results indicate that the prevalence of osteoporosis was 14.0%. Data indicate that strategic application of multiple nursing interventions can promote osteoporosis prevention knowledge in high-risk women and enhance the effectiveness of preventive action. The system can also provide people in remote areas or with insufficient medical resources a simple and effective means of managing health risk and implement the idea of self-evaluation and self-caring among community-dwelling women at home to achieve the final goal of early detection and early treatment of osteoporosis. This study developed a useful approach for providing Asia women with a reliable, valid, convenient and economical self-health management model. Health care professionals can explore the use of advanced information systems and nursing interventions to increase the effectiveness of osteoporosis prevention programmes for women. © 2011 Blackwell Publishing Ltd.

  16. Effect of a computer-guided, quality improvement program for cardiovascular disease risk management in primary health care: the treatment of cardiovascular risk using electronic decision support cluster-randomized trial.

    PubMed

    Peiris, David; Usherwood, Tim; Panaretto, Kathryn; Harris, Mark; Hunt, Jennifer; Redfern, Julie; Zwar, Nicholas; Colagiuri, Stephen; Hayman, Noel; Lo, Serigne; Patel, Bindu; Lyford, Marilyn; MacMahon, Stephen; Neal, Bruce; Sullivan, David; Cass, Alan; Jackson, Rod; Patel, Anushka

    2015-01-01

    Despite effective treatments to reduce cardiovascular disease risk, their translation into practice is limited. Using a parallel arm cluster-randomized controlled trial in 60 Australian primary healthcare centers, we tested whether a multifaceted quality improvement intervention comprising computerized decision support, audit/feedback tools, and staff training improved (1) guideline-indicated risk factor measurements and (2) guideline-indicated medications for those at high cardiovascular disease risk. Centers had to use a compatible software system, and eligible patients were regular attendees (Aboriginal and Torres Strait Islander people aged ≥ 35 years and others aged ≥ 45 years). Patient-level analyses were conducted using generalized estimating equations to account for clustering. Median follow-up for 38,725 patients (mean age, 61.0 years; 42% men) was 17.5 months. Mean monthly staff support was <1 hour/site. For the coprimary outcomes, the intervention was associated with improved overall risk factor measurements (62.8% versus 53.4% risk ratio; 1.25; 95% confidence interval, 1.04-1.50; P=0.02), but there was no significant differences in recommended prescriptions for the high-risk cohort (n=10,308; 56.8% versus 51.2%; P=0.12). There were significant treatment escalations (new prescriptions or increased numbers of medicines) for antiplatelet (17.9% versus 2.7%; P<0.001), lipid-lowering (19.2% versus 4.8%; P<0.001), and blood pressure-lowering medications (23.3% versus 12.1%; P=0.02). In Australian primary healthcare settings, a computer-guided quality improvement intervention, requiring minimal support, improved cardiovascular disease risk measurement but did not increase prescription rates in the high-risk group. Computerized quality improvement tools offer an important, albeit partial, solution to improving primary healthcare system capacity for cardiovascular disease risk management. https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=336630. Australian New Zealand Clinical Trials Registry No. 12611000478910. © 2015 American Heart Association, Inc.

  17. Risk-cost-benefit analysis of atrazine in drinking water from agricultural activities and policy implications

    NASA Astrophysics Data System (ADS)

    Tesfamichael, Aklilu A.; Caplan, Arthur J.; Kaluarachchi, Jagath J.

    2005-05-01

    This study provides an improved methodology for investigating the trade-offs between the health risks and economic benefits of using atrazine in the agricultural sector by incorporating public attitude to pesticide management in the analysis. Regression models are developed to predict finished water atrazine concentration in high-risk community water supplies in the United States. The predicted finished water atrazine concentrations are then used in a health risk assessment. The computed health risks are compared with the total economic surplus in the U.S. corn market for different atrazine application rates using estimated demand and supply functions developed in this work. Analysis of different scenarios with consumer price premiums for chemical-free and reduced-chemical corn indicate that if the society is willing to pay a price premium, risks can be reduced without a large reduction in the total economic surplus and net benefits may be higher. The results also show that this methodology provides an improved scientific framework for future decision making and policy evaluation in pesticide management.

  18. Computer algorithms and applications used to assist the evaluation and treatment of adolescent idiopathic scoliosis: a review of published articles 2000-2009.

    PubMed

    Phan, Philippe; Mezghani, Neila; Aubin, Carl-Éric; de Guise, Jacques A; Labelle, Hubert

    2011-07-01

    Adolescent idiopathic scoliosis (AIS) is a complex spinal deformity whose assessment and treatment present many challenges. Computer applications have been developed to assist clinicians. A literature review on computer applications used in AIS evaluation and treatment has been undertaken. The algorithms used, their accuracy and clinical usability were analyzed. Computer applications have been used to create new classifications for AIS based on 2D and 3D features, assess scoliosis severity or risk of progression and assist bracing and surgical treatment. It was found that classification accuracy could be improved using computer algorithms that AIS patient follow-up and screening could be done using surface topography thereby limiting radiation and that bracing and surgical treatment could be optimized using simulations. Yet few computer applications are routinely used in clinics. With the development of 3D imaging and databases, huge amounts of clinical and geometrical data need to be taken into consideration when researching and managing AIS. Computer applications based on advanced algorithms will be able to handle tasks that could otherwise not be done which can possibly improve AIS patients' management. Clinically oriented applications and evidence that they can improve current care will be required for their integration in the clinical setting.

  19. Impact of computer-assisted data collection, evaluation and management on the cancer genetic counselor's time providing patient care.

    PubMed

    Cohen, Stephanie A; McIlvried, Dawn E

    2011-06-01

    Cancer genetic counseling sessions traditionally encompass collecting medical and family history information, evaluating that information for the likelihood of a genetic predisposition for a hereditary cancer syndrome, conveying that information to the patient, offering genetic testing when appropriate, obtaining consent and subsequently documenting the encounter with a clinic note and pedigree. Software programs exist to collect family and medical history information electronically, intending to improve efficiency and simplicity of collecting, managing and storing this data. This study compares the genetic counselor's time spent in cancer genetic counseling tasks in a traditional model and one using computer-assisted data collection, which is then used to generate a pedigree, risk assessment and consult note. Genetic counselor time spent collecting family and medical history and providing face-to-face counseling for a new patient session decreased from an average of 85-69 min when using the computer-assisted data collection. However, there was no statistically significant change in overall genetic counselor time on all aspects of the genetic counseling process, due to an increased amount of time spent generating an electronic pedigree and consult note. Improvements in the computer program's technical design would potentially minimize data manipulation. Certain aspects of this program, such as electronic collection of family history and risk assessment, appear effective in improving cancer genetic counseling efficiency while others, such as generating an electronic pedigree and consult note, do not.

  20. Position of document holder and work related risk factors for neck pain among computer users: a narrative review.

    PubMed

    Ambusam, S; Baharudin, O; Roslizawati, N; Leonard, J

    2015-01-01

    Document holder is used as a remedy to address occupational neck pain among computer users. An understanding on the effects of the document holder along with other work related risk factors while working in computer workstation requires attention. A comprehensive knowledge on the optimal location of the document holder in computer use and associated work related factors that may contribute to neck pain reviewed in this article. A literature search has been conducted over the past 14 years based on the published articles from January 1990 to January 2014 in both Science Direct and PubMed databases. Medical Subject Headings (MeSH) keywords for search were neck muscle OR head posture OR muscle tension' OR muscle activity OR work related disorders OR neck pain AND/OR document location OR document holder OR source document OR copy screen holder.Document holder placed lateral to the screen was most preferred to reduce neck discomfort among occupational typists. Document without a holder was placed flat on the surface is least preferred. The head posture and muscle activity increases when the document is placed flat on the surface compared to when placed on the document holder. Work related factors such as static posture, repetitive movement, prolong sitting and awkward positions were the risk factors for chronic neck pain. This review highlights the optimal location for document holder for computer users to reduce neck pain. Together, the importance of work related risk factors for to neck pain on occupational typist is emphasized for the clinical management.

  1. 17 CFR 1.17 - Minimum financial requirements for futures commission merchants and introducing brokers.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... internal risk management control system of the futures commission merchant; a description of how the... section; and (ii)(A) The readily marketable collateral is in the possession or control of the applicant or... accepted accounting principles. For the purposes of computing “net capital”, the term “liabilities”: (i...

  2. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.

  3. Public Risk Assessment Program

    NASA Technical Reports Server (NTRS)

    Mendeck, Gavin

    2010-01-01

    The Public Entry Risk Assessment (PERA) program addresses risk to the public from shuttle or other spacecraft re-entry trajectories. Managing public risk to acceptable levels is a major component of safe spacecraft operation. PERA is given scenario inputs of vehicle trajectory, probability of failure along that trajectory, the resulting debris characteristics, and field size and distribution, and returns risk metrics that quantify the individual and collective risk posed by that scenario. Due to the large volume of data required to perform such a risk analysis, PERA was designed to streamline the analysis process by using innovative mathematical analysis of the risk assessment equations. Real-time analysis in the event of a shuttle contingency operation, such as damage to the Orbiter, is possible because PERA allows for a change to the probability of failure models, therefore providing a much quicker estimation of public risk. PERA also provides the ability to generate movie files showing how the entry risk changes as the entry develops. PERA was designed to streamline the computation of the enormous amounts of data needed for this type of risk assessment by using an average distribution of debris on the ground, rather than pinpointing the impact point of every piece of debris. This has reduced the amount of computational time significantly without reducing the accuracy of the results. PERA was written in MATLAB; a compiled version can run from a DOS or UNIX prompt.

  4. A Systems Engineering Framework for Implementing a Security and Critical Patch Management Process in Diverse Environments (Academic Departments' Workstations)

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hadi

    Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to create and design a Security and Critical Patch Management Process (SCPMP) framework based on Systems Engineering (SE) principles. This framework will assist Information Technology Department Staff (ITDS) to reduce IT operating time and costs and mitigate the risk of security and vulnerability attacks. Further, this study evaluates implementation of the SCPMP in the networked computing systems of an academic environment in order to: 1. Meet patch management requirements by applying SE principles. 2. Reduce the cost of IT operations and PVM cycles. 3. Improve the current PVM methodologies to prevent networked computing systems from becoming the targets of security vulnerability attacks. 4. Embed a Maintenance Optimization Tool (MOT) in the proposed framework. The MOT allows IT managers to make the most practicable choice of methods for deploying and installing released patches and vulnerability remediation. In recent years, there has been a variety of frameworks for security practices in every networked computing system to protect computer workstations from becoming compromised or vulnerable to security attacks, which can expose important information and critical data. I have developed a new mechanism for implementing PVM for maximizing security-vulnerability maintenance, protecting OS and software packages, and minimizing SCPMP cost. To increase computing system security in any diverse environment, particularly in academia, one must apply SCPMP. I propose an optimal maintenance policy that will allow ITDS to measure and estimate the variation of PVM cycles based on their department's requirements. My results demonstrate that MOT optimizes the process of implementing SCPMP in academic workstations.

  5. Assistive technologies for self-managed pressure ulcer prevention in spinal cord injury: a scoping review.

    PubMed

    Tung, James Y; Stead, Brent; Mann, William; Ba'Pham; Popovic, Milos R

    2015-01-01

    Pressure ulcers (PUs) in individuals with spinal cord injury (SCI) present a persistent and costly problem. Continuing effort in developing new technologies that support self-managed care is an important prevention strategy. Specifically, the aims of this scoping review are to review the key concepts and factors related to self-managed prevention of PUs in individuals with SCI and appraise the technologies available to assist patients in self-management of PU prevention practices. There is broad consensus that sustaining long-term adherence to prevention regimens is a major concern. Recent literature highlights the interactions between behavioral and physiological risk factors. We identify four technology categories that support self-management: computer-based educational technologies demonstrated improved short-term gains in knowledge (2 studies), interface pressure mapping technologies demonstrated improved adherence to pressure-relief schedules up to 3 mo (5 studies), electrical stimulation confirmed improvements in tissue tolerance after 8 wk of training (3 studies), and telemedicine programs demonstrated improvements in independence and reduced hospital visits over 6 mo (2 studies). Overall, self-management technologies demonstrated low-to-moderate effectiveness in addressing a subset of risk factors. However, the effectiveness of technologies in preventing PUs is limited due to a lack of incidence reporting. In light of the key findings, we recommend developing integrated technologies that address multiple risk factors.

  6. Radiation dose management for pediatric cardiac computed tomography: a report from the Image Gently 'Have-A-Heart' campaign.

    PubMed

    Rigsby, Cynthia K; McKenney, Sarah E; Hill, Kevin D; Chelliah, Anjali; Einstein, Andrew J; Han, B Kelly; Robinson, Joshua D; Sammet, Christina L; Slesnick, Timothy C; Frush, Donald P

    2018-01-01

    Children with congenital or acquired heart disease can be exposed to relatively high lifetime cumulative doses of ionizing radiation from necessary medical imaging procedures including radiography, fluoroscopic procedures including diagnostic and interventional cardiac catheterizations, electrophysiology examinations, cardiac computed tomography (CT) studies, and nuclear cardiology examinations. Despite the clinical necessity of these imaging studies, the related ionizing radiation exposure could pose an increased lifetime attributable cancer risk. The Image Gently "Have-A-Heart" campaign is promoting the appropriate use of medical imaging studies in children with congenital or acquired heart disease while minimizing radiation exposure. The focus of this manuscript is to provide a comprehensive review of radiation dose management and CT performance in children with congenital or acquired heart disease.

  7. Participatory video-assisted evaluation of truck drivers' work outside cab: deliveries in two types of transport.

    PubMed

    Reiman, Arto; Pekkala, Janne; Väyrynen, Seppo; Putkonen, Ari; Forsman, Mikael

    2014-01-01

    The aim of this study was to identify risks and ergonomics discomfort during work of local and short haul delivery truck drivers outside a cab. The study used a video- and computer-based method (VIDAR). VIDAR is a participatory method identifying demanding work situations and their potential risks. The drivers' work was videoed and analysed by subjects and ergonomists. Delivery truck drivers should not be perceived as one group with equal risks because there were significant differences between the 2 types of transportation and specific types of risks. VIDAR produces visual material for risk management processes. VIDAR as a participatory approach stimulates active discussion about work-related risks and discomfort, and about possibilities for improvement. VIDAR may be also applied to work which comprises different working environments.

  8. Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.

    This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalizedmore » cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.« less

  9. The CEOS Global Observation Strategy for Disaster Risk Management: An Enterprise Architect's View

    NASA Astrophysics Data System (ADS)

    Moe, K.; Evans, J. D.; Frye, S.

    2013-12-01

    The Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS), on behalf of the Global Earth Observation System of Systems (GEOSS), is defining an enterprise architecture (known as GA.4.D) for the use of satellite observations in international disaster management. This architecture defines the scope and structure of the disaster management enterprise (based on disaster types and phases); its processes (expressed via use cases / system functions); and its core values (in particular, free and open data sharing via standard interfaces). The architecture also details how a disaster management enterprise describes, obtains, and handles earth observations and data products for decision-support; and how it draws on distributed computational services for streamlined operational capability. We have begun to apply this architecture to a new CEOS initiative, the Global Observation Strategy for Disaster Risk Management (DRM). CEOS is defining this Strategy based on the outcomes of three pilot projects focused on seismic hazards, volcanoes, and floods. These pilots offer a unique opportunity to characterize and assess the impacts (benefits / costs) of the GA.4.D architecture in practice. In particular, the DRM Floods Pilot is applying satellite-based optical and radar data to flood mitigation, warning, and response, including monitoring and modeling at regional to global scales. It is focused on serving user needs and building local institutional / technical capacity in the Caribbean, Southern Africa, and Southeast Asia. In the context of these CEOS DRM Pilots, we are characterizing where and how the GA.4D architecture helps participants to: - Understand the scope and nature of hazard events quickly and accurately - Assure timely delivery of observations into analysis, modeling, and decision-making - Streamline user access to products - Lower barriers to entry for users or suppliers - Streamline or focus field operations in disaster reduction - Reduce redundancies and gaps in inter-organizational systems - Assist in planning / managing / prioritizing information and computing resources - Adapt computational resources to new technologies or evolving user needs - Sustain capability for the long term Insights from this exercise are helping us to abstract best practices applicable to other contexts, disaster types, and disaster phases, whereby local communities can improve their use of satellite data for greater preparedness. This effort is also helping to assess the likely impacts and roles of emerging technologies (such as cloud computing, "Big Data" analysis, location-based services, crowdsourcing, semantic services, small satellites, drones, direct broadcast, or model webs) in future disaster management activities.

  10. ICARUSS, the Integrated Care for the Reduction of Secondary Stroke trial: rationale and design of a randomized controlled trial of a multimodal intervention to prevent recurrent stroke in patients with a recent cerebrovascular event, ACTRN = 12611000264987.

    PubMed

    Joubert, J; Davis, S M; Hankey, G J; Levi, C; Olver, J; Gonzales, G; Donnan, G A

    2015-07-01

    The majority of strokes, both ischaemic and haemorrhagic, are attributable to a relatively small number of risk factors which are readily manageable in primary care setting. Implementation of best-practice recommendations for risk factor management is calculated to reduce stroke recurrence by around 80%. However, risk factor management in stroke survivors has generally been poor at primary care level. A model of care that supports long-term effective risk factor management is needed. To determine whether the model of Integrated Care for the Reduction of Recurrent Stroke (ICARUSS) will, through promotion of implementation of best-practice recommendations for risk factor management reduce the combined incidence of stroke, myocardial infarction and vascular death in patients with recent stroke or transient ischaemic attack (TIA) of the brain or eye. A prospective, Australian, multicentre, randomized controlled trial. Academic stroke units in Melbourne, Perth and the John Hunter Hospital, New South Wales. 1000 stroke survivors recruited as from March 2007 with a recent (<3 months) stroke (ischaemic or haemorrhagic) or a TIA (brain or eye). Randomization and data collection are performed by means of a central computer generated telephone system (IVRS). Exposure to the ICARUSS model of integrated care or usual care. The composite of stroke, MI or death from any vascular cause, whichever occurs first. Risk factor management in the community, depression, quality of life, disability and dementia. With 1000 patients followed up for a median of one-year, with a recurrence rate of 7-10% per year in patients exposed to usual care, the study will have at least 80% power to detect a significant reduction in primary end-points The ICARUSS study aims to recruit and follow up patients between 2007 and 2013 and demonstrate the effectiveness of exposure to the ICARUSS model in stroke survivors to reduce recurrent stroke or vascular events and promote the implementation of best practice risk factor management at primary care level. © 2015 World Stroke Organization.

  11. Short-term Forecasting Tools for Agricultural Nutrient Management.

    PubMed

    Easton, Zachary M; Kleinman, Peter J A; Buda, Anthony R; Goering, Dustin; Emberston, Nichole; Reed, Seann; Drohan, Patrick J; Walter, M Todd; Guinan, Pat; Lory, John A; Sommerlot, Andrew R; Sharpley, Andrew

    2017-11-01

    The advent of real-time, short-term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high-performance computing and hydrologic or climate modeling have enabled rapid dissemination of real-time information that can assist landowners and conservation personnel with short-term management planning. This paper reviews short-term decision support tools for agriculture that are under various stages of development and implementation in the United States: (i) Wisconsin's Runoff Risk Advisory Forecast (RRAF) System, (ii) New York's Hydrologically Sensitive Area Prediction Tool, (iii) Virginia's Saturated Area Forecast Model, (iv) Pennsylvania's Fertilizer Forecaster, (v) Washington's Application Risk Management (ARM) System, and (vi) Missouri's Design Storm Notification System. Although these decision support tools differ in their underlying model structure, the resolution at which they are applied, and the hydroclimates to which they are relevant, all provide forecasts (range 24-120 h) of runoff risk or soil moisture saturation derived from National Weather Service Forecast models. Although this review highlights the need for further development of robust and well-supported short-term nutrient management tools, their potential for adoption and ultimate utility requires an understanding of the appropriate context of application, the strategic and operational needs of managers, access to weather forecasts, scales of application (e.g., regional vs. field level), data requirements, and outreach communication structure. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  12. Examining Behavioral Consultation plus Computer-Based Implementation Planning on Teachers' Intervention Implementation in an Alternative School

    ERIC Educational Resources Information Center

    Long, Anna C. J.; Sanetti, Lisa M. Hagermoser; Lark, Catherine R.; Connolly, Jennifer J. G.

    2018-01-01

    Students who demonstrate the most challenging behaviors are at risk of school failure and are often placed in alternative schools, in which a primary goal is remediating behavioral and academic concerns to facilitate students' return to their community school. Consistently implemented evidence-based classroom management is necessary toward this…

  13. A municipal forest report card: Results for California, USA

    Treesearch

    E.Gregory McPherson; Louren Kotow

    2013-01-01

    This study integrates two existing computer programs, the Pest Vulnerability Matrix and i-Tree Streets, into a decision-support tool for assessing municipal forest stability and recommending strategies to mitigate risk of loss. A report card concept was developed to communicate levels of performance in terms that managers and the public easily understand. Grades were...

  14. Significance of screening electrocardiogram before the initiation of amitriptyline therapy in children with functional abdominal pain.

    PubMed

    Patra, Kamakshya P; Sankararaman, Senthilkumar; Jackson, Robert; Hussain, Sunny Z

    2012-09-01

    Amitriptyline (AMT) is commonly used in the management of children with irritable bowel syndrome. AMT is pro-arrhythmogenic and increases the risk of sudden cardiac death. However, there is not enough data regarding the cardiac toxicity in therapeutic doses of AMT in children and the need for screening electrocardiogram (EKG). Errors in computer EKG interpretation are not uncommon. In a risk-prevention study, the authors sought to identify the true incidence of prolonged corrected QT (QTc) interval and other arrhythmias in children with irritable bowel syndrome before the initiation of AMT. Out of the 760 EKGs screened, 3 EKGs demonstrated a true prolonged QTc after the careful manual reading by a pediatric cardiologist and they were not picked by computer-generated reading. The authors conclude that screening EKG should always be performed on children before initiating AMT therapy. Also, the computer-generated EKG needs to be verified by a pediatric cardiologist to avoid serious misinterpretations.

  15. Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Lyver, John W., IV; Bui, Chinh T.

    2011-01-01

    Risk assessment is used in many industries to identify and manage risks. Initially developed for use on aeronautical and nuclear systems, risk assessment has been applied to transportation, chemical, computer, financial, and security systems among others. It is used to gain an understanding of the weaknesses or vulnerabilities in a system so modification can be made to increase operability, efficiency, and safety and to reduce failure and down-time. Risk assessment results are primary inputs to risk-informed decision making; where risk information including uncertainty is used along with other pertinent information to assist management in the decision-making process. Therefore, to be useful, a risk assessment must be directed at specific objectives. As the world embraces the globalization of trade and manufacturing, understanding the associated risk become important to decision making. Applying risk assessment techniques to a global system of development, manufacturing, and transportation can provide insight into how the system can fail, the likelihood of system failure and the consequences of system failure. The risk assessment can identify those elements that contribute most to risk and identify measures to prevent and mitigate failures, disruptions, and damaging outcomes. In addition, risk associated with public and environment impact can be identified. The risk insights gained can be applied to making decisions concerning suitable development and manufacturing locations, supply chains, and transportation strategies. While risk assessment has been mostly applied to mechanical and electrical systems, the concepts and techniques can be applied across other systems and activities. This paper provides a basic overview of the development of a risk assessment.

  16. Flood Forecasting in Wales: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    How, Andrew; Williams, Christopher

    2015-04-01

    With steep, fast-responding river catchments, exposed coastal reaches with large tidal ranges and large population densities in some of the most at-risk areas; flood forecasting in Wales presents many varied challenges. Utilising advances in computing power and learning from best practice within the United Kingdom and abroad have seen significant improvements in recent years - however, many challenges still remain. Developments in computing and increased processing power comes with a significant price tag; greater numbers of data sources and ensemble feeds brings a better understanding of uncertainty but the wealth of data needs careful management to ensure a clear message of risk is disseminated; new modelling techniques utilise better and faster computation, but lack the history of record and experience gained from the continued use of more established forecasting models. As a flood forecasting team we work to develop coastal and fluvial forecasting models, set them up for operational use and manage the duty role that runs the models in real time. An overview of our current operational flood forecasting system will be presented, along with a discussion on some of the solutions we have in place to address the challenges we face. These include: • real-time updating of fluvial models • rainfall forecasting verification • ensemble forecast data • longer range forecast data • contingency models • offshore to nearshore wave transformation • calculation of wave overtopping

  17. Medical imaging and computers in the diagnosis of breast cancer

    NASA Astrophysics Data System (ADS)

    Giger, Maryellen L.

    2014-09-01

    Computer-aided diagnosis (CAD) and quantitative image analysis (QIA) methods (i.e., computerized methods of analyzing digital breast images: mammograms, ultrasound, and magnetic resonance images) can yield novel image-based tumor and parenchyma characteristics (i.e., signatures that may ultimately contribute to the design of patient-specific breast cancer management plans). The role of QIA/CAD has been expanding beyond screening programs towards applications in risk assessment, diagnosis, prognosis, and response to therapy as well as in data mining to discover relationships of image-based lesion characteristics with genomics and other phenotypes; thus, as they apply to disease states. These various computer-based applications are demonstrated through research examples from the Giger Lab.

  18. Integrated risk/cost planning models for the US Air Traffic system

    NASA Technical Reports Server (NTRS)

    Mulvey, J. M.; Zenios, S. A.

    1985-01-01

    A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.

  19. Gas Migration Project: Risk Assessment Tool and Computational Analyses to Investigate Wellbore/Mine Interactions, Secretary's Potash Area, Southeastern New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobolik, Steven R.; Hadgu, Teklu; Rechard, Robert P.

    The Bureau of Land Management (BLM), US Department of the Interior has asked Sandia National Laboratories (SNL) to perform scientific studies relevant to technical issues that arise in the development of co-located resources of potash and petroleum in southeastern New Mexico in the Secretary’s Potash Area. The BLM manages resource development, issues permits and interacts with the State of New Mexico in the process of developing regulations, in an environment where many issues are disputed by industry stakeholders. The present report is a deliverable of the study of the potential for gas migration from a wellbore to a mine openingmore » in the event of wellbore leakage, a risk scenario about which there is disagreement among stakeholders and little previous site specific analysis. One goal of this study was to develop a framework that required collaboratively developed inputs and analytical approaches in order to encourage stakeholder participation and to employ ranges of data values and scenarios. SNL presents here a description of a basic risk assessment (RA) framework that will fulfill the initial steps of meeting that goal. SNL used the gas migration problem to set up example conceptual models, parameter sets and computer models and as a foundation for future development of RA to support BLM resource development.« less

  20. Managing industrial risk--having a tested and proven system to prevent and assess risk.

    PubMed

    Heller, Stephen

    2006-03-17

    Some relatively easy techniques exist to improve the risk picture/profile to aid in preventing losses. Today with the advent of computer system resources, focusing on specific aspects of risk through systematic scoring and comparison, the risk analysis can be relatively easy to achieve. Techniques like these demonstrate how working experience and common sense can be combined mathematically into a flexible risk management tool or risk model for analyzing risk. The risk assessment methodology provided by companies today is no longer the ideas and practices of one group or even one company. It is reflective of the practice of many companies, as well as the ideas and expertise of academia and government regulators. The use of multi-criteria decision making (MCDM) techniques for making critical decisions has been recognized for many years for a variety of purposes. In today's computer age, the easy accessing and user-friendly nature for using these techniques, makes them a favorable choice for use in the risk assessment environment. The new user of these methodologies should find many ideas directly applicable to his or her needs when approaching risk decision making. The user should find their ideas readily adapted, with slight modification, to accurately reflect a specific situation using MCDM techniques. This makes them an attractive feature for use in assessment and risk modeling. The main advantage of decision making techniques, such as MCDM, is that in the early stages of a risk assessment, accurate data on industrial risk, and failures are lacking. In most cases, it is still insufficient to perform a thorough risk assessment using purely statistical concepts. The practical advantages towards deviating from strict data-driven protocol seem to outweigh the drawbacks. Industry failure data often comes at a high cost when a loss occurs. We can benefit from this unfortunate acquisition of data through the continuous refining of our decisions by incorporating this new information into our assessments. MCDM techniques offer flexibility in accessing comparison within broad data sets to reflect our best estimation of their importance towards contribution to the risk picture. This allows for the accurate determination of the more probable and more consequential issues. This can later be refined using more intensive risk techniques and the avoidance of less critical issues.

  1. Self managing experiment resources

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Ubeda, M.; Tsaregorodtsev, A.; Romanovskiy, V.; Roiser, S.; Charpentier, P.; Graciani, R.

    2014-06-01

    Within this paper we present an autonomic Computing resources management system, used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous resources. For example, LHC experiments very often use resources not provided by WLCG, and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware, addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System (Resource Status System) delivering real time information, the system controls the resources topology, independently of the resource types. The Resource Status System applies data mining techniques against all possible information sources available and assesses the status changes, that are then propagated to the topology description. Obviously, giving full control to such an automated system is not risk-free. Therefore, in order to minimise the probability of misbehavior, a battery of tests has been developed in order to certify the correctness of its assessments. We will demonstrate the performance and efficiency of such a system in terms of cost reduction and reliability.

  2. Work stress and risk factors for health management trainees in canakkale, Turkey.

    PubMed

    Tanışman, Beyhan; Cevizci, Sibel; Çelik, Merve; Sevim, Sezgin

    2014-10-01

    This study aims to investigate the general mental health situation, work-related stress and risk factors of health management trainees. This cross-sectional study was conducted on Health Management Musters students (N=96) in Canakkale Onsekiz Mart University Health Sciences Institute, May-June 2014. A total of 58 students who voluntarily participated in the study were reached (60.42%). Participants completed a 22-question sociodemographic survey form and a 12-item General Health Questionnaire in a face-to-face interview. Data were analyzed using the SPSS software version 20.0. The average age of participants was 36.4±6.2 (Min:24-Max:62) years. Thirty five of the participants were female (60.3%), 23 were male (39.7%). The number of people using cigarettes and alcohol were 23 (39.7%) and 9 (15.8%) respectively. In our study group according to GHQ scale 32 people (55.2%) were in the group at risk of depression. Eighty-six percent of participants reported experiencing work stress. The most frequently reported sources of stress were superiors (56.8%), work itself (41.3%), and work colleagues (25.8%). There was no significant difference between those at risk of depression and those not at risk in terms of gender, marital status, educational level, age, work-related factors (daily work, computer use, duration of sitting at desk), sleep duration, presence of chronic disease, substance use (cigarettes, alcohol), regular exercise, regular meals, fast-food consumption, sufficient family time and vacations (p>0.05). Our study results indicated that majority of participants reported experiencing work stress with more than half at high risk of developing depression. The most reported risk factors were superiors, the work itself and colleagues in the present study. Psychosocial risk factors at work environment should be investigated in terms of psychological, sociological and ergonomics in more detail to reduce the risk of health management trainees experiencing work stress and mental health problems.

  3. Work Stress and Risk Factors For Health Management Trainees in Canakkale, Turkey

    PubMed Central

    Tanışman, Beyhan; Cevizci, Sibel; Çelik, Merve; Sevim, Sezgin

    2014-01-01

    Aim: This study aims to investigate the general mental health situation, work-related stress and risk factors of health management trainees. Methods: This cross-sectional study was conducted on Health Management Musters students (N=96) in Canakkale Onsekiz Mart University Health Sciences Institute, May-June 2014. A total of 58 students who voluntarily participated in the study were reached (60.42%). Participants completed a 22-question sociodemographic survey form and a 12-item General Health Questionnaire in a face-to-face interview. Data were analyzed using the SPSS software version 20.0. Results: The average age of participants was 36.4±6.2 (Min:24-Max:62) years. Thirty five of the participants were female (60.3%), 23 were male (39.7%). The number of people using cigarettes and alcohol were 23 (39.7%) and 9 (15.8%) respectively. In our study group according to GHQ scale 32 people (55.2%) were in the group at risk of depression. Eighty-six percent of participants reported experiencing work stress. The most frequently reported sources of stress were superiors (56.8%), work itself (41.3%), and work colleagues (25.8%). There was no significant difference between those at risk of depression and those not at risk in terms of gender, marital status, educational level, age, work-related factors (daily work, computer use, duration of sitting at desk), sleep duration, presence of chronic disease, substance use (cigarettes, alcohol), regular exercise, regular meals, fast-food consumption, sufficient family time and vacations (p>0.05). Conclusions: Our study results indicated that majority of participants reported experiencing work stress with more than half at high risk of developing depression. The most reported risk factors were superiors, the work itself and colleagues in the present study. Psychosocial risk factors at work environment should be investigated in terms of psychological, sociological and ergonomics in more detail to reduce the risk of health management trainees experiencing work stress and mental health problems. PMID:25568633

  4. Computer model predicting breakthrough febrile urinary tract infection in children with primary vesicoureteral reflux.

    PubMed

    Arlen, Angela M; Alexander, Siobhan E; Wald, Moshe; Cooper, Christopher S

    2016-10-01

    Factors influencing the decision to surgically correct vesicoureteral reflux (VUR) include risk of breakthrough febrile urinary tract infection (fUTI) or renal scarring, and decreased likelihood of spontaneous resolution. Improved identification of children at risk for recurrent fUTI may impact management decisions, and allow for more individualized VUR management. We have developed and investigated the accuracy of a multivariable computational model to predict probability of breakthrough fUTI in children with primary VUR. Children with primary VUR and detailed clinical and voiding cystourethrogram (VCUG) data were identified. Patient demographics, VCUG findings including grade, laterality, and bladder volume at onset of VUR, UTI history, presence of bladder-bowel dysfunction (BBD), and breakthrough fUTI were assessed. The VCUG dataset was randomized into a training set of 288 with a separate representational cross-validation set of 96. Various model types and architectures were investigated using neUROn++, a set of C++ programs. Two hundred fifty-five children (208 girls, 47 boys) diagnosed with primary VUR at a mean age of 3.1 years (±2.6) met all inclusion criteria. A total 384 VCUGs were analyzed. Median follow-up was 24 months (interquartile range 12-52 months). Sixty-eight children (26.7%) experienced 90 breakthrough fUTI events. Dilating VUR, reflux occurring at low bladder volumes, BBD, and history of multiple infections/fUTI were associated with breakthrough fUTI (Table). A 2-hidden node neural network model had the best fit with a receiver operating characteristic curve area of 0.755 for predicting breakthrough fUTI. The risk of recurrent febrile infections, renal parenchymal scarring, and likelihood of spontaneous resolution, as well as parental preference all influence management of primary VUR. The genesis of UTI is multifactorial, making precise prediction of an individual child's risk of breakthrough fUTI challenging. Demonstrated risk factors for UTI include age, gender, VUR grade, reflux at low bladder volume, BBD, and UTI history. We developed a prognostic calculator using a multivariable model with 76% accuracy that can be deployed for availability on the Internet, allowing input variables to be entered to calculate the odds of an individual child developing a breakthrough fUTI. A computational model using multiple variables including bladder volume at onset of VUR provides individualized prediction of children at risk for breakthrough fUTI. A web-based prognostic calculator based on this model will provide a useful tool for assessing personalized risk of breakthrough fUTI in children with primary VUR. Copyright © 2016 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  5. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  6. Space shuttle low cost/risk avionics study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.

  7. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  8. A pilot audit of a protocol for ambulatory investigation of predicted low-risk patients with possible pulmonary embolism.

    PubMed

    McDonald, A H; Murphy, R

    2011-09-01

    Patients with possible pulmonary embolism (PE) commonly present to acute medical services. Research has led to the identification of low-risk patients suitable for ambulatory management. We report on a protocol designed to select low-risk patients for ambulatory investigation if confirmatory imaging is not available that day. The protocol was piloted in the Emergency Department and Medical Assessment Area at the Royal Infirmary of Edinburgh. We retrospectively analysed electronic patient records in an open observational audit of all patients managed in the ambulatory arm over five months of use. We analysed 45 patients' records. Of these, 91.1% required imaging to confirm or refute PE, 62.2% received a computed tomography pulmonary angiogram (CTPA). In 25% of patients, PE was confirmed with musculoskeletal pain (22.7%), and respiratory tract infection (15.9%) the next most prevalent diagnoses. Alternative diagnoses was provided by CTPA in 32% of cases. We identified no adverse events or readmissions but individualised follow-up was not attempted. The data from this audit suggests this protocol can be applied to select and manage low-risk patients suitable for ambulatory investigation of possible PE. A larger prospective comparative study would be required to accurately define the safety and effectiveness of this protocol.

  9. A Semantic Approach with Decision Support for Safety Service in Smart Home Management

    PubMed Central

    Huang, Xiaoci; Yi, Jianjun; Zhu, Xiaomin; Chen, Shaoli

    2016-01-01

    Research on smart homes (SHs) has increased significantly in recent years because of the convenience provided by having an assisted living environment. The functions of SHs as mentioned in previous studies, particularly safety services, are seldom discussed or mentioned. Thus, this study proposes a semantic approach with decision support for safety service in SH management. The focus of this contribution is to explore a context awareness and reasoning approach for risk recognition in SH that enables the proper decision support for flexible safety service provision. The framework of SH based on a wireless sensor network is described from the perspective of neighbourhood management. This approach is based on the integration of semantic knowledge in which a reasoner can make decisions about risk recognition and safety service. We present a management ontology for a SH and relevant monitoring contextual information, which considers its suitability in a pervasive computing environment and is service-oriented. We also propose a rule-based reasoning method to provide decision support through reasoning techniques and context-awareness. A system prototype is developed to evaluate the feasibility, time response and extendibility of the approach. The evaluation of our approach shows that it is more effective in daily risk event recognition. The decisions for service provision are shown to be accurate. PMID:27527170

  10. A Semantic Approach with Decision Support for Safety Service in Smart Home Management.

    PubMed

    Huang, Xiaoci; Yi, Jianjun; Zhu, Xiaomin; Chen, Shaoli

    2016-08-03

    Research on smart homes (SHs) has increased significantly in recent years because of the convenience provided by having an assisted living environment. The functions of SHs as mentioned in previous studies, particularly safety services, are seldom discussed or mentioned. Thus, this study proposes a semantic approach with decision support for safety service in SH management. The focus of this contribution is to explore a context awareness and reasoning approach for risk recognition in SH that enables the proper decision support for flexible safety service provision. The framework of SH based on a wireless sensor network is described from the perspective of neighbourhood management. This approach is based on the integration of semantic knowledge in which a reasoner can make decisions about risk recognition and safety service. We present a management ontology for a SH and relevant monitoring contextual information, which considers its suitability in a pervasive computing environment and is service-oriented. We also propose a rule-based reasoning method to provide decision support through reasoning techniques and context-awareness. A system prototype is developed to evaluate the feasibility, time response and extendibility of the approach. The evaluation of our approach shows that it is more effective in daily risk event recognition. The decisions for service provision are shown to be accurate.

  11. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  12. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  13. Quantifying the impact of seasonal and short-term manure application decisions on phosphorus loss in surface runoff

    USDA-ARS?s Scientific Manuscript database

    Agricultural phosphorus (P) management is a pressing research and policy issue due to concerns about P loss from fields and water quality degradation. Better information is especially needed on the risk of P loss from dairy manure applied to fields in winter. We used the SurPhos computer model to as...

  14. Using Mobile Health (mHealth) Technology in the Management of Diabetes Mellitus, Physical Inactivity, and Smoking.

    PubMed

    Rehman, Hasan; Kamal, Ayeesha K; Sayani, Saleem; Morris, Pamela B; Merchant, Anwar T; Virani, Salim S

    2017-04-01

    Cardiovascular mortality remains high due to insufficient progress made in managing cardiovascular risk factors such as diabetes mellitus, physical inactivity, and smoking. Healthy lifestyle choices play an important role in the management of these modifiable risk factors. Mobile health or mHealth is defined as the use of mobile computing and communication technologies (i.e., mobile phones, wearable sensors) for the delivery of health services and health-related information. In this review, we examine some recent studies that utilized mHealth tools to improve management of these risk factors, with examples from developing countries where available. The mHealth intervention used depends on the availability of resources. While developing countries are often restricted to text messages, more resourceful settings are shifting towards mobile phone applications and wearable technology. Diabetes mellitus has been extensively studied in different settings, and results have been encouraging. Tools utilized to increase physical activity are expensive, and studies have been limited to resource-abundant areas and have shown mixed results. Smoking cessation has had promising initial results with the use of technology, but mHealth's ability to recruit participants beyond those actively seeking to quit has not been established. mHealth interventions appear to be a potential tool in improving control of cardiovascular risk factors that rely on individuals making healthy lifestyle choices. Data related to clinical impact, if any, of commercially available tools is lacking. More studies are needed to assess interventions that target multiple cardiovascular risk factors and their impact on hard cardiovascular outcomes.

  15. Natural history of splenic vascular abnormalities after blunt injury: A Western Trauma Association multicenter trial.

    PubMed

    Zarzaur, Ben L; Dunn, Julie A; Leininger, Brian; Lauerman, Margaret; Shanmuganathan, Kathirkamanthan; Kaups, Krista; Zamary, Kirellos; Hartwell, Jennifer L; Bhakta, Ankur; Myers, John; Gordy, Stephanie; Todd, Samuel R; Claridge, Jeffrey A; Teicher, Erik; Sperry, Jason; Privette, Alicia; Allawi, Ahmed; Burlew, Clay Cothren; Maung, Adrian A; Davis, Kimberly A; Cogbill, Thomas; Bonne, Stephanie; Livingston, David H; Coimbra, Raul; Kozar, Rosemary A

    2017-12-01

    Following blunt splenic injury, there is conflicting evidence regarding the natural history and appropriate management of patients with vascular injuries of the spleen such as pseudoaneurysms or blushes. The purpose of this study was to describe the current management and outcomes of patients with pseudoaneurysm or blush. Data were collected on adult (aged ≥18 years) patients with blunt splenic injury and a splenic vascular injury from 17 trauma centers. Demographic, physiologic, radiographic, and injury characteristics were gathered. Management and outcomes were collected. Univariate and multivariable analyses were used to determine factors associated with splenectomy. Two hundred patients with a vascular abnormality on computed tomography scan were enrolled. Of those, 14.5% were managed with early splenectomy. Of the remaining patients, 59% underwent angiography and embolization (ANGIO), and 26.5% were observed. Of those who underwent ANGIO, 5.9% had a repeat ANGIO, and 6.8% had splenectomy. Of those observed, 9.4% had a delayed ANGIO, and 7.6% underwent splenectomy. There were no statistically significant differences between those observed and those who underwent ANGIO. There were 111 computed tomography scans with splenic vascular injuries available for review by an expert trauma radiologist. The concordance between the original classification of the type of vascular abnormality and the expert radiologist's interpretation was 56.3%. Based on expert review, the presence of an actively bleeding vascular injury was associated with a 40.9% risk of splenectomy. This was significantly higher than those with a nonbleeding vascular injury. In this series, the vast majority of patients are managed with ANGIO and usually embolization, whereas splenectomy remains a rare event. However, patients with a bleeding vascular injury of the spleen are at high risk of nonoperative failure, no matter the strategy used for management. This group may warrant closer observation or an alternative management strategy. Prognostic study, level III.

  16. Towards reducing thrombogenicity of LVAD therapy: optimizing surgical and patient management strategies

    NASA Astrophysics Data System (ADS)

    Chivukula, Venkat Keshav; Lafzi, Ali; Mokadam, Nahush; Beckman, Jennifer; Mahr, Claudius; Aliseda, Alberto

    2017-11-01

    Unfavourable hemodynamics in heart failure patients implanted with left ventricular assist devices (LVAD), due to non-optimal surgical configurations and patient management, strongly influence thrombogenicity. This is consistent with the increase in devastating thromboembolic complications (specifically thrombosis and stroke) in patients, even as the risk of thrombosis inside the device decreases with modern designs. Inflow cannula and outflow graft surgical configurations have been optimized via patient-specific modeling that computes the thrombogenic potential with a combination of Eulerian (endothelial) wall shear stress and Lagrangian (platelet shear history) tracking. Using this view of hemodynamics, the benefits of intermittent aortic valve opening (promoting washout and reducing stagnant flow in the aortic valve region) have been assessed in managing the patient's residual native cardiac output. The use of this methodology to understand the contribution of the hemodynamics in the flow surrounding the LVAD itself to thrombogenesis show promise in developing holistic patient-specific management strategies to minimize stroke risk and enhance efficacy of LVAD therapy. Funded in part by an AHA postdoctoral fellowship 16POST30520004.

  17. Integrated approach for managing health risks at work--the role of occupational health nurses.

    PubMed

    Marinescu, Luiza G

    2007-02-01

    Currently, many organizations are using a department-centered approach to manage health risks at work. In such a model, segregated departments are providing employee benefits such as health insurance, workers' compensation, and short- and long-term disability or benefits addressing work-life issues. In recent years, a new model has emerged: health and productivity management (HPM). This is an employee-centered, integrated approach, designed to increase efficiency, reduce competition for scarce resources, and increase employee participation in prevention activities. Evidence suggests that corporations using integrated HPM programs achieve better health outcomes for their employees, with consequent increased productivity and decreased absenteeism. Occupational health nurses are well positioned to assume leadership roles in their organizations by coordinating efforts and programs across departments that offer health, wellness, and safety benefits. To assume their role as change agents to improve employees' health, nurses should start using the language of business more often by improving their communication skills, computer skills, and ability to quantify and articulate results of programs and services to senior management.

  18. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  19. Prediction of Symptomatic Embolism in Filipinos With Infective Endocarditis Using the Embolic Risk French Calculator

    PubMed Central

    Aherrera, Jaime Alfonso M.; Abola, Maria Teresa B.; Balabagno, Maria Margarita O.; Abrahan, Lauro L.; Magno, Jose Donato A.; Reganit, Paul Ferdinand M.; Punzalan, Felix Eduardo R.

    2016-01-01

    Background Cardioembolic events are life-threatening complications of infective endocarditis (IE). The embolic risk French calculator estimates the embolic risk in IE computed on admission. Variables in this tool include age, diabetes, atrial fibrillation, prior embolism, vegetation length, and Staphylococcus aureus on culture. A computed risk of > 7% was considered high in the development of this tool. Knowledge of this risk applied in our local setting is important to guide clinicians in preventing such catastrophic complications. Among patients with IE, we aim to determine the efficacy of the embolic risk French calculator, using a computed score of > 7%, in predicting major embolic events. Methods All adults admitted from 2013 to 2016 with definite IE were included. The risk for embolic events was computed on admission. All were monitored for the duration of admission for the occurrence of the primary outcome (any major embolic event: arterial emboli, intracranial hemorrhage, pulmonary infarcts, or aneurysms). Secondary outcomes were: 1) composite of death and embolic events; and 2) death from any cause. Results Eighty-seven adults with definite IE were included. Majority had a valvular heart disease and preserved ejection fraction (EF). The mitral valve was most commonly involved. Embolic events occurred in 25 (29%). Multivariate analysis identified a high embolic score > 7% (relative risk (RR): 15.12, P < 0.001), vegetation area ≥ 18 mm2 (RR: 6.39, P < 0.01), and a prior embolism (RR: 5.18, P = 0.018) to be independent predictors of embolic events. For the composite of embolic events and death, independent predictors include a high score of > 7% (RR: 13.56, P < 0.001) and a prior embolus (RR: 13.75, P = 0.002). Independent predictors of death were a high score > 7% (RR: 6.20, P = 0.003) and EF ≤ 45% (RR: 9.91, P = 0.004). Conclusion Cardioembolic events are more prevalent in our study compared to previous data. The embolic risk French calculator is a useful tool to estimate and predict risk for embolic events and in-hospital mortality. The risk of developing embolic events should be weighed against the risks of early preventive cardiac surgery, as to institute timely and appropriate management. PMID:28197281

  20. Computer-assisted versus oral-and-written dietary history taking for diabetes mellitus.

    PubMed

    Wei, Igor; Pappas, Yannis; Car, Josip; Sheikh, Aziz; Majeed, Azeem

    2011-12-07

    Diabetes is a chronic illness characterised by insulin resistance or deficiency, resulting in elevated glycosylated haemoglobin A1c (HbA1c) levels. Diet and adherence to dietary advice is associated with lower HbA1c levels and control of disease. Dietary history may be an effective clinical tool for diabetes management and has traditionally been taken by oral-and-written methods, although it can also be collected using computer-assisted history taking systems (CAHTS). Although CAHTS were first described in the 1960s, there remains uncertainty about the impact of these methods on dietary history collection, clinical care and patient outcomes such as quality of life.  To assess the effects of computer-assisted versus oral-and-written dietary history taking on patient outcomes for diabetes mellitus. We searched The Cochrane Library (issue 6, 2011), MEDLINE (January 1985 to June 2011), EMBASE (January 1980 to June 2011) and CINAHL (January 1981 to June 2011). Reference lists of obtained articles were also pursued further and no limits were imposed on languages and publication status. Randomised controlled trials of computer-assisted versus oral-and-written history taking in patients with diabetes mellitus. Two authors independently scanned the title and abstract of retrieved articles. Potentially relevant articles were investigated as full text. Studies that met the inclusion criteria were abstracted for relevant population and intervention characteristics with any disagreements resolved by discussion, or by a third party. Risk of bias was similarly assessed independently. Of the 2991 studies retrieved, only one study with 38 study participants compared the two methods of history taking over a total of eight weeks. The authors found that as patients became increasingly familiar with using CAHTS, the correlation between patients' food records and computer assessments improved. Reported fat intake decreased in the control group and increased when queried by the computer. The effect of the intervention on the management of diabetes mellitus and blood glucose levels was not reported. Risk of bias was considered moderate for this study. Based on one small study judged to be of moderate risk of bias, we tentatively conclude that CAHTS may be well received by study participants and potentially offer time saving in practice. However, more robust studies with larger sample sizes are needed to confirm these. We cannot draw on any conclusions in relation to any other clinical outcomes at this stage.

  1. [Risk and risk management in aviation].

    PubMed

    Müller, Manfred

    2004-10-01

    RISK MANAGEMENT: The large proportion of human errors in aviation accidents suggested the solution--at first sight brilliant--to replace the fallible human being by an "infallible" digitally-operating computer. However, even after the introduction of the so-called HITEC-airplanes, the factor human error still accounts for 75% of all accidents. Thus, if the computer is ruled out as the ultimate safety system, how else can complex operations involving quick and difficult decisions be controlled? OPTIMIZED TEAM INTERACTION/PARALLEL CONNECTION OF THOUGHT MACHINES: Since a single person is always "highly error-prone", support and control have to be guaranteed by a second person. The independent work of mind results in a safety network that more efficiently cushions human errors. NON-PUNITIVE ERROR MANAGEMENT: To be able to tackle the actual problems, the open discussion of intervened errors must not be endangered by the threat of punishment. It has been shown in the past that progress is primarily achieved by investigating and following up mistakes, failures and catastrophes shortly after they happened. HUMAN FACTOR RESEARCH PROJECT: A comprehensive survey showed the following result: By far the most frequent safety-critical situation (37.8% of all events) consists of the following combination of risk factors: 1. A complication develops. 2. In this situation of increased stress a human error occurs. 3. The negative effects of the error cannot be corrected or eased because there are deficiencies in team interaction on the flight deck. This means, for example, that a negative social climate has the effect of a "turbocharger" when a human error occurs. It needs to be pointed out that a negative social climate is not identical with a dispute. In many cases the working climate is burdened without the responsible person even noticing it: A first negative impression, too much or too little respect, contempt, misunderstandings, not expressing unclear concern, etc. can considerably reduce the efficiency of a team.

  2. Potential economic value of drought information to support early warning in Africa

    NASA Astrophysics Data System (ADS)

    Quiroga, S.; Iglesias, A.; Diz, A.; Garrote, L.

    2012-04-01

    We present a methodology to estimate the economic value of advanced climate information for food production in Africa under climate change scenarios. The results aim to facilitate better choices in water resources management. The methodology includes 4 sequential steps. First two contrasting management strategies (with and without early warning) are defined. Second, the associated impacts of the management actions are estimated by calculating the effect of drought in crop productivity under climate change scenarios. Third, the optimal management option is calculated as a function of the drought information and risk aversion of potential information users. Finally we use these optimal management simulations to compute the economic value of enhanced water allocation rules to support stable food production in Africa. Our results show how a timely response to climate variations can help reduce loses in food production. The proposed framework is developed within the Dewfora project (Early warning and forecasting systems to predict climate related drought vulnerability and risk in Africa) that aims to improve the knowledge on drought forecasting, warning and mitigation, and advance the understanding of climate related vulnerability to drought and to develop a prototype operational forecasting.

  3. Management of occupational health risks in small-animal veterinary practices.

    PubMed

    D'Souza, Eva; Barraclough, Richard; Fishwick, David; Curran, Andrew

    2009-08-01

    Small-animal work is a major element of veterinary practice in the UK and may be hazardous, with high levels of work-related injuries and ill-health reported in Australia and USA. There are no studies addressing the management of occupational health risks arising from small-animal work in the UK. To investigate the sources of health and safety information used and how health and safety and 12 specific occupational health risks are managed by practices. A cross-sectional postal survey of all small-animal veterinary practices in Hampshire. A response was mandatory as this was a Health & Safety Executive (HSE) inspection activity. A total of 118 (100%) practices responded of which 93 were eligible for inclusion. Of these, 99 and 86%, respectively, were aware of the Royal College of Veterinary Surgeons (RCVS) practice standards and had British Small Animal Veterinary Association (BSAVA) staff members, while only 51% had previous contact with HSE (publications, advice and visit). Ninety per cent had health and safety policies, but only 31% had trained responsible staff in health and safety. Specific health hazards such as occupational allergens and computer use were relatively overlooked both by practices and the RCVS/BSAVA guidance available in 2002. Failings in active health risk management systems could be due to a lack of training to ensure competence in those with responsibilities. Practices rely on guidance produced by their professional bodies. Current RCVS guidance, available since 2005, has remedied some previous omissions, but further improvements are recommended.

  4. Impact of concomitant trauma in the management of blunt splenic injuries.

    PubMed

    Lo, Albert; Matheson, Anne-Marie; Adams, Dave

    2004-09-10

    Conservative management of isolated blunt splenic injuries has become widely accepted for haemodynamically stable patients, but may be untenable in those with multiple injuries. A retrospective review was performed to evaluate of our cumulative experience with non-operative management of splenic injuries, and to identify the risk factors for operative management. Eighty patients were identified. Demographics, mechanism of injury, injury severity score (ISS), clinical signs at presentation, utility of computed tomography scans and methods of treatment (operative management vs conservative management) were documented and statistically analysed to identify predictors for operative management. Initially, 45 patients (56%) were managed without operation, while 35 patients underwent urgent laparotomy - with 26 (74% in operative group) of these having splenectomy performed. Two patients (out of 45) failed conservative management and required delayed splenectomy, a 96% success rate for intended conservative management. Thus, overall rates of 54% non-operative management and 65% splenic conservation were achieved. The mean ISS of the operative management group (ISS=30) was higher than that of the non-operative treatment group (ISS=13, p<0.05), reflecting not only the grade of the splenic injury but also the severity of concomitant trauma. Risk factors for patients with blunt splenic injuries requiring operative management include ISS > or =16, hypotension, GCS < or =13, and requirement for blood transfusion (p<0.05). Appropriate patient selection is the most important element of non-operative management. Patients with splenic injuries who are haemodynamically stable can be managed non-operatively with acceptable outcome. However, in the presence of concomitant trauma, there is an increasing trend towards operative management.

  5. Cutaneous nocardiosis in two dogs receiving ciclosporin therapy for the management of canine atopic dermatitis.

    PubMed

    Siak, Meng K; Burrows, Amanda K

    2013-08-01

    Ciclosporin is a calcineurin inhibitor that is currently registered for the treatment of canine atopic dermatitis. The most common adverse effects include mild, transient gastrointestinal disturbances. Single case reports of opportunistic infections due to Nocardia spp., Neospora spp. and papillomaviruses have also been reported. Clinicians should be aware of the potential risk of systemic immunosuppression and subsequent infection with Nocardia spp. in dogs receiving ciclosporin. Cutaneous nocardiosis in two dogs receiving ciclosporin therapy for management of canine atopic dermatitis. Histopathology, PCR for Nocardia spp. and computed tomography. One dog developed disseminated nocardiosis due to Nocardia brasiliensis and a second dog developed localized cutaneous nocardiosis due to a novel Nocardia species subsequent to ciclosporin administration at the recommended dose rate for the management of canine atopic dermatitis. The second case was receiving a combination of ciclosporin and ketoconazole, and serum trough ciclosporin levels were elevated. Clinicians should be aware of the potential risk of systemic immunosuppression and subsequent infection with Nocardia spp. in dogs receiving ciclosporin. Measurement of serum ciclosporin levels may be useful in identifying those individuals which are at risk of opportunistic infections. © 2013 ESVD and ACVD.

  6. Designing an agricultural vegetative waste-management system under uncertain prices of treatment-technology output products.

    PubMed

    Broitman, D; Raviv, O; Ayalon, O; Kan, I

    2018-05-01

    Setting up a sustainable agricultural vegetative waste-management system is a challenging investment task, particularly when markets for output products of waste-treatment technologies are not well established. We conduct an economic analysis of possible investments in treatment technologies of agricultural vegetative waste, while accounting for fluctuating output prices. Under a risk-neutral approach, we find the range of output-product prices within which each considered technology becomes most profitable, using average final prices as the exclusive factor. Under a risk-averse perspective, we rank the treatment technologies based on their computed certainty-equivalent profits as functions of the coefficient of variation of the technologies' output prices. We find the ranking of treatment technologies based on average prices to be robust to output-price fluctuations provided that the coefficient of variation of the output prices is below about 0.4, that is, approximately twice as high as that of well-established recycled-material markets such as glass, paper and plastic. We discuss some policy implications that arise from our analysis regarding vegetative waste management and its associated risks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. The Adoption of Cloud Computing in the Field of Genomics Research: The Influence of Ethical and Legal Issues

    PubMed Central

    Charlebois, Kathleen; Palmour, Nicole; Knoppers, Bartha Maria

    2016-01-01

    This study aims to understand the influence of the ethical and legal issues on cloud computing adoption in the field of genomics research. To do so, we adapted Diffusion of Innovation (DoI) theory to enable understanding of how key stakeholders manage the various ethical and legal issues they encounter when adopting cloud computing. Twenty semi-structured interviews were conducted with genomics researchers, patient advocates and cloud service providers. Thematic analysis generated five major themes: 1) Getting comfortable with cloud computing; 2) Weighing the advantages and the risks of cloud computing; 3) Reconciling cloud computing with data privacy; 4) Maintaining trust and 5) Anticipating the cloud by creating the conditions for cloud adoption. Our analysis highlights the tendency among genomics researchers to gradually adopt cloud technology. Efforts made by cloud service providers to promote cloud computing adoption are confronted by researchers’ perpetual cost and security concerns, along with a lack of familiarity with the technology. Further underlying those fears are researchers’ legal responsibility with respect to the data that is stored on the cloud. Alternative consent mechanisms aimed at increasing patients’ control over the use of their data also provide a means to circumvent various institutional and jurisdictional hurdles that restrict access by creating siloed databases. However, the risk of creating new, cloud-based silos may run counter to the goal in genomics research to increase data sharing on a global scale. PMID:27755563

  8. The Adoption of Cloud Computing in the Field of Genomics Research: The Influence of Ethical and Legal Issues.

    PubMed

    Charlebois, Kathleen; Palmour, Nicole; Knoppers, Bartha Maria

    2016-01-01

    This study aims to understand the influence of the ethical and legal issues on cloud computing adoption in the field of genomics research. To do so, we adapted Diffusion of Innovation (DoI) theory to enable understanding of how key stakeholders manage the various ethical and legal issues they encounter when adopting cloud computing. Twenty semi-structured interviews were conducted with genomics researchers, patient advocates and cloud service providers. Thematic analysis generated five major themes: 1) Getting comfortable with cloud computing; 2) Weighing the advantages and the risks of cloud computing; 3) Reconciling cloud computing with data privacy; 4) Maintaining trust and 5) Anticipating the cloud by creating the conditions for cloud adoption. Our analysis highlights the tendency among genomics researchers to gradually adopt cloud technology. Efforts made by cloud service providers to promote cloud computing adoption are confronted by researchers' perpetual cost and security concerns, along with a lack of familiarity with the technology. Further underlying those fears are researchers' legal responsibility with respect to the data that is stored on the cloud. Alternative consent mechanisms aimed at increasing patients' control over the use of their data also provide a means to circumvent various institutional and jurisdictional hurdles that restrict access by creating siloed databases. However, the risk of creating new, cloud-based silos may run counter to the goal in genomics research to increase data sharing on a global scale.

  9. Splenic injury from colonoscopy: a review and management guidelines.

    PubMed

    Ghevariya, Vishal; Kevorkian, Noubar; Asarian, Armand; Anand, Sury; Krishnaiah, Mahesh

    2011-07-01

    Splenic injury is an uncommon complication of colonoscopy. Less than 100 cases are reported in the English language literature. The exact mechanism of injury to the spleen during colonoscopy is unknown; various authors propose several risk factors and possible mechanisms. Splenic injury can be graded or classified according to the extent of laceration and the severity of the resultant hematoma. The management options range from observation to emergency splenectomy. Computed tomography scan is the most important imaging modality to diagnose splenic injury. Early recognition and appropriate management is of paramount importance in the management of this condition. A high index of suspicion in a patient with persistent abdominal pain after colonoscopy is key especially when a perforated viscous is ruled out. This article outlines the clinical presentation of splenic injury after colonoscopy and delineates a management algorithm.

  10. Transformation in the pharmaceutical industry--a systematic review of the literature.

    PubMed

    Shafiei, Nader; Ford, James L; Morecroft, Charles W; Lisboa, Paulo J; Taylor, Mark J; Mouzughi, Yusra

    2013-01-01

    The evolutionary development of pharmaceutical transformation was studied through systematic review of the literature. Fourteen triggers were identified that will affect the pharmaceutical business, regulatory science, and enabling technologies in future years. The relative importance ranking of the transformation triggers was computed based on their prevalence within the articles studied. The four main triggers with the strongest literature evidence were Fully Integrated Pharma Network, Personalized Medicine, Translational Research, and Pervasive Computing. The theoretical quality risks for each of the four main transformation triggers are examined, and the remaining ten triggers are described. The pharmaceutical industry is currently going through changes that affect the way it performs its research, manufacturing, and regulatory activities (this is termed pharmaceutical transformation). The impact of these changes on the approaches to quality risk management requires more understanding. In this paper, a comprehensive review of the academic, regulatory, and industry literature were used to identify 14 triggers that influence pharmaceutical transformation. The four main triggers, namely Fully Integrated Pharma Network, Personalized Medicine, Translational Research, and Pervasive Computing, were selected as the most important based on the strength of the evidence found during the literature review activity described in this paper. Theoretical quality risks for each of the four main transformation triggers are examined, and the remaining ten triggers are described.

  11. Software for pest-management science: computer models and databases from the United States Department of Agriculture-Agricultural Research Service.

    PubMed

    Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D

    2003-01-01

    We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.

  12. System security in the space flight operations center

    NASA Technical Reports Server (NTRS)

    Wagner, David A.

    1988-01-01

    The Space Flight Operations Center is a networked system of workstation-class computers that will provide ground support for NASA's next generation of deep-space missions. The author recounts the development of the SFOC system security policy and discusses the various management and technology issues involved. Particular attention is given to risk assessment, security plan development, security implications of design requirements, automatic safeguards, and procedural safeguards.

  13. A Comparison of Computer-Assisted and Self-Management Programs for Reducing Alcohol Use among Students in First Year Experience Courses

    ERIC Educational Resources Information Center

    Lane, David J.; Lindemann, Dana F.; Schmidt, James A.

    2012-01-01

    The National Institute of Alcohol Abuse and Alcoholism has called for the use of evidence-based approaches to address high-risk drinking prevalent on many college campuses. In line with this recommendation, the present study evaluated the efficacy of two evidence-based approaches to reducing alcohol use. One hundred and three college students in…

  14. Online trust, trustworthiness, or assurance?

    PubMed

    Cheshire, Coye

    2011-01-01

    Every day, individuals around the world retrieve, share, and exchange information on the Internet. We interact online to share personal information, find answers to questions, make financial transactions, play social games, and maintain professional and personal relationships. Sometimes our online interactions take place between two or more humans. In other cases, we rely on computers to manage information on our behalf. In each scenario, risk and uncertainty are essential for determining possible actions and outcomes. This essay highlights common deficiencies in our understanding of key concepts such as trust, trustworthiness, cooperation, and assurance in online environments. Empirical evidence from experimental work in computer-mediated environments underscores the promises and perils of overreliance on security and assurance structures as replacements for interpersonal trust. These conceptual distinctions are critical because the future shape of the Internet will depend on whether we build assurance structures to limit and control ambiguity or allow trust to emerge in the presence of risk and uncertainty.

  15. Cardiac magnetic resonance imaging and computed tomography in ischemic cardiomyopathy: an update*

    PubMed Central

    Assunção, Fernanda Boldrini; de Oliveira, Diogo Costa Leandro; Souza, Vitor Frauches; Nacif, Marcelo Souto

    2016-01-01

    Ischemic cardiomyopathy is one of the major health problems worldwide, representing a significant part of mortality in the general population nowadays. Cardiac magnetic resonance imaging (CMRI) and cardiac computed tomography (CCT) are noninvasive imaging methods that serve as useful tools in the diagnosis of coronary artery disease and may also help in screening individuals with risk factors for developing this illness. Technological developments of CMRI and CCT have contributed to the rise of several clinical indications of these imaging methods complementarily to other investigation methods, particularly in cases where they are inconclusive. In terms of accuracy, CMRI and CCT are similar to the other imaging methods, with few absolute contraindications and minimal risks of adverse side-effects. This fact strengthens these methods as powerful and safe tools in the management of patients. The present study is aimed at describing the role played by CMRI and CCT in the diagnosis of ischemic cardiomyopathies. PMID:26929458

  16. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  17. Electrostatic Discharge Issues in International Space Station Program EVAs

    NASA Technical Reports Server (NTRS)

    Bacon, John B.

    2009-01-01

    EVA activity in the ISS program encounters several dangerous ESD conditions. The ISS program has been aggressive for many years to find ways to mitigate or to eliminate the associated risks. Investments have included: (1) Major mods to EVA tools, suit connectors & analytical tools (2) Floating Potential Measurement Unit (3) Plasma Contactor Units (4) Certification of new ISS flight attitudes (5) Teraflops of computation (6) Thousands of hours of work by scores of specialists (7) Monthly management attention at the highest program levels. The risks are now mitigated to a level that is orders of magnitude safer than prior operations

  18. [Management of the visual risk in VDT workers and the role of the occupational physician (Medico competente)].

    PubMed

    Signorelli, C; Lepratto, M; Summa, A

    2005-01-01

    The enormous increasing of computer use in work activities has carried great progresses and many other advantages, but it has brought also possible health problems for the workers. The occupational risk in VDT workers involves the visual system, work-related muscoloskeletal disorders and also the mental state. This article concerns the major problems related to the obligations of the employer and to health surveillance, with special care to ophtalmologist examination for the ability, the responsibility and duty of occupational physicians (medici competenti) and the possible role of the ophthalmologists.

  19. Shape optimization of pulsatile ventricular assist devices using FSI to minimize thrombotic risk

    NASA Astrophysics Data System (ADS)

    Long, C. C.; Marsden, A. L.; Bazilevs, Y.

    2014-10-01

    In this paper we perform shape optimization of a pediatric pulsatile ventricular assist device (PVAD). The device simulation is carried out using fluid-structure interaction (FSI) modeling techniques within a computational framework that combines FEM for fluid mechanics and isogeometric analysis for structural mechanics modeling. The PVAD FSI simulations are performed under realistic conditions (i.e., flow speeds, pressure levels, boundary conditions, etc.), and account for the interaction of air, blood, and a thin structural membrane separating the two fluid subdomains. The shape optimization study is designed to reduce thrombotic risk, a major clinical problem in PVADs. Thrombotic risk is quantified in terms of particle residence time in the device blood chamber. Methods to compute particle residence time in the context of moving spatial domains are presented in a companion paper published in the same issue (Comput Mech, doi: 10.1007/s00466-013-0931-y, 2013). The surrogate management framework, a derivative-free pattern search optimization method that relies on surrogates for increased efficiency, is employed in this work. For the optimization study shown here, particle residence time is used to define a suitable cost or objective function, while four adjustable design optimization parameters are used to define the device geometry. The FSI-based optimization framework is implemented in a parallel computing environment, and deployed with minimal user intervention. Using five SEARCH/ POLL steps the optimization scheme identifies a PVAD design with significantly better throughput efficiency than the original device.

  20. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  1. Feasibility of Executing MIMS on Interdata 80.

    DTIC Science & Technology

    CDC 6500 computers, CDC 6600 computers, MIMS(Medical Information Management System ), Medical information management system , File structures, Computer...storage managementThe report examines the feasibility of implementing large information management system on mini-computers. The Medical Information ... Management System and the Interdata 80 mini-computer were selected as being representative systems. The FORTRAN programs currently being used in MIMS

  2. Non-invasive Characterization of the Histopathologic Features of Pulmonary Nodules of the Lung Adenocarcinoma Spectrum using Computer Aided Nodule Assessment and Risk Yield (CANARY) – a Pilot Study

    PubMed Central

    Maldonado, Fabien; Boland, Jennifer M.; Raghunath, Sushravya; Aubry, Marie Christine; Bartholmai, Brian J.; deAndrade, Mariza; Hartman, Thomas E.; Karwoski, Ronald A.; Rajagopalan, Srinivasan; Sykes, Anne-Marie; Yang, Ping; Yi, Eunhee S.; Robb, Richard A.; Peikert, Tobias

    2013-01-01

    Introduction Pulmonary nodules of the adenocarcinoma spectrum are characterized by distinctive morphological and radiological features and variable prognosis. Non-invasive high-resolution computed-tomography (HRCT)-based risk stratification tools are needed to individualize their management. Methods Radiological measurements of histopathologic tissue invasion were developed in a training set of 54 pulmonary nodules of the adenocarcinoma spectrum and validated in 86 consecutively resected nodules. Nodules were isolated and characterized by computer-aided analysis and data were analyzed by Spearman correlation, sensitivity, specificity as well as the positive and negative predictive values. Results Computer Aided Nodule Assessment and Risk Yield (CANARY) can non-invasively characterize pulmonary nodules of the adenocarcinoma spectrum. Unsupervised clustering analysis of HRCT data identified 9 unique exemplars representing the basic radiologic building blocks of these lesions. The exemplar distribution within each nodule correlated well with the proportion of histologic tissue invasion, Spearman R=0.87,p < 0.0001 and 0.89,p < 0.0001 for the training and the validation set, respectively. Clustering of the exemplars in three-dimensional space corresponding to tissue invasion and lepidic growth was used to develop a CANARY decision algorithm, which successfully categorized these pulmonary nodules as “aggressive” (invasive adenocarcinoma) or “indolent” (adenocarcinoma in situ and minimally invasive adenocarcinoma). Sensitivity, specificity, positive predictive value and negative predictive value of this approach for the detection of “aggressive” lesions were 95.4%, 96.8%, 95.4% and 96.8%, respectively in the training set and 98.7%, 63.6%, 94.9% and 87.5%, respectively in the validation set. Conclusion CANARY represents a promising tool to non-invasively risk stratify pulmonary nodules of the adenocarcinoma spectrum. PMID:23486265

  3. Myocardial perfusion 320-row multidetector computed tomography-guided treatment strategy for the clinical management of patients with recent acute-onset chest pain: Design of the CArdiac cT in the treatment of acute CHest pain (CATCH)-2 randomized controlled trial.

    PubMed

    Sørgaard, Mathias; Linde, Jesper J; Hove, Jens D; Petersen, Jan R; Jørgensen, Tem B S; Abdulla, Jawdat; Heitmann, Merete; Kragelund, Charlotte; Hansen, Thomas Fritz; Udholm, Patricia M; Pihl, Christian; Kühl, J Tobias; Engstrøm, Thomas; Jensen, Jan Skov; Høfsten, Dan E; Kelbæk, Henning; Kofoed, Klaus F

    2016-09-01

    Patients admitted with chest pain are a diagnostic challenge because the majority does not have coronary artery disease (CAD). Assessment of CAD with coronary computed tomography angiography (CCTA) is safe, cost-effective, and accurate, albeit with a modest specificity. Stress myocardial computed tomography perfusion (CTP) has been shown to increase the specificity when added to CCTA, without lowering the sensitivity. This article describes the design of a randomized controlled trial, CATCH-2, comparing a clinical diagnostic management strategy of CCTA alone against CCTA in combination with CTP. Patients with acute-onset chest pain older than 50 years and with at least one cardiovascular risk factor for CAD are being prospectively enrolled to this study from 6 different clinical sites since October 2013. A total of 600 patients will be included. Patients are randomized 1:1 to clinical management based on CCTA or on CCTA in combination with CTP, determining the need for further testing with invasive coronary angiography including measurement of the fractional flow reserve in vessels with coronary artery lesions. Patients are scanned with a 320-row multidetector computed tomography scanner. Decisions to revascularize the patients are taken by the invasive cardiologist independently of the study allocation. The primary end point is the frequency of revascularization. Secondary end points of clinical outcome are also recorded. The CATCH-2 will determine whether CCTA in combination with CTP is diagnostically superior to CCTA alone in the management of patients with acute-onset chest pain. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Modified Sainsbury tool: an initial risk assessment tool for primary care mental health and learning disability services.

    PubMed

    Stein, W

    2005-10-01

    Risk assessments by health and social care professionals must encompass risk of suicide, of harm to others, and of neglect. The UK's National Confidential Inquiry into Homicide and Suicide paints a picture of failure to predict suicides and homicides, failure to identify opportunities for prevention and a failure to manage these opportunities. Assessing risk at 'first contact' with the mental health service assumes a special place in this regard. The initial opportunity to be alerted to, and thus to influence, risk, usually falls to the general psychiatric service (as opposed to forensic specialists) or to a joint health and local authority community mental health team. The Mental Health and Learning Disabilities Directorate of Renfrewshire & Inverclyde Primary Care NHS Trust, Scotland, determined to standardize their approach to risk assessment and selected a modified version of the Sainsbury Risk Assessment Tool. A year-long pilot revealed general support for its service-wide introduction but also some misgivings to address, including: (i) rejection of the tool by some medical staff; (ii) concerns about limited training; and (iii) a perceived failure on the part of the management to properly resource its use. The tool has the potential to fit well with the computer-networked needs assessment system used in joint-working with partner local authorities to allocate care resources.

  5. A randomized controlled trial of two primary school intervention strategies to prevent early onset tobacco smoking.

    PubMed

    Storr, Carla L; Ialongo, Nicholas S; Kellam, Sheppard G; Anthony, James C

    2002-03-01

    In this article, we examine the impact of two universal, grade 1 preventive interventions on the onset of tobacco smoking as assessed in early adolescence. The classroom-centered (CC) intervention was designed to reduce the risk for tobacco smoking by enhancing teachers' behavior management skills in first grade and, thereby, reducing child attention problems and aggressive and shy behavior-known risk behaviors for later substance use. The family-school partnership (FSP) intervention targeted these early risk behaviors via improvements in parent-teacher communication and parents' child behavior management strategies. A cohort of 678 urban, predominately African-American, public school students were randomly assigned to one of three Grade 1 classrooms at entrance to primary school (age 6). One classroom featured the CC intervention, a second the FSP intervention, and the third served as a control classroom. Six years later, 81% of the students completed audio computer-assisted self-interviews. Relative to controls, a modest attenuation in the risk of smoking initiation was found for students who had been assigned to either the CC or FSP intervention classrooms (26% versus 33%) (adjusted relative risk for CC/control contrast=0.57, 95% confidence interval (CI), 0.34-0.96; adjusted relative risk for FSP/control contrast=0.69, 95% CI, 0.50-0.97). Results lend support to targeting the early antecedent risk behaviors for tobacco smoking.

  6. Physical activity for the prevention and management of youth-onset type 2 diabetes mellitus: focus on cardiovascular complications.

    PubMed

    McGavock, Jonathan; Sellers, Elizabeth; Dean, Heather

    2007-12-01

    With the growing prevalence of childhood obesity and type 2 diabetes mellitus (T2DM) in youth, the challenge of cardiovascular disease risk management has entered the paediatric realm, affecting specialists, family physicians and allied healthcare professionals alike. Currently, there is little evidence to support optimal strategies for management of T2DM in youth and the associated cardiovascular complications. Physical activity plays a powerful role in the prevention and management of T2DM and cardiovascular disease in adults. This review will focus on the role of physical activity for the prevention of T2DM in youth and its associated cardiovascular complications. The first part describes the prevalence of cardiovascular risk factors in this cohort. The second part focuses on the role of physical activity in the prevention and management of T2DM in youth. Collectively, the limited intervention and observation studies published to date suggest that daily targets of 60-90 minutes of physical activity and less than 60 minutes of screen time (i.e. time spent in front of a television, computer or video games) are required for the prevention and management of T2DM in youth. Large-scale intervention studies are needed to determine the most effective physical activity strategies for the prevention and management of T2DM in youth.

  7. A risk management model for familial breast cancer: A new application using Fuzzy Cognitive Map method.

    PubMed

    Papageorgiou, Elpiniki I; Jayashree Subramanian; Karmegam, Akila; Papandrianos, Nikolaos

    2015-11-01

    Breast cancer is the most deadly disease affecting women and thus it is natural for women aged 40-49 years (who have a family history of breast cancer or other related cancers) to assess their personal risk for developing familial breast cancer (FBC). Besides, as each individual woman possesses different levels of risk of developing breast cancer depending on their family history, genetic predispositions and personal medical history, individualized care setting mechanism needs to be identified so that appropriate risk assessment, counseling, screening, and prevention options can be determined by the health care professionals. The presented work aims at developing a soft computing based medical decision support system using Fuzzy Cognitive Map (FCM) that assists health care professionals in deciding the individualized care setting mechanisms based on the FBC risk level of the given women. The FCM based FBC risk management system uses NHL to learn causal weights from 40 patient records and achieves a 95% diagnostic accuracy. The results obtained from the proposed model are in concurrence with the comprehensive risk evaluation tool based on Tyrer-Cuzick model for 38/40 patient cases (95%). Besides, the proposed model identifies high risk women by calculating higher accuracy of prediction than the standard Gail and NSAPB models. The testing accuracy of the proposed model using 10-fold cross validation technique outperforms other standard machine learning based inference engines as well as previous FCM-based risk prediction methods for BC. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices

    PubMed Central

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-01-01

    Background Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken. Trial registration Current controlled trials ISRCTN21785299 PMID:19409095

  9. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices.

    PubMed

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-05-01

    Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. RESEARCH SUBJECT GROUP: "At-risk" patients registered with computerised general practices in two geographical regions in England. Parallel group pragmatic cluster randomised trial. Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs; - with a computer-recorded diagnosis of asthma being prescribed beta-blockers; - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. SECONDARY OUTCOME MEASURES; These relate to a number of other examples of potentially hazardous prescribing and medicines management. An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. QUALITATIVE ANALYSIS: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

  10. Screening and Management of Asymptomatic Renal Stones in Astronauts

    NASA Technical Reports Server (NTRS)

    Reyes, David; Locke, James; Sargsyan, Ashot; Garcia, Kathleen

    2017-01-01

    Management guidelines were created to screen and manage asymptomatic renal stones in U.S. astronauts. The true risk for renal stone formation in astronauts due to the space flight environment is unknown. Proper management of this condition is crucial to mitigate health and mission risks. The NASA Flight Medicine Clinic electronic medical record and the Lifetime Surveillance of Astronaut Health databases were reviewed. An extensive review of the literature and current aeromedical standards for the monitoring and management of renal stones was also done. This work was used to develop a screening and management protocol for renal stones in astronauts that is relevant to the spaceflight operational environment. In the proposed guidelines all astronauts receive a yearly screening and post-flight renal ultrasound using a novel ultrasound protocol. The ultrasound protocol uses a combination of factors, including: size, position, shadow, twinkle and dispersion properties to confirm the presence of a renal calcification. For mission-assigned astronauts, any positive ultrasound study is followed by a low-dose renal computed tomography scan and urologic consult. Other specific guidelines were also created. A small asymptomatic renal stone within the renal collecting system may become symptomatic at any time, and therefore affect launch and flight schedules, or cause incapacitation during a mission. Astronauts in need of definitive care can be evacuated from the International Space Station, but for deep space missions evacuation is impossible. The new screening and management algorithm has been implemented and the initial round of screening ultrasounds is under way. Data from these exams will better define the incidence of renal stones in U.S. astronauts, and will be used to inform risk mitigation for both short and long duration spaceflights.

  11. Internet: road to heaven or hell for the clinical laboratory?

    PubMed

    Chou, D

    1996-05-01

    The Internet started as a research project by the Department of Defense Advanced Research Projects Agency for networking computers. Ironically, the networking project now predominantly supports human rather than computer communications. The Internet's growth, estimated at 20% per month, has been fueled by commercial and public perception that it will become an important medium for merchandising, marketing, and advertising. For the clinical laboratory, the Internet provides high-speed communications through e-mail and allows the retrieval of important information held in repositories. All this capability comes at a price, including the need to manage a complex technology and the risk of instrusions on patient privacy.

  12. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  13. Risk identification of agricultural drought for sustainable agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.; Tarquis, A. M.

    2014-04-01

    Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, society and economy. Droughts affect sustainability of agriculture and may result in environmental degradation of a region, which is one of the factors contributing to the vulnerability of agriculture. This paper addresses agrometeorological or agricultural drought within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with risk identification of agricultural drought, which involves drought quantification and monitoring, as well as statistical inference. For the quantitative assessment of agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the Vegetation Health Index (VHI). The computation of VHI is based on satellite data of temperature and the Normalized Difference Vegetation Index (NDVI). The spatiotemporal features of drought, which are extracted from VHI are: areal extent, onset and end time, duration and severity. In this paper, a 20 year (1981-2001) time series of NOAA/AVHRR satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural drought-prone region of Greece, characterized by vulnerable agriculture. The results show that agricultural drought appears every year during the warm season in the region. The severity of drought is increasing from mild to extreme throughout the warm season with peaks appearing in the summer. Similarly, the areal extent of drought is also increasing during the warm season, whereas the number of extreme drought pixels is much less than those of mild to moderate drought throughout the warm season. Finally, the areas with diachronic drought persistence can be located. Drought early warning is developed using empirical functional relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought classes, respectively. The two fitted curves offer a forecasting tool on a monthly basis from May to October. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential. The adopted remote sensing data and methods have proven very effective in delineating spatial variability and features in drought quantification and monitoring.

  14. Framework for Identifying Cybersecurity Risks in Manufacturing

    DOE PAGES

    Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.; ...

    2015-10-21

    Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less

  15. Framework for Identifying Cybersecurity Risks in Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.

    Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less

  16. Post traumatic inferior vena cava thrombosis: A case report and review of literature.

    PubMed

    Chakroun, Amine; Nakhli, Mohamed Said; Kahloul, Mohamed; Harrathi, Mohamed Amine; Naija, Walid

    2017-01-01

    Post traumatic inferior vena cava (IVC) thrombosis is a rare and not well described entity with nonspecific clinical presentation. It remains a therapeutic challenge in traumatic context because of haemorrhagic risk due to anticoagulation. We report a case of IVC thrombosis in an 18 year-old man who presented with liver injury following a traffic crash. The thrombosis was incidentally diagnosed on admission by computed tomography. The patient was managed conservatively without anticoagulation initially considering the increasing haemorrhagic risk. IVC filter placing was not possible because of the unusual localization of the thrombus. Unfractionated heparin was started on the third day after CT scan control showing stability of hepatic lesions with occurrence of a pulmonary embolism. The final outcome was good. The management of post traumatic IVC thrombosis is not well described. Medical approach consists in conservative management with anticoagulation which requires the absence of active bleeding lesions. Surgical treatment is commonly based on thrombectomy under extracorporeal circulation. Interventional vascular techniques have become an important alternative approach for the treatment of many vessel lesions. Their main advantages are the relative ease and speed with which they can be performed. Post traumatic IVC thrombosis is a rare condition. Its management is not well defined. Early anticoagulation should be discussed on a case-by-case basis. Other alternatives such IVC filter or surgical thrombectomy may be used when the bleeding risk is increased. The most serious risk is pulmonary embolism. Outcome can be favorable even with non surgical approaches. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Food Waste to Energy: An Overview of Sustainable Approaches for Food Waste Management and Nutrient Recycling

    PubMed Central

    Paritosh, Kunwar; Kushwaha, Sandeep K.; Yadav, Monika; Pareek, Nidhi; Chawade, Aakash

    2017-01-01

    Food wastage and its accumulation are becoming a critical problem around the globe due to continuous increase of the world population. The exponential growth in food waste is imposing serious threats to our society like environmental pollution, health risk, and scarcity of dumping land. There is an urgent need to take appropriate measures to reduce food waste burden by adopting standard management practices. Currently, various kinds of approaches are investigated in waste food processing and management for societal benefits and applications. Anaerobic digestion approach has appeared as one of the most ecofriendly and promising solutions for food wastes management, energy, and nutrient production, which can contribute to world's ever-increasing energy requirements. Here, we have briefly described and explored the different aspects of anaerobic biodegrading approaches for food waste, effects of cosubstrates, effect of environmental factors, contribution of microbial population, and available computational resources for food waste management researches. PMID:28293629

  18. Food Waste to Energy: An Overview of Sustainable Approaches for Food Waste Management and Nutrient Recycling.

    PubMed

    Paritosh, Kunwar; Kushwaha, Sandeep K; Yadav, Monika; Pareek, Nidhi; Chawade, Aakash; Vivekanand, Vivekanand

    2017-01-01

    Food wastage and its accumulation are becoming a critical problem around the globe due to continuous increase of the world population. The exponential growth in food waste is imposing serious threats to our society like environmental pollution, health risk, and scarcity of dumping land. There is an urgent need to take appropriate measures to reduce food waste burden by adopting standard management practices. Currently, various kinds of approaches are investigated in waste food processing and management for societal benefits and applications. Anaerobic digestion approach has appeared as one of the most ecofriendly and promising solutions for food wastes management, energy, and nutrient production, which can contribute to world's ever-increasing energy requirements. Here, we have briefly described and explored the different aspects of anaerobic biodegrading approaches for food waste, effects of cosubstrates, effect of environmental factors, contribution of microbial population, and available computational resources for food waste management researches.

  19. Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts

    DTIC Science & Technology

    2015-07-01

    uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...value 95% of the time. In the frequentist approach to PE, model parameters area regarded as having true values, and their estimate is based on the...in catchment models. 1. Evaluating parameter uncertainty. Water Resources Research 19(5):1151–1172. Lee, P. M. 2012. Bayesian statistics: An

  20. Metal bioavailability in ecological risk assessment of freshwater ecosystems: From science to environmental management.

    PubMed

    Väänänen, Kristiina; Leppänen, Matti T; Chen, XuePing; Akkanen, Jarkko

    2018-01-01

    Metal contamination in freshwater ecosystems is a global issue and metal discharges to aquatic environments are monitored in order to protect aquatic life and human health. Bioavailability is an important factor determining metal toxicity. In aquatic systems, metal bioavailability depends on local water and sediment characteristics, and therefore, the risks are site-specific. Environmental quality standards (EQS) are used to manage the risks of metals in aquatic environments. In the simplest form of EQSs, total concentrations of metals in water or sediment are compared against pre-set acceptable threshold levels. Now, however, the environmental administration bodies have stated the need to incorporate metal bioavailability assessment tools into environmental regulation. Scientific advances have been made in metal bioavailability assessment, including passive samplers and computational models, such as biotic ligand models (BLM). However, the cutting-edge methods tend to be too elaborate or laborious for standard environmental monitoring. We review the commonly used metal bioavailability assessment methods and introduce the latest scientific advances that might be applied to environmental management in the future. We present the current practices in environmental management in North America, Europe and China, highlighting the good practices and the needs for improvement. Environmental management has met these new challenges with varying degrees of success: the USA has implemented site-specific environmental risk assessment for water and sediment phases, and they have already implemented metal mixture toxicity evaluation. The European Union is promoting the use of bioavailability and BLMs in ecological risk assessment (ERA), but metal mixture toxicity and sediment phase are still mostly neglected. China has regulation only for total concentrations of metals in surface water. We conclude that there is a need for (1) Advanced and up-to-date guidelines and legislation, (2) New and simple scientific methods for assessing metal bioavailability and (3) Improvement of knowledge and skills of administrators. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Assessment of adherence to the guidelines for the management of nausea and vomiting induced by chemotherapy

    PubMed Central

    França, Monique Sedlmaier; Usón, Pedro Luiz Serrano; Antunes, Yuri Philippe Pimentel Vieira; Prado, Bernard Lobato; Donnarumma, Carlos del Cistia; Mutão, Taciana Sousa; Rodrigues, Heloisa Veasey; del Giglio, Auro

    2015-01-01

    ABSTRACT Objective: To assess adherence of the prescribing physicians in a private cancer care center to the American Society of Clinical Oncology guideline for antiemetic prophylaxis, in the first cycle of antineoplastic chemotherapy. Methods: A total of 139 chemotherapy regimens, of 105 patients, were evaluated retrospectively from 2011 to 2013. Results: We observed 78% of non-adherence to the guideline rate. The main disagreements with the directive were the prescription of higher doses of dexamethasone and excessive use of 5-HT3 antagonist for low risk emetogenic chemotherapy regimens. On univariate analysis, hematological malignancies (p=0.005), the use of two or more chemotherapy (p=0.05) and high emetogenic risk regimes (p=0.012) were factors statistically associated with greater adherence to guidelines. Treatment based on paclitaxel was the only significant risk factor for non-adherence (p=0.02). By multivariate analysis, the chemotherapy of high emetogenic risk most correlated with adherence to guideline (p=0.05). Conclusion: We concluded that the adherence to guidelines is greater if the chemotherapy regime has high emetogenic risk. Educational efforts should focus more intensely on the management of chemotherapy regimens with low and moderate emetogenic potential. Perhaps the development of a computer generated reminder may improve the adherence to guidelines. PMID:26154543

  2. Nonoperative management of spontaneous splenic rupture in infectious mononucleosis: a case report and review of the literature.

    PubMed

    Stephenson, Jacob T; DuBois, Jeffrey J

    2007-08-01

    Spontaneous rupture of the spleen is a rare complication of infectious mononucleosis with no clear consensus on appropriate management. Although management of traumatic splenic rupture has largely moved to nonoperative treatment, splenectomy is still frequently used in dealing with rupture of the diseased spleen. Here we report the case of a 16-year-old boy with splenic rupture secondary to laboratory-confirmed infectious mononucleosis in the absence of trauma. Nonoperative management including ICU admission, serial computed tomography scans, and activity limitation was used successfully. Our experience, along with a review of the literature, leads us to conclude that splenic preservation can be a safe alternative to splenectomy in hemodynamically stable patients with spontaneous splenic rupture. This is of particular importance in the pediatric population, which is at higher risk for postsplenectomy sepsis.

  3. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2015-07-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - Portuguese Continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time. Shoreline risks can be computed in real-time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns, "hot spots" or developing sensitivity analysis to specific conditions, whereas real time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.

  4. Identifying at-risk employees: A behavioral model for predicting potential insider threats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Kangas, Lars J.; Noonan, Christine F.

    A psychosocial model was developed to assess an employee’s behavior associated with an increased risk of insider abuse. The model is based on case studies and research literature on factors/correlates associated with precursor behavioral manifestations of individuals committing insider crimes. In many of these crimes, managers and other coworkers observed that the offenders had exhibited signs of stress, disgruntlement, or other issues, but no alarms were raised. Barriers to using such psychosocial indicators include the inability to recognize the signs and the failure to record the behaviors so that they could be assessed by a person experienced in psychosocial evaluations.more » We have developed a model using a Bayesian belief network with the help of human resources staff, experienced in evaluating behaviors in staff. We conducted an experiment to assess its agreement with human resources and management professionals, with positive results. If implemented in an operational setting, the model would be part of a set of management tools for employee assessment that can raise an alarm about employees who pose higher insider threat risks. In separate work, we combine this psychosocial model’s assessment with computer workstation behavior to raise the efficacy of recognizing an insider crime in the making.« less

  5. [The economics of preventing psycho-social risks].

    PubMed

    Golzio, Luigi

    2014-01-01

    The aim of the essay is to show the SHIELD methodology for helping the firm management to improve the risks prevention policy. It has been tested in the field with positive results. SHIELD is a cost-benefit analysis application to compare prevention and non-prevention costs, which arise from non-market risks. In the economic perspective safety risks (which include psycho-social risks) are non-market ones as they cause injures to workers during the job. SHIELD (Social Health Indicators for Economic Labour Decisions), is the original method proposed by the author. It is a cost benefits analysis application, which compares safety prevention and non-prevention costs. The comparison allow stop management to evaluate the efficiency of the current safety prevention policy as it helps top management to answer to the policy question: how much to invest in prevention costs? The costs comparison is obtained through the reclassification of safety costs between prevention and non-prevention costs (which are composed by claim damages and penalty sanction costs). SHIELD has been tested empirically in four companies operating in the agribusiness sector during a research financed by the Assessorato all'Agricoltura and INAI Regionale of Emilia Romagna Region. Results are postive: it has been found that the increase of prevention costs causes the cut of non-prevention costs in all companies looked into, as assumed by the high reliability organization theory. SHIELD can be applied to all companies which must have an accounting system by law, no matter of the industry they act. Its application has limited costs as SHIELD doesn't need changes in the accounting system. Safety costs sustained by the company are simply reclassified in prevention and non-prevention costs. The comparison of these two costs categories has been appreciated by top management of companies investigated as a useful support to decide the risks prevention policy for the company. The SHIELD original feature compared with others cost benefit analysis application is to compute registered costs in the company accounting system.

  6. Ischemic Stroke: Advances in Diagnosis and Management.

    PubMed

    Cassella, Courtney R; Jagoda, Andy

    2017-11-01

    Acute ischemic stroke carries the risk of morbidity and mortality. Since the advent of intravenous thrombolysis, there have been improvements in stroke care and functional outcomes. Studies of populations once excluded from thrombolysis have begun to elucidate candidates who might benefit and thus should be engaged in the process of shared decision-making. Imaging is evolving to better target the ischemic penumbra salvageable with prompt reperfusion. Availability and use of computed tomography angiography identifies large-vessel occlusions, and new-generation endovascular therapy devices are improving outcomes in these patients. With this progress in stroke treatment, risk stratification tools and shared decision-making are fundamental. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. NeuPAT: an intranet database supporting translational research in neuroblastic tumors.

    PubMed

    Villamón, Eva; Piqueras, Marta; Meseguer, Javier; Blanquer, Ignacio; Berbegall, Ana P; Tadeo, Irene; Hernández, Vicente; Navarro, Samuel; Noguera, Rosa

    2013-03-01

    Translational research in oncology is directed mainly towards establishing a better risk stratification and searching for appropriate therapeutic targets. This research generates a tremendous amount of complex clinical and biological data needing speedy and effective management. The authors describe the design, implementation and early experiences of a computer-aided system for the integration and management of data for neuroblastoma patients. NeuPAT facilitates clinical and translational research, minimizes the workload in consolidating the information, reduces errors and increases correlation of data through extensive coding. This design can also be applied to other tumor types. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Computer-Aided Facilities Management Systems (CAFM).

    ERIC Educational Resources Information Center

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  9. Risk Based Reservoir Operations Using Ensemble Streamflow Predictions for Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Mendoza, J.; Whitin, B.; Hartman, R. K.

    2017-12-01

    Ensemble Forecast Operations (EFO) is a risk based approach of reservoir flood operations that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, each member of an ESP is individually modeled to forecast system conditions and calculate risk of reaching critical operational thresholds. Reservoir release decisions are computed which seek to manage forecasted risk to established risk tolerance levels. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC, which approximates flow forecasts for 61 ensemble members for a 15-day horizon. Model simulation results of the EFO alternative demonstrate a 36% increase in median end of water year (September 30) storage levels over existing operations. Additionally, model results show no increase in occurrence of flows above flood stage for points downstream of Lake Mendocino. This investigation demonstrates that the EFO alternative may be a viable approach for managing Lake Mendocino for multiple purposes (water supply, flood mitigation, ecosystems) and warrants further investigation through additional modeling and analysis.

  10. Application of expert systems in project management decision aiding

    NASA Technical Reports Server (NTRS)

    Harris, Regina; Shaffer, Steven; Stokes, James; Goldstein, David

    1987-01-01

    The feasibility of developing an expert systems-based project management decision aid to enhance the performance of NASA project managers was assessed. The research effort included extensive literature reviews in the areas of project management, project management decision aiding, expert systems technology, and human-computer interface engineering. Literature reviews were augmented by focused interviews with NASA managers. Time estimation for project scheduling was identified as the target activity for decision augmentation, and a design was developed for an Integrated NASA System for Intelligent Time Estimation (INSITE). The proposed INSITE design was judged feasible with a low level of risk. A partial proof-of-concept experiment was performed and was successful. Specific conclusions drawn from the research and analyses are included. The INSITE concept is potentially applicable in any management sphere, commercial or government, where time estimation is required for project scheduling. As project scheduling is a nearly universal management activity, the range of possibilities is considerable. The INSITE concept also holds potential for enhancing other management tasks, especially in areas such as cost estimation, where estimation-by-analogy is already a proven method.

  11. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M; Palta, J; Dunscombe, P

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less

  12. An integrated science-based methodology to assess potential ...

    EPA Pesticide Factsheets

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo

  13. Finite element probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacvarov, D.C.

    1981-01-01

    A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less

  14. 2008 Homeland Security S and T Stakeholders Conference West-Volume 3 Tuesday

    DTIC Science & Technology

    2008-01-16

    Architecture ( PNNL SRS) • Online data collection / entry • Data Warehouse • On Demand Analysis and Reporting Tools • Reports, Charts & Graphs • Visual / Data...Sustainability 2007– 2016 Our region wide investment include all PANYNJ business areas Computer Statistical Analysis COMPSTAT •NYPD 1990’s •Personnel Management...Coast Guard, and public health Expertise, Depth, Agility Staff Degrees 6 Our Value Added Capabilities • Risk Analysis • Operations Analysis

  15. Data systems and computer science: Software Engineering Program

    NASA Technical Reports Server (NTRS)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  16. Research on Building Education & Workforce Capacity in Systems Engineering

    DTIC Science & Technology

    2011-10-31

    product or prototype that addresses a real DoD need. Implemented as pilot courses in eight civilian and six military universities affiliated with...Engineering 1 1.1 Computer Engineering 1 1.1 Operations Research 1 1.1 Product Architecture 1 1.1 Total 93 100.0 Table 7: Breakdown of Student... product specifications, inattention to budget limits and safety issues, inattention to product life cycle, poor implementation of risk management plans

  17. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...

  18. Computing for Finance

    ScienceCinema

    None

    2018-01-24

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  19. Computing for Finance

    ScienceCinema

    None

    2018-06-20

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  20. Computing for Finance

    ScienceCinema

    None

    2018-01-25

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  1. Computing for Finance

    ScienceCinema

    None

    2018-02-02

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  2. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  3. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  4. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  5. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  6. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  7. Computing for Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less

  8. Computing for Finance

    ScienceCinema

    None

    2018-02-01

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  9. Computing for Finance

    ScienceCinema

    None

    2018-01-24

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

  10. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  11. Regional scale landslide risk assessment with a dynamic physical model - development, application and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh

    2013-04-01

    Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  12. NASA's Risk Management System

    NASA Technical Reports Server (NTRS)

    Perera, Jeevan S.

    2011-01-01

    Leadership is key to success. Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks -- risk office personnel. Each group is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk reporting and communication is an essential element of risk management and will combine both qualitative and quantitative elements. Risk informed decision making should be introduced to all levels of management. Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  13. Risk Management Issues - An Aerospace Perspective

    NASA Technical Reports Server (NTRS)

    Perera, Jeevan S.

    2011-01-01

    Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks--risk office personnel. Each group is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk reporting and communication is an essential element of risk management and will combine both qualitative and quantitative elements.. Risk informed decision making should be introduced to all levels of management. Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner, Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  14. A resource management architecture based on complex network theory in cloud computing federation

    NASA Astrophysics Data System (ADS)

    Zhang, Zehua; Zhang, Xuejie

    2011-10-01

    Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.

  15. Management and Risk Reduction of Rheumatoid Arthritis in Individuals with Obstructive Sleep Apnea: A Nationwide Population-Based Study in Taiwan.

    PubMed

    Chen, Wei-Sheng; Chang, Yu-Sheng; Chang, Chi-Ching; Chang, Deh-Ming; Chen, Yi-Hsuan; Tsai, Chang-Youh; Chen, Jin-Hua

    2016-10-01

    To explore associations between obstructive sleep apnea (OSA) and autoimmune diseases and evaluate whether OSA management reduces the incidence of autoimmune diseases. This was a retrospective cohort study using nationwide database research. The data was from 105,846 adult patients in whom OSA was diagnosed and recorded in the Taiwan National Health Insurance Research Database between 2002 and 2011 were the patients were analyzed retrospectively. Patients with antecedent autoimmune diseases were excluded. A comparison cohort of 423,384 participants without OSA served as age- and sex-matched controls. Multivariable Cox regression analysis was performed on both cohorts to compute risk of autoimmune diseases during follow-up. Time-dependent OSA treatment effect was analyzed among patients with OSA. There were no interventions. Among patients with OSA, overall risk for incident autoimmune diseases was significantly higher than that in controls (adjusted hazard ratio [HR] = 1.95, 95% confidence interval [CI] = 1.66-2.27). Risk for individual autoimmune diseases, including rheumatoid arthritis (RA), Sjögren syndrome (SS), and Behçet disease, was significantly higher in patients with OSA than in controls (HRs [95% CI]: RA 1.33 [1.03-1.72, SS 3.45 [2.67-4.45] and Behçet disease 5.33 [2.45-12.66]). Increased risk for systemic lupus erythematosus (HR 1.00 [0.54-1.84]) and systemic sclerosis (HR 1.43 [0.51-3.96]) did not reach statistical significance. Patients with OSA receiving treatment had an overall reduced risk of RA and other autoimmune diseases (time-dependent HRs [95% CI]: 0.22 [0.05-0.94] and 0.51 [0.28-0.92], respectively). Patients with OSA are associated with higher risk for developing RA, SS, and Behçet disease. OSA management is associated with reduced risk of RA. © 2016 Associated Professional Sleep Societies, LLC.

  16. Cardiac PET/CT for the Evaluation of Known or Suspected Coronary Artery Disease

    PubMed Central

    Murthy, Venkatesh L.

    2011-01-01

    Positron emission tomography (PET) is increasingly being applied in the evaluation of myocardial perfusion. Cardiac PET can be performed with an increasing variety of cyclotron- and generator-produced radiotracers. Compared with single photon emission computed tomography, PET offers lower radiation exposure, fewer artifacts, improved spatial resolution, and, most important, improved diagnostic performance. With its capacity to quantify rest–peak stress left ventricular systolic function as well as coronary flow reserve, PET is superior to other methods for the detection of multivessel coronary artery disease and, potentially, for risk stratification. Coronary artery calcium scoring may be included for further risk stratification in patients with normal perfusion imaging findings. Furthermore, PET allows quantification of absolute myocardial perfusion, which also carries substantial prognostic value. Hybrid PET–computed tomography scanners allow functional evaluation of myocardial perfusion combined with anatomic characterization of the epicardial coronary arteries, thereby offering great potential for both diagnosis and management. Additional studies to further validate the prognostic value and cost effectiveness of PET are warranted. © RSNA, 2011 PMID:21918042

  17. Web based collaborative decision making in flood risk management

    NASA Astrophysics Data System (ADS)

    Evers, Mariele; Almoradie, Adrian; Jonoski, Andreja

    2014-05-01

    Stakeholder participation in the development of flood risk management (FRM) plans is essential since stakeholders often have a better understanding or knowledge of the potentials and limitation of their local area. Moreover, a participatory approach also creates trust amongst stakeholders, leading to a successful implementation of measures. Stakeholder participation however has its challenges and potential pitfalls that could lead to its premature termination. Such challenges and pitfalls are the limitation of financial resources, stakeholders' spatial distribution and their interest to participate. Different type of participation in FRM may encounter diverse challenges. These types of participation in FRM can be classified into (1) Information and knowledge sharing (IKS), (2) Consultative participation (CP) or (3) Collaborative decision making (CDM)- the most challenging type of participation. An innovative approach to address these challenges and potential pitfalls is a web-based mobile or computer-aided environment for stakeholder participation. This enhances the remote interaction between participating entities such as stakeholders. This paper presents a developed framework and an implementation of CDM web based environment for the Alster catchment (Hamburg, Germany) and Cranbrook catchment (London, UK). The CDM framework consists of two main stages: (1) Collaborative modelling and (2) Participatory decision making. This paper also highlights the stakeholder analyses, modelling approach and application of General Public License (GPL) technologies in developing the web-based environments. Actual test and evaluation of the environments was through series of stakeholders workshops. The overall results based from stakeholders' evaluation shows that web-based environments can address the challenges and potential pitfalls in stakeholder participation and it enhances participation in flood risk management. The web-based environment was developed within the DIANE-CM project (Decentralised Integrated Analysis and Enhancement of Awareness through Collaborative Modelling and Management of Flood Risk) of the 2nd ERANET CRUE funding initiative.

  18. Generalizable open source urban water portfolio simulation framework demonstrated using a multi-objective risk-based planning benchmark problem.

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.

    2017-12-01

    The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.

  19. NASA's Risk Management System

    NASA Technical Reports Server (NTRS)

    Perera, Jeevan S.

    2013-01-01

    Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks - not just risk office personnel. Each group/department is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk informed decision making should be introduced to all levels of management. ? Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  20. Prognostic Value of Coronary Computed Tomography Imaging in Patients at High Risk Without Symptoms of Coronary Artery Disease.

    PubMed

    Dedic, Admir; Ten Kate, Gert-Jan R; Roos, Cornelis J; Neefjes, Lisan A; de Graaf, Michiel A; Spronk, Angela; Delgado, Victoria; van Lennep, Jeanine E Roeters; Moelker, Adriaan; Ouhlous, Mohamed; Scholte, Arthur J H A; Boersma, Eric; Sijbrands, Eric J G; Nieman, Koen; Bax, Jeroen J; de Feijter, Pim J

    2016-03-01

    At present, traditional risk factors are used to guide cardiovascular management of asymptomatic subjects. Intensified surveillance may be warranted in those identified as high risk of developing cardiovascular disease (CVD). This study aims to determine the prognostic value of coronary computed tomography (CT) angiography (CCTA) next to the coronary artery calcium score (CACS) in patients at high CVD risk without symptoms suspect for coronary artery disease (CAD). A total of 665 patients at high risk (mean age 56 ± 9 years, 417 men), having at least one important CVD risk factor (diabetes mellitus, familial hypercholesterolemia, peripheral artery disease, or severe hypertension) or a calculated European systematic coronary risk evaluation of >10% were included from outpatient clinics at 2 academic centers. Follow-up was performed for the occurrence of adverse events including all-cause mortality, nonfatal myocardial infarction, unstable angina, or coronary revascularization. During a median follow-up of 3.0 (interquartile range 1.3 to 4.1) years, adverse events occurred in 40 subjects (6.0%). By multivariate analysis, adjusted for age, gender, and CACS, obstructive CAD on CCTA (≥50% luminal stenosis) was a significant predictor of adverse events (hazard ratio 5.9 [CI 1.3 to 26.1]). Addition of CCTA to age, gender, plus CACS, increased the C statistic from 0.81 to 0.84 and resulted in a total net reclassification index of 0.19 (p <0.01). In conclusion, CCTA has incremental prognostic value and risk reclassification benefit beyond CACS in patients without CAD symptoms but with high risk of developing CVD. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Findings from an Organizational Network Analysis to Support Local Public Health Management

    PubMed Central

    Caldwell, Michael; Rockoff, Maxine L.; Gebbie, Kristine; Carley, Kathleen M.; Bakken, Suzanne

    2008-01-01

    We assessed the feasibility of using organizational network analysis in a local public health organization. The research setting was an urban/suburban county health department with 156 employees. The goal of the research was to study communication and information flow in the department and to assess the technique for public health management. Network data were derived from survey questionnaires. Computational analysis was performed with the Organizational Risk Analyzer. Analysis revealed centralized communication, limited interdependencies, potential knowledge loss through retirement, and possible informational silos. The findings suggested opportunities for more cross program coordination but also suggested the presences of potentially efficient communication paths and potentially beneficial social connectedness. Managers found the findings useful to support decision making. Public health organizations must be effective in an increasingly complex environment. Network analysis can help build public health capacity for complex system management. PMID:18481183

  2. A data management system to enable urgent natural disaster computing

    NASA Astrophysics Data System (ADS)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    Civil protection, in particular natural disaster management, is very important to most nations and civilians in the world. When disasters like flash floods, earthquakes and tsunamis are expected or have taken place, it is of utmost importance to make timely decisions for managing the affected areas and reduce casualties. Computer simulations can generate information and provide predictions to facilitate this decision making process. Getting the data to the required resources is a critical requirement to enable the timely computation of the predictions. An urgent data management system to support natural disaster computing is thus necessary to effectively carry out data activities within a stipulated deadline. Since the trigger of a natural disaster is usually unpredictable, it is not always possible to prepare required resources well in advance. As such, an urgent data management system for natural disaster computing has to be able to work with any type of resources. Additional requirements include the need to manage deadlines and huge volume of data, fault tolerance, reliable, flexibility to changes, ease of usage, etc. The proposed data management platform includes a service manager to provide a uniform and extensible interface for the supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities. These managers will enable the selection of the most appropriate resource, transfer protocol, etc. such that the hard deadline of an urgent computation can be met for a particular urgent activity, e.g. data staging or computation. We associated 2 types of deadlines [2] with an urgent computing system. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe consequences Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences. A prototype of this system has a REST-based service manager. The REST-based implementation provides a uniform interface that is easy to use. New and upcoming file transfer protocols can easily be extended and accessed via the service manager. The service manager interacts with the other four managers to coordinate the data activities so that the fundamental natural disaster urgent computing requirement, i.e. deadline, can be fulfilled in a reliable manner. A data activity can include data storing, data archiving and data storing. Reliability is ensured by the choice of a network of managers organisation model[1] the configuration manager and the fault tolerance manager. With this proposed design, an easy to use, resource-independent data management system that can support and fulfill the computation of a natural disaster prediction within stipulated deadlines can thus be realised. References [1] H. G. Hegering, S. Abeck, and B. Neumair, Integrated management of networked systems - concepts, architectures, and their operational application, Morgan Kaufmann Publishers, 340 Pine Stret, Sixth Floor, San Francisco, CA 94104-3205, USA, 1999. [2] H. Kopetz, Real-time systems design principles for distributed embedded applications, second edition, Springer, LLC, 233 Spring Street, New York, NY 10013, USA, 2011. [3] S. H. Leong, A. Frank, and D. Kranzlmu¨ ller, Leveraging e-infrastructures for urgent computing, Procedia Computer Science 18 (2013), no. 0, 2177 - 2186, 2013 International Conference on Computational Science. [4] N. Trebon, Enabling urgent computing within the existing distributed computing infrastructure, Ph.D. thesis, University of Chicago, August 2011, http://people.cs.uchicago.edu/~ntrebon/docs/dissertation.pdf.

  3. The Incidence, Classification, and Management of Acute Adverse Reactions to the Low-Osmolar Iodinated Contrast Media Isovue and Ultravist in Contrast-Enhanced Computed Tomography Scanning.

    PubMed

    Zhang, Bin; Dong, Yuhao; Liang, Long; Lian, Zhouyang; Liu, Jing; Luo, Xiaoning; Chen, Wenbo; Li, Xinyu; Liang, Changhong; Zhang, Shuixing

    2016-03-01

    Some epidemiologic surveillance studies have recorded adverse drug reactions to radiocontrast agents. We aimed to investigate the incidence and management of acute adverse reactions (AARs) to Ultravist-370 and Isovue-370 in patients who underwent contrast-enhanced computed tomography (CT) scanning.Data from 137,473 patients were analyzed. They had undergone enhanced CT scanning with intravenous injection of Ultravist-370 or Isovue-370 during the period of January 1, 2006 to December 31, 2012 in our hospital. We investigated and classified AARs according to the American College of Radiology and the Chinese Society of Radiology (CSR) guidelines for iodinated contrast media. We analyzed risk factors for AARs and compared the AARs induced by Ultravist-370 and Isovue-370.Four hundred and twenty-eight (0.31%) patients experienced AARs, which included 330 (0.24%) patients with mild AARs, 82 (0.06%) patients with moderate AARs, and 16 (0.01%) patients with severe AARs (including 3 cases of cardiac arrest and one case of death). The incidence of AARs was higher with Ultravist-370 than with Isovue-370 (0.38% vs 0.24%, P < 0.001), but only for mild AARs (0.32% vs 0.16%, P < 0.001). Analyses on risk factors indicated that female patients (n = 221, 0.43%, P < 0.001), emergency patients (n = 11, 0.51%, P < 0.001), elderly patients aged 50 to 60 years (n = 135, 0.43%, P < 0.001), and patients who underwent coronary computed tomography angiography (CTA) (n = 55, 0.51%, P < 0.001) had a higher risk of AARs. Cutaneous manifestations (50.52%)-especially rash (59.74%)-were the most frequent mild AARs. Cardiovascular manifestations accounted for most moderate and severe AARs (62.91% and 48.28%, respectively). After proper management, the symptoms and signs of 96.5% of the AARs resolved within 24 hours without sequelae.Ultravist-370 and Isovue-370 are safe for patients undergoing enhanced CT scanning. The incidence of AARs is higher with Ultravist-370 than with Isovue-370, but this difference is limited only to the mild AARs. The incidence of AARs could be affected by multiple factors.

  4. The Incidence, Classification, and Management of Acute Adverse Reactions to the Low-Osmolar Iodinated Contrast Media Isovue and Ultravist in Contrast-Enhanced Computed Tomography Scanning

    PubMed Central

    Zhang, Bin; Dong, Yuhao; Liang, Long; Lian, Zhouyang; Liu, Jing; Luo, Xiaoning; Chen, Wenbo; Li, Xinyu; Liang, Changhong; Zhang, Shuixing

    2016-01-01

    Abstract Some epidemiologic surveillance studies have recorded adverse drug reactions to radiocontrast agents. We aimed to investigate the incidence and management of acute adverse reactions (AARs) to Ultravist-370 and Isovue-370 in patients who underwent contrast-enhanced computed tomography (CT) scanning. Data from 137,473 patients were analyzed. They had undergone enhanced CT scanning with intravenous injection of Ultravist-370 or Isovue-370 during the period of January 1, 2006 to December 31, 2012 in our hospital. We investigated and classified AARs according to the American College of Radiology and the Chinese Society of Radiology (CSR) guidelines for iodinated contrast media. We analyzed risk factors for AARs and compared the AARs induced by Ultravist-370 and Isovue-370. Four hundred and twenty-eight (0.31%) patients experienced AARs, which included 330 (0.24%) patients with mild AARs, 82 (0.06%) patients with moderate AARs, and 16 (0.01%) patients with severe AARs (including 3 cases of cardiac arrest and one case of death). The incidence of AARs was higher with Ultravist-370 than with Isovue-370 (0.38% vs 0.24%, P < 0.001), but only for mild AARs (0.32% vs 0.16%, P < 0.001). Analyses on risk factors indicated that female patients (n = 221, 0.43%, P < 0.001), emergency patients (n = 11, 0.51%, P < 0.001), elderly patients aged 50 to 60 years (n = 135, 0.43%, P < 0.001), and patients who underwent coronary computed tomography angiography (CTA) (n = 55, 0.51%, P < 0.001) had a higher risk of AARs. Cutaneous manifestations (50.52%)—especially rash (59.74%)—were the most frequent mild AARs. Cardiovascular manifestations accounted for most moderate and severe AARs (62.91% and 48.28%, respectively). After proper management, the symptoms and signs of 96.5% of the AARs resolved within 24 hours without sequelae. Ultravist-370 and Isovue-370 are safe for patients undergoing enhanced CT scanning. The incidence of AARs is higher with Ultravist-370 than with Isovue-370, but this difference is limited only to the mild AARs. The incidence of AARs could be affected by multiple factors. PMID:27015204

  5. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  6. Research on elastic resource management for multi-queue under cloud computing environment

    NASA Astrophysics Data System (ADS)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  7. Tool for Ranking Research Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott, Kelly; Smith, Harold

    2005-01-01

    Tool for Research Enhancement Decision Support (TREDS) is a computer program developed to assist managers in ranking options for research aboard the International Space Station (ISS). It could likely also be adapted to perform similar decision-support functions in industrial and academic settings. TREDS provides a ranking of the options, based on a quantifiable assessment of all the relevant programmatic decision factors of benefit, cost, and risk. The computation of the benefit for each option is based on a figure of merit (FOM) for ISS research capacity that incorporates both quantitative and qualitative inputs. Qualitative inputs are gathered and partly quantified by use of the time-tested analytical hierarchical process and used to set weighting factors in the FOM corresponding to priorities determined by the cognizant decision maker(s). Then by use of algorithms developed specifically for this application, TREDS adjusts the projected benefit for each option on the basis of levels of technical implementation, cost, and schedule risk. Based partly on Excel spreadsheets, TREDS provides screens for entering cost, benefit, and risk information. Drop-down boxes are provided for entry of qualitative information. TREDS produces graphical output in multiple formats that can be tailored by users.

  8. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  9. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  10. Knowledge, attitudes, and practices regarding antiretroviral management, reproductive health, sexually transmitted infections, and sexual risk behavior among perinatally HIV-infected youth in Thailand.

    PubMed

    Lolekha, Rangsima; Boon-Yasidhi, Vitharon; Leowsrisook, Pimsiri; Naiwatanakul, Thananda; Durier, Yuitiang; Nuchanard, Wipada; Tarugsa, Jariya; Punpanich, Warunee; Pattanasin, Sarika; Chokephaibulkit, Kulkanya

    2015-01-01

    More than 30% of perinatally HIV-infected children in Thailand are 12 years and older. As these youth become sexually active, there is a risk that they will transmit HIV to their partners. Data on the knowledge, attitudes, and practices (KAP) of HIV-infected youth in Thailand are limited. Therefore, we assessed the KAP of perinatally HIV-infected youth and youth reporting sexual risk behaviors receiving care at two tertiary care hospitals in Bangkok, Thailand and living in an orphanage in Lopburi, Thailand. From October 2010 to July 2011, 197 HIV-infected youth completed an audio computer-assisted self-interview to assess their KAP regarding antiretroviral (ARV) management, reproductive health, sexual risk behaviors, and sexually transmitted infections (STIs). A majority of youth in this study correctly answered questions about HIV transmission and prevention and the importance of taking ARVs regularly. More than half of the youth in this study demonstrated a lack of family planning, reproductive health, and STI knowledge. Girls had more appropriate attitudes toward safe sex and risk behaviors than boys. Although only 5% of the youth reported that they had engaged in sexual intercourse, about a third reported sexual risk behaviors (e.g., having or kissing boy/girlfriend or consuming an alcoholic beverage). We found low condom use and other family planning practices, increasing the risk of HIV and/or STI transmission to sexual partners. Additional resources are needed to improve reproductive health knowledge and reduce risk behavior among HIV-infected youth in Thailand.

  11. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  12. Ileal Entrapment within a Paracaecal Hernia Mimicking Acute Appendicitis

    PubMed Central

    Birchley, David

    2009-01-01

    Presented is a case of incarcerated paracaecal hernia mimicking acute appendicitis. The clinical scenario highlights the need for a high index of suspicion in the management of patients with localised peritonism even in the absence of obstructive symptoms and the presence of normal laboratory markers of inflammation.Whilst computed tomography might offer a pre-operative diagnosis, in such a low-risk patient laparoscopy offers the combined advantages of immediate diagnosis and definitive treatment of acute pathology. PMID:19317924

  13. Cyber Ricochet: Risk Management and Cyberspace Operations

    DTIC Science & Technology

    2012-07-01

    Cox, U.S. Cyber Command Director of Intelligence Introduction Recent media reports of the ‘ Duqu ’, ‘Flame’, and ‘Stuxnet’ malware highlight...as the ‘ Duqu ,’ ‘Flame,’ and ‘Stuxnet’ malware, are just of a few of the capabilities that can contribute to mission success and achieve strategic...rely on artificially intelligent agents to dredge up the deepest secrets.” 19 The ‘ Duqu ’ and ‘Flame’ malware are excellent examples of computer

  14. Supply Chain Risk Management: An Introduction to the Credible Threat

    DTIC Science & Technology

    2016-08-01

    connected. All we have to do is pull out our phones, tablets, laptops or any other similar device and get the information we need virtually...with the sup­ ply chain, especially when it comes to the use of electronics, computers and other computerized components. The attempt to remove or...Trusted State­of­the­Art Microelectronics Strategy Study ,“ July 2015, Potomac Institute for Policy Studies report. Figure 2. An Organization’s

  15. Computation of Material Demand in the Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) Process

    DTIC Science & Technology

    2015-08-01

    Congress concerning requirements for the National Defense Stockpile (NDS) of strategic and critical non- fuel materials. 1 RAMF-SM, which was...critical non- fuel materials. The NDS was established in the World War II era and has been managed by the Department of Defense (DOD) since 1988. By...Department of the Interior. An alternative algorithm is used for materials with intensive defense demands. v Contents 1 .  Introduction

  16. Automated acute kidney injury alerts.

    PubMed

    Kashani, Kianoush B

    2018-05-02

    Acute kidney injury (AKI) is one of the most common and probably one of the more consequential complications of critical illnesses. Recent information indicates that it is at least partially preventable; however, progress in its prevention, management, and treatment has been hindered by the scarcity of knowledge for effective interventions, inconsistencies in clinical practices, late identification of patients at risk for or with AKI, and limitations of access to best practices for prevention and management of AKI. Growing use of electronic health records has provided a platform for computer science to engage in data mining and processing, not only for early detection of AKI but also for the development of risk-stratification strategies and computer clinical decision-support (CDS) systems. Despite promising perspectives, the literature regarding the impact of AKI electronic alerts and CDS systems has been conflicting. Some studies have reported improvement in care processes and patient outcomes, whereas others have shown no effect on clinical outcomes and yet demonstrated an increase in the use of resources. These discrepancies are thought to be due to multiple factors that may be related to technology, human factors, modes of delivery of information to clinical providers, and level of expectations regarding the impact on patient outcomes. This review appraises the current body of knowledge and provides some outlines regarding research into and clinical aspects of CDS systems for AKI. Copyright © 2018 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  17. Evaluation of Microcomputer-Based Operation and Maintenance Management Systems for Army Water/Wastewater Treatment Plant Operation.

    DTIC Science & Technology

    1986-07-01

    COMPUTER-AIDED OPERATION MANAGEMENT SYSTEM ................. 29 Functions of an Off-Line Computer-Aided Operation Management System Applications of...System Comparisons 85 DISTRIBUTION 5V J. • 0. FIGURES Number Page 1 Hardware Components 21 2 Basic Functions of a Computer-Aided Operation Management System...Plant Visits 26 4 Computer-Aided Operation Management Systems Reviewed for Analysis of Basic Functions 29 5 Progress of Software System Installation and

  18. Design and Development of a Clinical Risk Management Tool Using Radio Frequency Identification (RFID)

    PubMed Central

    Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh

    2016-01-01

    Background: Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. Objective: This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. Methods: The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients’ medication orders. Results: The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. Conclusion: A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare. PMID:27147802

  19. Design and Development of a Clinical Risk Management Tool Using Radio Frequency Identification (RFID).

    PubMed

    Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh

    2016-04-01

    Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients' medication orders. The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare.

  20. Risk identification of agricultural drought for sustainable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.; Tarquis, A. M.

    2014-09-01

    Drought is considered as one of the major natural hazards with a significant impact on agriculture, environment, society and economy. Droughts affect sustainability of agriculture and may result in environmental degradation of a region, which is one of the factors contributing to the vulnerability of agriculture. This paper addresses agrometeorological or agricultural drought within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with risk identification of agricultural drought, which involves drought quantification and monitoring, as well as statistical inference. For the quantitative assessment of agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the vegetation health index (VHI). The computation of VHI is based on satellite data of temperature and the normalized difference vegetation index (NDVI). The spatiotemporal features of drought, which are extracted from VHI, are areal extent, onset and end time, duration and severity. In this paper, a 20-year (1981-2001) time series of the National Oceanic and Atmospheric Administration/advanced very high resolution radiometer (NOAA/AVHRR) satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural drought-prone region of Greece, characterized by vulnerable agriculture. The results show that agricultural drought appears every year during the warm season in the region. The severity of drought is increasing from mild to extreme throughout the warm season, with peaks appearing in the summer. Similarly, the areal extent of drought is also increasing during the warm season, whereas the number of extreme drought pixels is much less than those of mild to moderate drought throughout the warm season. Finally, the areas with diachronic drought persistence can be located. Drought early warning is developed using empirical functional relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought classes, respectively. The two fitted curves offer a forecasting tool on a monthly basis from May to October. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential. The adopted remote-sensing data and methods have proven very effective in delineating spatial variability and features in drought quantification and monitoring.

  1. Management of Asymptomatic Renal Stones in Astronauts

    NASA Technical Reports Server (NTRS)

    Reyes, David; Locke, James

    2016-01-01

    Introduction: Management guidelines were created to screen and manage asymptomatic renal stones in U.S. astronauts. The risks for renal stone formation in astronauts due to bone loss and hypercalcuria are unknown. Astronauts have a stone risk which is about the same as commercial aviation pilots, which is about half that of the general population. However, proper management of this condition is still crucial to mitigate health and mission risks in the spaceflight environment. Methods: An extensive review of the literature and current aeromedical standards for the monitoring and management of renal stones was done. The NASA Flight Medicine Clinic's electronic medical record and Longitudinal Survey of Astronaut Health were also reviewed. Using this work, a screening and management algorithm was created that takes into consideration the unique operational environment of spaceflight. Results: Renal stone screening and management guidelines for astronauts were created based on accepted standards of care, with consideration to the environment of spaceflight. In the proposed algorithm, all astronauts will receive a yearly screening ultrasound for renal calcifications, or mineralized renal material (MRM). Any areas of MRM, 3 millimeters or larger, are considered a positive finding. Three millimeters approaches the detection limit of standard ultrasound, and several studies have shown that any stone that is 3 millimeters or less has an approximately 95 percent chance of spontaneous passage. For mission-assigned astronauts, any positive ultrasound study is followed by low-dose renal computed tomography (CT) scan, and flexible ureteroscopy if CT is positive. Other specific guidelines were also created. Discussion: The term "MRM" is used to account for small areas of calcification that may be outside the renal collecting system, and allows objectivity without otherwise constraining the diagnostic and treatment process for potentially very small calcifications of uncertain significance. However, a small asymptomatic MRM or stone within the renal collecting system may become symptomatic, and so affect launch and flight schedules, cause incapacitation during flight, and ultimately require medical evacuation. For exploration class missions, evacuation is unlikely. The new screening and management algorithm allows better management of mission risks, and will define the true incidence of renal stones in U.S. astronauts. This information will be used to refine future screening, countermeasures and treatment methods; and will also inform the needed capabilities to be flown on exploration-class missions.

  2. Risk and Vulnerability Assessment Using Cybernomic Computational Models: Tailored for Industrial Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Federick T.; Schlicher, Bob G

    2015-01-01

    There are many influencing economic factors to weigh from the defender-practitioner stakeholder point-of-view that involve cost combined with development/deployment models. Some examples include the cost of countermeasures themselves, the cost of training and the cost of maintenance. Meanwhile, we must better anticipate the total cost from a compromise. The return on investment in countermeasures is essentially impact costs (i.e., the costs from violating availability, integrity and confidentiality / privacy requirements). The natural question arises about choosing the main risks that must be mitigated/controlled and monitored in deciding where to focus security investments. To answer this question, we have investigated themore » cost/benefits to the attacker/defender to better estimate risk exposure. In doing so, it s important to develop a sound basis for estimating the factors that derive risk exposure, such as likelihood that a threat will emerge and whether it will be thwarted. This impact assessment framework can provide key information for ranking cybersecurity threats and managing risk.« less

  3. Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.

    PubMed

    Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal

    2012-09-01

    While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  4. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Ravindra; Reilly, James T.; Wang, Jianhui

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to commonmore » barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.« less

  5. A new Web-based medical tool for assessment and prevention of comprehensive cardiovascular risk

    PubMed Central

    Franchi, Daniele; Cini, Davide; Iervasi, Giorgio

    2011-01-01

    Background: Multifactor cardiovascular disease is the leading cause of death; besides well-known cardiovascular risk factors, several emerging factors such as mental stress, diet type, and physical inactivity, have been associated to cardiovascular disease. To date, preventive strategies are based on the concept of absolute risk calculated by different algorithms and scoring systems. However, in general practice the patient’s data collection represents a critical issue. Design: A new multipurpose computer-based program has been developed in order to:1) easily calculate and compare the absolute cardiovascular risk by the Framingham, Procam, and Progetto Cuore algorithms; 2) to design a web-based computerized tool for prospective collection of structured data; 3) to support the doctor in the decision-making process for patients at risk according to recent international guidelines. Methods: During a medical consultation the doctor utilizes a common computer connected by Internet to a medical server where all the patient’s data and software reside. The program evaluates absolute and relative cardiovascular risk factors, personalized patient’s goals, and multiparametric trends, monitors critical parameter values, and generates an automated medical report. Results: In a pilot study on 294 patients (47% males; mean age 60 ± 12 years [±SD]) the global time to collect data at first consultation was 13 ± 11 minutes which declined to 8 ± 7 minutes at the subsequent consultation. In 48.2% of cases the program revealed 2 or more primary risk factor parameters outside guideline indications and gave specific clinical suggestions to return altered parameters to target values. Conclusion: The web-based system proposed here may represent a feasible and flexible tool for clinical management of patients at risk of cardiovascular disease and for epidemiological research. PMID:21445280

  6. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, Kostas; Ager, Alan; Finney, Mark; Athanasis, Nikos; Palaiologou, Palaiologos; Vasilakos, Christos

    2016-03-01

    We describe a Web-GIS wildfire prevention and management platform (AEGIS) developed as an integrated and easy-to-use decision support tool to manage wildland fire hazards in Greece (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing online access to information that is essential for wildfire management. The system uses a number of spatial and non-spatial data sources to support key system functionalities. Land use/land cover maps were produced by combining field inventory data with high-resolution multispectral satellite images (RapidEye). These data support wildfire simulation tools that allow the users to examine potential fire behavior and hazard with the Minimum Travel Time fire spread algorithm. End-users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations, i.e., single-fire propagation, point-scale calculation of potential fire behavior, and burn probability analysis, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANNs) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps are used to generate integrated output map of fire hazard prediction. The system also incorporates weather information obtained from remote automatic weather stations and weather forecast maps. The system and associated computation algorithms leverage parallel processing techniques (i.e., High Performance Computing and Cloud Computing) that ensure computational power required for real-time application. All AEGIS functionalities are accessible to authorized end-users through a web-based graphical user interface. An innovative smartphone application, AEGIS App, also provides mobile access to the web-based version of the system.

  7. Diagnostic Approach to Pulmonary Hypertension in Premature Neonates

    PubMed Central

    2017-01-01

    Bronchopulmonary dysplasia (BPD) is a form of chronic lung disease in premature infants following respiratory distress at birth. With increasing survival of extremely low birth weight infants, alveolar simplification is the defining lung characteristic of infants with BPD, and along with pulmonary hypertension, increasingly contributes to both respiratory morbidity and mortality in these infants. Growth restricted infants, infants born to mothers with oligohydramnios or following prolonged preterm rupture of membranes are at particular risk for early onset pulmonary hypertension. Altered vascular and alveolar growth particularly in canalicular and early saccular stages of lung development following mechanical ventilation and oxygen therapy, results in developmental lung arrest leading to BPD with pulmonary hypertension (PH). Early recognition of PH in infants with risk factors is important for optimal management of these infants. Screening tools for early diagnosis of PH are evolving; however, echocardiography is the mainstay for non-invasive diagnosis of PH in infants. Cardiac computed tomography (CT) and magnetic resonance are being used as imaging modalities, however their role in improving outcomes in these patients is uncertain. Follow-up of infants at risk for PH will help not only in early diagnosis, but also in appropriate management of these infants. Aggressive management of lung disease, avoidance of hypoxemic episodes, and optimal nutrition determine the progression of PH, as epigenetic factors may have significant effects, particularly in growth-restricted infants. Infants with diagnosis of PH are managed with pulmonary vasodilators and those resistant to therapy need to be worked up for the presence of cardio-vascular anomalies. The management of infants and toddlers with PH, especially following premature birth is an emerging field. Nonetheless, combination therapies in a multi-disciplinary setting improves outcomes for these infants. PMID:28837121

  8. Feasibility of Homomorphic Encryption for Sharing I2B2 Aggregate-Level Data in the Cloud

    PubMed Central

    Raisaro, Jean Louis; Klann, Jeffrey G; Wagholikar, Kavishwar B; Estiri, Hossein; Hubaux, Jean-Pierre; Murphy, Shawn N

    2018-01-01

    The biomedical community is lagging in the adoption of cloud computing for the management of medical data. The primary obstacles are concerns about privacy and security. In this paper, we explore the feasibility of using advanced privacy-enhancing technologies in order to enable the sharing of sensitive clinical data in a public cloud. Our goal is to facilitate sharing of clinical data in the cloud by minimizing the risk of unintended leakage of sensitive clinical information. In particular, we focus on homomorphic encryption, a specific type of encryption that offers the ability to run computation on the data while the data remains encrypted. This paper demonstrates that homomorphic encryption can be used efficiently to compute aggregating queries on the ciphertexts, along with providing end-to-end confidentiality of aggregate-level data from the i2b2 data model. PMID:29888067

  9. Ontology-based prediction of surgical events in laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  10. Feasibility of Homomorphic Encryption for Sharing I2B2 Aggregate-Level Data in the Cloud.

    PubMed

    Raisaro, Jean Louis; Klann, Jeffrey G; Wagholikar, Kavishwar B; Estiri, Hossein; Hubaux, Jean-Pierre; Murphy, Shawn N

    2018-01-01

    The biomedical community is lagging in the adoption of cloud computing for the management of medical data. The primary obstacles are concerns about privacy and security. In this paper, we explore the feasibility of using advanced privacy-enhancing technologies in order to enable the sharing of sensitive clinical data in a public cloud. Our goal is to facilitate sharing of clinical data in the cloud by minimizing the risk of unintended leakage of sensitive clinical information. In particular, we focus on homomorphic encryption, a specific type of encryption that offers the ability to run computation on the data while the data remains encrypted. This paper demonstrates that homomorphic encryption can be used efficiently to compute aggregating queries on the ciphertexts, along with providing end-to-end confidentiality of aggregate-level data from the i2b2 data model.

  11. Continuous Risk Management: An Overview

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hammer, Theodore F.

    1999-01-01

    Software risk management is important because it helps avoid disasters, rework, and overkill, but more importantly because it stimulates win-win situations. The objectives of software risk management are to identify, address, and eliminate software risk items before they become threats to success or major sources of rework. In general, good project managers are also good managers of risk. It makes good business sense for all software development projects to incorporate risk management as part of project management. The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to implement risk management. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This is an introductory tutorial to continuous risk management based on this course. The rational for continuous risk management and how it is incorporated into project management are discussed. The risk management structure of six functions is discussed in sufficient depth for managers to understand what is involved in risk management and how it is implemented. These functions include: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  12. Energy Consumption Management of Virtual Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  13. Probability concepts in quality risk management.

    PubMed

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management.

  14. Portfolio management under sudden changes in volatility and heterogeneous investment horizons

    NASA Astrophysics Data System (ADS)

    Fernandez, Viviana; Lucey, Brian M.

    2007-03-01

    We analyze the implications for portfolio management of accounting for conditional heteroskedasticity and sudden changes in volatility, based on a sample of weekly data of the Dow Jones Country Titans, the CBT-municipal bond, spot and futures prices of commodities for the period 1992-2005. To that end, we first proceed to utilize the ICSS algorithm to detect long-term volatility shifts, and incorporate that information into PGARCH models fitted to the returns series. At the next stage, we simulate returns series and compute a wavelet-based value at risk, which takes into consideration the investor's time horizon. We repeat the same procedure for artificial data generated from semi-parametric estimates of the distribution functions of returns, which account for fat tails. Our estimation results show that neglecting GARCH effects and volatility shifts may lead to an overestimation of financial risk at different time horizons. In addition, we conclude that investors benefit from holding commodities as their low or even negative correlation with stock and bond indices contribute to portfolio diversification.

  15. Extensive Mobile Thrombus of the Internal Carotid Discovered After Intravenous Thrombolysis

    PubMed Central

    Fugate, Jennifer E.; Hocker, Sara E.

    2016-01-01

    This case report describes a rare presentation of ischemic stroke secondary to an extensive internal carotid artery thrombus, subsequent therapeutic dilemma, and clinical management. A 58-year-old man was administered intravenous (IV) thrombolysis for right middle cerebral artery territory ischemic stroke symptoms. A computed tomography angiogram of the head and neck following thrombolysis showed a longitudinally extensive internal carotid artery thrombus originating at the region of high-grade calcific stenosis. Mechanical embolectomy was deferred because of risk of clot dislodgement and mild neurological symptoms. Recumbency and hemodynamic augmentation were used acutely to support cerebral perfusion. Anticoagulation was started 24 hours after thrombolysis. Carotid endarterectomy was completed successfully within 1 week of presentation. Clinical outcome was satisfactory with discharge modified Rankin Scale score 0. A longitudinally extensive carotid artery thrombus poses a risk of dislodgement and hemispheric stroke. Optimal management in these cases is not known with certainty. In our case, IV thrombolysis, hemodynamic augmentation, delayed anticoagulation, and carotid endarterectomy resulted in a favorable clinical outcome. PMID:28400904

  16. Management of advanced NK/T-cell lymphoma.

    PubMed

    Tse, Eric; Kwong, Yok-Lam

    2014-09-01

    NK/T-cell lymphomas are aggressive malignancies, and the outlook is poor when conventional anthracycline-containing regimens designed for B-cell lymphomas are used. With the advent of L-asparaginase-containing regimens, treatment outcome has significantly improved. L-asparaginase-containing regimens are now considered the standard in the management of NK/T-cell lymphomas. In advanced diseases, however, outcome remains unsatisfactory, with durable remission achieved in only about 50% of cases. Stratification of patients with advanced NK/T-cell lymphomas is needed, so that poor-risk patients can be given additional therapy to improve outcome. Conventional presentation parameters are untested and appear inadequate for prognostication when L-asparaginase-containing regimens are used. Recent evidence suggests that dynamic factors during treatment and interim assessment, including Epstein-Barr virus (EBV) DNA quantification and positron emission tomography computed tomography findings, are more useful in patient stratification. The role of high-dose chemotherapy and haematopoietic stem cell transplantation requires evaluation in an overall risk-adapted treatment algorithm.

  17. Aqueduct: a methodology to measure and communicate global water risks

    NASA Astrophysics Data System (ADS)

    Gassert, Francis; Reig, Paul

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores and includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. Indicators are selected based on relevance to water users, availability and robustness of global data sources, and expert consultation, and are collected from existing datasets or derived from a Global Land Data Assimilation System (GLDAS) based integrated water balance model. Indicators are normalized using a threshold approach, and composite scores are computed using a linear aggregation scheme that allows for dynamic weighting to capture users' unique exposure to water hazards. By providing consistent scores across the globe, the Aqueduct Water Risk Atlas enables rapid comparison across diverse aspects of water risk. Companies can use this information to prioritize actions, investors to leverage financial interest to improve water management, and governments to engage with the private sector to seek solutions for more equitable and sustainable water governance. The Aqueduct Water Risk Atlas enables practical applications of scientific data, helping non-expert audiences better understand and evaluate risks facing water users. This presentation will discuss the methodology used to combine the indicator values into aggregated risk scores and lessons learned from working with diverse audiences in academia, development institutions, and the public and private sectors.

  18. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2016-02-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real-time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.

  19. Gender, Race, and Risk: Intersectional Risk Management in the Sale of Sex Online.

    PubMed

    Moorman, Jessica D; Harrison, Kristen

    2016-09-01

    Sex worker experience of risk (e.g., physical violence or rape) is shaped by race, gender, and context. For web-based sex workers, experience of risk is comparatively minimal; what is unclear is how web-based sex workers manage risk and if online advertising plays a role in risk management. Building on intersectionality theory and research exploring risk management in sex work, we content-analyzed 600 escort advertisements from Backpage.com ( http://www.backpage.com ) to explore risk management in web-based sex work. To guide our research we asked: Do advertisements contain risk management messages? Does the use of risk management messaging differ by sex worker race or gender? Which groups have the highest overall use of risk management messages? Through a multivariate analysis of covariance (MANCOVA) we found that advertisements contained risk management messages and that uses of these phrases varied by race and gender. Blacks, women, and transgender women drove the use of risk management messages. Black and White transgender women had the highest overall use of these phrases. We conclude that risk management is an intersectional practice and that the use of risk management messages is a venue-specific manifestation of broader risk management priorities found in all venues where sex is sold.

  20. [Lessons learned from a distribution incident at the Alps-Mediterranean Division of the French Blood Establishment].

    PubMed

    Legrand, D

    2008-11-01

    The Alps-Mediterranean division of the French blood establishment (EFS Alpes-Mediterranée) has implemented a risk management program. Within this framework, the labile blood product distribution process was assessed to identify critical steps. Subsequently, safety measures were instituted including computer-assisted decision support, detailed written instructions and control checks at each step. Failure of these measures to prevent an incident underlines the vulnerability of the process to the human factor. Indeed root cause analysis showed that the incident was due to underestimation of the danger by one individual. Elimination of this type of risk will require continuous training, testing and updating of personnel. Identification and reporting of nonconformities will allow personnel at all levels (local, regional, and national) to share lessons and implement appropriate risk mitigation strategies.

  1. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  2. Fracture risk assessment: improved evaluation of vertebral integrity among metastatic cancer patients to aid in surgical decision-making

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.

    2012-03-01

    Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.

  3. Project Risk Management

    NASA Technical Reports Server (NTRS)

    Jr., R. F. Miles

    1995-01-01

    Project risk management is primarily concerned with performance, reliability, cost, and schedule. Environmental risk management is primarily concerned with human health and ecological hazards and likelihoods. This paper discusses project risk management and compares it to environmental risk management, both with respect to goals and implementation. The approach of the Jet Propulsion Laboratory to risk management is presented as an example of a project risk management approach that is an extension to NASA NHB 7120.5: Management of Major System Programs and Projects.

  4. Computer models for economic and silvicultural decisions

    Treesearch

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  5. Implications of Big Data Analytics on Population Health Management.

    PubMed

    Bradley, Paul S

    2013-09-01

    As healthcare providers transition to outcome-based reimbursements, it is imperative that they make the transition to population health management to stay viable. Providers already have big data assets in the form of electronic health records and financial billing system. Integrating these disparate sources together in patient-centered datasets provides the foundation for probabilistic modeling of their patient populations. These models are the core technology to compute and track the health and financial risk status of the patient population being served. We show how the probabilistic formulation allows for straightforward, early identification of a change in health and risk status. Knowing when a patient is likely to shift to a less healthy, higher risk category allows the provider to intervene to avert or delay the shift. These automated, proactive alerts are critical in maintaining and improving the health of a population of patients. We discuss results of leveraging these models with an urban healthcare provider to track and monitor type 2 diabetes patients. When intervention outcome data are available, data mining and predictive modeling technology are primed to recommend the best type of intervention (prescriptions, physical therapy, discharge protocols, etc.) with the best likely outcome.

  6. Estimating earthquake-induced failure probability and downtime of critical facilities.

    PubMed

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.

  7. Anesthesia in patients with infectious disease caused by multi-drug resistant bacteria.

    PubMed

    Einav, Sharon; Wiener-Well, Yonit

    2017-06-01

    Up to 50% of specific bacterial strains in healthcare admission facilities are multi-drug resistant organisms (MDROs). Involvement of anesthesiologists in management of patients carrying/at risk of carrying MDROs may decrease transmission in the Operating Room (OR). Anesthesiologists, their work area and tools have all been implicated in MDRO outbreaks. Causes include contamination of external ventilation circuits and noncontribution of filters to prevention, inappropriate decontamination procedures for nondisposable equipment (e.g. laryngoscopes, bronchoscopes and stethoscopes) and the anesthesia workplace (e.g. external surfaces of cart and anesthesia machine, telephones and computer keyboards) during OR cleaning and lack of training in sterile drug management. Discussions regarding the management of potential MDRO carriers must include anesthesia providers to optimize infection control interventions as well as the anesthesia method, the location of surgery and recovery and the details of patient transport. Anesthesia staff must learn to identify patients at risk for MDRO infection. Antibiotic prophylaxis, although not evidence based, should adhere to known best practices. Adjuvant therapies (e.g. intranasal Mupirocin and bathing with antiseptics) should be considered. Addition of nonmanual OR cleaning methods such as ultraviolet irradiation or gaseous decontamination is encouraged. Anesthesiologists must undergo formal training in sterile drug preparation and administration.

  8. Predicting regime shifts in flow of the Colorado River

    USGS Publications Warehouse

    Gangopadhyay, Subhrendu; McCabe, Gregory J.

    2010-01-01

    The effects of continued global warming on water resources are a concern for water managers and stake holders. In the western United States, where the combined climatic demand and consumptive use of water is equal to or greater than the natural supply of water for some locations, there is growing concern regarding the sustainability of future water supplies. In addition to the adverse effects of warming on water supply, another issue for water managers is accounting for, and managing, the effects of natural climatic variability, particularly persistently dry and wet periods. Analyses of paleo-reconstructions of Upper Colorado River basin (UCRB) flow demonstrate that severe sustained droughts, and persistent pluvial periods, are a recurring characteristic of hydroclimate in the Colorado River basin. Shifts between persistently dry and wet regimes (e.g., decadal to multi-decadal variability (D2M)) have important implications for water supply and water management. In this study paleo-reconstructions of UCRB flow are used to compute the risks of shifts between persistently wet and dry regimes given the length of time in a specific regime. Results indicate that low frequency variability of hydro-climatic conditions and the statistics that describe this low frequency variability can be useful to water managers by providing information about the risk of shifting from one hydrologic regime to another. To manage water resources in the future water managers will have to understand the joint hydrologic effects of natural climate variability and global warming. These joint effects may produce future hydrologic conditions that are unprecedented in both the instrumental and paleoclimatic records.

  9. The Research on Safety Management Information System of Railway Passenger Based on Risk Management Theory

    NASA Astrophysics Data System (ADS)

    Zhu, Wenmin; Jia, Yuanhua

    2018-01-01

    Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.

  10. Development of a framework for resilience measurement: Suggestion of fuzzy Resilience Grade (RG) and fuzzy Resilience Early Warning Grade (REWG).

    PubMed

    Omidvar, Mohsen; Mazloumi, Adel; Mohammad Fam, Iraj; Nirumand, Fereshteh

    2017-01-01

    Resilience engineering (RE) can be an alternative technique to the traditional risk assessment and management techniques, to predict and manage safety conditions of modern socio-technical organizations. While traditional risk management approaches are retrospective and highlight error calculation and computation of malfunction possibilities, resilience engineering seeks ways to improve capacity at all levels of organizations in order to build strong yet flexible processes. Considering the resilience potential measurement as a concern in complex working systems, the aim of this study was to quantify the resilience by the help of fuzzy sets and Multi-Criteria Decision-Making (MCDM) techniques. In this paper, we adopted the fuzzy analytic hierarchy process (FAHP) method to measure resilience in a gas refinery plant. A resilience assessment framework containing six indicators, each with its own sub-indicators, was constructed. Then, the fuzzy weights of the indicators and the sub-indicators were derived from pair-wise comparisons conducted by experts. The fuzzy evaluating vectors of the indicators and the sub-indicators computed according to the initial assessment data. Finally, the Comprehensive Resilience Index (CoRI), Resilience Grade (RG), and Resilience Early Warning Grade (REWG) were established. To demonstrate the applicability of the proposed method, an illustrative example in a gas refinery complex (an instance of socio-technical systems) was provided. CoRI of the refinery ranked as "III". In addition, for the six main indicators, RG and REWG ranked as "III" and "NEWZ", respectively, except for C3, in which RG ranked as "II", and REWG ranked as "OEWZ". The results revealed the engineering practicability and usefulness of the proposed method in resilience evaluation of socio-technical systems.

  11. 2001 Research Reports NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This document is a collection of technical reports on research conducted by the participants in the 2001 NASA/ASEE Summer Faculty Fellowship Program at the Kennedy Space Center (KSC). Research areas are broad. Some of the topics addressed include: project management, space shuttle safety risks induced by human factor errors, body wearable computers as a feasible delivery system for 'work authorization documents', gas leak detection using remote sensing technologies, a history of the Kennedy Space Center, and design concepts for collabsible cyrogenic storage vessels.

  12. Analysis of the Federal Aviation Administration’s Host Computer Acquisition Process and Potential Application in Department of Defense Acquisitions

    DTIC Science & Technology

    1988-09-01

    defense programs lost far more to inefficient procedures than to fraud and dishonesty * (President’s Commission, l986c:15). Based on the Commission...recommendations from current studies, lessons learned from a successful program, and DOD expert opinions to develop an acquisition management strategy that...established for the alternative(s) selected in the preceding phase. 5. In the concept demonstration/validation phase the technical risk and economic

  13. Management Needs for Computer Support.

    ERIC Educational Resources Information Center

    Irby, Alice J.

    University management has many and varied needs for effective computer services in support of their processing and information functions. The challenge for the computer center managers is to better understand these needs and assist in the development of effective and timely solutions. Management needs can range from accounting and payroll to…

  14. 75 FR 43579 - Privacy Act of 1974; Computer Matching Program Between the Office of Personnel Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... safeguards for disclosure of Social Security benefit information to OPM via direct computer link for the... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program Between the Office of Personnel Management and Social Security Administration AGENCY: Office of Personnel Management...

  15. 78 FR 3474 - Privacy Act of 1974; Computer Matching Program Between the Office Of Personnel Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ... Security benefit information to OPM via direct computer link for the administration of certain programs by... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program Between the Office Of Personnel Management and Social Security Administration AGENCY: Office of Personnel Management...

  16. Computer Viruses and Related Threats: A Management Guide.

    ERIC Educational Resources Information Center

    Wack, John P.; Carnahan, Lisa J.

    This document contains guidance for managing the threats of computer viruses, Trojan horses, network worms, etc. and related software along with unauthorized use. It is geared towards managers of end-user groups, managers dealing with multi-user systems, personal computers, and networks. The guidance is general and addresses the vulnerabilities…

  17. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  18. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  19. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  20. Application of the malaria management model to the analysis of costs and benefits of DDT versus non-DDT malaria control.

    PubMed

    Pedercini, Matteo; Movilla Blanco, Santiago; Kopainsky, Birgit

    2011-01-01

    DDT is considered to be the most cost-effective insecticide for combating malaria. However, it is also the most environmentally persistent and can pose risks to human health when sprayed indoors. Therefore, the use of DDT for vector control remains controversial. In this paper we develop a computer-based simulation model to assess some of the costs and benefits of the continued use of DDT for Indoor Residual Spraying (IRS) versus its rapid phase out. We apply the prototype model to the aggregated sub Saharan African region. For putting the question about the continued use of DDT for IRS versus its rapid phase out into perspective we calculate the same costs and benefits for alternative combinations of integrated vector management interventions. Our simulation results confirm that the current mix of integrated vector management interventions with DDT as the main insecticide is cheaper than the same mix with alternative insecticides when only direct costs are considered. However, combinations with a stronger focus on insecticide-treated bed nets and environmental management show higher levels of cost-effectiveness than interventions with a focus on IRS. Thus, this focus would also allow phasing out DDT in a cost-effective manner. Although a rapid phase out of DDT for IRS is the most expensive of the tested intervention combinations it can have important economic benefits in addition to health and environmental impacts that are difficult to assess in monetary terms. Those economic benefits captured by the model include the avoided risk of losses in agricultural exports. The prototype simulation model illustrates how a computer-based scenario analysis tool can inform debates on malaria control policies in general and on the continued use of DDT for IRS versus its rapid phase out in specific. Simulation models create systematic mechanisms for analyzing alternative interventions and making informed trade offs.

  1. Application of the Malaria Management Model to the Analysis of Costs and Benefits of DDT versus Non-DDT Malaria Control

    PubMed Central

    Pedercini, Matteo; Movilla Blanco, Santiago; Kopainsky, Birgit

    2011-01-01

    Introduction DDT is considered to be the most cost-effective insecticide for combating malaria. However, it is also the most environmentally persistent and can pose risks to human health when sprayed indoors. Therefore, the use of DDT for vector control remains controversial. Methods In this paper we develop a computer-based simulation model to assess some of the costs and benefits of the continued use of DDT for Indoor Residual Spraying (IRS) versus its rapid phase out. We apply the prototype model to the aggregated sub Saharan African region. For putting the question about the continued use of DDT for IRS versus its rapid phase out into perspective we calculate the same costs and benefits for alternative combinations of integrated vector management interventions. Results Our simulation results confirm that the current mix of integrated vector management interventions with DDT as the main insecticide is cheaper than the same mix with alternative insecticides when only direct costs are considered. However, combinations with a stronger focus on insecticide-treated bed nets and environmental management show higher levels of cost-effectiveness than interventions with a focus on IRS. Thus, this focus would also allow phasing out DDT in a cost-effective manner. Although a rapid phase out of DDT for IRS is the most expensive of the tested intervention combinations it can have important economic benefits in addition to health and environmental impacts that are difficult to assess in monetary terms. Those economic benefits captured by the model include the avoided risk of losses in agricultural exports. Conclusions The prototype simulation model illustrates how a computer-based scenario analysis tool can inform debates on malaria control policies in general and on the continued use of DDT for IRS versus its rapid phase out in specific. Simulation models create systematic mechanisms for analyzing alternative interventions and making informed trade offs. PMID:22140467

  2. Universal Skills and Competencies for Geoscientists

    NASA Astrophysics Data System (ADS)

    Mosher, S.

    2015-12-01

    Geoscience students worldwide face a changing future workforce, but all geoscience work has universal cross-cutting skills and competencies that are critical for success. A recent Geoscience Employers Workshop, and employers' input on the "Future of Undergraduate Geoscience Education" survey, identified three major areas. Geoscience work requires spatial and temporal (3D & 4D) thinking, understanding that the Earth is a system of interacting parts and processes, and geoscience reasoning and synthesis. Thus, students need to be able to solve problems in the context of an open and dynamic system, recognizing that most geoscience problems have no clear, unambiguous answers. Students must learn to manage uncertainty, work by analogy and inference, and make predations with limited data. Being able to visualize and solve problems in 3D, incorporate the element of time, and understand scale is critical. Additionally students must learn how to tackle problems using real data, including understand the problems' context, identify appropriate questions to ask, and determine how to proceed. Geoscience work requires integration of quantitative, technical, and computational skills and the ability to be intellectually flexible in applying skills to new situations. Students need experience using high-level math and computational methods to solve geoscience problems, including probability and statistics to understand risk. Increasingly important is the ability to use "Big Data", GIS, visualization and modeling tools. Employers also agree a strong field component in geoscience education is important. Success as a geoscientist also requires non-technical skills. Because most work environments involve working on projects with a diverse team, students need experience with project management in team settings, including goal setting, conflict resolution, time management and being both leader and follower. Written and verbal scientific communication, as well as public speaking and listening skills, are important. Success also depends on interpersonal skills and professionalism, including business acumen, risk management, ethical conduct, and leadership. A global perspective is increasingly important, including cultural literacy and understanding societal relevance.

  3. An Extensible Information Grid for Risk Management

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David G.

    2003-01-01

    This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.

  4. Assessing and Managing Multiple Risks in a Changing World ...

    EPA Pesticide Factsheets

    Roskilde University hosted a November 2015 workshop on “Environmental Risk – Assessing and Managing Multiple Risks in a Changing World”. Thirty attendees from 9 countries developed consensus recommendations regarding: implementation of a common currency (ecosystem services) for holistic environmental risk assessment and management; improvements to risk assessment and management in a complex, human-modified, and changing world; appropriate development of protection goals in a 2-stage process involving both universal and site-, region-, or problem-specific protection goals; addressing societal issues; risk management information needs; conducting risk assessment of risk management; and development of adaptive and flexible regulatory systems. We encourage both cross- and inter-disciplinary approaches to address 10 recommendations: 1) adopt ecosystem services as a common currency for risk assessment and management; 2) consider cumulative stressors (chemical and non-chemical) and determine which dominate to best manage and restore ecosystem services; 3) fully integrate risk managers and communities of interest into the risk assessment process; 4) fully integrate risk assessors and communities of interest into the risk management process; 5) consider socio-economics and increase transparency in both risk assessment and risk management; 6) recognize the ethical rights of humans and ecosystems to an adequate level of protection; 7) determine relevant reference con

  5. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  6. Behavioral and psychophysiological responses to job demands and association with musculoskeletal symptoms in computer work.

    PubMed

    Griffiths, Karin Lindgren; Mackey, Martin G; Adamson, Barbara J

    2011-12-01

    The purpose of this study was to identify and compare individual behavioral and psychophysiological responses to workload demands and stressors associated with the reporting of musculoskeletal symptoms with computer work. Evidence is growing that the prevalence of musculoskeletal symptoms increases with longer hours of computer work and exposure to psychosocial stressors such as high workloads and unrealistic deadlines. Workstyle, or how an individual worker behaves in response to such work demands, may also be an important factor associated with musculoskeletal symptoms in computer operators. Approximately 8,000 employees of the Australian Public Service were invited to complete an on-line survey if they worked with a computer for 15 or more hours per week. The survey was a composite of three questionnaires: the ASSET to measure perceived organizational stressors, Nordic Musculoskeletal Questionnaire to measure reported prevalence of musculoskeletal symptoms and additional questions to measure individual work behaviors and responses. 934 completed surveys were accepted for analyses. Logistic regression was used to identify significant behavioral and work response predictors of musculoskeletal symptoms. Reporting of heightened muscle tension in response to workload pressure was more strongly associated, than other physical behavioral factors, with musculoskeletal symptoms for all body areas, particularly the neck (OR = 2.50, 95% CI: 2.09-2.99). Individual workstyles in response to workload demands and stressors, including working with heightened muscle tension and mental fatigue, were significantly associated with musculoskeletal symptoms. Future risk management strategies should have a greater focus on the identification and management of those organizational factors that are likely to encourage and exacerbate adverse workstyles.

  7. Differentiation of Lung Cancer, Empyema, and Abscess Through the Investigation of a Dry Cough.

    PubMed

    Urso, Brittany; Michaels, Scott

    2016-11-24

    An acute dry cough results commonly from bronchitis or pneumonia. When a patient presents with signs of infection, respiratory crackles, and a positive chest radiograph, the diagnosis of pneumonia is more common. Antibiotic failure in a patient being treated for community-acquired pneumonia requires further investigation through chest computed tomography. If a lung mass is found on chest computed tomography, lung empyema, abscess, and cancer need to be included on the differential and managed aggressively. This report describes a 55-year-old Caucasian male, with a history of obesity, recovered alcoholism, hypercholesterolemia, and hypertension, presenting with an acute dry cough in the primary care setting. The patient developed signs of infection and was found to have a lung mass on chest computed tomography. Treatment with piperacillin-tazobactam and chest tube placement did not resolve the mass, so treatment with thoracotomy and lobectomy was required. It was determined through surgical investigation that the patient, despite having no risk factors, developed a lung abscess. Lung abscesses rarely form in healthy middle-aged individuals making it an unlikely cause of the patient's presenting symptom, dry cough. The patient cleared his infection with proper management and only suffered minor complications of mild pneumoperitoneum and pneumothorax during his hospitalization.

  8. Extravasation Risk Using Ultrasound-guided Peripheral Intravenous Catheters for Computed Tomography Contrast Administration.

    PubMed

    Rupp, Jordan D; Ferre, Robinson M; Boyd, Jeremy S; Dearing, Elizabeth; McNaughton, Candace D; Liu, Dandan; Jarrell, Kelli L; McWade, Conor M; Self, Wesley H

    2016-08-01

    Ultrasound-guided intravenous catheter (USGIV) insertion is increasingly being used for administration of intravenous (IV) contrast for computed tomography (CT) scans. The goal of this investigation was to evaluate the risk of contrast extravasation among patients receiving contrast through USGIV catheters. A retrospective observational study of adult patients who underwent a contrast-enhanced CT scan at a tertiary care emergency department during a recent 64-month period was conducted. The unadjusted prevalence of contrast extravasation was compared between patients with an USGIV and those with a standard peripheral IV inserted without ultrasound. Then, a two-stage sampling design was used to select a subset of the population for a multivariable logistic regression model evaluating USGIVs as a risk factor for extravasation while adjusting for potential confounders. In total, 40,143 patients underwent a contrasted CT scan, including 364 (0.9%) who had contrast administered through an USGIV. Unadjusted prevalence of extravasation was 3.6% for contrast administration through USGIVs and 0.3% for standard IVs (relative risk = 13.9, 95% confidence interval [CI] = 7.9 to 24.6). After potential confounders were adjusted for, CT contrast administered through USGIVs was associated with extravasation (adjusted odds ratio = 8.6, 95% CI = 4.6 to 16.2). No patients required surgical management for contrast extravasation; one patient in the standard IV group was admitted for observation due to extravasation. Patients who received contrast for a CT scan through an USGIV had a higher risk of extravasation than those who received contrast through a standard peripheral IV. Clinicians should consider this extravasation risk when weighing the risks and benefits of a contrast-enhanced CT scan in a patient with USGIV vascular access. © 2016 by the Society for Academic Emergency Medicine.

  9. Extravasation Risk Using Ultrasound Guided Peripheral Intravenous Catheters for Computed Tomography Contrast Administration

    PubMed Central

    Rupp, Jordan D.; Ferre, Robinson M.; Boyd, Jeremy S.; Dearing, Elizabeth; McNaughton, Candace D.; Liu, Dandan; Jarrell, Kelli L.; McWade, Conor M.; Self, Wesley H.

    2016-01-01

    Objective Ultrasound guided intravenous catheter (USGIV) insertion is increasingly being used for administration of intravenous contrast for computed tomography (CT) scans. The goal of this investigation was to evaluate the risk of contrast extravasation among patients receiving contrast through USGIV catheters. Methods A retrospective observational study of adult patients who underwent a contrast-enhanced CT scan at a tertiary-care emergency department during a recent 64-month period was conducted. The unadjusted prevalence of contrast extravasation was compared between patients with an USGIV and those with a standard peripheral IV inserted without ultrasound. Then, a two-stage sampling design was used to select a subset of the population for a multivariable logistic regression model evaluating USGIVs as a risk factor for extravasation while adjusting for potential confounders. Results In total, 40,143 patients underwent a contrasted CT scan, including 364 (0.9%) who had contrast administered through an USGIV. Unadjusted prevalence of extravasation was 3.6% for contrast administration through USGIVs and 0.3% for standard IVs (relative risk: 13.9, 95% CI: 7.7 to 24.6). After adjustment for potential confounders, CT contrast administered through USGIVs was associated with extravasation (adjusted odds ratio: 8.6; 95% CI: 4.6, 16.2). No patients required surgical management for contrast extravasation; one patient in the standard IV group was admitted for observation due to extravasation. Conclusions Patients who received contrast for a CT scan through an USGIV had a higher risk of extravasation than those who received contrast through a standard peripheral IV. Clinicians should consider this extravasation risk when weighing the risks and benefits of a contrast-enhanced CT scan in a patient with USGIV vascular access. PMID:27151898

  10. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.

  11. Assessing and managing multiple risks in a changing world ...

    EPA Pesticide Factsheets

    Roskilde University (Denmark) hosted a November 2015 workshop, Environmental Risk—Assessing and Managing Multiple Risks in a Changing World. This Focus article presents the consensus recommendations of 30 attendees from 9 countries regarding implementation of a common currency (ecosystem services) for holistic environmental risk assessment and management; improvements to risk assessment and management in a complex, human-modified, and changing world; appropriate development of protection goals in a 2-stage process; dealing with societal issues; risk-management information needs; conducting risk assessment of risk management; and development of adaptive and flexible regulatory systems. The authors encourage both cross-disciplinary and interdisciplinary approaches to address their 10 recommendations: 1) adopt ecosystem services as a common currency for risk assessment and management; 2) consider cumulative stressors (chemical and nonchemical) and determine which dominate to best manage and restore ecosystem services; 3) fully integrate risk managers and communities of interest into the risk-assessment process; 4) fully integrate risk assessors and communities of interest into the risk-management process; 5) consider socioeconomics and increased transparency in both risk assessment and risk management; 6) recognize the ethical rights of humans and ecosystems to an adequate level of protection; 7) determine relevant reference conditions and the proper ecological c

  12. High-throughput landslide modelling using computational grids

    NASA Astrophysics Data System (ADS)

    Wallace, M.; Metson, S.; Holcombe, L.; Anderson, M.; Newbold, D.; Brook, N.

    2012-04-01

    Landslides are an increasing problem in developing countries. Multiple landslides can be triggered by heavy rainfall resulting in loss of life, homes and critical infrastructure. Through computer simulation of individual slopes it is possible to predict the causes, timing and magnitude of landslides and estimate the potential physical impact. Geographical scientists at the University of Bristol have developed software that integrates a physically-based slope hydrology and stability model (CHASM) with an econometric model (QUESTA) in order to predict landslide risk over time. These models allow multiple scenarios to be evaluated for each slope, accounting for data uncertainties, different engineering interventions, risk management approaches and rainfall patterns. Individual scenarios can be computationally intensive, however each scenario is independent and so multiple scenarios can be executed in parallel. As more simulations are carried out the overhead involved in managing input and output data becomes significant. This is a greater problem if multiple slopes are considered concurrently, as is required both for landslide research and for effective disaster planning at national levels. There are two critical factors in this context: generated data volumes can be in the order of tens of terabytes, and greater numbers of simulations result in long total runtimes. Users of such models, in both the research community and in developing countries, need to develop a means for handling the generation and submission of landside modelling experiments, and the storage and analysis of the resulting datasets. Additionally, governments in developing countries typically lack the necessary computing resources and infrastructure. Consequently, knowledge that could be gained by aggregating simulation results from many different scenarios across many different slopes remains hidden within the data. To address these data and workload management issues, University of Bristol particle physicists and geographical scientists are collaborating to develop methods for providing simple and effective access to landslide models and associated simulation data. Particle physicists have valuable experience in dealing with data complexity and management due to the scale of data generated by particle accelerators such as the Large Hadron Collider (LHC). The LHC generates tens of petabytes of data every year which is stored and analysed using the Worldwide LHC Computing Grid (WLCG). Tools and concepts from the WLCG are being used to drive the development of a Software-as-a-Service (SaaS) platform to provide access to hosted landslide simulation software and data. It contains advanced data management features and allows landslide simulations to be run on the WLCG, dramatically reducing simulation runtimes by parallel execution. The simulations are accessed using a web page through which users can enter and browse input data, submit jobs and visualise results. Replication of the data ensures a local copy can be accessed should a connection to the platform be unavailable. The platform does not know the details of the simulation software it runs, so it is therefore possible to use it to run alternative models at similar scales. This creates the opportunity for activities such as model sensitivity analysis and performance comparison at scales that are impractical using standalone software.

  13. Risk Management.

    ERIC Educational Resources Information Center

    Randal, L. Nathan

    This chapter of "Principles of School Business Management" presents an overview of risk management for school districts. The chapter first discusses four fundamental elements of risk management: (1) identifying and measuring risks; (2) reducing or eliminating risks; (3) transferring unassumable risks; and (4) assuming remaining risks.…

  14. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  15. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  16. Communicating Risk to Program Managers

    NASA Technical Reports Server (NTRS)

    Shivers, C. Herbert

    2005-01-01

    Program Managers (PM) can protect program resources and improve chances of success by anticipating, understanding and managing risks. Understanding the range of potential risks helps one to avoid or manage the risks. A PM must choose which risks to accept to reduce fire fighting, must meet the expectations of stakeholders consistently, and avoid falling into costly "black holes" that may open. A good risk management process provides the PM more confidence to seize opportunities save money, meet schedule, even improve relationships with people important to the program. Evidence of managing risk and sound internal controls can mean better support from superiors for the program by building a trust and reputation from being on top of issues. Risk managers have an obligation to provide the PM with the best information possible to allow the benefits to be realized (Small Business Consortium, 2004). The Institute for Chartered Accountants in England and Wales sees very important benefits for companies in providing better information about what they do to assess and manage key business risks. Such information will: a) provide practical forward-looking information; b) reduce the cost of capital; c) encourage better risk management; and d) improve accountability for stewardship, investor protection and the usefulness of financial reporting. We are particularly convinced that enhanced risk reporting will help listed companies obtain capital at the lowest possible cost (The Institute of Chartered Accountants in England &Wales, June 2002). Risk managers can take a significant role in quantifying the success of their department and communicating those figures to executive (program) management levels while pushing for a broader risk management role. Overall, risk managers must show that risk management work matters in the most crucial place-the bottom line- as they prove risk management can be a profit center (Sullivan, 2004).

  17. Implementation of Risk Management in NASA's CEV Project- Ensuring Mission Success

    NASA Astrophysics Data System (ADS)

    Perera, Jeevan; Holsomback, Jerry D.

    2005-12-01

    Most project managers know that Risk Management (RM) is essential to good project management. At NASA, standards and procedures to manage risk through a tiered approach have been developed - from the global agency-wide requirements down to a program or project's implementation. The basic methodology for NASA's risk management strategy includes processes to identify, analyze, plan, track, control, communicate and document risks. The identification, characterization, mitigation plan, and mitigation responsibilities associated with specific risks are documented to help communicate, manage, and effectuate appropriate closure. This approach helps to ensure more consistent documentation and assessment and provides a means of archiving lessons learned for future identification or mitigation activities.A new risk database and management tool was developed by NASA in 2002 and since has been used successfully to communicate, document and manage a number of diverse risks for the International Space Station, Space Shuttle, and several other NASA projects and programs including at the Johnson Space Center. Organizations use this database application to effectively manage and track each risk and gain insight into impacts from other organization's viewpoint to develop integrated solutions. Schedule, cost, technical and safety issues are tracked in detail through this system.Risks are tagged within the system to ensure proper review, coordination and management at the necessary management level. The database is intended as a day-to- day tool for organizations to manage their risks and elevate those issues that need coordination from above. Each risk is assigned to a managing organization and a specific risk owner who generates mitigation plans as appropriate. In essence, the risk owner is responsible for shepherding the risk through closure. The individual that identifies a new risk does not necessarily get assigned as the risk owner. Whoever is in the best position to effectuate comprehensive closure is assigned as the risk owner. Each mitigation plan includes the specific tasks that will be conducted to either decrease the likelihood of the risk occurring and/or lessen the severity of the consequences if they do occur. As each mitigation task is completed, the responsible managing organization records the completion of the task in the risk database and then re-scores the risk considering the task's results. By keeping scores updated, a managing organization's current top risks and risk posture can be readily identified including the status of any risk in the system.A number of metrics measure risk process trends from data contained in the database. This allows for trend analysis to further identify improvements to the process and assist in the management of all risks. The metrics will also scrutinize both the effectiveness and compliance of risk management requirements.The risk database is an evolving tool and will be continuously improved with capabilities requested by the NASA project community. This paper presents the basic foundations of risk management, the elements necessary for effective risk management, and the capabilities of this new risk database and how it is implemented to support NASA's risk management needs.

  18. Risk Management Structured for Today's Environment

    NASA Technical Reports Server (NTRS)

    Greenfield, Michael A.

    1998-01-01

    In NPG (NASA Procedures and Guidelines) 7120.5A, we define risk management as "an organized, systematic decision-making process that efficiently identifies, analyzes, plans, tracks, controls, and communicates and documents risk in order to increase the likelihood of achieving program/project goals." Effective risk management depends upon a thorough understanding of the concept of risk, the principles of risk management and the formation of a disciplined risk management process. In human spaceflight programs, NASA has always maintained a rigorous and highly structured risk management effort. When lives are at stake, NASA's missions must be 100% safe; the risk management approach used in human spaceflight has always been comprehensive.

  19. PREFACE: International Conference on Applied Sciences 2015 (ICAS2015)

    NASA Astrophysics Data System (ADS)

    Lemle, Ludovic Dan; Jiang, Yiwen

    2016-02-01

    The International Conference on Applied Sciences ICAS2015 took place in Wuhan, China on June 3-5, 2015 at the Military Economics Academy of Wuhan. The conference is regularly organized, alternatively in Romania and in P.R. China, by Politehnica University of Timişoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the joint aims to serve as a platform for exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The topics of the conference cover a comprehensive spectrum of issues from: >Economical Sciences and Defense: Management Sciences, Business Management, Financial Management, Logistics, Human Resources, Crisis Management, Risk Management, Quality Control, Analysis and Prediction, Government Expenditure, Computational Methods in Economics, Military Sciences, National Security, and others... >Fundamental Sciences and Engineering: Interdisciplinary applications of physics, Numerical approximation and analysis, Computational Methods in Engineering, Metallic Materials, Composite Materials, Metal Alloys, Metallurgy, Heat Transfer, Mechanical Engineering, Mechatronics, Reliability, Electrical Engineering, Circuits and Systems, Signal Processing, Software Engineering, Data Bases, Modeling and Simulation, and others... The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in Engineering, Economics, Defense, etc. The number of participants was 120 from 11 countries (China, Romania, Taiwan, Korea, Denmark, France, Italy, Spain, USA, Jamaica, and Bosnia and Herzegovina). During the three days of the conference four invited and 67 oral talks were delivered. Based on the work presented at the conference, 38 selected papers have been included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computers Engineering, and Electrical Engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields.

  20. Computers and Library Management.

    ERIC Educational Resources Information Center

    Cooke, Deborah M.; And Others

    1985-01-01

    This five-article section discusses changes in the management of the school library resulting from use of the computer. Topics covered include data management programs (record keeping, word processing, and bibliographies); practical applications of a database; evaluation of "Circulation Plus" software; ergonomics and computers; and…

  1. Cognitive mapping tools: review and risk management needs.

    PubMed

    Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor

    2012-08-01

    Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.

  2. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  3. A prioritization and analysis strategy for environmental surveillance results.

    PubMed

    Shyr, L J; Herrera, H; Haaker, R

    1997-11-01

    DOE facilities are required to conduct environmental surveillance to verify that facility operations are operated within the approved risk envelope and have not caused undue risk to the public and the environment. Given a reduced budget, a strategy for analyzing environmental surveillance data was developed to set priorities for sampling needs. The radiological and metal data collected at Sandia National Laboratories, New Mexico, were used to demonstrate the analysis strategy. Sampling locations were prioritized for further investigation and the needs for routine sampling. The process of data management, analysis, prioritization, and presentation has been automated through a custom-designed computer tool. Data collected over years can be analyzed and summarized in a short table format for prioritization and decision making.

  4. Computer Literacy for UK Shipping Management Ashore and Afloat. A Summary. FEU/PICKUP Project Report.

    ERIC Educational Resources Information Center

    Moreby, D. H.

    A study assessed the need of various levels of management in the shipping industry of the United Kingdom for computer literacy training. During the study, researchers interviewed managers in eight shipping companies identified as using computers, spoke with managers and consultants from five companies actively engaged in designing and installing…

  5. Enhanced Capabilities for Subcritical Experiments (ECSE) Risk Management Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urban, Mary Elizabeth

    Risk is a factor, element, constraint, or course of action that introduces an uncertainty of outcome that could impact project objectives. Risk is an inherent part of all activities, whether the activity is simple and small, or large and complex. Risk management is a process that identifies, evaluates, handles, and monitors risks that have the potential to affect project success. The risk management process spans the entire project, from its initiation to its successful completion and closeout, including both technical and programmatic (non-technical) risks. This Risk Management Plan (RMP) defines the process to be used for identifying, evaluating, handling, andmore » monitoring risks as part of the overall management of the Enhanced Capabilities for Subcritical Experiments (ECSE) ‘Project’. Given the changing nature of the project environment, risk management is essentially an ongoing and iterative process, which applies the best efforts of a knowledgeable project staff to a suite of focused and prioritized concerns. The risk management process itself must be continually applied throughout the project life cycle. This document was prepared in accordance with DOE O 413.3B, Program and Project Management for the Acquisition of Capital Assets, its associated guide for risk management DOE G 413.3-7, Risk Management Guide, and LANL ADPM AP-350-204, Risk and Opportunity Management.« less

  6. 12 CFR 1710.19 - Compliance and risk management programs; compliance with other laws.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management program. (1) An Enterprise shall establish and maintain a risk management program that is reasonably designed to manage the risks of the operations of the Enterprise. (2) The risk management program... executive officer of the Enterprise. The risk management officer shall report regularly to the board of...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laughlin, Gary L.

    The International, Homeland, and Nuclear Security (IHNS) Program Management Unit (PMU) oversees a broad portfolio of Sandia’s programs in areas ranging from global nuclear security to critical asset protection. We use science and technology, innovative research, and global engagement to counter threats, reduce dangers, and respond to disasters. The PMU draws on the skills of scientists and engineers from across Sandia. Our programs focus on protecting US government installations, safeguarding nuclear weapons and materials, facilitating nonproliferation activities, securing infrastructures, countering chemical and biological dangers, and reducing the risk of terrorist threats. We conduct research in risk and threat analysis, monitoringmore » and detection, decontamination and recovery, and situational awareness. We develop technologies for verifying arms control agreements, neutralizing dangerous materials, detecting intruders, and strengthening resiliency. Our programs use Sandia’s High-Performance Computing resources for predictive modeling and simulation of interdependent systems, for modeling dynamic threats and forecasting adaptive behavior, and for enabling decision support and processing large cyber data streams. In this report, we highlight four advanced computation projects that illustrate the breadth of the IHNS mission space.« less

  8. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  9. Frameworks and tools for risk assessment of manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Gottardo, Stefania; Semenzin, Elena; Oomen, Agnes; Bos, Peter; Peijnenburg, Willie; van Tongeren, Martie; Nowack, Bernd; Hunt, Neil; Brunelli, Andrea; Scott-Fordsmand, Janeck J; Tran, Lang; Marcomini, Antonio

    2016-10-01

    Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Physical security and IT convergence: Managing the cyber-related risks.

    PubMed

    McCreight, Tim; Leece, Doug

    The convergence of physical security devices into the corporate network is increasing, due to the perceived economic benefits and efficiencies gained from using one enterprise network. Bringing these two networks together is not without risk. Physical devices like closed circuit television cameras (CCTV), card access readers, and heating, ventilation and air conditioning controllers (HVAC) are typically not secured to the standards we expect for corporate computer networks. These devices can pose significant risks to the corporate network by creating new avenues to exploit vulnerabilities in less-than-secure implementations of physical systems. The ASIS Information Technology Security Council (ITSC) developed a white paper describing steps organisations can take to reduce the risks this convergence can pose, and presented these concepts at the 2015 ASIS/ISC2 Congress in Anaheim, California. 1 This paper expands upon the six characteristics described by ITSC, and provides business continuity planners with information on how to apply these recommendations to physical security devices that use the corporate network.

  11. The application of integrated knowledge-based systems for the Biomedical Risk Assessment Intelligent Network (BRAIN)

    NASA Technical Reports Server (NTRS)

    Loftin, Karin C.; Ly, Bebe; Webster, Laurie; Verlander, James; Taylor, Gerald R.; Riley, Gary; Culbert, Chris

    1992-01-01

    One of NASA's goals for long duration space flight is to maintain acceptable levels of crew health, safety, and performance. One way of meeting this goal is through BRAIN, an integrated network of both human and computer elements. BRAIN will function as an advisor to mission managers by assessing the risk of inflight biomedical problems and recommending appropriate countermeasures. Described here is a joint effort among various NASA elements to develop BRAIN and the Infectious Disease Risk Assessment (IDRA) prototype. The implementation of this effort addresses the technological aspects of knowledge acquisition, integration of IDRA components, the use of expert systems to automate the biomedical prediction process, development of a user friendly interface, and integration of IDRA and ExerCISys systems. Because C language, CLIPS and the X-Window System are portable and easily integrated, they were chosen ss the tools for the initial IDRA prototype.

  12. The utilisation of engineered invert traps in the management of near bed solids in sewer networks.

    PubMed

    Ashley, R M; Tait, S J; Stovin, V R; Burrows, R; Framer, A; Buxton, A P; Blackwood, D J; Saul, A J; Blanksby, J R

    2003-01-01

    Large existing sewers are considerable assets which wastewater utilities will require to operate for the foreseeable future to maintain health and the quality of life in cities. Despite their existence for more than a century there is surprisingly little guidance available to manage these systems to minimise problems associated with in-sewer solids. A joint study has been undertaken in the UK, to refine and utilise new knowledge gained from field data, laboratory results and Computational Fluid Dynamics (CFD) simulations to devise cost beneficial engineering tools for the application of small invert traps to localise the deposition of sediments in sewers at accessible points for collection. New guidance has been produced for trap siting and this has been linked to a risk-cost-effectiveness assessment procedure to enable system operators to approach in-sewer sediment management pro-actively rather than reactively as currently happens.

  13. A Group Contingency Plus Self-Management Intervention Targeting At-Risk Secondary Students’ Class-Work and Active Engagement

    PubMed Central

    Trevino-Maack, Sylvia I.; Kamps, Debra; Wills, Howard

    2015-01-01

    The purpose of the present study is to show that an independent group contingency (GC) combined with self-management strategies and randomized-reinforcer components can increase the amount of written work and active classroom responding in high school students. Three remedial reading classes and a total of 15 students participated in this study. Students used self-management strategies during independent reading time to increase the amount of writing in their reading logs. They used self-monitoring strategies to record whether or not they performed expected behaviors in class. A token economy using points and tickets was included in the GC to provide positive reinforcement for target responses. The results were analyzed through visual inspection of graphs and effect size computations and showed that the intervention increased the total amount of written words in the students’ reading logs and overall classroom and individual student academic engagement. PMID:26617432

  14. Prognostic Validation of SKY92 and Its Combination With ISS in an Independent Cohort of Patients With Multiple Myeloma.

    PubMed

    van Beers, Erik H; van Vliet, Martin H; Kuiper, Rowan; de Best, Leonie; Anderson, Kenneth C; Chari, Ajai; Jagannath, Sundar; Jakubowiak, Andrzej; Kumar, Shaji K; Levy, Joan B; Auclair, Daniel; Lonial, Sagar; Reece, Donna; Richardson, Paul; Siegel, David S; Stewart, A Keith; Trudel, Suzanne; Vij, Ravi; Zimmerman, Todd M; Fonseca, Rafael

    2017-09-01

    High risk and low risk multiple myeloma patients follow a very different clinical course as reflected in their PFS and OS. To be clinically useful, methodologies used to identify high and low risk disease must be validated in representative independent clinical data and available so that patients can be managed appropriately. A recent analysis has indicated that SKY92 combined with the International Staging System (ISS) identifies patients with different risk disease with high sensitivity. Here we computed the performance of eight gene expression based classifiers SKY92, UAMS70, UAMS80, IFM15, Proliferation Index, Centrosome Index, Cancer Testis Antigen and HM19 as well as the combination of SKY92/ISS in an independent cohort of 91 newly diagnosed MM patients. The classifiers identified between 9%-21% of patients as high risk, with hazard ratios (HRs) between 1.9 and 8.2. Among the eight signatures, SKY92 identified the largest proportion of patients (21%) also with the highest HR (8.2). Our analysis also validated the combination SKY92/ISS for identification of three classes; low risk (42%), intermediate risk (37%) and high risk (21%). Between low risk and high risk classes the HR is >10. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Assessing and managing multiple risks in a changing world-The Roskilde recommendations.

    PubMed

    Selck, Henriette; Adamsen, Peter B; Backhaus, Thomas; Banta, Gary T; Bruce, Peter K H; Burton, G Allen; Butts, Michael B; Boegh, Eva; Clague, John J; Dinh, Khuong V; Doorn, Neelke; Gunnarsson, Jonas S; Hauggaard-Nielsen, Henrik; Hazlerigg, Charles; Hunka, Agnieszka D; Jensen, John; Lin, Yan; Loureiro, Susana; Miraglia, Simona; Munns, Wayne R; Nadim, Farrokh; Palmqvist, Annemette; Rämö, Robert A; Seaby, Lauren P; Syberg, Kristian; Tangaa, Stine R; Thit, Amalie; Windfeld, Ronja; Zalewski, Maciej; Chapman, Peter M

    2017-01-01

    Roskilde University (Denmark) hosted a November 2015 workshop, Environmental Risk-Assessing and Managing Multiple Risks in a Changing World. This Focus article presents the consensus recommendations of 30 attendees from 9 countries regarding implementation of a common currency (ecosystem services) for holistic environmental risk assessment and management; improvements to risk assessment and management in a complex, human-modified, and changing world; appropriate development of protection goals in a 2-stage process; dealing with societal issues; risk-management information needs; conducting risk assessment of risk management; and development of adaptive and flexible regulatory systems. The authors encourage both cross-disciplinary and interdisciplinary approaches to address their 10 recommendations: 1) adopt ecosystem services as a common currency for risk assessment and management; 2) consider cumulative stressors (chemical and nonchemical) and determine which dominate to best manage and restore ecosystem services; 3) fully integrate risk managers and communities of interest into the risk-assessment process; 4) fully integrate risk assessors and communities of interest into the risk-management process; 5) consider socioeconomics and increased transparency in both risk assessment and risk management; 6) recognize the ethical rights of humans and ecosystems to an adequate level of protection; 7) determine relevant reference conditions and the proper ecological context for assessments in human-modified systems; 8) assess risks and benefits to humans and the ecosystem and consider unintended consequences of management actions; 9) avoid excessive conservatism or possible underprotection resulting from sole reliance on binary, numerical benchmarks; and 10) develop adaptive risk-management and regulatory goals based on ranges of uncertainty. Environ Toxicol Chem 2017;36:7-16. © 2016 SETAC. © 2016 SETAC.

  16. Adoption of Building Information Modelling in project planning risk management

    NASA Astrophysics Data System (ADS)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  17. [Management of preterm labor].

    PubMed

    Kayem, G; Lorthe, E; Doret, M

    2016-12-01

    To define the management of preterm labor (MAP). The literature search was conducted using computer databases Medline and the Cochrane Library for a period from 1969 to March 2016. Leukocytosis screening may be useful in case of hospitalization for Preterm labor (PTL). Its use is not routine (professional consensus). Screening for urinary tract infection by urine culture should be systematic and antibiotic treatment should be performed in cases of bacterial colonization or urinary tract infection for a period of 7 days (grade A). The vaginal swab is useful to detect a strep B and was prescribed antibiotics during labor if positive (grade A). Routine antibiotic therapy is not recommended in case of PTL (grade A). Prolonged hospitalization does not reduce the risk of preterm delivery (NP3) and is not recommended (grade B). Bed rest does not reduce the risk of PTL (NP3), increases the risk of thromboembolism (NP3), and is not recommended (grade C). After hospitalization for PTL, a regular visit by a caregiver at home may be helpful when patients belong to a precarious environment or are psychologically vulnerable (Professional consensus). The benefit of monitoring home uterine activity repeated in the aftermath of hospitalization for PTL is not shown (NP3). It is not recommended to follow-up uterine activity systematically after hospitalization for PTL (grade C). The management of PTL should be individualized, include searching and treatment of infection and avoid prolonged hospitalization or bed rest. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  18. Risk Management for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    jones, Harry

    2005-01-01

    NASA requires continuous risk management for all programs and projects. The risk management process identifies risks, analyzes their impact, prioritizes them, develops and carries out plans to mitigate or accept them, tracks risks and mitigation plans, and communicates and documents risk information. Project risk management is driven by the project goal and is performed by the entire team. Risk management begins early in the formulation phase with initial risk identification and development of a risk management plan and continues throughout the project life cycle. This paper describes the risk management approach that is suggested for use in NASA's Human Support Technology Development. The first step in risk management is to identify the detailed technical and programmatic risks specific to a project. Each individual risk should be described in detail. The identified risks are summarized in a complete risk list. Risk analysis provides estimates of the likelihood and the qualitative impact of a risk. The likelihood and impact of the risk are used to define its priority location in the risk matrix. The approaches for responding to risk are either to mitigate it by eliminating or reducing the effect or likelihood of a risk, to accept it with a documented rationale and contingency plan, or to research or monitor the risk, The Human Support Technology Development program includes many projects with independently achievable goals. Each project must do independent risk management, considering all its risks together and trading them against performance, budget, and schedule. Since the program can succeed even if some projects fail, the program risk has a complex dependence on the individual project risks.

  19. Managing Risk for Cassini During Mission Operations and Data Analysis (MOandDA)

    NASA Technical Reports Server (NTRS)

    Witkowski, Mona M.

    2002-01-01

    A Risk Management Process has been tailored for Cassini that not only satisfies the requirements of NASA and JPL, but also allows the Program to proactively identify and assess risks that threaten mission objectives. Cassini Risk Management is a team effort that involves both management and engineering staff. The process is managed and facilitated by the Mission Assurance Manager (MAM), but requires regular interactions with Program Staff and team members to instill the risk management philosophy into the day to day mission operations. While Risk Management is well defined for projects in the development phase, it is a relatively new concept for Mission Operations. The Cassini team has embraced this process and has begun using it in an effective, proactive manner, to ensure mission success. It is hoped that the Cassini Risk Management Process will form the basis by which risk management is conducted during MO&DA on future projects. proactive in identifying, assessing and mitigating risks before they become problems. Cost ehtiveness is achieved by: Comprehensively identifying risks Rapidly assessing which risks require the expenditure of pruject cewums Taking early actions to mitigate these risks Iterating the process frequently, to be responsive to the dynamic internal and external environments The Cassini Program has successfully implemented a Risk Management Process for mission operations, The initial SRL has been developed and input into he online tool. The Risk Management webbased system has been rolled out for use by the flight team and risk owners we working proactive in identifying, assessing and mitigating risks before they become problems. Cost ehtiveness is achieved by: Comprehensively identifying risks Rapidly assessing which risks require the expenditure of pruject cewums Taking early actions to mitigate these risks Iterating the process frequently, to be responsive to the dynamic internal and external environments The Cassini Program has successfully implemented a Risk Management Process for mission operations, The initial SRL has been developed and input into he online tool. The Risk Management webbased system has been rolled out for use by the flight team and risk owners we working put into place will become visible and will be illusmted in future papers.

  20. Improving Multi-Objective Management of Water Quality Tipping Points: Revisiting the Classical Shallow Lake Problem

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Reed, P. M.; Keller, K.

    2015-12-01

    Recent multi-objective extensions of the classical shallow lake problem are useful for exploring the conceptual and computational challenges that emerge when managing irreversible water quality tipping points. Building on this work, we explore a four objective version of the lake problem where a hypothetical town derives economic benefits from polluting a nearby lake, but at the risk of irreversibly tipping the lake into a permanently polluted state. The trophic state of the lake exhibits non-linear threshold dynamics; below some critical phosphorus (P) threshold it is healthy and oligotrophic, but above this threshold it is irreversibly eutrophic. The town must decide how much P to discharge each year, a decision complicated by uncertainty in the natural P inflow to the lake. The shallow lake problem provides a conceptually rich set of dynamics, low computational demands, and a high level of mathematical difficulty. These properties maximize its value for benchmarking the relative merits and limitations of emerging decision support frameworks, such as Direct Policy Search (DPS). Here, we explore the use of DPS as a formal means of developing robust environmental pollution control rules that effectively account for deeply uncertain system states and conflicting objectives. The DPS reformulation of the shallow lake problem shows promise in formalizing pollution control triggers and signposts, while dramatically reducing the computational complexity of the multi-objective pollution control problem. More broadly, the insights from the DPS variant of the shallow lake problem formulated in this study bridge emerging work related to socio-ecological systems management, tipping points, robust decision making, and robust control.

  1. Total mandibular subapical osteotomy and Le Fort I osteotomy using piezosurgery and computer-aided designed and manufactured surgical splints: a favorable combination of three techniques in the management of severe mouth asymmetry in Parry-Romberg syndrome.

    PubMed

    Scolozzi, Paolo; Herzog, Georges

    2014-05-01

    Although its pathogenesis remains obscure, Parry-Romberg syndrome (PRS) has been associated with the linear scleroderma en coup de sabre. PRS is characterized by unilateral facial atrophy of the skin, subcutaneous tissue, muscles, and bones with at least 1 dermatome supplied by the trigeminal nerve. Facial asymmetry represents the most common sequela and can involve the soft tissues, craniomaxillofacial skeleton, dentoalveolar area, and temporomandibular joint. Although orthognathic procedures have been reported for skeletal reconstruction, treatment of facial asymmetry has been directed to augmentation of the soft tissue volume on the atrophic side using different recontouring or volumetric augmentation techniques. Total mandibular subapical osteotomy has been used in the management of dentofacial deformities, such as open bite and mandibular dentoalveolar retrusion or protrusion associated with an imbalance between the lower lip and the chin. Management of orthognathic procedures has been improved by the recent introduction of stereolithographic surgical splints using computer-aided design (CAD) and computer-aided manufacturing (CAM) technology and piezosurgery. Piezosurgery has increased security during surgery, especially for delicate procedures associated with a high risk of nerve injury. The present report describes a combined total mandibular subapical osteotomy and Le Fort I osteotomy using piezosurgery and surgical splints fabricated using CAD and CAM for the correction of severe mouth asymmetry related to vertical dentoalveolar disharmony in a patient with PRS. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Computer-aided diagnostic strategy selection.

    PubMed

    Greenes, R A

    1986-03-01

    Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.

  3. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  4. Risk Management

    DTIC Science & Technology

    2011-06-02

    actively attack the risks, they will actively attack you.” -Tom Gib Why do Risk Management? 8 “The first step in the risk management process is to...opportunities to manage and improve our chances of success. - Roger Vanscoy “If you do not actively attack the risks, they will actively attack ...our risks provides opportunities to manage and improve our chances of success. - Roger Vanscoy “If you do not actively attack the risks, they will

  5. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  6. A review of risk management process in construction projects of developing countries

    NASA Astrophysics Data System (ADS)

    Bahamid, R. A.; Doh, S. I.

    2017-11-01

    In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.

  7. 78 FR 49663 - Enhanced Risk Management Standards for Systemically Important Derivatives Clearing Organizations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-15

    ... objectives and principles for these risk management standards are to: (1) Promote risk management; (2... international risk management standards set by CPSS-IOSCO's Principles and Recommendations.\\64\\ The Board did... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 39 RIN 3038-AC98 Enhanced Risk Management...

  8. Sustaining a Mature Risk Management Process: Ensuring the International Space Station for a Vibrant Future

    NASA Technical Reports Server (NTRS)

    Raftery, Michael; Carter-Journet, Katrina

    2013-01-01

    The International Space Station (ISS) risk management methodology is an example of a mature and sustainable process. Risk management is a systematic approach used to proactively identify, analyze, plan, track, control, communicate, and document risks to help management make risk-informed decisions that increase the likelihood of achieving program objectives. The ISS has been operating in space for over 14 years and permanently crewed for over 12 years. It is the longest surviving habitable vehicle in low Earth orbit history. Without a mature and proven risk management plan, it would be increasingly difficult to achieve mission success throughout the life of the ISS Program. A successful risk management process must be able to adapt to a dynamic program. As ISS program-level decision processes have evolved, so too has the ISS risk management process continued to innovate, improve, and adapt. Constant adaptation of risk management tools and an ever-improving process is essential to the continued success of the ISS Program. Above all, sustained support from program management is vital to risk management continued effectiveness. Risk management is valued and stressed as an important process by the ISS Program.

  9. Identifying and managing the risks of medical ionizing radiation in endourology.

    PubMed

    Yecies, Todd; Averch, Timothy D; Semins, Michelle J

    2018-02-01

    The risks of exposure to medical ionizing radiation is of increasing concern both among medical professionals and the general public. Patients with nephrolithiasis are exposed to high levels of ionizing radiation through both diagnostic and therapeutic modalities. Endourologists who perform a high-volume of fluoroscopy guided procedures are also exposed to significant quantities of ionizing radiation. The combination of judicious use of radiation-based imaging modalities, application of new imaging techniques such as ultra-low dose computed tomography (CT) scan, and modifying use of current technology such as increasing ultrasound and pulsed fluoroscopy utilization offers the possibility of significantly reducing radiation exposure. We present a review of the literature regarding the risks of medical ionizing radiation to patients and surgeons as it pertains to the field of endourology and interventions that can be performed to limit this exposure. A review of the current state of the literature was performed using MEDLINE and PubMed. Interventions designed to limit patient and surgeon radiation exposure were identified and analyzed. Summaries of the data were compiled and synthesized in the body of the text. While no level 1 evidence exists demonstrating the risk of secondary malignancy with radiation exposure, the preponderance of evidence suggests a dose and age dependent increase in malignancy risk from ionizing radiation. Patients with nephrolithiasis were exposed to an average effective dose of 37mSv over a 2 year period. Multiple evidence-based interventions to limit patient and surgeon radiation exposure and associated risk were identified. Current evidence suggest an age and dose dependent risk of secondary malignancy from ionizing radiation. Urologists must act in accordance with ALARA principles to safely manage nephrolithiasis while minimizing radiation exposure.

  10. Risk management.

    PubMed

    Chambers, David W

    2010-01-01

    Every plan contains risk. To proceed without planning some means of managing that risk is to court failure. The basic logic of risk is explained. It consists in identifying a threshold where some corrective action is necessary, the probability of exceeding that threshold, and the attendant cost should the undesired outcome occur. This is the probable cost of failure. Various risk categories in dentistry are identified, including lack of liquidity; poor quality; equipment or procedure failures; employee slips; competitive environments; new regulations; unreliable suppliers, partners, and patients; and threats to one's reputation. It is prudent to make investments in risk management to the extent that the cost of managing the risk is less than the probable loss due to risk failure and when risk management strategies can be matched to type of risk. Four risk management strategies are discussed: insurance, reducing the probability of failure, reducing the costs of failure, and learning. A risk management accounting of the financial meltdown of October 2008 is provided.

  11. Research on Risk Management and Power Supplying Enterprise Control

    NASA Astrophysics Data System (ADS)

    Shen, Jianfei; Wang, Yige

    2017-09-01

    This paper derived from the background that electric power enterprises strengthen their risk management under requirements of the government. For the power industry, we explained the risk management theory, analysed current macro environment as well as basic situation, then classified and interpreted the main risks. In a case study on a power bureau, we established a risk management system based on deep understanding about the characteristics of its organization system and risk management function. Then, we focused on risks in operation as well as incorrupt government construction to give a more effective framework of the risk management system. Finally, we came up with the problems and specific countermeasures in risk management, which provided a reference for other electric power enterprises.

  12. Calysto: Risk Management for Commercial Manned Spaceflight

    NASA Technical Reports Server (NTRS)

    Dillaman, Gary

    2012-01-01

    The Calysto: Risk Management for Commercial Manned Spaceflight study analyzes risk management in large enterprises and how to effectively communicate risks across organizations. The Calysto Risk Management tool developed by NASA's Kennedy Space Center's SharePoint team is used and referenced throughout the study. Calysto is a web-base tool built on Microsoft's SharePoint platform. The risk management process at NASA is examined and incorporated in the study. Using risk management standards from industry and specific organizations at the Kennedy Space Center, three methods of communicating and elevating risk are examined. Each method describes details of the effectiveness and plausibility of using the method in the Calysto Risk Management Tool. At the end of the study suggestions are made for future renditions of Calysto.

  13. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  14. The utility of computed tomography in the management of fever and neutropenia in pediatric oncology.

    PubMed

    Rao, Avani D; Sugar, Elizabeth A; Barrett, Neil; Mahesh, Mahadevappa; Arceci, Robert J

    2015-10-01

    Despite the frequent use and radiation exposure of computed tomography (CT) scans, there is little information on patterns of CT use and their utility in the management of pediatric patients with fever and neutropenia (FN). We examined the contribution of either the commonly employed pan-CT (multiple anatomical locations) or targeted CT (single location) scanning to identify possible infectious etiologies in this challenging clinical scenario. Procedure Pediatric patients with an underlying malignancy admitted for fever (temperature ≥ 38.3 °C) and an absolute neutrophil count <500 cells/μL from 2003-2009 were included. Risk factors associated with utilization, results, and effects on clinical management of CT scans were identified. Results Charts for 635 admissions for FN from 263 patients were reviewed. Overall, 139 (22%) admissions (93 individuals) had at least one scan. Of 188 scans, 103 (55%) were pan-scans. Changes in management were most strongly associated with the identification of evidence consistent with infection (OR = 12.64, 95% CI: 5.05-31.60, P < 0.001). Seventy-eight (41%) of all CT scans led to a change in clinical management, most commonly relating to use of antibiotic (N = 41, 53%) or antifungal/antiviral medications (N = 33, 42%). The odds of a change in clinical management did not differ for those receiving a pan-scan compared to those receiving a targeted scan (OR = 1.23; 95% CI, 0.61-2.46; P = 0.57). Conclusions When CT is clinically indicated, it is important for clinicians to strongly consider utilizing a targeted scan to reduce radiation exposure to patients as well as to decrease costs without compromising care. © 2015 Wiley Periodicals, Inc.

  15. Revisions of the Fish Invasiveness Screening Kit (FISK) for its application in warmer climatic zones, with particular reference to peninsular Florida.

    PubMed

    Lawson, Larry L; Hill, Jeffrey E; Vilizzi, Lorenzo; Hardin, Scott; Copp, Gordon H

    2013-08-01

    The initial version (v1) of the Fish Invasiveness Scoring Kit (FISK) was adapted from the Weed Risk Assessment of Pheloung, Williams, and Halloy to assess the potential invasiveness of nonnative freshwater fishes in the United Kingdom. Published applications of FISK v1 have been primarily in temperate-zone countries (Belgium, Belarus, and Japan), so the specificity of this screening tool to that climatic zone was not noted until attempts were made to apply it in peninsular Florida. To remedy this shortcoming, the questions and guidance notes of FISK v1 were reviewed and revised to improve clarity and extend its applicability to broader climatic regions, resulting in changes to 36 of the 49 questions. In addition, upgrades were made to the software architecture of FISK to improve overall computational speed as well as graphical user interface flexibility and friendliness. We demonstrate the process of screening a fish species using FISK v2 in a realistic management scenario by assessing the Barcoo grunter Scortum barcoo (Terapontidae), a species whose management concerns are related to its potential use for aquaponics in Florida. The FISK v2 screening of Barcoo grunter placed the species into the lower range of medium risk (score = 5), suggesting it is a permissible species for use in Florida under current nonnative species regulations. Screening of the Barcoo grunter illustrates the usefulness of FISK v2 as a proactive tool serving to inform risk management decisions, but the low level of confidence associated with the assessment highlighted a dearth of critical information on this species. © 2012 Society for Risk Analysis.

  16. Enhancing Earth Observation and Modeling for Tsunami Disaster Response and Management

    NASA Astrophysics Data System (ADS)

    Koshimura, Shunichi; Post, Joachim

    2017-04-01

    In the aftermath of catastrophic natural disasters, such as earthquakes and tsunamis, our society has experienced significant difficulties in assessing disaster impact in the limited amount of time. In recent years, the quality of satellite sensors and access to and use of satellite imagery and services has greatly improved. More and more space agencies have embraced data-sharing policies that facilitate access to archived and up-to-date imagery. Tremendous progress has been achieved through the continuous development of powerful algorithms and software packages to manage and process geospatial data and to disseminate imagery and geospatial datasets in near-real time via geo-web-services, which can be used in disaster-risk management and emergency response efforts. Satellite Earth observations now offer consistent coverage and scope to provide a synoptic overview of large areas, repeated regularly. These can be used to compare risk across different countries, day and night, in all weather conditions, and in trans-boundary areas. On the other hand, with use of modern computing power and advanced sensor networks, the great advances of real-time simulation have been achieved. The data and information derived from satellite Earth observations, integrated with in situ information and simulation modeling provides unique value and the necessary complement to socio-economic data. Emphasis also needs to be placed on ensuring space-based data and information are used in existing and planned national and local disaster risk management systems, together with other data and information sources as a way to strengthen the resilience of communities. Through the case studies of the 2011 Great East Japan earthquake and tsunami disaster, we aim to discuss how earth observations and modeling, in combination with local, in situ data and information sources, can support the decision-making process before, during and after a disaster strikes.

  17. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  18. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  19. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  20. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  1. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  2. Influence of green supply chain risk management on performance of Chinese manufacturing enterprises

    NASA Astrophysics Data System (ADS)

    Zhang, Dongying; Yuting, Duan; Junyi, Shen

    2017-12-01

    This paper briefly introduces the background of the research on the impact of green supply chain risk management on corporate performance, reviews the relevant research literature at home and abroad, and uses the gray relational analysis to analyze the impact of the green supply chain risk management on enterprise performance based on 26 industry-related statistical data, from purchasing risk management performance,manufacturing risk management performance and marketing risk management performance.

  3. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  4. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  5. Understanding growers' decisions to manage invasive pathogens at the farm level.

    PubMed

    Breukers, Annemarie; van Asseldonk, Marcel; Bremmer, Johan; Beekman, Volkert

    2012-06-01

    Globalization causes plant production systems to be increasingly threatened by invasive pests and pathogens. Much research is devoted to support management of these risks. Yet, the role of growers' perceptions and behavior in risk management has remained insufficiently analyzed. This article aims to fill this gap by addressing risk management of invasive pathogens from a sociopsychological perspective. An analytical framework based on the Theory of Planned Behavior was used to explain growers' decisions on voluntary risk management measures. Survey information from 303 Dutch horticultural growers was statistically analyzed, including regression and cluster analysis. It appeared that growers were generally willing to apply risk management measures, and that poor risk management was mainly due to perceived barriers, such as high costs and doubts regarding efficacy of management measures. The management measures applied varied considerably among growers, depending on production sector and farm-specific circumstances. Growers' risk perception was found to play a role in their risk management, although the causal relation remained unclear. These results underscore the need to apply a holistic perspective to farm level management of invasive pathogen risk, considering the entire package of management measures and accounting for sector- and farm-specific circumstances. Moreover, they demonstrate that invasive pathogen risk management can benefit from a multidisciplinary approach that incorporates growers' perceptions and behavior.

  6. 77 FR 74518 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    ... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...

  7. 78 FR 35647 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-13

    ... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice of computer matching between the Office of Personnel Management and the Social Security Administration (CMA 1045). SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C...

  8. 75 FR 17788 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-07

    ... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...

  9. 75 FR 31819 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-04

    ... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. AGENCY: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...

  10. Risk Management Implementation Tool

    NASA Technical Reports Server (NTRS)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  11. An integrated simulation and optimization approach for managing human health risks of atmospheric pollutants by coal-fired power plants.

    PubMed

    Dai, C; Cai, X H; Cai, Y P; Guo, H C; Sun, W; Tan, Q; Huang, G H

    2014-06-01

    This research developed a simulation-aided nonlinear programming model (SNPM). This model incorporated the consideration of pollutant dispersion modeling, and the management of coal blending and the related human health risks within a general modeling framework In SNPM, the simulation effort (i.e., California puff [CALPUFF]) was used to forecast the fate of air pollutants for quantifying the health risk under various conditions, while the optimization studies were to identify the optimal coal blending strategies from a number of alternatives. To solve the model, a surrogate-based indirect search approach was proposed, where the support vector regression (SVR) was used to create a set of easy-to-use and rapid-response surrogates for identifying the function relationships between coal-blending operating conditions and health risks. Through replacing the CALPUFF and the corresponding hazard quotient equation with the surrogates, the computation efficiency could be improved. The developed SNPM was applied to minimize the human health risk associated with air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicated that it could be used for reducing the health risk of the public in the vicinity of the two power plants, identifying desired coal blending strategies for decision makers, and considering a proper balance between coal purchase cost and human health risk. A simulation-aided nonlinear programming model (SNPM) is developed. It integrates the advantages of CALPUFF and nonlinear programming model. To solve the model, a surrogate-based indirect search approach based on the combination of support vector regression and genetic algorithm is proposed. SNPM is applied to reduce the health risk caused by air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicate that it is useful for generating coal blending schemes, reducing the health risk of the public, reflecting the trade-offbetween coal purchase cost and health risk.

  12. Information Risk Management and Resilience

    NASA Astrophysics Data System (ADS)

    Dynes, Scott

    Are the levels of information risk management efforts within and between firms correlated with the resilience of the firms to information disruptions? This paper examines the question by considering the results of field studies of information risk management practices at organizations and in supply chains. The organizations investigated differ greatly in the degree of coupling from a general and information risk management standpoint, as well as in the levels of internal awareness and activity regarding information risk management. The comparison of the levels of information risk management in the firms and their actual or inferred resilience indicates that a formal information risk management approach is not necessary for resilience in certain sectors.

  13. Research on Risk Manage of Power Construction Project Based on Bayesian Network

    NASA Astrophysics Data System (ADS)

    Jia, Zhengyuan; Fan, Zhou; Li, Yong

    With China's changing economic structure and increasingly fierce competition in the market, the uncertainty and risk factors in the projects of electric power construction are increasingly complex, the projects will face huge risks or even fail if we don't consider or ignore these risk factors. Therefore, risk management in the projects of electric power construction plays an important role. The paper emphatically elaborated the influence of cost risk in electric power projects through study overall risk management and the behavior of individual in risk management, and introduced the Bayesian network to the project risk management. The paper obtained the order of key factors according to both scene analysis and causal analysis for effective risk management.

  14. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  15. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  16. Risk Management and Physical Modelling for Mountainous Natural Hazards

    NASA Astrophysics Data System (ADS)

    Lehning, Michael; Wilhelm, Christian

    Population growth and climate change cause rapid changes in mountainous regions resulting in increased risks of floods, avalanches, debris flows and other natural hazards. Xevents are of particular concern, since attempts to protect against them result in exponentially growing costs. In this contribution, we suggest an integral risk management approach to dealing with natural hazards that occur in mountainous areas. Using the example of a mountain pass road, which can be protected from the danger of an avalanche by engineering (galleries) and/or organisational (road closure) measures, we show the advantage of an optimal combination of both versus the traditional approach, which is to rely solely on engineering structures. Organisational measures become especially important for Xevents because engineering structures cannot be designed for those events. However, organisational measures need a reliable and objective forecast of the hazard. Therefore, we further suggest that such forecasts should be developed using physical numerical modelling. We present the status of current approaches to using physical modelling to predict snow cover stability for avalanche warnings and peak runoff from mountain catchments for flood warnings. While detailed physical models can already predict peak runoff reliably, they are only used to support avalanche warnings. With increased process knowledge and computer power, current developments should lead to a enhanced role for detailed physical models in natural mountain hazard prediction.

  17. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  18. Scope Complexity Options Risks Excursions (SCORE) Version 3.0 Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used tomore » determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  19. 12 CFR 917.3 - Risk management.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Risk management. 917.3 Section 917.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD GOVERNANCE AND MANAGEMENT OF THE FEDERAL HOME LOAN BANKS POWERS AND RESPONSIBILITIES OF BANK BOARDS OF DIRECTORS AND SENIOR MANAGEMENT § 917.3 Risk management. (a) Risk management...

  20. 12 CFR 917.3 - Risk management.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Risk management. 917.3 Section 917.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD GOVERNANCE AND MANAGEMENT OF THE FEDERAL HOME LOAN BANKS POWERS AND RESPONSIBILITIES OF BANK BOARDS OF DIRECTORS AND SENIOR MANAGEMENT § 917.3 Risk management. (a) Risk management...

  1. 12 CFR 917.3 - Risk management.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk management. 917.3 Section 917.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD GOVERNANCE AND MANAGEMENT OF THE FEDERAL HOME LOAN BANKS POWERS AND RESPONSIBILITIES OF BANK BOARDS OF DIRECTORS AND SENIOR MANAGEMENT § 917.3 Risk management. (a) Risk management...

  2. 12 CFR 917.3 - Risk management.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk management. 917.3 Section 917.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD GOVERNANCE AND MANAGEMENT OF THE FEDERAL HOME LOAN BANKS POWERS AND RESPONSIBILITIES OF BANK BOARDS OF DIRECTORS AND SENIOR MANAGEMENT § 917.3 Risk management. (a) Risk management...

  3. Systems Analysis Directorate Activities Summary August 1977

    DTIC Science & Technology

    1977-09-01

    are: x a. Cataloging direction b. Requirements computation c. Procurement direction d. Distribution management e. Disposal direction f...34inventory management," as a responsibility of NICP’s, includes cataloging, requirements computation, procurement direction, distribution management , maintenance...functions are cataloging, major item management, secondary item management, procurement direction, distribution management , overhaul and rebuild

  4. The roles of 'subjective computer training' and management support in the use of computers in community health centres.

    PubMed

    Yaghmaie, Farideh; Jayasuriya, Rohan

    2004-01-01

    There have been many changes made to information systems in the last decade. Changes in information systems require users constantly to update their computer knowledge and skills. Computer training is a critical issue for any user because it offers them considerable new skills. The purpose of this study was to measure the effects of 'subjective computer training' and management support on attitudes to computers, computer anxiety and subjective norms to use computers. The data were collected from community health centre staff. The results of the study showed that health staff trained in computer use had more favourable attitudes to computers, less computer anxiety and more awareness of others' expectations about computer use than untrained users. However, there was no relationship between management support and computer attitude, computer anxiety or subjective norms. Lack of computer training for the majority of healthcare staff confirmed the need for more attention to this issue, particularly in health centres.

  5. Data Storage and Transfer | High-Performance Computing | NREL

    Science.gov Websites

    High-Performance Computing (HPC) systems. Photo of computer server wiring and lights, blurred to show data. WinSCP for Windows File Transfers Use to transfer files from a local computer to a remote computer. Robinhood for File Management Use this tool to manage your data files on Peregrine. Best

  6. Risk Management in EVA

    NASA Technical Reports Server (NTRS)

    Hall, Jonathan; Lutomski, M.

    2006-01-01

    This viewgraph presentation reviews the use of risk management in Extravehicular Activities (EVA). The contents include: 1) EVA Office at NASA - JSC; 2) EVA Project Risk Management: Why and When; 3) EVA Office Risk Management: How; 4) Criteria for Closing a Risk; 5) Criteria for Accepting a Risk; 6) ISS IRMA Reference Card Data Entry Requirement s; 7) XA/ EVA Office Risk Activity Summary; 8) EVA Significant Change Summary; 9) Integrated Risk Management Application (XA) Matrix, March 31, 2004; 10) ISS Watch Item: 50XX Summary Report; and 11) EVA Project RM Usefulness

  7. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  8. Continuous Risk Management at NASA

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.; Rosenberg, Linda

    1999-01-01

    NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions. This risk management structure of functions has been taught to projects at all NASA Centers and is being successfully implemented on many projects. This presentation will give project managers the information they need to understand if risk management is to be effectively implemented on their projects at a cost they can afford.

  9. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  10. Managing health and safety risks: Implications for tailoring health and safety management system practices.

    PubMed

    Willmer, D R; Haas, E J

    2016-01-01

    As national and international health and safety management system (HSMS) standards are voluntarily accepted or regulated into practice, organizations are making an effort to modify and integrate strategic elements of a connected management system into their daily risk management practices. In high-risk industries such as mining, that effort takes on added importance. The mining industry has long recognized the importance of a more integrated approach to recognizing and responding to site-specific risks, encouraging the adoption of a risk-based management framework. Recently, the U.S. National Mining Association led the development of an industry-specific HSMS built on the strategic frameworks of ANSI: Z10, OHSAS 18001, The American Chemistry Council's Responsible Care, and ILO-OSH 2001. All of these standards provide strategic guidance and focus on how to incorporate a plan-do-check-act cycle into the identification, management and evaluation of worksite risks. This paper details an exploratory study into whether practices associated with executing a risk-based management framework are visible through the actions of an organization's site-level management of health and safety risks. The results of this study show ways that site-level leaders manage day-to-day risk at their operations that can be characterized according to practices associated with a risk-based management framework. Having tangible operational examples of day-to-day risk management can serve as a starting point for evaluating field-level risk assessment efforts and their alignment to overall company efforts at effective risk mitigation through a HSMS or other processes.

  11. Risk management frameworks for human health and environmental risks.

    PubMed

    Jardine, Cindy; Hrudey, Steve; Shortreed, John; Craig, Lorraine; Krewski, Daniel; Furgal, Chris; McColl, Stephen

    2003-01-01

    A comprehensive analytical review of the risk assessment, risk management, and risk communication approaches currently being undertaken by key national, provincial/state, territorial, and international agencies was conducted. The information acquired for review was used to identify the differences, commonalities, strengths, and weaknesses among the various approaches, and to identify elements that should be included in an effective, current, and comprehensive approach applicable to environmental, human health and occupational health risks. More than 80 agencies, organizations, and advisory councils, encompassing more than 100 risk documents, were examined during the period from February 2000 until November 2002. An overview was made of the most important general frameworks for risk assessment, risk management, and risk communication for human health and ecological risk, and for occupational health risk. In addition, frameworks for specific applications were reviewed and summarized, including those for (1)contaminated sites; (2) northern contaminants; (3) priority substances; (4) standards development; (5) food safety; (6) medical devices; (7) prescription drug use; (8) emergency response; (9) transportation; (10) risk communication. Twelve frameworks were selected for more extensive review on the basis of representation of the areas of human health, ecological, and occupational health risk; relevance to Canadian risk management needs; representation of comprehensive and well-defined approaches; generalizability with their risk areas; representation of "state of the art" in Canada, the United States, and/or internationally; and extent of usage of potential usage within Canada. These 12 frameworks were: 1. Framework for Environmental Health Risk Management (US Presidential/Congressional Commission on Risk Assessment and Risk Management, 1997). 2. Health Risk Determination: The Challenge of Health Protection (Health and Welfare Canada, 1990). 3. Health Canada Decision-Making Framework for Identifying, Assessing and Managing Health Risks (Health Canada, 2000). 4. Canadian Environmental Protection Act: Human Health Risk Assessment of Priority Substances(Health Canada, 1994). 5. CSA-Q8550 Risk Management: Guidelines for Decision-Makers (Canada Standards Association, 1997). 6. Risk Assessment in the Federal Government: Managing the Process (US National Research Council, 1983). 7. Understanding Risk: Informing Decisions in a Democratic Society (US National Research Council, 1996). 8. Environmental Health Risk Assessment (enHealth Council of Australia, 2002). 9. A Framework for Ecological Risk Assessment (CCME, 1996). 10. Ecological Risk Assessments of Priority Substances Under the Canadian Environmental Protection Act (Environment Canada, 1996).11. Guidelines for Ecological Risk Assessment (US EPA, 1998b). 12. Proposed Model for Occupational Health Risk Assessment and Management (Rampal & Sadhra, 1999). Based on the extensive review of these frameworks, seven key elements that should be included in a comprehensive framework for human health, ecological, and occupational risk assessment and management were identified: 1. Problem formulation stage. 2. Stakeholder involvement. 3. Communication. 4. Quantitative risk assessment components. 5. Iteration and evaluation. 6. Informed decision making. 7. Flexibility. On the basis of this overarching approach to risk management, the following "checklist" to ensure a good risk management decision is proposed: - Make sure you're solving the right problem. - Consider the problem and the risk within the full context of the situation, using a broad perspective. - Acknowledge, incorporate, and balance the multiple dimensions of risk. - Ensure the highest degree of reliability for all components of the risk management process. - Involve interested and effected parties from the outset of the process. - Commit to honest and open communication between all parties. - Employ continuous evaluation throughout the process (formative, process, and outcome evaluation), and be prepared to change the decision if new information becomes available. Comprehensive and sound principles are critical to providing structure and integrity to risk management frameworks. Guiding principles are intended to provide an ethical grounding for considering the many factors involved in risk management decision making. Ten principles are proposed to guide risk management decision making. The first four principles were adapted and modified from Hattis (1996) along with the addition of two more principles by Hrudey (2000). These have been supplemented by another four principles to make the 10 presented. The principles are based in fundamental ethical principles and values. These principles are intended to be aspirational rather than prescriptive--their application requires flexibility and practical judgement. Risk management is inherently a process in search of balance among competing interests and concerns. Each risk management decision will be "balancing act" of competing priorities, and trade-offs may sometimes have to be made between seemingly conflicting principles. The 10 decision-making principles, with the corresponding ethical principle in italics are: 1. Do more good than harm (beneficence, nonmalificence).- The ultimate goal of good risk management is to prevent or minimize risk, or to "do good" as much as possible. 2. Fair process of decision making (fairness, natural justice). - Risk management must be just, equitable, impartial, unbiased, dispassionate, and objective as far as possible given the circumstances of each situation. 3. Ensure an equitable distribution of risk (equity). - An equitable process of risk management would ensure fair outcomes and equal treatment of all concerned through an equal distribution of benefits and burdens (includes the concept of distributive justice, i.e., equal opportunities for all individuals). 4. Seek optimal use of limited risk management resources (utility). - Optimal risk management demands using limited resources where they will achieve the most risk reduction of overall benefit. 5. Promise no more risk management that can be delivered (honesty).- Unrealistic expectations of risk management can be avoided with honest and candid public accounting of what we know and don't know, and what we can and can't do using risk assessment and risk management. 6. Impose no more risk that you would tolerate yourself (the Golden Rule). - The Golden Rule is important in risk management because it forces decision makers to abandon complete detachment from their decisions so they may understand the perspectives of those affected. 7. Be cautious in the face of uncertainty ("better safe than sorry"). - Risk management must adopt a cautious approach when faced with a potentially serous risk, even if the evidence is uncertain. 8. Foster informed risk decision making for all stakeholders (autonomy). - Fostering autonomous decision making involves both providing people with the opportunity to participate, and full and honest disclosure of all the information required for informed decisions. 9. Risk management processes must be flexible and evolutionary to be open to new knowledge and understanding (evolution, evaluation, iterative process). - The incorporation of new evidence requires that risk management be a flexible, evolutionary, and iterative process, and that evaluation is employed at the beginning and througthout the process. 10. the complete elimination fo risk is not possible (life is not risk free).- Risk is pervasive in our society, and cannot be totally eliminated despite an oft-expressed public desire for "zero risk". However, the level of risk that may ve tolerable by any individual is dependent on values of beliefs, as well as scientific information. Each agency must continue to employ a process that meets the needs of their specific application of risk management. A single approach cannot satisfy the diverse areas to which risk decisions are being applied. However, with increasing experience in the application of the approaches, we are evolving to a common understanding of the essential elements and principles required for successful risk assessment, risk management, and risk communication. Risk management will continue to be a balancing act of competing priorities and needs. Flexibility and good judgement are ultimately the key to successfully making appropriate risk decisions.

  12. [Global risk management].

    PubMed

    Sghaier, W; Hergon, E; Desroches, A

    2015-08-01

    Risk management is a fundamental component of any successful company, whether it is in economic, societal or environmental aspect. Risk management is an especially important activity for companies that optimal security challenge of products and services is great. This is the case especially for the health sector institutions. Risk management is therefore a decision support tool and a means to ensure the sustainability of an organization. In this context, what methods and approaches implemented to manage the risks? Through this state of the art, we are interested in the concept of risk and risk management processes. Then we focus on the different methods of risk management and the criteria for choosing among these methods. Finally we highlight the need to supplement these methods by a systemic and global approach including through risk assessment by the audits. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  13. 12 CFR 615.5182 - Interest rate risk management by associations and other Farm Credit System institutions other...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Interest rate risk management by associations... OPERATIONS, AND FUNDING OPERATIONS Risk Assessment and Management § 615.5182 Interest rate risk management by... shall comply with the requirements of §§ 615.5180 and 615.5181. The interest rate risk management...

  14. Risk preferences in strategic wildfire decision making: A choice experiment with U.S. wildfire managers

    Treesearch

    Matthew J. Wibbenmeyer; Michael S. Hand; David E. Calkin; Tyron J. Venn; Matthew P. Thompson

    2013-01-01

    Federal policy has embraced risk management as an appropriate paradigm for wildfire management. Economic theory suggests that over repeated wildfire events, potential economic costs and risks of ecological damage are optimally balanced when management decisions are free from biases, risk aversion, and risk seeking. Of primary concern in this article is how managers...

  15. 12 CFR 615.5182 - Interest rate risk management by associations and other Farm Credit System institutions other...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Interest rate risk management by associations... OPERATIONS, AND FUNDING OPERATIONS Risk Assessment and Management § 615.5182 Interest rate risk management by... shall comply with the requirements of §§ 615.5180 and 615.5181. The interest rate risk management...

  16. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  17. Impacts of Changing Climate, Hydrology and Land Use on the Stormwater Runoff of Urbanizing Central Florida

    NASA Astrophysics Data System (ADS)

    Huq, E.; Abdul-Aziz, O. I.

    2017-12-01

    We computed the historical and future storm runoff scenarios for the Shingle Creek Basin, including the growing urban centers of central Florida (e.g., City of Orlando). Storm Water Management Model (SWMM 5.1) of US EPA was used to develop a mechanistic hydrologic model for the basin by incorporating components of urban hydrology, hydroclimatological variables, and land use/cover features. The model was calibrated and validated with historical streamflow of 2004-2013 near the outlet of the Shingle Creek. The calibrated model was used to compute the sensitivities of stormwater budget to reference changes in hydroclimatological variables (rainfall and evapotranspiration) and land use/cover features (imperviousness, roughness). Basin stormwater budgets for the historical (2010s = 2004-2013) and future periods (2050s = 2030-2059; 2080s = 2070-2099) were also computed based on downscaled climatic projections of 20 GCMs-RCMs representing the coupled model intercomparison project (CMIP5), and anticipated changes in land use/cover. The sensitivity analyses indicated the dominant drivers of urban runoff in the basin. Comparative assessment of the historical and future stormwater runoff scenarios helped to locate basin areas that would be at a higher risk of future stormwater flooding. Importance of the study lies in providing valuable guidelines for managing stormwater flooding in central Florida and similar growing urban centers around the world.

  18. Bridge-Scour Data Management System user's manual

    USGS Publications Warehouse

    Landers, Mark N.; Mueller, David S.; Martin, Gary R.

    1996-01-01

    The Bridge-Scour Data Management System (BSDMS) supports preparation, compilation, and analysis of bridge-scour data. The BSDMS provides interactive storage, retrieval, selection, editing, and display of bridge-scour data sets. Bridge-scour data sets include more than 200 site and measurement attributes of the channel geometry, flow hydraulics, hydrology, sediment, geomorphic-setting, location, and bridge specifications. This user's manual provides a general overview of the structure and organization of BSDMS data sets and detailed instructions to operate the program. Attributes stored by the BSDMS are described along with an illustration of the input screen where the attribute can be entered or edited. Measured scour depths can be compared with scour depths predicted by selected published equations using the BSDMS. The selected published equations available in the computational portion of the BSDMS are described. This manual is written for BSDMS, version 2.0. The data base will facilitate: (1) developing improved estimators of scour for specific regions or conditions; (2) describing scour processes; and (3) reducing risk from scour at bridges. BSDMS is available in DOS and UNIX versions. The program was written to be portable and, therefore, can be used on multiple computer platforms. Installation procedures depend on the computer platform, and specific installation instructions are distributed with the software. Sample data files and data sets of 384 pier-scour measurements from 56 bridges in 14 States are also distributed with the software.

  19. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  20. 77 FR 9237 - Agency Information Collection Activities; Proposed Collection; Comment Request; Risk Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... Activities; Proposed Collection; Comment Request; Risk Management Program Requirements and Petitions To..., non-chemical manufacturers, etc. Title: Risk Management Program Requirements and Petitions to Modify... regulated substance in a process develop and implement a risk management program and submit a risk...

  1. Potential impact of single-risk-factor versus total risk management for the prevention of cardiovascular events in Seychelles.

    PubMed

    Ndindjock, Roger; Gedeon, Jude; Mendis, Shanthi; Paccaud, Fred; Bovet, Pascal

    2011-04-01

    To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure ≥ 140/90 mmHg and/or total serum cholesterol ≥ 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk ≥ 10% or ≥ 20%). CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40-64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (≥ 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication. A total CV risk of ≥ 10% and ≥ 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted. Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles.

  2. The effectiveness of risk management: an analysis of project risk planning across industries and countries.

    PubMed

    Zwikael, Ofer; Ahn, Mark

    2011-01-01

    This article examines the effectiveness of current risk management practices to reduce project risk using a multinational, multi-industry study across different scenarios and cultures. A survey was administered to 701 project managers, and their supervisors, in seven industries and three diverse countries (New Zealand, Israel, and Japan), in multiple languages during the 2002-2007 period. Results of this study show that project context--industry and country where a project is executed--significantly impacts perceived levels of project risk, and the intensity of risk management processes. Our findings also suggest that risk management moderates the relationship between risk level and project success. Specifically, we found that even moderate levels of risk management planning are sufficient to reduce the negative effect risk levels have on project success. © 2010 Society for Risk Analysis.

  3. The amplification of risk in experimental diffusion chains.

    PubMed

    Moussaïd, Mehdi; Brighton, Henry; Gaissmaier, Wolfgang

    2015-05-05

    Understanding how people form and revise their perception of risk is central to designing efficient risk communication methods, eliciting risk awareness, and avoiding unnecessary anxiety among the public. However, public responses to hazardous events such as climate change, contagious outbreaks, and terrorist threats are complex and difficult-to-anticipate phenomena. Although many psychological factors influencing risk perception have been identified in the past, it remains unclear how perceptions of risk change when propagated from one person to another and what impact the repeated social transmission of perceived risk has at the population scale. Here, we study the social dynamics of risk perception by analyzing how messages detailing the benefits and harms of a controversial antibacterial agent undergo change when passed from one person to the next in 10-subject experimental diffusion chains. Our analyses show that when messages are propagated through the diffusion chains, they tend to become shorter, gradually inaccurate, and increasingly dissimilar between chains. In contrast, the perception of risk is propagated with higher fidelity due to participants manipulating messages to fit their preconceptions, thereby influencing the judgments of subsequent participants. Computer simulations implementing this simple influence mechanism show that small judgment biases tend to become more extreme, even when the injected message contradicts preconceived risk judgments. Our results provide quantitative insights into the social amplification of risk perception, and can help policy makers better anticipate and manage the public response to emerging threats.

  4. The amplification of risk in experimental diffusion chains

    PubMed Central

    Moussaïd, Mehdi; Brighton, Henry; Gaissmaier, Wolfgang

    2015-01-01

    Understanding how people form and revise their perception of risk is central to designing efficient risk communication methods, eliciting risk awareness, and avoiding unnecessary anxiety among the public. However, public responses to hazardous events such as climate change, contagious outbreaks, and terrorist threats are complex and difficult-to-anticipate phenomena. Although many psychological factors influencing risk perception have been identified in the past, it remains unclear how perceptions of risk change when propagated from one person to another and what impact the repeated social transmission of perceived risk has at the population scale. Here, we study the social dynamics of risk perception by analyzing how messages detailing the benefits and harms of a controversial antibacterial agent undergo change when passed from one person to the next in 10-subject experimental diffusion chains. Our analyses show that when messages are propagated through the diffusion chains, they tend to become shorter, gradually inaccurate, and increasingly dissimilar between chains. In contrast, the perception of risk is propagated with higher fidelity due to participants manipulating messages to fit their preconceptions, thereby influencing the judgments of subsequent participants. Computer simulations implementing this simple influence mechanism show that small judgment biases tend to become more extreme, even when the injected message contradicts preconceived risk judgments. Our results provide quantitative insights into the social amplification of risk perception, and can help policy makers better anticipate and manage the public response to emerging threats. PMID:25902519

  5. Decisionmaking under risk in invasive species management: risk management theory and applications

    Treesearch

    Shefali V. Mehta; Robert G. Haight; Frances R. Homans

    2010-01-01

    Invasive species management is closely entwined with the assessment and management of risk that arises from the inherently random nature of the invasion process. The theory and application of risk management for invasive species with an economic perspective is reviewed in this synthesis. Invasive species management can be delineated into three general categories:...

  6. AQUATOOL, a generalized decision-support system for water-resources planning and operational management

    NASA Astrophysics Data System (ADS)

    Andreu, J.; Capilla, J.; Sanchís, E.

    1996-04-01

    This paper describes a generic decision-support system (DSS) which was originally designed for the planning stage of dicision-making associated with complex river basins. Subsequently, it was expanded to incorporate modules relating to the operational stage of decision-making. Computer-assisted design modules allow any complex water-resource system to be represented in graphical form, giving access to geographically referenced databases and knowledge bases. The modelling capability includes basin simulation and optimization modules, an aquifer flow modelling module and two modules for risk assessment. The Segura and Tagus river basins have been used as case studies in the development and validation phases. The value of this DSS is demonstrated by the fact that both River Basin Agencies currently use a version for the efficient management of their water resources.

  7. PET/CT in Radiation Therapy Planning.

    PubMed

    Specht, Lena; Berthelsen, Anne Kiil

    2018-01-01

    Radiation therapy (RT) is an important component of the management of lymphoma patients. Most lymphomas are metabolically active and accumulate 18 F-fluorodeoxyglucose (FDG). Positron emission tomography with computer tomography (PET/CT) imaging using FDG is used routinely in staging and treatment evaluation. FDG-PET/CT imaging is now also used routinely for contouring the target for RT, and has been shown to change the irradiated volume significantly compared with CT imaging alone. Modern advanced imaging techniques with image fusion and motion management in combination with modern highly conformal RT techniques have increased the precision of RT, and have made it possible to reduce dramatically the risks of long-term side effects of treatment while maintaining the high cure rates for these diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Case studies in Bayesian microbial risk assessments.

    PubMed

    Kennedy, Marc C; Clough, Helen E; Turner, Joanne

    2009-12-21

    The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.

  9. Analysis of interactions among barriers in project risk management

    NASA Astrophysics Data System (ADS)

    Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita

    2018-03-01

    In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.

  10. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  11. An Architecture for Cross-Cloud System Management

    NASA Astrophysics Data System (ADS)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  12. Managing risks and hazardous in industrial operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almaula, S.C.

    1996-12-31

    The main objective of this paper is to demonstrate that it makes good business sense to identify risks and hazards of an operation and take appropriate steps to manage them effectively. Developing and implementing an effective risk and hazard management plan also contibutes to other industry requirements and standards. Development of a risk management system, key elements of a risk management plan, and hazards and risk analysis methods are outlined. Comparing potential risk to the cost of prevention is also discussed. It is estimated that the cost of developing and preparing the first risk management plan varies between $50,000 tomore » $200,000. 3 refs., 2 figs., 1 tab.« less

  13. [The relevance of clinical risk management].

    PubMed

    Gulino, Matteo; Vergallo, Gianluca Montanari; Frati, Paola

    2011-01-01

    Medical activity includes a risk of possible injury or complications for the patients, that should drive the Health Care Institutions to introduce and/ or improve clinical Risk management instruments. Although Italy is still lacking a National project of Clinical Risk Management, a number of efforts have been made by different Italian Regions to introduce instruments of risk management. In addition, most of National Health Care Institutions include actually a Department specifically in charge to manage the clinical risk. Despite the practical difficulties, the results obtained until now suggest that the risk management may represent a useful instrument to contribute to the reduction of errors in clinical conduct. Indeed, the introduction of adequate instruments of prevention and management of clinical risk may help to ameliorate the quality of health care Institution services.

  14. Automated systems to identify relevant documents in product risk management

    PubMed Central

    2012-01-01

    Background Product risk management involves critical assessment of the risks and benefits of health products circulating in the market. One of the important sources of safety information is the primary literature, especially for newer products which regulatory authorities have relatively little experience with. Although the primary literature provides vast and diverse information, only a small proportion of which is useful for product risk assessment work. Hence, the aim of this study is to explore the possibility of using text mining to automate the identification of useful articles, which will reduce the time taken for literature search and hence improving work efficiency. In this study, term-frequency inverse document-frequency values were computed for predictors extracted from the titles and abstracts of articles related to three tumour necrosis factors-alpha blockers. A general automated system was developed using only general predictors and was tested for its generalizability using articles related to four other drug classes. Several specific automated systems were developed using both general and specific predictors and training sets of different sizes in order to determine the minimum number of articles required for developing such systems. Results The general automated system had an area under the curve value of 0.731 and was able to rank 34.6% and 46.2% of the total number of 'useful' articles among the first 10% and 20% of the articles presented to the evaluators when tested on the generalizability set. However, its use may be limited by the subjective definition of useful articles. For the specific automated system, it was found that only 20 articles were required to develop a specific automated system with a prediction performance (AUC 0.748) that was better than that of general automated system. Conclusions Specific automated systems can be developed rapidly and avoid problems caused by subjective definition of useful articles. Thus the efficiency of product risk management can be improved with the use of specific automated systems. PMID:22380483

  15. Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii

    2013-04-01

    Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.

  16. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  17. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. 42 CFR 441.476 - Risk management.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Risk management. 441.476 Section 441.476 Public... Self-Directed Personal Assistance Services Program § 441.476 Risk management. (a) The State must... plan for how identified risks will be mitigated. (d) The State must ensure that the risk management...

  19. 42 CFR 441.476 - Risk management.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Risk management. 441.476 Section 441.476 Public... Self-Directed Personal Assistance Services Program § 441.476 Risk management. (a) The State must... plan for how identified risks will be mitigated. (d) The State must ensure that the risk management...

  20. 42 CFR 441.476 - Risk management.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Risk management. 441.476 Section 441.476 Public... Self-Directed Personal Assistance Services Program § 441.476 Risk management. (a) The State must... plan for how identified risks will be mitigated. (d) The State must ensure that the risk management...

  1. 42 CFR 441.476 - Risk management.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Risk management. 441.476 Section 441.476 Public... Self-Directed Personal Assistance Services Program § 441.476 Risk management. (a) The State must... plan for how identified risks will be mitigated. (d) The State must ensure that the risk management...

  2. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-06

    ... Rulemaking: State-78, Risk Analysis and Management Records SUMMARY: Notice is hereby given that the... portions of the Risk Analysis and Management (RAM) Records, State-78, system of records contain criminal...) * * * (2) * * * Risk Analysis and Management Records, STATE-78. * * * * * (b) * * * (1) * * * Risk Analysis...

  3. 42 CFR 441.476 - Risk management.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Risk management. 441.476 Section 441.476 Public... Self-Directed Personal Assistance Services Program § 441.476 Risk management. (a) The State must... plan for how identified risks will be mitigated. (d) The State must ensure that the risk management...

  4. 12 CFR 704.6 - Credit risk management.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Credit risk management. 704.6 Section 704.6... CREDIT UNIONS § 704.6 Credit risk management. (a) Policies. A corporate credit union must operate according to a credit risk management policy that is commensurate with the investment risks and activities...

  5. 12 CFR 563.176 - Interest-rate-risk-management procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Interest-rate-risk-management procedures. 563... ASSOCIATIONS-OPERATIONS Financial Management Policies § 563.176 Interest-rate-risk-management procedures... association's management of that risk. (b) The board of directors shall formerly adopt a policy for the...

  6. 12 CFR 563.176 - Interest-rate-risk-management procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Interest-rate-risk-management procedures. 563... ASSOCIATIONS-OPERATIONS Financial Management Policies § 563.176 Interest-rate-risk-management procedures... association's management of that risk. (b) The board of directors shall formerly adopt a policy for the...

  7. Marine and Hydrokinetic Technology Development Risk Management Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snowberg, David; Weber, Jochem

    2015-09-01

    Over the past decade, the global marine and hydrokinetic (MHK) industry has suffered a number of serious technological and commercial setbacks. To help reduce the risks of industry failures and advance the development of new technologies, the U.S. Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) developed an MHK Risk Management Framework. By addressing uncertainties, the MHK Risk Management Framework increases the likelihood of successful development of an MHK technology. It covers projects of any technical readiness level (TRL) or technical performance level (TPL) and all risk types (e.g. technological risk, regulatory risk, commercial risk) over themore » development cycle. This framework is intended for the development and deployment of a single MHK technology—not for multiple device deployments within a plant. This risk framework is intended to meet DOE’s risk management expectations for the MHK technology research and development efforts of the Water Power Program (see Appendix A). It also provides an overview of other relevant risk management tools and documentation.1 This framework emphasizes design and risk reviews as formal gates to ensure risks are managed throughout the technology development cycle. Section 1 presents the recommended technology development cycle, Sections 2 and 3 present tools to assess the TRL and TPL of the project, respectively. Section 4 presents a risk management process with design and risk reviews for actively managing risk within the project, and Section 5 presents a detailed description of a risk registry to collect the risk management information into one living document. Section 6 presents recommendations for collecting and using lessons learned throughout the development process.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glantz, C.S.; DiMassa, F.V.; Pelto, P.J.

    The Western Area Power Administration (Western) views environmental protection and compliance as a top priority as it manages the construction, operation, and maintenance of its vast network of transmission lines, substations, and other facilities. A recent Department of Energy audit of Western`s environmental management activities recommends that Western adopt a formal environmental risk program. To accomplish this goal, Western, in conjunction with Pacific Northwest Laboratory, is in the process of developing a centrally coordinated environmental risk program. This report presents the results of this design effort, and indicates the direction in which Western`s environmental risk program is heading. Western`s environmentalmore » risk program will consist of three main components: risk communication, risk assessment, and risk management/decision making. Risk communication is defined as an exchange of information on the potential for threats to human health, public safety, or the environment. This information exchange provides a mechanism for public involvement, and also for the participation in the risk assessment and management process by diverse groups or offices within Western. The objective of risk assessment is to evaluate and rank the relative magnitude of risks associated with specific environmental issues that are facing Western. The evaluation and ranking is based on the best available scientific information and judgment and serves as input to the risk management process. Risk management takes risk information and combines it with relevant non-risk factors (e.g., legal mandates, public opinion, costs) to generate risk management options. A risk management tool, such as decision analysis, can be used to help make risk management choices.« less

  9. Health risk assessment and the practice of industrial hygiene.

    PubMed

    Paustenbach, D J

    1990-07-01

    It has been claimed that there may be as many as 2000 airborne chemicals to which persons could be exposed in the workplace and in the community. Of these, occupational exposure limits have been set for approximately 700 chemicals, and only about 30 chemicals have limits for the ambient air. It is likely that some type of health risk assessment methodology will be used to establish limits for the remainder. Although these methods have been used for over 10 yr to set environmental limits, each step of the process (hazard identification, dose-response assessment, exposure assessment, and risk characterization) contains a number of traps into which scientists and risk managers can fall. For example, regulatory approaches to the hazard identification step have allowed little discrimination between the various animal carcinogens, even though these chemicals can vary greatly in their potency and mechanisms of action. In general, epidemiology data have been given little weight compared to the results of rodent bioassays. The dose-response extrapolation process, as generally practiced, often does not present the range of equally plausible values. Procedures which acknowledge and quantitatively account for some or all of the different classes of chemical carcinogens have not been widely adopted. For example, physiologically based pharmacokinetic (PB-PK) and biologically based models need to become a part of future risk assessments. The exposure evaluation portion of risk assessments can now be significantly more valid because of better dispersion models, validated exposure parameters, and the use of computers to account for complex environmental factors. Using these procedures, industrial hygienists are now able to quantitatively estimate the risks caused not only by the inhalation of chemicals but also those caused by dermal contact and incidental ingestion. The appropriate use of risk assessment methods should allow scientists and risk managers to set scientifically valid environmental and occupational standards for air contaminants.

  10. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  11. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  12. A high-resolution physically-based global flood hazard map

    NASA Astrophysics Data System (ADS)

    Kaheil, Y.; Begnudelli, L.; McCollum, J.

    2016-12-01

    We present the results from a physically-based global flood hazard model. The model uses a physically-based hydrologic model to simulate river discharges, and 2D hydrodynamic model to simulate inundation. The model is set up such that it allows the application of large-scale flood hazard through efficient use of parallel computing. For hydrology, we use the Hillslope River Routing (HRR) model. HRR accounts for surface hydrology using Green-Ampt parameterization. The model is calibrated against observed discharge data from the Global Runoff Data Centre (GRDC) network, among other publicly-available datasets. The parallel-computing framework takes advantage of the river network structure to minimize cross-processor messages, and thus significantly increases computational efficiency. For inundation, we implemented a computationally-efficient 2D finite-volume model with wetting/drying. The approach consists of simulating flood along the river network by forcing the hydraulic model with the streamflow hydrographs simulated by HRR, and scaled up to certain return levels, e.g. 100 years. The model is distributed such that each available processor takes the next simulation. Given an approximate criterion, the simulations are ordered from most-demanding to least-demanding to ensure that all processors finalize almost simultaneously. Upon completing all simulations, the maximum envelope of flood depth is taken to generate the final map. The model is applied globally, with selected results shown from different continents and regions. The maps shown depict flood depth and extent at different return periods. These maps, which are currently available at 3 arc-sec resolution ( 90m) can be made available at higher resolutions where high resolution DEMs are available. The maps can be utilized by flood risk managers at the national, regional, and even local levels to further understand their flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs.

  13. Cross-Cutting Risk Framework: Mining Data for Common Risks Across the Portfolio

    NASA Technical Reports Server (NTRS)

    Klein, Gerald A., Jr.; Ruark, Valerie

    2017-01-01

    The National Aeronautics and Space Administration (NASA) defines risk management as an integrated framework, combining risk-informed decision making and continuous risk management to foster forward-thinking and decision making from an integrated risk perspective. Therefore, decision makers must have access to risks outside of their own project to gain the knowledge that provides the integrated risk perspective. Through the Goddard Space Flight Center (GSFC) Flight Projects Directorate (FPD) Business Change Initiative (BCI), risks were integrated into one repository to facilitate access to risk data between projects. With the centralized repository, communications between the FPD, project managers, and risk managers improved and GSFC created the cross-cutting risk framework (CCRF) team. The creation of the consolidated risk repository, in parallel with the initiation of monthly FPD risk managers and risk governance board meetings, are now providing a complete risk management picture spanning the entire directorate. This paper will describe the challenges, methodologies, tools, and techniques used to develop the CCRF, and the lessons learned as the team collectively worked to identify risks that FPD programs projects had in common, both past and present.

  14. Application of a risk management system to improve drinking water safety.

    PubMed

    Jayaratne, Asoka

    2008-12-01

    The use of a comprehensive risk management framework is considered a very effective means of managing water quality risks. There are many risk-based systems available to water utilities such as ISO 9001 and Hazard Analysis and Critical Control Point (HACCP). In 2004, the World Health Organization's (WHO) Guidelines for Drinking Water Quality recommended the use of preventive risk management approaches to manage water quality risks. This paper describes the framework adopted by Yarra Valley Water for the development of its Drinking Water Quality Risk Management Plan incorporating HACCP and ISO 9001 systems and demonstrates benefits of Water Safety Plans such as HACCP. Copyright IWA Publishing 2008.

  15. Additional Security Considerations for Grid Management

    NASA Technical Reports Server (NTRS)

    Eidson, Thomas M.

    2003-01-01

    The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.

  16. Appraising longitudinal trends in the strategic risks cited by risk managers in the international water utility sector, 2005-2015.

    PubMed

    Chalker, Rosemary T C; Pollard, Simon J T; Leinster, Paul; Jude, Simon

    2018-03-15

    We report dynamic changes in the priorities for strategic risks faced by international water utilities over a 10year period, as cited by managers responsible for managing them. A content analysis of interviews with three cohorts of risk managers in the water sector was undertaken. Interviews probed the focus risk managers' were giving to strategic risks within utilities, as well as specific questions on risk analysis tools (2005); risk management cultures (2011) and the integration of risk management with corporate decision-making (2015). The coding frequency of strategic (business, enterprise, corporate) risk terms from 18 structured interviews (2005) and 28 semi-structured interviews (12 in 2011; 16 in 2015) was used to appraise changes in the perceived importance of strategic risks within the sector. The aggregated coding frequency across the study period, and changes in the frequency of strategic risks cited at three interview periods identified infrastructure assets as the most significant risk over the period and suggests an emergence of extrinsic risk over time. Extended interviews with three utility risk managers (2016) from the UK, Canada and the US were then used to contextualise the findings. This research supports the ongoing focus on infrastructure resilience and the increasing prevalence of extrinsic risk within the water sector, as reported by the insurance sector and by water research organisations. The extended interviews provided insight into how strategic risks are now driving the implementation agenda within utilities, and into how utilities can secure tangible business value from proactive risk governance. Strategic external risks affecting the sector are on the rise, involve more players and are less controllable from within a utility's own organisational boundaries. Proportionate risk management processes and structures provide oversight and assurance, whilst allowing a focus on the tangible business value that comes from managing strategic risks well. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Predicting risk of trace element pollution from municipal roads using site-specific soil samples and remotely sensed data.

    PubMed

    Reeves, Mari Kathryn; Perdue, Margaret; Munk, Lee Ann; Hagedorn, Birgit

    2018-07-15

    Studies of environmental processes exhibit spatial variation within data sets. The ability to derive predictions of risk from field data is a critical path forward in understanding the data and applying the information to land and resource management. Thanks to recent advances in predictive modeling, open source software, and computing, the power to do this is within grasp. This article provides an example of how we predicted relative trace element pollution risk from roads across a region by combining site specific trace element data in soils with regional land cover and planning information in a predictive model framework. In the Kenai Peninsula of Alaska, we sampled 36 sites (191 soil samples) adjacent to roads for trace elements. We then combined this site specific data with freely-available land cover and urban planning data to derive a predictive model of landscape scale environmental risk. We used six different model algorithms to analyze the dataset, comparing these in terms of their predictive abilities and the variables identified as important. Based on comparable predictive abilities (mean R 2 from 30 to 35% and mean root mean square error from 65 to 68%), we averaged all six model outputs to predict relative levels of trace element deposition in soils-given the road surface, traffic volume, sample distance from the road, land cover category, and impervious surface percentage. Mapped predictions of environmental risk from toxic trace element pollution can show land managers and transportation planners where to prioritize road renewal or maintenance by each road segment's relative environmental and human health risk. Published by Elsevier B.V.

  18. Evaluation of volcanic risk management in Merapi and Bromo Volcanoes

    NASA Astrophysics Data System (ADS)

    Bachri, S.; Stöetter, J.; Sartohadi, J.; Setiawan, M. A.

    2012-04-01

    Merapi (Central Java Province) and Bromo (East Java Province) volcanoes have human-environmental systems with unique characteristics, thus causing specific consequences on their risk management. Various efforts have been carried out by many parties (institutional government, scientists, and non-governmental organizations) to reduce the risk in these areas. However, it is likely that most of the actions have been done for temporary and partial purposes, leading to overlapping work and finally to a non-integrated scheme of volcanic risk management. This study, therefore, aims to identify and evaluate actions of risk and disaster reduction in Merapi and Bromo Volcanoes. To achieve this aims, a thorough literature review was carried out to identify earlier studies in both areas. Afterward, the basic concept of risk management cycle, consisting of risk assessment, risk reduction, event management and regeneration, is used to map those earlier studies and already implemented risk management actions in Merapi and Bromo. The results show that risk studies in Merapi have been developed predominantly on physical aspects of volcanic eruptions, i.e. models of lahar flows, hazard maps as well as other geophysical modeling. Furthermore, after the 2006 eruption of Merapi, research such on risk communication, social vulnerability, cultural vulnerability have appeared on the social side of risk management research. Apart from that, disaster risk management activities in the Bromo area were emphasizing on physical process and historical religious aspects. This overview of both study areas provides information on how risk studies have been used for managing the volcano disaster. This result confirms that most of earlier studies emphasize on the risk assessment and only few of them consider the risk reduction phase. Further investigation in this field work in the near future will accomplish the findings and contribute to formulate integrated volcanic risk management cycles for both Merapi and Bromo. Keywords: Risk management, volcanoes hazard, Merapi and Bromo Volcano Indonesia

  19. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  20. A novel resource management method of providing operating system as a service for mobile transparent computing.

    PubMed

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  1. Continuous Risk Management Course. Revised

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.

    1999-01-01

    This document includes a course plan for Continuous Risk Management taught by the Software Assurance Technology Center along with the Continuous Risk Management Guidebook of the Software Engineering Institute of Carnegie Mellon University and a description of Continuous Risk Management at NASA.

  2. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  3. Rethinking 'risk' and self-management for chronic illness.

    PubMed

    Morden, Andrew; Jinks, Clare; Ong, Bie Nio

    2012-02-01

    Self-management for chronic illness is a current high profile UK healthcare policy. Policy and clinical recommendations relating to chronic illnesses are framed within a language of lifestyle risk management. This article argues the enactment of risk within current UK self-management policy is intimately related to neo-liberal ideology and is geared towards population governance. The approach that dominates policy perspectives to 'risk' management is critiqued for positioning people as rational subjects who calculate risk probabilities and act upon them. Furthermore this perspective fails to understand the lay person's construction and enactment of risk, their agenda and contextual needs when living with chronic illness. Of everyday relevance to lay people is the management of risk and uncertainty relating to social roles and obligations, the emotions involved when encountering the risk and uncertainty in chronic illness, and the challenges posed by social structural factors and social environments that have to be managed. Thus, clinical enactments of self-management policy would benefit from taking a more holistic view to patient need and seek to avoid solely communicating lifestyle risk factors to be self-managed.

  4. 5 CFR 930.301 - Information systems security awareness training program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... training in system/application life cycle management, risk management, and contingency planning. (4) Chief... security management, system/application life cycle management, risk management, and contingency planning..., risk management, and contingency planning. (b) Provide the Federal information systems security...

  5. 5 CFR 930.301 - Information systems security awareness training program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... training in system/application life cycle management, risk management, and contingency planning. (4) Chief... security management, system/application life cycle management, risk management, and contingency planning..., risk management, and contingency planning. (b) Provide the Federal information systems security...

  6. Taking Risk Assessment and Management to the Next Level: Program-Level Risk Analysis to Enable Solid Decision-Making on Priorities and Funding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, J. G.; Morton, R. L.; Castillo, C.

    2011-02-01

    A multi-level (facility and programmatic) risk assessment was conducted for the facilities in the Nevada National Security Site (NNSS) Readiness in Technical Base and Facilities (RTBF) Program and results were included in a new Risk Management Plan (RMP), which was incorporated into the fiscal year (FY) 2010 Integrated Plans. Risks, risk events, probability, consequence(s), and mitigation strategies were identified and captured, for most scope areas (i.e., risk categories) during the facilitated risk workshops. Risk mitigations (i.e., efforts in addition to existing controls) were identified during the facilitated risk workshops when the risk event was identified. Risk mitigation strategies fell intomore » two broad categories: threats or opportunities. Improvement projects were identified and linked to specific risks they mitigate, making the connection of risk reduction through investments for the annual Site Execution Plan. Due to the amount of that was collected, analysis to be performed, and reports to be generated, a Risk Assessment/ Management Tool (RAMtool) database was developed to analyze the risks in real-time, at multiple levels, which reinforced the site-level risk management process and procedures. The RAMtool database was developed and designed to assist in the capturing and analysis of the key elements of risk: probability, consequence, and impact. The RAMtool calculates the facility-level and programmatic-level risk factors to enable a side-by-side comparison to see where the facility manager and program manager should focus their risk reduction efforts and funding. This enables them to make solid decisions on priorities and funding to maximize the risk reduction. A more active risk management process was developed where risks and opportunities are actively managed, monitored, and controlled by each facility more aggressively and frequently. risk owners have the responsibility and accountability to manage their assigned risk in real-time, using the RAMtool database.« less

  7. Key Points to Facilitate the Adoption of Computer-Based Assessments.

    PubMed

    Burr, S A; Chatterjee, A; Gibson, S; Coombes, L; Wilkinson, S

    2016-01-01

    There are strong pedagogical arguments in favor of adopting computer-based assessment. The risks of technical failure can be managed and are offset by improvements in cost-effectiveness and quality assurance capability. Academic, administrative, and technical leads at an appropriately senior level within an institution need to be identified, so that they can act as effective advocates. All stakeholder groups need to be represented in undertaking a detailed appraisal of requirements and shortlisting software based on core functionality, summative assessment life cycle needs, external compatibility, security, and usability. Any software that is a candidate for adoption should be trialed under simulated summative conditions, with all stakeholders having a voice in agreeing the optimum solution. Transfer to a new system should be carefully planned and communicated, with a programme of training established to maximize the success of adoption.

  8. Key Points to Facilitate the Adoption of Computer-Based Assessments

    PubMed Central

    Burr, S.A.; Chatterjee, A.; Gibson, S.; Coombes, L.; Wilkinson, S.

    2016-01-01

    There are strong pedagogical arguments in favor of adopting computer-based assessment. The risks of technical failure can be managed and are offset by improvements in cost-effectiveness and quality assurance capability. Academic, administrative, and technical leads at an appropriately senior level within an institution need to be identified, so that they can act as effective advocates. All stakeholder groups need to be represented in undertaking a detailed appraisal of requirements and shortlisting software based on core functionality, summative assessment life cycle needs, external compatibility, security, and usability. Any software that is a candidate for adoption should be trialed under simulated summative conditions, with all stakeholders having a voice in agreeing the optimum solution. Transfer to a new system should be carefully planned and communicated, with a programme of training established to maximize the success of adoption. PMID:29349322

  9. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  10. 78 FR 59927 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., Computational, and Systems Biology [External Review Draft]'' (EPA/600/R-13/214A). EPA is also announcing that... Advances in Molecular, Computational, and Systems Biology [External Review Draft]'' is available primarily...

  11. Applying the Heuristic to the Risk Assessment within the Automotive Industry Supply Chain

    NASA Astrophysics Data System (ADS)

    Marasova, Daniela; Andrejiova, Miriam; Grincova, Anna

    2017-03-01

    Risk management facilitates risk identification, evaluation, control, and by means of appropriate set of measures, risk reduction or complete elimination. Therefore, the risk management becomes a strategic factor for a company's success. Properly implemented risk management system does not represent a tool to avoid the risk; it is used to understand the risk and provide the bases for strategic decision-making. Risk management represents a key factor for the supply chain operations. Managing the risks is crucial for achieving the customer satisfaction and thus also a company's success. The subject-matter of the article is the assessment of the supply chain in the automobile industry, in terms of risks. The topicality of this problem is even higher, as after the economic crisis it is necessary to revaluate the readiness of the supply chain for prospective risk conditions. One advantage of this article is the use of the Saaty method as a tool for the risk management within the supply chain.

  12. Risk preferences, probability weighting, and strategy tradeoffs in wildfire management

    Treesearch

    Michael S. Hand; Matthew J. Wibbenmeyer; Dave Calkin; Matthew P. Thompson

    2015-01-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to...

  13. 12 CFR 1710.19 - Compliance and risk management programs; compliance with other laws.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Compliance and risk management programs... Practices and Procedures § 1710.19 Compliance and risk management programs; compliance with other laws. (a... management program. (1) An Enterprise shall establish and maintain a risk management program that is...

  14. [Application of risk grading and classification for occupational hazards in risk management for a shipbuilding project].

    PubMed

    Zeng, Wenfeng; Tan, Qiang; Wu, Shihua; Deng, Yingcong; Liu, Lifen; Wang, Zhi; Liu, Yimin

    2015-12-01

    To investigate the application of risk grading and classification for occupational hazards in risk management for a shipbuilding project. The risk management for this shipbuilding project was performed by a comprehensive application of MES evaluation, quality assessment of occupational health management, and risk grading and classification for occupational hazards, through the methods of occupational health survey, occupational health testing, and occupational health examinations. The results of MES evaluation showed that the risk of occupational hazards in this project was grade 3, which was considered as significant risk; Q value calculated by quality assessment of occupational health management was 0.52, which was considered to be unqualified; the comprehensive evaluation with these two methods showed that the integrated risk rating for this shipbuilding project was class D, and follow- up and rectification were needed with a focus on the improvement in health management. The application of MES evaluation and quality assessment of occupational health management in risk management for occupational hazards can achieve objective and reasonable conclusions and has good applicability.

  15. Utilising handheld computers to monitor and support patients receiving chemotherapy: results of a UK-based feasibility study.

    PubMed

    Kearney, N; Kidd, L; Miller, M; Sage, M; Khorrami, J; McGee, M; Cassidy, J; Niven, K; Gray, P

    2006-07-01

    Recent changes in cancer service provision mean that many patients spend a limited time in hospital and therefore experience and must cope with and manage treatment-related side effects at home. Information technology can provide innovative solutions in promoting patient care through information provision, enhancing communication, monitoring treatment-related side effects and promoting self-care. The aim of this feasibility study was to evaluate the acceptability of using handheld computers as a symptom assessment and management tool for patients receiving chemotherapy for cancer. A convenience sample of patients (n = 18) and health professionals (n = 9) at one Scottish cancer centre was recruited. Patients used the handheld computer to record and send daily symptom reports to the cancer centre and receive instant, tailored symptom management advice during two treatment cycles. Both patients' and health professionals' perceptions of the handheld computer system were evaluated at baseline and at the end of the project. Patients believed the handheld computer had improved their symptom management and felt comfortable in using it. The health professionals also found the handheld computer to be helpful in assessing and managing patients' symptoms. This project suggests that a handheld-computer-based symptom management tool is feasible and acceptable to both patients and health professionals in complementing the care of patients receiving chemotherapy.

  16. Integrated Risk Management Within NASA Programs/Projects

    NASA Technical Reports Server (NTRS)

    Connley, Warren; Rad, Adrian; Botzum, Stephen

    2004-01-01

    As NASA Project Risk Management activities continue to evolve, the need to successfully integrate risk management processes across the life cycle, between functional disciplines, stakeholders, various management policies, and within cost, schedule and performance requirements/constraints become more evident and important. Today's programs and projects are complex undertakings that include a myriad of processes, tools, techniques, management arrangements and other variables all of which must function together in order to achieve mission success. The perception and impact of risk may vary significantly among stakeholders and may influence decisions that may have unintended consequences on the project during a future phase of the life cycle. In these cases, risks may be unintentionally and/or arbitrarily transferred to others without the benefit of a comprehensive systemic risk assessment. Integrating risk across people, processes, and project requirements/constraints serves to enhance decisions, strengthen communication pathways, and reinforce the ability of the project team to identify and manage risks across the broad spectrum of project management responsibilities. The ability to identify risks in all areas of project management increases the likelihood a project will identify significant issues before they become problems and allows projects to make effective and efficient use of shrinking resources. By getting a total team integrated risk effort, applying a disciplined and rigorous process, along with understanding project requirements/constraints provides the opportunity for more effective risk management. Applying an integrated approach to risk management makes it possible to do a better job at balancing safety, cost, schedule, operational performance and other elements of risk. This paper will examine how people, processes, and project requirements/constraints can be integrated across the project lifecycle for better risk management and ultimately improve the chances for mission success.

  17. Risk Assessment in the UK Health and Safety System: Theory and Practice.

    PubMed

    Russ, Karen

    2010-09-01

    In the UK, a person or organisation that creates risk is required to manage and control that risk so that it is reduced 'So Far As Is Reasonably Practicable' (SFAIRP). How the risk is managed is to be determined by those who create the risk. They have a duty to demonstrate that they have taken action to ensure all risk is reduced SFAIRP and must have documentary evidence, for example a risk assessment or safety case, to prove that they manage the risks their activities create. The UK Health and Safety Executive (HSE) does not tell organisations how to manage the risks they create but does inspect the quality of risk identification and management. This paper gives a brief overview of where responsibility for occupational health and safety lies in the UK, and how risk should be managed through risk assessment. The focus of the paper is three recent major UK incidents, all involving fatalities, and all of which were wholly avoidable if risks had been properly assessed and managed. The paper concludes with an analysis of the common failings of risk assessments and key actions for improvement.

  18. The Use of Major Risk Factors for Computer-Based Distinction of Diabetic Patients with Ischemic Stroke and Without Stroke

    DTIC Science & Technology

    2001-10-25

    THE USE of MAJOR RISK FACTORS for COMPUTER-BASED DISTINCTION of DIABETIC PATIENTS with ISCHEMIC STROKE and WITHOUT STROKE Sibel Oge Merey1...highlighting the major risk factors of diabetic patients with non-embolic stroke and without stroke by performing dependency analysis and decision making...of Major Risk Factors for Computer-Based Distinction of Diabetic Patients with Ischemic Stroke and Without Stroke Contract Number Grant Number

  19. Development of a change management system

    NASA Technical Reports Server (NTRS)

    Parks, Cathy Bonifas

    1993-01-01

    The complexity and interdependence of software on a computer system can create a situation where a solution to one problem causes failures in dependent software. In the computer industry, software problems arise and are often solved with 'quick and dirty' solutions. But in implementing these solutions, documentation about the solution or user notification of changes is often overlooked, and new problems are frequently introduced because of insufficient review or testing. These problems increase when numerous heterogeneous systems are involved. Because of this situation, a change management system plays an integral part in the maintenance of any multisystem computing environment. At the NASA Ames Advanced Computational Facility (ACF), the Online Change Management System (OCMS) was designed and developed to manage the changes being applied to its multivendor computing environment. This paper documents the research, design, and modifications that went into the development of this change management system (CMS).

  20. A secure file manager for UNIX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure filemore » manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.« less

  1. Audio computer-assisted self interview compared to traditional interview in an HIV-related behavioral survey in Vietnam.

    PubMed

    Le, Linh Cu; Vu, Lan T H

    2012-10-01

    Globally, population surveys on HIV/AIDS and other sensitive topics have been using audio computer-assisted self interview for many years. This interview technique, however, is still new to Vietnam and little is known about its application and impact in general population surveys. One plausible hypothesis is that residents of Vietnam interviewed using this technique may provide a higher response rate and be more willing to reveal their true behaviors than if interviewed with traditional methods. This study aims to compare audio computer-assisted self interview with traditional face-to-face personal interview and self-administered interview with regard to rates of refusal and affirmative responses to questions on sensitive topics related to HIV/AIDS. In June 2010, a randomized study was conducted in three cities (Ha Noi, Da Nan and Can Tho), using a sample of 4049 residents aged 15 to 49 years. Respondents were randomly assigned to one of three interviewing methods: audio computer-assisted self interview, personal face-to-face interview, and self-administered paper interview. Instead of providing answers directly to interviewer questions as with traditional methods, audio computer-assisted self-interview respondents read the questions displayed on a laptop screen, while listening to the questions through audio headphones, then entered responses using a laptop keyboard. A MySQL database was used for data management and SPSS statistical package version 18 used for data analysis with bivariate and multivariate statistical techniques. Rates of high risk behaviors and mean values of continuous variables were compared for the three data collection methods. Audio computer-assisted self interview showed advantages over comparison techniques, achieving lower refusal rates and reporting higher prevalence of some sensitive and risk behaviors (perhaps indication of more truthful answers). Premarital sex was reported by 20.4% in the audio computer-assisted self-interview survey group, versus 11.4% in the face-to-face group and 11.1% in the self-administered paper questionnaire group. The pattern was consistent for both male and female respondents and in both urban and rural settings. Men in the audio computer-assisted self-interview group also reported higher levels of high-risk sexual behavior--such as sex with sex workers and a higher average number of sexual partners--than did women in the same group. Importantly, item refusal rates on sensitive topics tended to be lower with audio computer-assisted self interview than with the other two methods. Combined with existing data from other countries and previous studies in Vietnam, these findings suggest that researchers should consider using audio computer-assisted self interview for future studies of sensitive and stigmatized topics, especially for men.

  2. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them, four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35˚ -40˚ to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in Remnefjell is used for evaluation of frequency based probability to be used in risk assessment. The major consequence of a rock slide is generation of a tsunami in the lake Loen, causing inundation of residential areas around the lake. Risk is assessed by adapting damage probability matrix approach, which is originally developed for risk assessment for buildings in case of earthquake.

  3. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  4. Seismic risk management solution for nuclear power plants

    DOE PAGES

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefitmore » of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crolley, R.; Thompson, M.

    There has been a need for a faster and cheaper deployment model for information technology (IT) solutions to address waste management needs at US Department of Energy (DOE) complex sites for years. Budget constraints, challenges in deploying new technologies, frequent travel, and increased job demands for existing employees have prevented IT organizations from staying abreast of new technologies or deploying them quickly. Despite such challenges, IT organizations have added significant value to waste management handling through better worker safety, tracking, characterization, and disposition at DOE complex sites. Systems developed for site-specific missions have broad applicability to waste management challenges andmore » in many cases have been expanded to meet other waste missions. Radio frequency identification (RFID) and global positioning satellite (GPS)-enabled solutions have reduced the risk of radiation exposure and safety risks. New web-based and mobile applications have enabled precision characterization and control of nuclear materials. These solutions have also improved operational efficiencies and shortened schedules, reduced cost, and improved regulatory compliance. Collaboration between US Department of Energy (DOE) complex sites is improving time to delivery and cost efficiencies for waste management missions with new information technologies (IT) such as wireless computing, global positioning satellite (GPS), and radio frequency identification (RFID). Integrated solutions developed at separate DOE complex sites by new technology Centers of Excellence (CoE) have increased material control and accountability, worker safety, and environmental sustainability. CoEs offer other DOE sister sites significant cost and time savings by leveraging their technology expertise in project scoping, implementation, and ongoing operations.« less

  6. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  7. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2015-12-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as of the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management should be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  8. The positive impact of radiologic imaging on high-stage cutaneous squamous cell carcinoma management.

    PubMed

    Ruiz, Emily Stamell; Karia, Pritesh S; Morgan, Frederick C; Schmults, Chrysalyne D

    2017-02-01

    There is limited evidence on the utility of radiologic imaging for prognostic staging of cutaneous squamous cell carcinoma (CSCC). Review utilization of radiologic imaging of high-stage CSCCs to evaluate whether imaging impacted management and outcomes. Tumors classified as Brigham and Women's Hospital (BWH) tumor (T) stage T2B or T3 over a 13-year period were reviewed to identify whether imaging was performed and whether results affected treatment. Disease-related outcomes (DRO: local recurrence, nodal metastasis, death from disease) were compared between patients by type of imaging used. 108 high-stage CSCCs in 98 patients were included. Imaging (mostly computed tomography, 79%) was utilized in 45 (46%) patients and management was altered in 16 (33%) patients who underwent imaging. Patients that received no imaging were at higher risk of developing nodal metastases (nonimaging, 30%; imaging, 13%; P = .041) and any DRO (nonimaging, 42%; imaging, 20%; P = .028) compared to the imaging group. Imaging was associated with a lower risk for DRO (subhazard ratio, 0.5; 95% CI 0.2-0.9; P = .046) adjusted for BWH T stage, sex, and location. Single institution retrospective design and changes in technology overtime. Radiologic imaging of high-stage CSCC may influence management and appears to positively impact outcomes. Further prospective studies are needed to establish which patients benefit from imaging. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  9. Domesticating the Personal Computer: The Mainstreaming of a New Technology and the Cultural Management of a Widespread Technophobia, 1964-.

    ERIC Educational Resources Information Center

    Reed, Lori

    2000-01-01

    Uses discourses on "computer-phobia" and "computer addiction" to describe the cultural work involved and marketing strategies used between 1960s-1990s regarding management of computer fear. Draws on popular discourses, advertisements, and advice literature to explore how the personal computer was successfully connected to middle-class family…

  10. Computers Help Technicians Become Managers.

    ERIC Educational Resources Information Center

    Instructional Innovator, 1984

    1984-01-01

    Briefly describes the Academy of Advanced Traffic's use of the Numerax electronic tariff library in financial management, business logistics management, and warehousing courses to familiarize future traffic managers with time saving computer-based information systems that will free them to become integral members of their company's decision-making…

  11. Managing Impression Formation in Computer-Mediated Communication.

    ERIC Educational Resources Information Center

    Liu, Yuliang; Ginther, Dean

    2001-01-01

    Offers suggestions for online instructors regarding verbal and nonverbal impression management. The recommendations should facilitate computer mediated teacher-student or manager-client interactions and help develop constructive relationships that promote learning and productivity. (EV)

  12. Searching for a business case for quality in Medicaid managed care.

    PubMed

    Greene, Sandra B; Reiter, Kristin L; Kilpatrick, Kerry E; Leatherman, Sheila; Somers, Stephen A; Hamblin, Allison

    2008-01-01

    Despite the prevalence of evidence-based interventions to improve quality in health care systems, there is a paucity of documented evidence of a financial return on investment (ROI) for these interventions from the perspective of the investing entity. To report on a demonstration project designed to measure the business case for selected quality interventions in high-risk high-cost patient populations in 10 Medicaid managed care organizations across the United States. Using claims and enrollment data gathered over a 3-year period and data on the costs of designing, implementing, and operating the interventions, ROIs were computed for 11 discrete evidence-based quality-enhancing interventions. A complex case management program to treat adults with multiple comorbidities achieved the largest ROI of 12.21:1. This was followed by an ROI of 6.35:1 for a program which treated children with asthma with a history of high emergency room (ER) use and/or inpatient admissions for their disease. An intervention for high-risk pregnant mothers produced a 1.26:1 ROI, and a program for adult patients with diabetes resulted in a 1.16:1 return. The remaining seven interventions failed to show positive returns, although four sites came close to realizing sufficient savings to offset investment costs. Evidence-based interventions designed to improve the quality of patient care may have the best opportunity to yield a positive financial return if it is focused on high-risk high-cost populations and conditions associated with avoidable emergency and inpatient utilization. Developing the necessary tracking systems for the claims and financial investments is critical to perform accurate financial ROI analyses.

  13. A new interactive computer simulation system for violence risk assessment of mentally disordered violent offenders.

    PubMed

    Arborelius, Lotta; Fors, Uno; Svensson, Anna-Karin; Sygel, Kristina; Kristiansson, Marianne

    2013-02-01

    Assessment of risk of future violence has developed from reliance on static indicators towards a more dynamic approach. In the latter context, however, the offender is seldom confronted with real life situations. The aim of this study is to evaluate a computer-based system--Reactions on Display, which presents human interactions based on real-life situations--for its effectiveness in distinguishing between potentially violent offenders with mental disorder and a healthy comparison group. Male offenders with autism spectrum disorders or psychosis were recruited from specialist forensic psychiatric units in Sweden and healthy participants from the local communities. Each consenting participant was presented with film clips of a man in neutral and violent situations, which at critical moments stopped the story to ask him to predict the thoughts, feelings and actions of the actor. Offender patients, irrespective of diagnosis, detected fewer emotional reactions in the actor in the non-violent sequence compared with controls. When asked to choose one of four violent actions, the offender patients chose more violent actions than did the controls. They also reported fewer physical reactions in the actors when actors were being violent. There were also some examples of incongruent or deviant responses by some individual patients. The use of interactive computer simulation techniques is not only generally acceptable to offender patients, but it also helps to differentiate their current response style to particular circumstances from that of healthy controls in a way that does not rely on their verbal abilities and may tap more effectively into their emotional reactions than standard verbal questions and answer approaches. This may pave the way for Reactions on Display providing a useful complement to traditional risk assessment, and a training route with respect to learning more empathic responding, thus having a role in aiding risk management. Copyright © 2013 John Wiley & Sons, Ltd.

  14. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, K.; Ager, A.; Finney, M.; Athanasis, N.; Palaiologou, P.; Vasilakos, C.

    2015-10-01

    A Web-GIS wildfire prevention and management platform (AEGIS) was developed as an integrated and easy-to-use decision support tool (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing access to information that is essential for wildfire management. Databases were created with spatial and non-spatial data to support key system functionalities. Updated land use/land cover maps were produced by combining field inventory data with high resolution multispectral satellite images (RapidEye) to be used as inputs in fire propagation modeling with the Minimum Travel Time algorithm. End users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations; i.e. single-fire propagations, conditional burn probabilities and at the landscape-level, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANN) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps produced an integrated output map for fire danger prediction. The system also incorporates weather measurements from remote automatic weather stations and weather forecast maps. The structure of the algorithms relies on parallel processing techniques (i.e. High Performance Computing and Cloud Computing) that ensure computational power and speed. All AEGIS functionalities are accessible to authorized end users through a web-based graphical user interface. An innovative mobile application, AEGIS App, acts as a complementary tool to the web-based version of the system.

  15. Prognocean Plus: the Science-Oriented Sea Level Prediction System as a Tool for Public Stakeholders

    NASA Astrophysics Data System (ADS)

    Świerczyńska, M. G.; Miziński, B.; Niedzielski, T.

    2015-12-01

    The novel real-time system for sea level prediction, known as Prognocean Plus, has been developed as a new generation service available through the Polish supercomputing grid infrastructure. The researchers can access the service at https://prognocean.plgrid.pl/. Although the system is science-oriented, we wish to discuss herein its potentials to enhance ocean management studies carried out routinely by public stakeholders. The system produces the short- and medium-term predictions of global altimetric gridded Sea Level Anomaly (SLA) time series, updated daily. The spatial resolution of the SLA forecasts is 1/4° x 1/4°, while the temporal resolution of prognoses is equal to 1 day. The system computes the predictions of time-variable ocean topography using five data-based models, which are not computationally demanding, enabling us to compare their skillfulness in respect to physically-based approaches commonly used by different sea level prediction systems. However, the aim of the system is not only to compute the predictions for science purposes, but primarily to build a user-oriented platform that serves the prognoses and their statistics to a broader community. Thus, we deliver the SLA forecasts as a rapid service available online. In order to provide potential users with the access to science results the Web Map Service (WMS) for Prognocean Plus is designed. We regularly publish the forecasts, both in the interactive graphical WMS service, available from the browser, as well as through the Web Coverage Service (WCS) standard. The Prognocean Plus system, as an early-response system, may be interesting for public stakeholders. It may be used for marine navigation as well as for climate risk management (delineate areas vulnerable to local sea level rise), marine management (advise offered for offshore activities) and coastal management (early warnings against coastal floodings).

  16. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete information and to generate knowledge in the stakeholders for improving flood risk management. In particular, Floodrisk comprises a set of calculators capable of computing human or economic losses for a collection of assets, caused by a given scenario event, explicitly covering mitigation and adaptation measures (Mancusi et al., 2015). It is important to mention that despite the fact that some literature models incorporates calculator philosophies identical to the ones implemented in the FloodRisk engine, its implementation might vary significantly, such as the need for a user-friendly and intuitive user interface, or the capability of running the calculations on any platform (Windows, Mac, Linux, etc.), the ability to promotes extensibility, efficient testability, and scientific operability. FloodRisk has been designed as an initiative for implemented a standard and harmonized procedure to determine the flood impacts. Albano, R.; Mancusi, L.; Sole, A.; Adamowski, J. Collaborative Strategies for Sustainable EU Flood Risk Management: FOSS and Geospatial Tools - Challenges and Opportunities for Operative Risk Analysis. ISPRS Int. J. Geo-Inf. 2015, 4, 2704-2727. Mancusi, L., Albano, R., Sole, A.. FloodRisk: a QGIS plugin for flood consequences estimation, In: Geomatics Workbooks n°12 - FOSS4G Europe Como, 2015

  17. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  18. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF EDC RISK MANAGEMENT METHODS

    EPA Science Inventory

    In Superfund risk management research, the performance of risk management techniques is typically evaluated by measuring "the concentrations of the chemicals of concern before and after risk management efforts. However, using bioassays and chemical data provides a more robust und...

  19. A computer-oriented system for assembling and displaying land management information

    Treesearch

    Elliot L. Amidon

    1964-01-01

    Maps contain information basic to land management planning. By transforming conventional map symbols into numbers which are punched into cards, the land manager can have a computer assemble and display information required for a specific job. He can let a computer select information from several maps, combine it with such nonmap data as treatment cost or benefit per...

  20. Impact of cardiac hybrid imaging-guided patient management on clinical long-term outcome.

    PubMed

    Benz, Dominik C; Gaemperli, Lara; Gräni, Christoph; von Felten, Elia; Giannopoulos, Andreas A; Messerli, Michael; Buechel, Ronny R; Gaemperli, Oliver; Pazhenkottil, Aju P; Kaufmann, Philipp A

    2018-06-15

    Although randomized trials have provided evidence for invasive fractional flow reserve to guide revascularization, evidence for non-invasive imaging is less well established. The present study investigated whether hybrid coronary computed tomography (CCTA)/single photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) can identify patients who benefit from early revascularization compared to medical therapy. This retrospective study consists of 414 patients referred for evaluation of known or suspected coronary artery disease (CAD) with CCTA/SPECT hybrid imaging. CCTA categorized patients into no CAD, non-high-risk CAD and high-risk CAD. In patients with CAD (n = 329), a matched finding (n = 75) was defined as a reversible perfusion defect in a territory subtended by a coronary artery with CAD. All other combinations of pathologic findings were classified as unmatched (n = 254). Death, myocardial infarction, unstable angina requiring hospitalization, and late coronary revascularization were defined as major adverse cardiac events (MACE). Cox hazards models included covariates age, male gender, more than two risk factors, previous CABG, high-risk CAD and early revascularization. During median follow-up of 6.0 years, 112 patients experienced a MACE (27%). Early revascularization (n = 50) was independently associated with improved outcome among patients with a matched finding (p < 0.001). There was no benefit among patients with an unmatched finding (p = 0.787), irrespective of presence (p = 0.505) or absence of high-risk CAD (p = 0.631). Early revascularization is associated with an outcome benefit in CAD patients with a matched finding documented by cardiac hybrid imaging while no benefit of revascularization was observed in patients with an unmatched finding. Copyright © 2018 Elsevier B.V. All rights reserved.

Top