2015-12-01
FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical
ERIC Educational Resources Information Center
Trexler, Grant Lewis
2012-01-01
This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…
ERIC Educational Resources Information Center
Eshlaghy, Abbas Toloie; Kaveh, Haydeh
2009-01-01
The purpose of this study was to determine the most suitable ICT-based education and define the most suitable e-content creation tools for quantitative courses in the IT-management Masters program. ICT-based tools and technologies are divided in to three categories: the creation of e-content, the offering of e-content, and access to e-content. In…
Simulating the Effects of Alternative Forest Management Strategies on Landscape Structure
Eric J. Gustafson; Thomas Crow
1996-01-01
Quantitative, spatial tools are needed to assess the long-term spatial consequences of alternative management strategies for land use planning and resource management. We constructed a timber harvest allocation model (HARVEST) that provides a visual and quantitative means to predict the spatial pattern of forest openings produced by alternative harvest strategies....
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-19
... panel will be employed to collect this information, which serves the need for direct and quantitative measurement of our target population, and which, as a quantitative research tool has some major benefits: To...
Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)
This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
CANARY Risk Management of Adenocarcinoma: The Future of Imaging?
Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A.; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias
2016-01-01
Increased clinical utilization of chest high resolution computed tomography results in increased identification of lung adenocarcinomas and persistent sub-solid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to non-invasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semi-quantitative measures to decrease inter- and intra-rater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still sub-optimal, require validation and are not yet clinically applicable. The Computer-Aided Nodule Assessment and Risk Yield (CANARY) software application represents a validated tool for the automated, quantitative, non-invasive tool for risk stratification of adenocarcinoma lung nodules. CANARY correlates well with consensus histology and post-surgical patient outcomes and therefore may help to guide individualized patient management e.g. in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. PMID:27568149
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
QUANTITATIVE DECISION TOOLS AND MANAGEMENT DEVELOPMENT PROGRAMS.
ERIC Educational Resources Information Center
BYARS, LLOYD L.; NUNN, GEOFFREY E.
THIS ARTICLE OUTLINED THE CURRENT STATUS OF QUANTITATIVE METHODS AND OPERATIONS RESEARCH (OR), SKETCHED THE STRENGTHS OF TRAINING EFFORTS AND ISOLATED WEAKNESSES, AND FORMULATED WORKABLE CRITERIA FOR EVALUATING SUCCESS OF OPERATIONS RESEARCH TRAINING PROGRAMS. A SURVEY OF 105 COMPANIES REVEALED THAT PERT, INVENTORY CONTROL THEORY AND LINEAR…
IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.
ERIC Educational Resources Information Center
Nadkami, Sanjay M.
1998-01-01
Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
There is a growing interest in the application of human-associated fecal sourceidentification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data q...
NASA Astrophysics Data System (ADS)
shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu
2017-11-01
The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.
Isoperms: An Environmental Management Tool.
ERIC Educational Resources Information Center
Sebera, Donald K.
A quantitative tool, the isoperm method, is described; it quantifies the effect of environmental factors of temperature (T) and percent relative humidity (%RH) on the anticipated useful life expectancy of paper-based collections. The isoperm method provides answers to questions of the expected lifetime of the collection under various temperature…
ERIC Educational Resources Information Center
Kemp, Jeremy William
2011-01-01
This quantitative survey study examines the willingness of online students to adopt an immersive virtual environment as a classroom tool and compares this with their feelings about more traditional learning modes including our ANGEL learning management system and the Elluminate live Web conferencing tool. I surveyed 1,108 graduate students in…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's...
Pateman, B; Jinks, A M
1999-01-01
The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.
Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.
ERIC Educational Resources Information Center
Jungck, John R.; Soderberg, Patti
1995-01-01
Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)
Quantitative tools link portfolio management with use of technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, R.N.; Boulanger, A.; Amaefule, J.
1998-11-30
The exploration and production (E and P) business is in the midst of a major transformation from an emphasis on cost-cutting to more diverse portfolio management practices. The industry has found that it is not easy to simultaneously optimize net present value (NPV), return on investment (ROI), and long-term growth. The result has been the adaptation of quantitative business practices that rival their subsurface geological equivalents in sophistication and complexity. The computational tools assess the risk-reward tradeoffs inherent in the upstream linkages between (1) the application of advanced technologies to improve success in exploration and in exploitation (reservoir evaluation, drilling,more » producing, and delivery to market) and (2) the maximization of both short- and long-term profitability. Exploitation is a critical link to the industry`s E and P profitability, as can be seen from the correlation between earnings growth of the international majors and production growth. The paper discusses the use of tools to optimize exploitation.« less
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code
ERIC Educational Resources Information Center
Fischer, Michael
2011-01-01
The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…
Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie
2012-06-01
Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.
Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.
Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn
2016-12-01
We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.
Integrated national-scale assessment of wildfire risk to human and ecological values
Matthew P. Thompson; David E. Calkin; Mark A. Finney; Alan A. Ager; Julie W. Gilbertson-Day
2011-01-01
The spatial, temporal, and social dimensions of wildfire risk are challenging U.S. federal land management agencies to meet societal needs while maintaining the health of the lands they manage. In this paper we present a quantitative, geospatial wildfire risk assessment tool, developed in response to demands for improved risk-based decision frameworks. The methodology...
A Case Study of Resources Management Planning with Multiple Objectives and Projects
David L. Peterson; David G. Silsbee; Daniel L. Schmoldt
1995-01-01
Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decisionmaking tools. We have previously described a multiple objective planning...
Smoke management photographic guide: a visual aid for communicating impacts
Joshua C. Hyde; Jarod Blades; Troy Hall; Roger D. Ottmar; Alistair Smith
2016-01-01
Communicating emissions impacts to the public can sometimes be difficult because quantitatively conveying smoke concentrations is complicated. Regulators and land managers often refer to particulate-matter concentrations in micrograms per cubic meter, but this may not be intuitive or meaningful to everyone. The primary purpose of this guide is to serve as a tool for...
Upgrading Marine Ecosystem Restoration Using Ecological-Social Concepts.
Abelson, Avigdor; Halpern, Benjamin S; Reed, Daniel C; Orth, Robert J; Kendrick, Gary A; Beck, Michael W; Belmaker, Jonathan; Krause, Gesche; Edgar, Graham J; Airoldi, Laura; Brokovich, Eran; France, Robert; Shashar, Nadav; de Blaeij, Arianne; Stambler, Noga; Salameh, Pierre; Shechter, Mordechai; Nelson, Peter A
2016-02-01
Conservation and environmental management are principal countermeasures to the degradation of marine ecosystems and their services. However, in many cases, current practices are insufficient to reverse ecosystem declines. We suggest that restoration ecology , the science underlying the concepts and tools needed to restore ecosystems, must be recognized as an integral element for marine conservation and environmental management. Marine restoration ecology is a young scientific discipline, often with gaps between its application and the supporting science. Bridging these gaps is essential to using restoration as an effective management tool and reversing the decline of marine ecosystems and their services. Ecological restoration should address objectives that include improved ecosystem services, and it therefore should encompass social-ecological elements rather than focusing solely on ecological parameters. We recommend using existing management frameworks to identify clear restoration targets, to apply quantitative tools for assessment, and to make the re-establishment of ecosystem services a criterion for success.
Upgrading Marine Ecosystem Restoration Using Ecological‐Social Concepts
Abelson, Avigdor; Halpern, Benjamin S.; Reed, Daniel C.; Orth, Robert J.; Kendrick, Gary A.; Beck, Michael W.; Belmaker, Jonathan; Krause, Gesche; Edgar, Graham J.; Airoldi, Laura; Brokovich, Eran; France, Robert; Shashar, Nadav; de Blaeij, Arianne; Stambler, Noga; Salameh, Pierre; Shechter, Mordechai; Nelson, Peter A.
2015-01-01
Conservation and environmental management are principal countermeasures to the degradation of marine ecosystems and their services. However, in many cases, current practices are insufficient to reverse ecosystem declines. We suggest that restoration ecology, the science underlying the concepts and tools needed to restore ecosystems, must be recognized as an integral element for marine conservation and environmental management. Marine restoration ecology is a young scientific discipline, often with gaps between its application and the supporting science. Bridging these gaps is essential to using restoration as an effective management tool and reversing the decline of marine ecosystems and their services. Ecological restoration should address objectives that include improved ecosystem services, and it therefore should encompass social–ecological elements rather than focusing solely on ecological parameters. We recommend using existing management frameworks to identify clear restoration targets, to apply quantitative tools for assessment, and to make the re-establishment of ecosystem services a criterion for success. PMID:26977115
Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul
2017-02-01
Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Breakthrough at the Missouri River Breaks: A quick tool for comparing burned and unburned sites
Rachael Clark; Theresa Jain
2009-01-01
A quantitative understanding of how forests work, both before and after (prescribed and wild) fire, is essential to management. Yet acquiring the kind of broad yet detailed information needed for many management decisions can be costly, tedious, and time-consuming. After two sweeping wildfires in the Missouri River Breaks area of eastern Montana - the Indian and...
NASA Astrophysics Data System (ADS)
Albano, Raffaele; Manfreda, Salvatore; Celano, Giuseppe
The paper introduces a minimalist water-driven crop model for sustainable irrigation management using an eco-hydrological approach. Such model, called MY SIRR, uses a relatively small number of parameters and attempts to balance simplicity, accuracy, and robustness. MY SIRR is a quantitative tool to assess water requirements and agricultural production across different climates, soil types, crops, and irrigation strategies. The MY SIRR source code is published under copyleft license. The FOSS approach could lower the financial barriers of smallholders, especially in developing countries, in the utilization of tools for better decision-making on the strategies for short- and long-term water resource management.
NASA Astrophysics Data System (ADS)
Erickson, A.; Martone, R. G.; Hazen, L.; Mease, L.; Gourlie, D.; Le Cornu, E.; Ourens, R.; Micheli, F.
2016-12-01
California's fisheries management law, the Marine Life Management Act (MLMA) of 1998, signaled a transformative shift from traditional single-species management to an ecosystem-based approach. In response, the fisheries management community in California is striving to integrate new science and management innovations while maximizing its limited capacity. However, data gaps, high compliance costs, capacity constraints, and limited access to the best available data and technologies persist. Here we present two decision support tools being developed to aid California fisheries managers as they continue to implement ecosystem-based management (EBM). First, to practice adaptive management, a key principle of EBM, managers must know whether and how their decisions are meeting their management objectives over time. Based on a cross-walk of MLMA goals with metrics and indicators from sustainable fishery certification programs, we present a flexible and practical tool for tracking fishery management performance in California. We showcase a draft series of decision trees and questionnaires managers can use to quantitatively or qualitatively measure both ecological and social outcomes, helping them to prioritize management options and limited resources. Second, state fisheries managers acknowledge the need for more effective stakeholder engagement to facilitate and inform decision-making and long-term outcomes, another key principle of EBM. Here, we present a pilot version of a decision-support tool to aid managers in choosing the most appropriate stakeholder engagement strategies in various types of decision contexts. This online tool will help staff identify their engagement goals, when they can strategically engage stakeholders based on their needs, and the fishery characteristics that will inform how engagement strategies are tailored to specific contexts. We also share opportunities to expand these EBM tools to other resource management contexts and scales.
NASA Astrophysics Data System (ADS)
Erickson, A.; Martone, R. G.; Hazen, L.; Mease, L.; Gourlie, D.; Le Cornu, E.; Ourens, R.; Micheli, F.
2016-02-01
California's fisheries management law, the Marine Life Management Act (MLMA) of 1998, signaled a transformative shift from traditional single-species management to an ecosystem-based approach. In response, the fisheries management community in California is striving to integrate new science and management innovations while maximizing its limited capacity. However, data gaps, high compliance costs, capacity constraints, and limited access to the best available data and technologies persist. Here we present two decision support tools being developed to aid California fisheries managers as they continue to implement ecosystem-based management (EBM). First, to practice adaptive management, a key principle of EBM, managers must know whether and how their decisions are meeting their management objectives over time. Based on a cross-walk of MLMA goals with metrics and indicators from sustainable fishery certification programs, we present a flexible and practical tool for tracking fishery management performance in California. We showcase a draft series of decision trees and questionnaires managers can use to quantitatively or qualitatively measure both ecological and social outcomes, helping them to prioritize management options and limited resources. Second, state fisheries managers acknowledge the need for more effective stakeholder engagement to facilitate and inform decision-making and long-term outcomes, another key principle of EBM. Here, we present a pilot version of a decision-support tool to aid managers in choosing the most appropriate stakeholder engagement strategies in various types of decision contexts. This online tool will help staff identify their engagement goals, when they can strategically engage stakeholders based on their needs, and the fishery characteristics that will inform how engagement strategies are tailored to specific contexts. We also share opportunities to expand these EBM tools to other resource management contexts and scales.
Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis
Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn
2016-01-01
We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743
NASA Technical Reports Server (NTRS)
Mercer, Joey; Callantine, Todd; Martin, Lynne
2012-01-01
A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.
solGS: a web-based tool for genomic selection
USDA-ARS?s Scientific Manuscript database
Genomic selection (GS) promises to improve accuracy in estimating breeding values and genetic gain for quantitative traits compared to traditional breeding methods. Its reliance on high-throughput genome-wide markers and statistical complexity, however, is a serious challenge in data management, ana...
The scientific management of volcanic crises
NASA Astrophysics Data System (ADS)
Marzocchi, Warner; Newhall, Christopher; Woo, Gordon
2012-12-01
Sound scientific management of volcanic crises is the primary tool to reduce significantly volcanic risk in the short-term. At present, a wide variety of qualitative or semi-quantitative strategies is adopted, and there is not yet a commonly accepted quantitative and general strategy. Pre-eruptive processes are extremely complicated, with many degrees of freedom nonlinearly coupled, and poorly known, so scientists must quantify eruption forecasts through the use of probabilities. On the other hand, this also forces decision-makers to make decisions under uncertainty. We review the present state of the art in this field in order to identify the main gaps of the existing procedures. Then, we put forward a general quantitative procedure that may overcome the present barriers, providing guidelines on how probabilities may be used to take rational mitigation actions. These procedures constitute a crucial link between science and society; they can be used to establish objective and transparent decision-making protocols and also clarify the role and responsibility of each partner involved in managing a crisis.
Innovations for the future of pharmacovigilance.
Almenoff, June S
2007-01-01
Post-marketing pharmacovigilance involves the review and management of safety information from many sources. Among these sources, spontaneous adverse event reporting systems are among the most challenging and resource-intensive to manage. Traditionally, efforts to monitor spontaneous adverse event reporting systems have focused on review of individual case reports. The science of pharmacovigilance could be enhanced with the availability of systems-based tools that facilitate analysis of aggregate data for purposes of signal detection, signal evaluation and knowledge management. GlaxoSmithKline (GSK) recently implemented Online Signal Management (OSM) as a data-driven framework for managing the pharmacovigilance of marketed products. This pioneering work builds upon the strong history GSK has of innovation in this area. OSM is a software application co-developed by GSK and Lincoln Technologies that integrates traditional pharmacovigilance methods with modern quantitative statistical methods and data visualisation tools. OSM enables the rapid identification of trends from the individual adverse event reports received by GSK. OSM also provides knowledge-management tools to ensure the successful tracking of emerging safety issues. GSK has developed standard procedures and 'best practices' around the use of OSM to ensure the systematic evaluation of complex safety datasets. In summary, the implementation of OSM provides new tools and efficient processes to advance the science of pharmacovigilance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suresh, Niraj; Stephens, Sean A.; Adams, Lexor
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on amore » Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.« less
PIQMIe: a web server for semi-quantitative proteomics data management and analysis
Kuzniar, Arnold; Kanaar, Roland
2014-01-01
We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615
PIQMIe: a web server for semi-quantitative proteomics data management and analysis.
Kuzniar, Arnold; Kanaar, Roland
2014-07-01
We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
2013-01-01
The Chemical Events Working Group of the Global Health Security Initiative has developed a flexible screening tool for chemicals that present a risk when accidentally or deliberately released into the atmosphere. The tool is generic, semi-quantitative, independent of site, situation and scenario, encompasses all chemical hazards (toxicity, flammability and reactivity), and can be easily and quickly implemented by non-subject matter experts using freely available, authoritative information. Public health practitioners and planners can use the screening tool to assist them in directing their activities in each of the five stages of the disaster management cycle. PMID:23517410
Addison, Prue F E; Flander, Louisa B; Cook, Carly N
2017-08-01
Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright © 2017 Elsevier Ltd. All rights reserved.
Targeting Millennials: Social Media Strategies within Higher Education
ERIC Educational Resources Information Center
Sessa, Whitney L.
2015-01-01
Using a quantitative survey method with an online questionnaire as the data collection tool, the author surveyed 189 social media managers working at American Higher Education institutions to identify forms of social media in use, along with the most popular strategies that colleges and universities use with Facebook.
Modeling the Effects of Conservation Tillage on Water Quality at the Field Scale
USDA-ARS?s Scientific Manuscript database
The development and application of predictive tools to quantitatively assess the effects of tillage and related management activities should be carefully tested against high quality field data. This study reports on: 1) the calibration and validation of the Root Zone Water Quality Model (RZWQM) to a...
USDA-ARS?s Scientific Manuscript database
The western corn rootworm (WCR), Diabrotica virgifera virgifera, is an insect pest of corn, and population suppression with chemical insecticides is an important management tool. Traits conferring organophosphate insecticide resistance have increased in frequency among WCR populations, resulting in...
Developing user-friendly habitat suitability tools from regional stream fish survey data
Zorn, T.G.; Seelbach, P.; Wiley, M.J.
2011-01-01
We developed user-friendly fish habitat suitability tools (plots) for fishery managers in Michigan; these tools are based on driving habitat variables and fish population estimates for several hundred stream sites throughout the state. We generated contour plots to show patterns in fish biomass for over 60 common species (and for 120 species grouped at the family level) in relation to axes of catchment area and low-flow yield (90% exceedance flow divided by catchment area) and also in relation to axes of mean and weekly range of July temperatures. The plots showed distinct patterns in fish habitat suitability at each level of biological organization studied and were useful for quantitatively comparing river sites. We demonstrate how these plots can be used to support stream management, and we provide examples pertaining to resource assessment, trout stocking, angling regulations, chemical reclamation of marginal trout streams, indicator species, instream flow protection, and habitat restoration. These straightforward and effective tools are electronically available so that managers can easily access and incorporate them into decision protocols and presentations.
MO-C-BRCD-03: The Role of Informatics in Medical Physics and Vice Versa.
Andriole, K
2012-06-01
Like Medical Physics, Imaging Informatics encompasses concepts touching every aspect of the imaging chain from image creation, acquisition, management and archival, to image processing, analysis, display and interpretation. The two disciplines are in fact quite complementary, with similar goals to improve the quality of care provided to patients using an evidence-based approach, to assure safety in the clinical and research environments, to facilitate efficiency in the workplace, and to accelerate knowledge discovery. Use-cases describing several areas of informatics activity will be given to illustrate current limitations that would benefit from medical physicist participation, and conversely areas in which informaticists may contribute to the solution. Topics to be discussed include radiation dose monitoring, process management and quality control, display technologies, business analytics techniques, and quantitative imaging. Quantitative imaging is increasingly becoming an essential part of biomedicalresearch as well as being incorporated into clinical diagnostic activities. Referring clinicians are asking for more objective information to be gleaned from the imaging tests that they order so that they may make the best clinical management decisions for their patients. Medical Physicists may be called upon to identify existing issues as well as develop, validate and implement new approaches and technologies to help move the field further toward quantitative imaging methods for the future. Biomedical imaging informatics tools and techniques such as standards, integration, data mining, cloud computing and new systems architectures, ontologies and lexicons, data visualization and navigation tools, and business analytics applications can be used to overcome some of the existing limitations. 1. Describe what is meant by Medical Imaging Informatics and understand why the medical physicist should care. 2. Identify existing limitations in information technologies with respect to Medical Physics, and conversely see how Informatics may assist the medical physicist in filling some of the current gaps in their activities. 3. Understand general informatics concepts and areas of investigation including imaging and workflow standards, systems integration, computing architectures, ontologies, data mining and business analytics, data visualization and human-computer interface tools, and the importance of quantitative imaging for the future of Medical Physics and Imaging Informatics. 4. Become familiar with on-going efforts to address current challenges facing future research into and clinical implementation of quantitative imaging applications. © 2012 American Association of Physicists in Medicine.
Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias
2016-01-01
Increased clinical use of chest high-resolution computed tomography results in increased identification of lung adenocarcinomas and persistent subsolid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to noninvasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semiquantitative measures to decrease interrater and intrarater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still suboptimal, require validation and are not yet clinically applicable. The computer-aided nodule assessment and risk yield software application represents a validated tool for the automated, quantitative, and noninvasive tool for risk stratification of adenocarcinoma lung nodules. Computer-aided nodule assessment and risk yield correlates well with consensus histology and postsurgical patient outcomes, and therefore may help to guide individualized patient management, for example, in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. Copyright © 2016 Elsevier Inc. All rights reserved.
Bardos, R Paul; Bone, Brian D; Boyle, Richard; Evans, Frank; Harries, Nicola D; Howard, Trevor; Smith, Jonathan W N
2016-09-01
The scale of land-contamination problems, and of the responses to them, makes achieving sustainability in contaminated land remediation an important objective. The Sustainable Remediation Forum in the UK (SuRF-UK) was established in 2007 to support more sustainable remediation practice in the UK. The current international interest in 'sustainable remediation' has achieved a fairly rapid consensus on concepts, descriptions and definitions for sustainable remediation, which are now being incorporated into an ISO standard. However the sustainability assessment methods being used remain diverse with a range of (mainly) semi-quantitative and quantitative approaches and tools developed, or in development. Sustainability assessment is site specific and subjective. It depends on the inclusion of a wide range of considerations across different stakeholder perspectives. Taking a tiered approach to sustainability assessment offers important advantages, starting from a qualitative assessment and moving through to semi-quantitative and quantitative assessments on an 'as required' basis only. It is also clear that there are a number of 'easy wins' that could improve performance against sustainability criteria right across the site management process. SuRF-UK has provided a checklist of 'sustainable management practices' that describes some of these. This paper provides the rationale for, and an outline of, and recently published SuRF-UK guidance on preparing for and framing sustainability assessments; carrying out qualitative sustainability assessment; and simple good management practices to improve sustainability across contaminated land management activities. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Thiessen, Lindsey D; Neill, Tara M; Mahaffee, Walter F
2018-01-01
Plant pathogen detection systems have been useful tools to monitor inoculum presence and initiate management schedules. More recently, a loop-mediated isothermal amplification (LAMP) assay was successfully designed for field use in the grape powdery mildew pathosystem; however, false negatives or false positives were prevalent in grower-conducted assays due to the difficulty in perceiving the magnesium pyrophosphate precipitate at low DNA concentrations. A quantitative LAMP (qLAMP) assay using a fluorescence resonance energy transfer-based probe was assessed by grape growers in the Willamette Valley of Oregon. Custom impaction spore samplers were placed at a research vineyard and six commercial vineyard locations, and were tested bi-weekly by the lab and by growers. Grower-conducted qLAMP assays used a beta-version of the Smart-DART handheld LAMP reaction devices (Diagenetix, Inc., Honolulu, HI, USA), connected to Android 4.4 enabled, Bluetooth-capable Nexus 7 tablets for output. Quantification by a quantitative PCR assay was assumed correct to compare the lab and grower qLAMP assay quantification. Growers were able to conduct and interpret qLAMP results; however, the Erysiphe necator inoculum quantification was unreliable using the beta-Smart-DART devices. The qLAMP assay developed was sensitive to one spore in early testing of the assay, but decreased to >20 spores by the end of the trial. The qLAMP assay is not likely a suitable management tool for grape powdery mildew due to losses in sensitivity and decreasing costs and portability for other, more reliable molecular tools.
Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.
Smith, Jonathan W N; Kerrison, Gavin
2013-01-01
Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.
Maturity Curve of Systems Engineering
2008-12-01
b. Analysis of Data .......................................................... 41 4. Fuzzy Logic...the collection and analysis of data . (Hart, 1998) 13 1. Methodology Overview A qualitative approach in acquiring and managing the data was used...for this analysis . A quantitative tool was used to examine and evaluate the data . The qualitative approach was intended to sort the acquired traits
Teaching Quantitative Management to Evening MBA Students.
ERIC Educational Resources Information Center
Libby, Barbara
1984-01-01
The author discusses the mathematics background of Masters of Business Administration (MBA) students and asks what math tools are necessary for an MBA. While she finds useful the ability to deal with linear and quadratic equations; interest, depreciation, and growth rates; and word problems, she concludes that calculus is of little use apart from…
IMO and Internal Branding Outcomes: An Employee Perspective in UK HE
ERIC Educational Resources Information Center
Yu, Qionglei; Asaad, Yousra; Yen, Dorothy A.; Gupta, Suraksha
2018-01-01
This study extends our knowledge of internal branding in the context of employees in the higher education sector. Employing a quantitative methodology in UK universities, a conceptual model is presented and tested on 235 employees. Internal market orientation (IMO) is examined as a management tool to drive employees' university brand commitment…
Urban vacant land typology: A tool for managing urban vacant land
Gunwoo Kim; Patrick A. Miller; David J. Nowak
2018-01-01
A typology of urban vacant land was developed, using Roanoke, Virginia, as the study area. A comprehensive literature review, field measurements and observations, including photographs, and quantitative based approach to assessing vacant land forest structure and values (i-Tree Eco sampling) were utilized, along with aerial photo interpretation, and ground-truthing...
USDA-ARS?s Scientific Manuscript database
Plant pathogen detection systems have been useful tools to monitor inoculum presence and initiate management schedules. More recently, a LAMP assay was successfully designed for field use in the grape powdery mildew pathosystem; however, false negatives or false positives were prevalent in grower-co...
Monitoring Urban Quality of Life: The Porto Experience
ERIC Educational Resources Information Center
Santos, Luis Delfim; Martins, Isabel
2007-01-01
This paper describes the monitoring system of the urban quality of life developed by the Porto City Council, a new tool being used to support urban planning and management. The two components of this system--a quantitative approach based on statistical indicators and a qualitative analysis based on the citizens' perceptions of the conditions of…
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2014 CFR
2014-10-01
... get this information from topographical maps such as U.S. Geological Survey quadrangle maps. (2... risk.)—Risk Value=3 Close interval survey: (yes/no)—no—Risk Value =5 Internal Inspection tool used..., including a summary of performance improvements, both qualitative and quantitative, to an operator's...
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2011 CFR
2011-10-01
... get this information from topographical maps such as U.S. Geological Survey quadrangle maps. (2... risk.)—Risk Value=3 Close interval survey: (yes/no)—no—Risk Value =5 Internal Inspection tool used..., including a summary of performance improvements, both qualitative and quantitative, to an operator's...
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2012 CFR
2012-10-01
... get this information from topographical maps such as U.S. Geological Survey quadrangle maps. (2... risk.)—Risk Value=3 Close interval survey: (yes/no)—no—Risk Value =5 Internal Inspection tool used..., including a summary of performance improvements, both qualitative and quantitative, to an operator's...
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2013 CFR
2013-10-01
... get this information from topographical maps such as U.S. Geological Survey quadrangle maps. (2... risk.)—Risk Value=3 Close interval survey: (yes/no)—no—Risk Value =5 Internal Inspection tool used..., including a summary of performance improvements, both qualitative and quantitative, to an operator's...
Improving management of type 2 diabetes - findings of the Type2Care clinical audit.
Barlow, John; Krassas, George
2013-01-01
Type 2 diabetes was responsible for 5.8% of the total disease burden in Australia in 2010. Despite advances in clinical management many type 2 diabetes (T2D) patients have suboptimal glycaemic control. Using quantitative questionnaires, general practitioners prospectively evaluated their management of 761 T2D patients at two time points, 6 months apart. Following the first audit, GPs received feedback and a decision support tool. Patients were then re-audited to assess if the intervention altered management. The use of annual cycle of care plans significantly increased by 12% during the audit. General practitioner performance improved across all measures with the greatest gains being in the use of care plans and measuring and meeting targets for microalbumin. Glycaemic control was well managed in this cohort (mean HbA1c 6.9% for both audit cycles). The Type2Care clinical audit provided decision support tools and diabetes registers that improved the delivery of care to patients with T2D.
NASA Technical Reports Server (NTRS)
Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter
1999-01-01
This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
A farm-level precision land management framework based on integer programming
Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar
2017-01-01
Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499
Systematic Sustainability Assessment (SSA) Tool for Hydroelectric Project in Malaysia
NASA Astrophysics Data System (ADS)
Turan, Faiz Mohd; Johan, Kartina
2017-08-01
Sustainably developed and managed hydropower has enormous potential to contribute to global sustainability goals. It is known that hydroelectricity contributing small amounts to greenhouse gas emissions and other atmospheric pollutants. However, developing the remaining hydroelectric potential offers many challenges, and public pressure and expectations on the environmental and social performance of hydroelectric tend to increase over time. This paper aims to develop Systematic Sustainability Assessment (SSA) Tool that promotes and guides more sustainable hydroelectric projects in the context of Malaysia. The proposed SSA tool which not only provide a quality and quantitative report of sustainability performance but also act as Self-Assessment Report (SAR) to provide roadmap to achieve greater level of sustainability in project management for continuous improvement. It is expected to provide a common language that allow government, civil society, financial institutions and the hydroelectric sector to talk about and evaluate sustainability issues. The advantage of SSA tool is it can be used at any stage of hydroelectric development, from the earliest planning stages right through to operation.
Human judgment vs. quantitative models for the management of ecological resources.
Holden, Matthew H; Ellner, Stephen P
2016-07-01
Despite major advances in quantitative approaches to natural resource management, there has been resistance to using these tools in the actual practice of managing ecological populations. Given a managed system and a set of assumptions, translated into a model, optimization methods can be used to solve for the most cost-effective management actions. However, when the underlying assumptions are not met, such methods can potentially lead to decisions that harm the environment and economy. Managers who develop decisions based on past experience and judgment, without the aid of mathematical models, can potentially learn about the system and develop flexible management strategies. However, these strategies are often based on subjective criteria and equally invalid and often unstated assumptions. Given the drawbacks of both methods, it is unclear whether simple quantitative models improve environmental decision making over expert opinion. In this study, we explore how well students, using their experience and judgment, manage simulated fishery populations in an online computer game and compare their management outcomes to the performance of model-based decisions. We consider harvest decisions generated using four different quantitative models: (1) the model used to produce the simulated population dynamics observed in the game, with the values of all parameters known (as a control), (2) the same model, but with unknown parameter values that must be estimated during the game from observed data, (3) models that are structurally different from those used to simulate the population dynamics, and (4) a model that ignores age structure. Humans on average performed much worse than the models in cases 1-3, but in a small minority of scenarios, models produced worse outcomes than those resulting from students making decisions based on experience and judgment. When the models ignored age structure, they generated poorly performing management decisions, but still outperformed students using experience and judgment 66% of the time. © 2016 by the Ecological Society of America.
Tool for Human-Systems Integration Assessment: HSI Scorecard
NASA Technical Reports Server (NTRS)
Whitmore, Nihriban; Sandor, Aniko; McGuire, Kerry M.; Berdich, Debbie
2009-01-01
This paper describes the development and rationale for a human-systems integration (HSI) scorecard that can be used in reviews of vehicle specification and design. This tool can be used to assess whether specific HSI related criteria have been met as part of a project milestone or critical event, such as technical reviews, crew station reviews, mockup evaluations, or even review of major plans or processes. Examples of HSI related criteria include Human Performance Capabilities, Health Management, Human System Interfaces, Anthropometry and Biomechanics, and Natural and Induced Environments. The tool is not intended to evaluate requirements compliance and verification, but to review how well the human related systems have been considered for the specific event and to identify gaps and vulnerabilities from an HSI perspective. The scorecard offers common basis, and criteria for discussions among system managers, evaluators, and design engineers. Furthermore, the scorecard items highlight the main areas of system development that need to be followed during system lifecycle. The ratings provide a repeatable quantitative measure to what has been often seen as only subjective commentary. Thus, the scorecard is anticipated to be a useful HSI tool to communicate review results to the institutional and the project office management.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
Forage resource evaluation system for habitat—deer: an interactive deer habitat model
Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris
2012-01-01
We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
Wildlife in the cloud: a new approach for engaging stakeholders in wildlife management.
Chapron, Guillaume
2015-11-01
Research in wildlife management increasingly relies on quantitative population models. However, a remaining challenge is to have end-users, who are often alienated by mathematics, benefiting from this research. I propose a new approach, 'wildlife in the cloud,' to enable active learning by practitioners from cloud-based ecological models whose complexity remains invisible to the user. I argue that this concept carries the potential to overcome limitations of desktop-based software and allows new understandings of human-wildlife systems. This concept is illustrated by presenting an online decision-support tool for moose management in areas with predators in Sweden. The tool takes the form of a user-friendly cloud-app through which users can compare the effects of alternative management decisions, and may feed into adjustment of their hunting strategy. I explain how the dynamic nature of cloud-apps opens the door to different ways of learning, informed by ecological models that can benefit both users and researchers.
NASA Astrophysics Data System (ADS)
Whittaker, Kara A.; McShane, Dan
2012-04-01
The objective of this study was to assess and compare the ability of two slope instability screening tools developed by the Washington State Department of Natural Resources (WDNR) to assess landslide risks associated with forestry activities. HAZONE is based on a semi-quantitative method that incorporates the landslide frequency rate and landslide area rate for delivery of mapped landforms. SLPSTAB is a GIS-based model of inherent landform characteristics that utilizes slope geometry derived from DEMs and climatic data. Utilization of slope instability screening tools by geologists, land managers, and regulatory agencies can reduce the frequency and magnitude of landslides. Aquatic habitats are negatively impacted by elevated rates and magnitudes of landslides associated with forest management practices due to high sediment loads and alteration of stream channels and morphology. In 2007 a large storm with heavy rainfall impacted southwestern Washington State trigging over 2500 landslides. This storm event and accompanying landslides provides an opportunity to assess the slope stability screening tools developed by WDNR. Landslide density (up to 6.5 landslides per km2) from the storm was highest in the areas designated by the screening tools as high hazard areas, and both of the screening tools were equal in their ability to predict landslide locations. Landslides that initiated in low hazard areas may have resulted from a variety of site-specific factors that deviated from assumed model values, from the inadequate identification of potentially unstable landforms due to low resolution DEMs, or from the inadequate implementation of the state Forest Practices Rules. We suggest that slope instability screening tools can be better utilized by forest management planners and regulators to meet policy goals regarding minimizing landslide rates and impacts to sensitive aquatic species.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial
Time-Out Parent Inventory for Clinical and Research Applications.
ERIC Educational Resources Information Center
Clark, Lynn
The purpose of the Time-Out Parent Inventory (TOPI) is to provide an objective and quantitative assessment of a parent's self-reported use of time-out procedures to manage a child's behavior. The TOPI is intended to be a tool for researchers as well as professionals who help parents and children. The professional asks the parent a series of 12…
Survival models for harvest management of mourning dove populations
Otis, D.L.
2002-01-01
Quantitative models of the relationship between annual survival and harvest rate of migratory game-bird populations are essential to science-based harvest management strategies. I used the best available band-recovery and harvest data for mourning doves (Zenaida macroura) to build a set of models based on different assumptions about compensatory harvest mortality. Although these models suffer from lack of contemporary data, they can be used in development of an initial set of population models that synthesize existing demographic data on a management-unit scale, and serve as a tool for prioritization of population demographic information needs. Credible harvest management plans for mourning dove populations will require a long-term commitment to population monitoring and iterative population analysis.
Warner, Guy C; Blum, Jesse M; Jones, Simon B; Lambert, Paul S; Turner, Kenneth J; Tan, Larry; Dawson, Alison S F; Bell, David N F
2010-08-28
The last two decades have seen substantially increased potential for quantitative social science research. This has been made possible by the significant expansion of publicly available social science datasets, the development of new analytical methodologies, such as microsimulation, and increases in computing power. These rich resources do, however, bring with them substantial challenges associated with organizing and using data. These processes are often referred to as 'data management'. The Data Management through e-Social Science (DAMES) project is working to support activities of data management for social science research. This paper describes the DAMES infrastructure, focusing on the data-fusion process that is central to the project approach. It covers: the background and requirements for provision of resources by DAMES; the use of grid technologies to provide easy-to-use tools and user front-ends for several common social science data-management tasks such as data fusion; the approach taken to solve problems related to data resources and metadata relevant to social science applications; and the implementation of the architecture that has been designed to achieve this infrastructure.
Knowledge portal: a tool to capture university requirements
NASA Astrophysics Data System (ADS)
Mansourvar, Marjan; Binti Mohd Yasin, Norizan
2011-10-01
New technologies, especially, the Internet have made a huge impact on knowledge management and information dissemination in education. The web portal as a knowledge management system is very popular topics in many organizations including universities. Generally, a web portal defines as a gateway to online network accessible resources through the intranet, extranet or Internet. This study develops a knowledge portal for the students in the Faculty of Computer Science and Information Technology (FCSIT), University of Malaya (UM). The goals of this portal are to provide information for the students to help them to choose the right courses and major that are relevant to their intended future jobs or career in IT. A quantitative approach used as the selected method for this research. Quantitative method provides an easy and useful way to collect data from a large sample population.
Erwin, Kim; Martin, Molly A; Flippin, Tara; Norell, Sarah; Shadlyn, Ariana; Yang, Jie; Falco, Paula; Rivera, Jaime; Ignoffo, Stacy; Kumar, Rajesh; Margellos-Anast, Helen; McDermott, Michael; McMahon, Kate; Mosnaim, Giselle; Nyenhuis, Sharmilee M; Press, Valerie G; Ramsay, Jessica E; Soyemi, Kenneth; Thompson, Trevonne M; Krishnan, Jerry A
2016-01-01
To present the methods and outcomes of stakeholder engagement in the development of interventions for children presenting to the emergency department (ED) for uncontrolled asthma. We engaged stakeholders (caregivers, physicians, nurses, administrators) from six EDs in a three-phase process to: define design requirements; prototype and refine; and evaluate. Interviews among 28 stakeholders yielded themes regarding in-home asthma management practices and ED discharge experiences. Quantitative and qualitative evaluation showed strong preference for the new discharge tool over current tools. Engaging end-users in contextual inquiry resulted in CAPE (CHICAGO Action Plan after ED discharge), a new stakeholder-balanced discharge tool, which is being tested in a multicenter comparative effectiveness trial.
Back, David A; Behringer, Florian; Haberstroh, Nicole; Ehlers, Jan P; Sostmann, Kai; Peters, Harm
2016-08-20
To investigate medical students´ utilization of and problems with a learning management system and its e-learning tools as well as their expectations on future developments. A single-center online survey has been carried out to investigate medical students´ (n = 505) usage and perception concerning the learning management system Blackboard, and provided e-learning tools. Data were collected with a standardized questionnaire consisting of 70 items and analyzed by quantitative and qualitative methods. The participants valued lecture notes (73.7%) and Wikipedia (74%) as their most important online sources for knowledge acquisition. Missing integration of e-learning into teaching was seen as the major pitfall (58.7%). The learning management system was mostly used for study information (68.3%), preparation of exams (63.3%) and lessons (54.5%). Clarity (98.3%), teaching-related contexts (92.5%) and easy use of e-learning offers (92.5%) were rated highest. Interactivity was most important in free-text comments (n = 123). It is desired that contents of a learning management system support an efficient learning. Interactivity of tools and their conceptual integration into face-to-face teaching are important for students. The learning management system was especially important for organizational purposes and the provision of learning materials. Teachers should be aware that free online sources such as Wikipedia enjoy a high approval as source of knowledge acquisition. This study provides an empirical basis for medical schools and teachers to improve their offerings in the field of digital learning for their students.
Back, David A.; Behringer, Florian; Haberstroh, Nicole; Ehlers, Jan P.; Sostmann, Kai
2016-01-01
Objectives To investigate medical students´ utilization of and problems with a learning management system and its e-learning tools as well as their expectations on future developments. Methods A single-center online survey has been carried out to investigate medical students´ (n = 505) usage and perception concerning the learning management system Blackboard, and provided e-learning tools. Data were collected with a standardized questionnaire consisting of 70 items and analyzed by quantitative and qualitative methods. Results The participants valued lecture notes (73.7%) and Wikipedia (74%) as their most important online sources for knowledge acquisition. Missing integration of e-learning into teaching was seen as the major pitfall (58.7%). The learning management system was mostly used for study information (68.3%), preparation of exams (63.3%) and lessons (54.5%). Clarity (98.3%), teaching-related contexts (92.5%) and easy use of e-learning offers (92.5%) were rated highest. Interactivity was most important in free-text comments (n = 123). Conclusions It is desired that contents of a learning management system support an efficient learning. Interactivity of tools and their conceptual integration into face-to-face teaching are important for students. The learning management system was especially important for organizational purposes and the provision of learning materials. Teachers should be aware that free online sources such as Wikipedia enjoy a high approval as source of knowledge acquisition. This study provides an empirical basis for medical schools and teachers to improve their offerings in the field of digital learning for their students. PMID:27544782
Miller, Wayne L
2017-01-01
Volume overload and fluid congestion remain primary clinical challenges in the assessment and management of patients with chronic heart failure (HF). The pathophysiology of volume regulation is complex, and the simple concept of passive intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to the central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in chronic HF. The quantitative assessment of intravascular volume is an effective tool to help guide individualized, appropriate therapy. Not all volume overload is the same, and the measurement of intravascular volume identifies heterogeneity to guide tailored therapy.
NASA Technical Reports Server (NTRS)
Perera, Jeevan S.
2011-01-01
Leadership is key to success. Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks -- risk office personnel. Each group is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk reporting and communication is an essential element of risk management and will combine both qualitative and quantitative elements. Risk informed decision making should be introduced to all levels of management. Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.
Risk Management Issues - An Aerospace Perspective
NASA Technical Reports Server (NTRS)
Perera, Jeevan S.
2011-01-01
Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks--risk office personnel. Each group is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk reporting and communication is an essential element of risk management and will combine both qualitative and quantitative elements.. Risk informed decision making should be introduced to all levels of management. Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner, Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.
Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning
NASA Astrophysics Data System (ADS)
Thomas, S. M.; Su, Y. C.; Hummel, P. R.
2016-12-01
Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.
Gimbel, Sarah; Voss, Joachim; Mercer, Mary Anne; Zierler, Brenda; Gloyd, Stephen; Coutinho, Maria de Joana; Floriano, Florencia; Cuembelo, Maria de Fatima; Einberg, Jennifer; Sherr, Kenneth
2014-10-21
The objective of the prevention of Mother-to-Child Transmission (pMTCT) cascade analysis tool is to provide frontline health managers at the facility level with the means to rapidly, independently and quantitatively track patient flows through the pMTCT cascade, and readily identify priority areas for clinic-level improvement interventions. Over a period of six months, five experienced maternal-child health managers and researchers iteratively adapted and tested this systems analysis tool for pMTCT services. They prioritized components of the pMTCT cascade for inclusion, disseminated multiple versions to 27 health managers and piloted it in five facilities. Process mapping techniques were used to chart PMTCT cascade steps in these five facilities, to document antenatal care attendance, HIV testing and counseling, provision of prophylactic anti-retrovirals, safe delivery, safe infant feeding, infant follow-up including HIV testing, and family planning, in order to obtain site-specific knowledge of service delivery. Seven pMTCT cascade steps were included in the Excel-based final tool. Prevalence calculations were incorporated as sub-headings under relevant steps. Cells not requiring data inputs were locked, wording was simplified and stepwise drop-offs and maximization functions were included at key steps along the cascade. While the drop off function allows health workers to rapidly assess how many patients were lost at each step, the maximization function details the additional people served if only one step improves to 100% capacity while others stay constant. Our experience suggests that adaptation of a cascade analysis tool for facility-level pMTCT services is feasible and appropriate as a starting point for discussions of where to implement improvement strategies. The resulting tool facilitates the engagement of frontline health workers and managers who fill out, interpret, apply the tool, and then follow up with quality improvement activities. Research on adoption, interpretation, and sustainability of this pMTCT cascade analysis tool by frontline health managers is needed. ClinicalTrials.gov NCT02023658, December 9, 2013.
Free web-based modelling platform for managed aquifer recharge (MAR) applications
NASA Astrophysics Data System (ADS)
Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia
2017-04-01
Managed aquifer recharge represents a valuable instrument for sustainable water resources management. The concept implies purposeful infiltration of surface water into underground for later recovery or environmental benefits. Over decades, MAR schemes were successfully installed worldwide for a variety of reasons: to maximize the natural storage capacity of aquifers, physical aquifer management, water quality management, and ecological benefits. The INOWAS-DSS platform provides a collection of free web-based tools for planning, management and optimization of main components of MAR schemes. The tools are grouped into 13 specific applications that cover most relevant challenges encountered at MAR sites, both from quantitative and qualitative perspectives. The applications include among others the optimization of MAR site location, the assessment of saltwater intrusion, the restoration of groundwater levels in overexploited aquifers, the maximization of natural storage capacity of aquifers, the improvement of water quality, the design and operational optimization of MAR schemes, clogging development and risk assessment. The platform contains a collection of about 35 web-based tools of various degrees of complexity, which are either included in application specific workflows or used as standalone modelling instruments. Among them are simple tools derived from data mining and empirical equations, analytical groundwater related equations, as well as complex numerical flow and transport models (MODFLOW, MT3DMS and SEAWAT). Up to now, the simulation core of the INOWAS-DSS, which is based on the finite differences groundwater flow model MODFLOW, is implemented and runs on the web. A scenario analyser helps to easily set up and evaluate new management options as well as future development such as land use and climate change and compare them to previous scenarios. Additionally simple tools such as analytical equations to assess saltwater intrusion are already running online. Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.
Bagstad, Kenneth J.; Semmens, Darius J.; Winthrop, Robert
2013-01-01
Although the number of ecosystem service modeling tools has grown in recent years, quantitative comparative studies of these tools have been lacking. In this study, we applied two leading open-source, spatially explicit ecosystem services modeling tools – Artificial Intelligence for Ecosystem Services (ARIES) and Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) – to the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. We modeled locally important services that both modeling systems could address – carbon, water, and scenic viewsheds. We then applied managerially relevant scenarios for urban growth and mesquite management to quantify ecosystem service changes. InVEST and ARIES use different modeling approaches and ecosystem services metrics; for carbon, metrics were more similar and results were more easily comparable than for viewsheds or water. However, findings demonstrate similar gains and losses of ecosystem services and conclusions when comparing effects across our scenarios. Results were more closely aligned for landscape-scale urban-growth scenarios and more divergent for a site-scale mesquite-management scenario. Follow-up studies, including testing in different geographic contexts, can improve our understanding of the strengths and weaknesses of these and other ecosystem services modeling tools as they move closer to readiness for supporting day-to-day resource management.
Managing complex research datasets using electronic tools: A meta-analysis exemplar
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
2013-01-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Managing complex research datasets using electronic tools: a meta-analysis exemplar.
Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L
2013-06-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2017-12-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2018-04-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
Medical Waste Management in Community Health Centers.
Tabrizi, Jafar Sadegh; Rezapour, Ramin; Saadati, Mohammad; Seifi, Samira; Amini, Behnam; Varmazyar, Farahnaz
2018-02-01
Non-standard management of medical waste leads to irreparable side effects. This issue is of double importance in health care centers in a city which are the most extensive system for providing Primary Health Care (PHC) across Iran cities. This study investigated the medical waste management standards observation in Tabriz community health care centers, northwestern Iran. In this triangulated cross-sectional study (qualitative-quantitative), data collecting tool was a valid checklist of waste management process developed based on Iranian medical waste management standards. The data were collected in 2015 through process observation and interviews with the health center's staff. The average rate of waste management standards observance in Tabriz community health centers, Tabriz, Iran was 29.8%. This case was 22.8% in dimension of management and training, 27.3% in separating and collecting, 31.2% in transport and temporary storage, and 42.9% in sterilization and disposal. Lack of principal separation of wastes, inappropriate collecting and disposal cycle of waste and disregarding safety tips (fertilizer device performance monitoring, microbial cultures and so on) were among the observed defects in health care centers supported by quantitative data. Medical waste management was not in a desirable situation in Tabriz community health centers. The expansion of community health centers in different regions and non-observance of standards could predispose to incidence the risks resulted from medical wastes. So it is necessary to adopt appropriate policies to promote waste management situation.
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
Risk Management for the International Space Station
NASA Technical Reports Server (NTRS)
Sebastian, J.; Brezovic, Philip
2002-01-01
The International Space Station (ISS) is an extremely complex system, both technically and programmatically. The Space Station must support a wide range of payloads and missions. It must be launched in numerous launch packages and be safely assembled and operated in the harsh environment of space. It is being designed and manufactured by many organizations, including the prime contractor, Boeing, the NASA institutions, and international partners and their contractors. Finally, the ISS has multiple customers, (e.g., the Administration, Congress, users, public, international partners, etc.) with contrasting needs and constraints. It is the ISS Risk Management Office strategy to proactively and systematically manages risks to help ensure ISS Program success. ISS program follows integrated risk management process (both quantitative and qualitative) and is integrated into ISS project management. The process and tools are simple and seamless and permeate to the lowest levels (at a level where effective management can be realized) and follows the continuous risk management methodology. The risk process assesses continually what could go wrong (risks), determine which risks need to be managed, implement strategies to deal with those risks, and measure effectiveness of the implemented strategies. The process integrates all facets of risk including cost, schedule and technical aspects. Support analysis risk tools like PRA are used to support programatic decisions and assist in analyzing risks.
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
Fast quantitative optical detection of heat dissipation by surface plasmon polaritons.
Möller, Thomas B; Ganser, Andreas; Kratt, Martina; Dickreuter, Simon; Waitz, Reimar; Scheer, Elke; Boneberg, Johannes; Leiderer, Paul
2018-06-13
Heat management at the nanoscale is an issue of increasing importance. In optoelectronic devices the transport and decay of plasmons contribute to the dissipation of heat. By comparison of experimental data and simulations we demonstrate that it is possible to gain quantitative information about excitation, propagation and decay of surface plasmon polaritons (SPPs) in a thin gold stripe supported by a silicon membrane. The temperature-dependent optical transmissivity of the membrane is used to determine the temperature distribution around the metal stripe with high spatial and temporal resolution. This method is complementary to techniques where the propagation of SPPs is monitored optically, and provides additional information which is not readily accessible by other means. In particular, we demonstrate that the thermal conductivity of the membrane can also be derived from our analysis. The results presented here show the high potential of this tool for heat management studies in nanoscale devices.
Universal versus tailored solutions for alleviating disruptive behavior in hospitals.
Berman-Kishony, Talia; Shvarts, Shifra
2015-01-01
Disruptive behavior among hospital staff can negatively affect quality of care. Motivated by a standard on disruptive behavior issued by The Joint Commission (LD 3.10), as well as the desire to improve patient care, minimize liability, and improve staff retention, hospitals are setting policies to prevent and resolve disruptive behaviors. However, it is unknown whether uniform conflict management tools are equally effective among different hospital settings. We surveyed residents and nurses to identify similarities and differences among hospital departments in the antecedents, characteristics, and outcomes of disruptive behaviors, and in the effectiveness of conflict management tools. We used a quantitative questionnaire-based assessment to examine conflict perceptions in eight different hospital departments at Rambam Medical Center in Haifa, Israel. Most participants (89 %) reported witnessing disruptive behavior either directly or in other parties; the most significant causes were identified as intense work, miscommunication, and problematic personalities. The forms of these behaviors, however, varied significantly between departments, with some more prone to expressed conflicts, while others were characterized by hidden disruptive behaviors. These outcomes were correlated by the antecedents to disruptive behavior, which in turn affected the effectiveness of alleviating strategies and tools. Some tools, such as processes for evaluating complaints, teamwork and conflict management courses, and introducing a behavioral mission statement, are effective across many antecedents. Other tools, however, are antecedent-specific, falling into two principal categories: tools directly removing a specific problem and tools that offer a way to circumvent the problem. Conflict resolution tools and strategies, based on residents and nurse perceptions, may be more effective if tailored to the specific situation, rather than using a "one-size-fits-all" approach.
Mataragas, M; Zwietering, M H; Skandamis, P N; Drosinos, E H
2010-07-31
The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH=6.2-6.4 and water activity=0.98-0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combination. Copyright 2010 Elsevier B.V. All rights reserved.
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
Erwin, Kim; Martin, Molly A; Flippin, Tara; Norell, Sarah; Shadlyn, Ariana; Yang, Jie; Falco, Paula; Rivera, Jaime; Ignoffo, Stacy; Kumar, Rajesh; Margellos-Anast, Helen; McDermott, Michael; McMahon, Kate; Mosnaim, Giselle; Nyenhuis, Sharmilee M; Press, Valerie G; Ramsay, Jessica E; Soyemi, Kenneth; Thompson, Trevonne M; Krishnan, Jerry A
2016-01-01
Aim: To present the methods and outcomes of stakeholder engagement in the development of interventions for children presenting to the emergency department (ED) for uncontrolled asthma. Methods: We engaged stakeholders (caregivers, physicians, nurses, administrators) from six EDs in a three-phase process to: define design requirements; prototype and refine; and evaluate. Results: Interviews among 28 stakeholders yielded themes regarding in-home asthma management practices and ED discharge experiences. Quantitative and qualitative evaluation showed strong preference for the new discharge tool over current tools. Conclusion: Engaging end-users in contextual inquiry resulted in CAPE (CHICAGO Action Plan after ED discharge), a new stakeholder-balanced discharge tool, which is being tested in a multicenter comparative effectiveness trial. PMID:26690579
On-line analysis capabilities developed to support the AFW wind-tunnel tests
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.
1992-01-01
A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.
NASA Astrophysics Data System (ADS)
Osenga, E. C.; Cundiff, J.; Arnott, J. C.; Katzenberger, J.; Taylor, J. R.; Jack-Scott, E.
2015-12-01
An interactive tool called the Forest Health Index (FHI) has been developed for the Roaring Fork watershed of Colorado, with the purpose of improving public understanding of local forest management and ecosystem dynamics. The watershed contains large areas of White River National Forest, which plays a significant role in the local economy, particularly for recreation and tourism. Local interest in healthy forests is therefore strong, but public understanding of forest ecosystems is often simplified. This can pose challenges for land managers and researchers seeking a scientifically informed approach to forest restoration, management, and planning. Now in its second iteration, the FHI is a tool designed to help bridge that gap. The FHI uses a suite of indicators to create a numeric rating of forest functionality and change, based on the desired forest state in relation to four categories: Ecological Integrity, Public Health and Safety, Ecosystem Services, and Sustainable Use and Management. The rating is based on data derived from several sources including local weather stations, stream gauge data, SNOTEL sites, and National Forest Service archives. In addition to offering local outreach and education, this project offers broader insight into effective communication methods, as well as into the challenges of using quantitative analysis to rate ecosystem health. Goals of the FHI include its use in schools as a means of using local data and place-based learning to teach basic math and science concepts, improved public understanding of ecological complexity and need for ongoing forest management, and, in the future, its use as a model for outreach tools in other forested communities in the Intermountain West.
Kilgore, Matthew D
The cardiology service line director at a health maintenance organization (HMO) in Washington State required a valid, reliable, and practical means for measuring workloads and other productivity factors for six heart failure (HF) registered nurse case managers located across three geographical regions. The Kilgore Heart Failure Case Management (KHFCM) Acuity Tool was systematically designed, developed, and validated to measure workload as a dependent function of the number of heart failure case management (HFCM) services rendered and the duration of times spent on various care duties. Research and development occurred at various HMO-affiliated internal medicine and cardiology offices throughout Western Washington. The concepts, methods, and principles used to develop the KHFCM Acuity Tool are applicable for any type of health care professional aiming to quantify workload using a high-quality objective tool. The content matter, scaling, and language on the KHFCM Acuity Tool are specific to HFCM settings. The content matter and numeric scales for the KHFCM Acuity Tool were developed and validated using a mixed-method participant action research method applied to a group of six outpatient HF case managers and their respective caseloads. The participant action research method was selected, because the application of this method requires research participants to become directly involved in the diagnosis of research problems, the planning and execution of actions taken to address those problems, and the implementation of progressive strategies throughout the course of the study, as necessary, to produce the most credible and practical practice improvements (; ; ; ). Heart failure case managers served clients with New York Heart Association Functional Class III-IV HF (), and encounters were conducted primarily by telephone or in-office consultation. A mix of qualitative and quantitative results demonstrated a variety of quality improvement outcomes achieved by the design and practice application of the KHFCM Acuity Tool. Quality improvement outcomes included a more valid reflection of encounter times and demonstration of the KHFCM Acuity Tool as a reliable, practical, credible, and satisfying tool for reflecting HF case manager workloads and HF disease severity. The KHFCM Acuity Tool defines workload simply as a function of the number of HFCM services performed and the duration of time spent on a client encounter. The design of the tool facilitates the measure of workload, service utilization, and HF disease characteristics, independently from the overall measure of acuity, so that differences in individual case manager practice, as well as client characteristics within sites, across sites, and potentially throughout annual seasons, can be demonstrated. Data produced from long-term applications of the KHFCM Acuity Tool, across all regions, could serve as a driver for establishing systemwide HFCM productivity benchmarks or standards of practice for HF case managers. Data produced from localized applications could serve as a reference for coordinating staffing resources or developing HFCM productivity benchmarks within individual regions or sites.
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey
2009-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey D.
2010-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Knowledge management for efficient quantitative analyses during regulatory reviews.
Krudys, Kevin; Li, Fang; Florian, Jeffry; Tornoe, Christoffer; Chen, Ying; Bhattaram, Atul; Jadhav, Pravin; Neal, Lauren; Wang, Yaning; Gobburu, Joga; Lee, Peter I D
2011-11-01
Knowledge management comprises the strategies and methods employed to generate and leverage knowledge within an organization. This report outlines the activities within the Division of Pharmacometrics at the US FDA to effectively manage knowledge with the ultimate goal of improving drug development and advancing public health. The infrastructure required for pharmacometric knowledge management includes provisions for data standards, queryable databases, libraries of modeling tools, archiving of analysis results and reporting templates for effective communication. Two examples of knowledge management systems developed within the Division of Pharmacometrics are used to illustrate these principles. The benefits of sound knowledge management include increased productivity, allowing reviewers to focus on research questions spanning new drug applications, such as improved trial design and biomarker development. The future of knowledge management depends on the collaboration between the FDA and industry to implement data and model standards to enhance sharing and dissemination of knowledge.
Gewirtz, Henry; Dilsizian, Vasken
2016-05-31
In the >40 years since planar myocardial imaging with(43)K-potassium was introduced into clinical research and management of patients with coronary artery disease (CAD), diagnosis and treatment have undergone profound scientific and technological changes. One such innovation is the current state-of-the-art hardware and software for positron emission tomography myocardial perfusion imaging, which has advanced it from a strictly research-oriented modality to a clinically valuable tool. This review traces the evolving role of quantitative positron emission tomography measurements of myocardial blood flow in the evaluation and management of patients with CAD. It presents methodology, currently or soon to be available, that offers a paradigm shift in CAD management. Heretofore, radionuclide myocardial perfusion imaging has been primarily qualitative or at best semiquantitative in nature, assessing regional perfusion in relative terms. Thus, unlike so many facets of modern cardiovascular practice and CAD management, which depend, for example, on absolute values of key parameters such as arterial and left ventricular pressures, serum lipoprotein, and other biomarker levels, the absolute levels of rest and maximal myocardial blood flow have yet to be incorporated into routine clinical practice even in most positron emission tomography centers where the potential to do so exists. Accordingly, this review focuses on potential value added for improving clinical CAD practice by measuring the absolute level of rest and maximal myocardial blood flow. Physiological principles and imaging fundamentals necessary to understand how positron emission tomography makes robust, quantitative measurements of myocardial blood flow possible are highlighted. © 2016 American Heart Association, Inc.
NASA Technical Reports Server (NTRS)
Stohlgren, Tom; Schnase, John; Morisette, Jeffrey; Most, Neal; Sheffner, Ed; Hutchinson, Charles; Drake, Sam; Van Leeuwen, Willem; Kaupp, Verne
2005-01-01
The National Institute of Invasive Species Science (NIISS), through collaboration with NASA's Goddard Space Flight Center (GSFC), recently began incorporating NASA observations and predictive modeling tools to fulfill its mission. These enhancements, labeled collectively as the Invasive Species Forecasting System (ISFS), are now in place in the NIISS in their initial state (V1.0). The ISFS is the primary decision support tool of the NIISS for the management and control of invasive species on Department of Interior and adjacent lands. The ISFS is the backbone for a unique information services line-of-business for the NIISS, and it provides the means for delivering advanced decision support capabilities to a wide range of management applications. This report describes the operational characteristics of the ISFS, a decision support tool of the United States Geological Survey (USGS). Recent enhancements to the performance of the ISFS, attained through the integration of observations, models, and systems engineering from the NASA are benchmarked; i.e., described quantitatively and evaluated in relation to the performance of the USGS system before incorporation of the NASA enhancements. This report benchmarks Version 1.0 of the ISFS.
System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2003-01-01
A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-12-28
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.
Smith, Anne E; Gans, Will
2015-03-01
The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.
Food Web Designer: a flexible tool to visualize interaction networks.
Sint, Daniela; Traugott, Michael
Species are embedded in complex networks of ecological interactions and assessing these networks provides a powerful approach to understand what the consequences of these interactions are for ecosystem functioning and services. This is mandatory to develop and evaluate strategies for the management and control of pests. Graphical representations of networks can help recognize patterns that might be overlooked otherwise. However, there is a lack of software which allows visualizing these complex interaction networks. Food Web Designer is a stand-alone, highly flexible and user friendly software tool to quantitatively visualize trophic and other types of bipartite and tripartite interaction networks. It is offered free of charge for use on Microsoft Windows platforms. Food Web Designer is easy to use without the need to learn a specific syntax due to its graphical user interface. Up to three (trophic) levels can be connected using links cascading from or pointing towards the taxa within each level to illustrate top-down and bottom-up connections. Link width/strength and abundance of taxa can be quantified, allowing generating fully quantitative networks. Network datasets can be imported, saved for later adjustment and the interaction webs can be exported as pictures for graphical display in different file formats. We show how Food Web Designer can be used to draw predator-prey and host-parasitoid food webs, demonstrating that this software is a simple and straightforward tool to graphically display interaction networks for assessing pest control or any other type of interaction in both managed and natural ecosystems from an ecological network perspective.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Measuring workplace social support for workers with disability.
Lysaght, Rosemary; Fabrigar, Leandre; Larmour-Trode, Sherrey; Stewart, Jeremy; Friesen, Margaret
2012-09-01
Social support in the workplace has been has been demonstrated to serve as a contributor to a worker's ability to manage work demands and to manage stress. Research in the area of disability management indicates that interpersonal factors play an important role in the success of return-to-work interventions. The role of workplace support has received limited attention in rehabilitation, despite the salience of support to the disability management process. Prior to this study, there existed no validated quantitative measure of social support for workers who re-enter the workplace following injury or disability. A support measure prototype, the Support for Workers with Disability Scale, was tested with 152 workers in accommodated work situations. Four validation tools were used to assess criterion validity. Factor analysis was used to validate the content structure and reduce the total number of response items. Additional analysis was conducted to determine the ability of the measure to discriminate between groups, and to provide insight into how social support operates in workplaces. Based on analysis, a reduced measure consisting of 41 items and measuring supervisor, co-worker, and non-work supports was created. Secondary analysis disclosed information concerning the nature of supports in the workplace. Higher levels of support were identified for workers with fewer work role limitations and for those with one versus multiple injury claims. This tool provides a validated outcome measure for research examining the social aspects of workplace disability. It can also serve as a quality management tool for human resource professionals engaged in continuous improvement of disability management programs.
Kourgialas, Nektarios N; Karatzas, George P; Dokou, Zoi; Kokorogiannis, Andreas
2018-02-15
In many Mediterranean islands with limited surface water resources, the growth of agricultural and touristic sectors, which are the main water consumers, highly depends on the sustainable water resources management. This work highlights the crucial role of groundwater footprint (GF) as a tool for the sustainable management of water resources, especially in water scarce islands. The groundwater footprint represents the water budget between inflows and outflows in an aquifer system and is used as an index of the effect of groundwater use in natural resources and environmental flows. The case study presented in this paper is the island of Crete, which consists of 11 main aquifer systems. The data used for estimating the groundwater footprint in each system were groundwater recharges, abstractions through 412 wells, environmental flows (discharges) from 76 springs and 19 streams present in the area of study. The proposed methodology takes into consideration not only the water quantity but also the water quality of the aquifer systems and can be used as an integrated decision making tool for the sustainable management of groundwater resources. This methodology can be applied in any groundwater system. The results serve as a tool for assessing the potential of sustainable use and the optimal distribution of water needs under the current and future climatic conditions, considering both quantitative and qualitative factors. Adaptation measures and water policies that will effectively promote sustainable development are also proposed for the management of the aquifer systems that exhibit a large groundwater footprint. Copyright © 2017 Elsevier B.V. All rights reserved.
Using multi-species occupancy models in structured decision making on managed lands
Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.
2013-01-01
Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.
Chittawatanarat, K; Tosanguan, K; Chaikledkaew, U; Tejavanija, S; Teerawattananon, Y
2016-08-01
The objective of this study was to identify the differences in pattern, process, and management of nutrition care in government hospitals in Thailand (an Asian upper-middle income developing country). This is a combination of a quantitative nationwide questionnaire survey and focus group discussions. A total of 2300 questionnaires were sent to government hospitals across Thailand. The responders were divided by routine-nutrition screening/assessment unit vs. non-routine-nutrition screening/assessment unit (RSA vs. NRSA). The comparison between the groups was reported as percentage and cross-sectional odds ratio (CS-OR) with 95% confidence interval (CI). The significant difference was defined as p < 0.05. A total of 814 questionnaires (35.4%) were returned. The three most common tools of RSA were 42% Bhumibol Nutrition Triage (BNT), 21.2% Subjective Global Assessment (SGA) and 20.2% Nutrition Alert Form (NAF). The RSA was significantly higher in proportion for the role of the nurses (RSA vs. NRSA; CS-OR [95% CI]: 68.3% vs. 11.9%; 15.8 [11.1 to 22.7]; p < 0.01), the multidisciplinary team (90.1% vs. 0.4%; 2266 [558 to 1909]; p < 0.01), the nutrition management guidelines (60.6% vs. 2.8%; 53.6 [29.6 to 102.8]; p < 0.01), the nurse-driven enteral feeding protocols (31.7% vs. 17.5%; 2.2 [1.5 to 3.1]; p < 0.01) and preference for hospital formula enteral nutrition (91.4% vs.69.7%; 4.6 [2.9 to 7.4]; p < 0.01). For focus group discussions, the main barrier of RSA implementation was that there was no national recommendation of a screening/assessment tool, inconsistency of policy and reimbursement, and professional and acceptable workload. Nutrition screening/assessment tools were found to be varied in Thailand. RSA affected the nutrition management working process and the types of nutrition support. The main barriers of RSA implementation were inconsistency of policy and reimbursement, acceptable workload, and national guidance as regards - screening/assessment tools. Copyright © 2016 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.
Ollerenshaw, Alison; Wong Shee, Anna; Yates, Mark
2018-04-01
To explore the awareness and usage of an online dementia pathways tool (including decision tree and region-specific dementia services) for primary health practitioners (GPs and nurses) in regional Victoria. Quantitative pilot study using surveys and Google Analytics. A large regional area (48 000 square kilometres, population 220 000) in Victoria. Two hundred and sixty-three GPs and 160 practice nurses were invited to participate, with 42 respondents (GPs, n = 21; practice nurses, n = 21). Primary care practitioners' awareness and usage of the dementia pathways tool. Survey respondents that had used the tool (n = 14) reported accessing information about diagnosis, management and referral. Practitioners reported improvements in knowledge, skills and confidence about core dementia topics. There were 9683 page views between August 2013 and February 2015 (monthly average: 509 page views). The average time spent on page was 2.03 min, with many visitors (68%) spending more than 4 min at the site. This research demonstrates that the tool has been well received by practitioners and has been consistently used since its launch. Health practitioners' valued the content and the availability of local resources. Primary health practitioners reported that the dementia pathways tool provided access to region-specific referral and management resources for all stages of dementia. Such tools have broad transferability in other health areas with further research needed to determine their contribution to learning in the practice setting and over time. © 2017 National Rural Health Alliance Inc.
Schaefer, Michele L.; Wester, Brock; Lee, Yi-Chien; Boggs, Nathan; Conner, Howard A.; Merkle, Andrew C.; Fricke, Stanley T.; Albanese, Chris; Koliatsos, Vassilis E.
2016-01-01
Abstract Traumatic brain injury (TBI) caused by explosive munitions, known as blast TBI, is the signature injury in recent military conflicts in Iraq and Afghanistan. Diagnostic evaluation of TBI, including blast TBI, is based on clinical history, symptoms, and neuropsychological testing, all of which can result in misdiagnosis or underdiagnosis of this condition, particularly in the case of TBI of mild-to-moderate severity. Prognosis is currently determined by TBI severity, recurrence, and type of pathology, and also may be influenced by promptness of clinical intervention when more effective treatments become available. An important task is prevention of repetitive TBI, particularly when the patient is still symptomatic. For these reasons, the establishment of quantitative biological markers can serve to improve diagnosis and preventative or therapeutic management. In this study, we used a shock-tube model of blast TBI to determine whether manganese-enhanced magnetic resonance imaging (MEMRI) can serve as a tool to accurately and quantitatively diagnose mild-to-moderate blast TBI. Mice were subjected to a 30 psig blast and administered a single dose of MnCl2 intraperitoneally. Longitudinal T1-magnetic resonance imaging (MRI) performed at 6, 24, 48, and 72 h and at 14 and 28 days revealed a marked signal enhancement in the brain of mice exposed to blast, compared with sham controls, at nearly all time-points. Interestingly, when mice were protected with a polycarbonate body shield during blast exposure, the marked increase in contrast was prevented. We conclude that manganese uptake can serve as a quantitative biomarker for TBI and that MEMRI is a minimally-invasive quantitative approach that can aid in the accurate diagnosis and management of blast TBI. In addition, the prevention of the increased uptake of manganese by body protection strongly suggests that the exposure of an individual to blast risk could benefit from the design of improved body armor. PMID:26414591
Rodriguez, Olga; Schaefer, Michele L; Wester, Brock; Lee, Yi-Chien; Boggs, Nathan; Conner, Howard A; Merkle, Andrew C; Fricke, Stanley T; Albanese, Chris; Koliatsos, Vassilis E
2016-04-01
Traumatic brain injury (TBI) caused by explosive munitions, known as blast TBI, is the signature injury in recent military conflicts in Iraq and Afghanistan. Diagnostic evaluation of TBI, including blast TBI, is based on clinical history, symptoms, and neuropsychological testing, all of which can result in misdiagnosis or underdiagnosis of this condition, particularly in the case of TBI of mild-to-moderate severity. Prognosis is currently determined by TBI severity, recurrence, and type of pathology, and also may be influenced by promptness of clinical intervention when more effective treatments become available. An important task is prevention of repetitive TBI, particularly when the patient is still symptomatic. For these reasons, the establishment of quantitative biological markers can serve to improve diagnosis and preventative or therapeutic management. In this study, we used a shock-tube model of blast TBI to determine whether manganese-enhanced magnetic resonance imaging (MEMRI) can serve as a tool to accurately and quantitatively diagnose mild-to-moderate blast TBI. Mice were subjected to a 30 psig blast and administered a single dose of MnCl2 intraperitoneally. Longitudinal T1-magnetic resonance imaging (MRI) performed at 6, 24, 48, and 72 h and at 14 and 28 days revealed a marked signal enhancement in the brain of mice exposed to blast, compared with sham controls, at nearly all time-points. Interestingly, when mice were protected with a polycarbonate body shield during blast exposure, the marked increase in contrast was prevented. We conclude that manganese uptake can serve as a quantitative biomarker for TBI and that MEMRI is a minimally-invasive quantitative approach that can aid in the accurate diagnosis and management of blast TBI. In addition, the prevention of the increased uptake of manganese by body protection strongly suggests that the exposure of an individual to blast risk could benefit from the design of improved body armor.
Ramani, Subhash; Thakur, Meenkashi
2014-01-01
Gestational trophoblastic disease is a condition of uncertain etiology, comprised of hydatiform mole (complete and partial), invasive mole, choriocarcinoma, and placental site trophoblastic tumor. It arises from abnormal proliferation of trophoblastic tissue. Early diagnosis of gestational trophoblastic disease and its potential complications is important for timely and successful management of the condition with preservation of fertility. Initial diagnosis is based on a multimodality approach: encompassing clinical features, serial quantitative β-hCG titers, and pelvic ultrasonography. Pelvic magnetic resonance imaging (MRI) is sometimes used as a problem-solving tool to assess the depth of myometrial invasion and extrauterine disease spread in equivocal and complicated cases. Chest radiography, body computed tomography (CT), and brain MRI have been recommended as investigative tools for overall disease staging. Angiography has a role in management of disease complications and metastases. Efficacy of PET (positron emission tomography) and PET/CT in the evaluation of recurrent or metastatic disease has not been adequately investigated yet. This paper discusses the imaging features of gestational trophoblastic disease on various imaging modalities and the role of different imaging techniques in the diagnosis and management of this entity. PMID:25126425
NASA Astrophysics Data System (ADS)
Aldowaisan, Tariq; Allahverdi, Ali
2016-07-01
This paper describes the process employed by the Industrial and Management Systems Engineering programme at Kuwait University to continuously improve the programme. Using a continuous improvement framework, the paper demonstrates how various qualitative and quantitative analyses methods, such as hypothesis testing and control charts, have been applied to the results of four assessment tools and other data sources to improve performance. Important improvements include the need to reconsider two student outcomes as they were difficult to implement in courses. In addition, through benchmarking and the engagement of Alumni and Employers, key decisions were made to improve the curriculum and enhance employability.
Alammari, M R; Smith, P W; de Josselin de Jong, E; Higham, S M
2013-02-01
This study reports the development and assessment of a novel method using quantitative light-induced fluorescence (QLF), to determine whether QLF parameters ΔF and ΔQ were appropriate for aiding diagnosis and clinical decision making of early occlusal mineral loss by comparing QLF analysis with actual restorative management. Following ethical approval, 46 subjects attending a dental teaching hospital were enrolled. White light digital (WL) and QLF images/analyses of 46 unrestored posterior teeth with suspected occlusal caries were made after a clinical decision had already been taken to explore fissures operatively. WL and QLF imaging/analysis were repeated after initial cavity preparation. The type of restorative treatment was determined by the supervising clinician independent of any imaging performed. Actual restorative management carried out was recorded as fissure sealant/preventive resin restoration (F/P) or class I occlusal restoration (Rest.) thus reflecting the extent of intervention (=gold standard). All QLF images were analysed independently. The results showed statistically significant differences between the two treatment groups ΔF (p=0.002) (mean 22.60 - F/P and 28.80 - Rest.) and ΔQ (p=0.012) (mean 230.49 - F/P and 348.30 - Rest.). ΔF and ΔQ values may be useful in aiding clinical diagnosis and decision making in relation to the management of early mineral loss and restorative intervention of occlusal caries. QLF has the potential to be a valuable tool for caries diagnosis in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
A IHE-Like Approach Method for Quantitative Analysis of PACS Usage.
Calabrese, Raffaele; Beltrame, Marco; Accardo, Agostino
2016-12-01
Today, many hospitals have a running enterprise picture archiving and communication system (PACS) and their administrators should have the tools to measure the system activity and, in particular, how much it is used. The information would be valuable for decision-makers to address asset management and the development of policies for its correct utilization and eventually start training initiatives to get the best in resource utilization and operators' satisfaction. On the economic side, a quantitative method to measure the usage of the workstations would be desirable to better redistribute existing resources and plan the purchase of new ones. The paper exploits in an unconventional way the potential of the IHE Audit Trail and Node Authentication (ATNA) profile: it uses the data generated in order to safeguard the security of patient data and to retrieve information about the workload of each PACS workstation. The method uses the traces recorded, according to the profile, for each access to image data and to calculate how much each station is used. The results, constituted by measures of the frequency of PACS station usage suitably classified and presented according to a convenient format for decision-makers, are encouraging. In the time of the spending review, the careful management of available resources is the top priority for a healthcare organization. Thanks to our work, a common medium such as the ATNA profile appears a very useful resource for purposes other than those for which it was born. This avoids additional investments in management tools and allows optimization of resources at no cost.
e-Learning for the elderly on drug utilization: A pilot study.
Throfast, Victoria; Hellström, Lina; Hovstadius, Bo; Petersson, Göran; Ericson, Lisa
2017-05-01
This study explores the attitudes of elderly people to the use of electronic educational technology (e-learning) on drug utilization, with particular emphasis on the layout, usability, content, and level of knowledge in the tool. e-Learning modules were evaluated by a group of elderly people (aged ⩾65 years, n = 16) via a questionnaire comprising closed and open-ended questions. Both qualitative and quantitative analyses of the responses showed mostly positive reviews. The results indicate that the e-learning modules are a suitable tool for distributing information and education and that they can be managed by elderly individuals who are familiar with computers, allowing them to learn more about medication use.
Yuan, Christina M; Prince, Lisa K; Zwettler, Amy J; Nee, Robert; Oliver, James D; Abbott, Kevin C
2014-11-01
Entrustable professional activities (EPAs) are complex tasks representing vital physician functions in multiple competencies, used to demonstrate trainee development along milestones. Managing a nephrology outpatient clinic has been proposed as an EPA for nephrology fellowship training. Retrospective cohort study of nephrology fellow outpatient clinic performance using a previously validated chart audit tool. Outpatient encounter chart audits for training years 2008-2009 through 2012-2013, corresponding to participation in the Nephrology In-Training Examination (ITE). A median of 7 auditors (attending nephrologists) audited a mean of 1,686±408 (SD) charts per year. 18 fellows were audited; 12, in both of their training years. Proportion of chart audit and quality indicator deficiencies. Longitudinal deficiency and ITE performance. Among fellows audited in both their training years, chart audit deficiencies were fewer in the second versus the first year (5.4%±2.0% vs 17.3%±7.0%; P<0.001) and declined between the first and second halves of the first year (22.2%±6.4% vs 12.3%±9.5%; P=0.002). Most deficiencies were omission errors, regardless of training year. Quality indicator deficiencies for hypertension and chronic kidney disease-associated anemia recognition and management were fewer during the second year (P<0.001). Yearly audit deficiencies ≥5% were associated with an ITE score less than the 25th percentile for second-year fellows (P=0.03), with no significant association for first-year fellows. Auditor-reported deficiencies declined between the first and second halves of the year (17.0% vs 11.1%; P<0.001), with a stable positive/neutral comment rate (17.3% vs 17.8%; P=0.6), suggesting that the decline was not due to auditor fatigue. Retrospective design and small trainee numbers. Managing a nephrology outpatient clinic is an EPA. The chart audit tool was used to assess longitudinal fellow performance in managing a nephrology outpatient clinic. Failure to progress may be quantitatively identified and remediated. The tool identifies deficiencies in all 6 competencies, not just medical knowledge, the primary focus of the ITE and the nephrology subspecialty board examination. Published by Elsevier Inc.
Davenport, Thomas H
2006-01-01
We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.
NASA Astrophysics Data System (ADS)
Spahr, K.; Hogue, T. S.
2016-12-01
Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.
Khanassov, Vladimir; Vedel, Isabelle; Pluye, Pierre
2014-01-01
PURPOSE Results of case management designed for patients with dementia and their caregivers in community-based primary health care (CBPHC) were inconsistent. Our objective was to identify the relationships between key outcomes of case management and barriers to implementation. METHODS We conducted a systematic mixed studies review (including quantitative and qualitative studies). Literature search was performed in MEDLINE, PsycINFO, Embase, and Cochrane Library (1995 up to August 2012). Case management intervention studies were used to assess clinical outcomes for patients, service use, caregiver outcomes, satisfaction, and cost-effectiveness. Qualitative studies were used to examine barriers to case management implementation. Patterns in the relationships between barriers to implementation and outcomes were identified using the configurational comparative method. The quality of studies was assessed using the Mixed Methods Appraisal Tool. RESULTS Forty-three studies were selected (31 quantitative and 12 qualitative). Case management had a limited positive effect on behavioral symptoms of dementia and length of hospital stay for patients and on burden and depression for informal caregivers. Interventions that addressed a greater number of barriers to implementation resulted in increased number of positive outcomes. Results suggested that high-intensity case management was necessary and sufficient to produce positive clinical outcomes for patients and to optimize service use. Effective communication within the CBPHC team was necessary and sufficient for positive outcomes for caregivers. CONCLUSIONS Clinicians and managers who implement case management in CBPHC should take into account high-intensity case management (small caseload, regular proactive patient follow-up, regular contact between case managers and family physicians) and effective communication between case managers and other CBPHC professionals and services. PMID:25354410
Khanassov, Vladimir; Vedel, Isabelle; Pluye, Pierre
2014-01-01
Results of case management designed for patients with dementia and their caregivers in community-based primary health care (CBPHC) were inconsistent. Our objective was to identify the relationships between key outcomes of case management and barriers to implementation. We conducted a systematic mixed studies review (including quantitative and qualitative studies). Literature search was performed in MEDLINE, PsycINFO, Embase, and Cochrane Library (1995 up to August 2012). Case management intervention studies were used to assess clinical outcomes for patients, service use, caregiver outcomes, satisfaction, and cost-effectiveness. Qualitative studies were used to examine barriers to case management implementation. Patterns in the relationships between barriers to implementation and outcomes were identified using the configurational comparative method. The quality of studies was assessed using the Mixed Methods Appraisal Tool. Forty-three studies were selected (31 quantitative and 12 qualitative). Case management had a limited positive effect on behavioral symptoms of dementia and length of hospital stay for patients and on burden and depression for informal caregivers. Interventions that addressed a greater number of barriers to implementation resulted in increased number of positive outcomes. Results suggested that high-intensity case management was necessary and sufficient to produce positive clinical outcomes for patients and to optimize service use. Effective communication within the CBPHC team was necessary and sufficient for positive outcomes for caregivers. Clinicians and managers who implement case management in CBPHC should take into account high-intensity case management (small caseload, regular proactive patient follow-up, regular contact between case managers and family physicians) and effective communication between case managers and other CBPHC professionals and services. © 2014 Annals of Family Medicine, Inc.
Risk Acceptance Personality Paradigm: How We View What We Don't Know We Don't Know
NASA Technical Reports Server (NTRS)
Massie, Michael J.; Morris, A. Terry
2011-01-01
The purpose of integrated hazard analyses, probabilistic risk assessments, failure modes and effects analyses, fault trees and many other similar tools is to give managers of a program some idea of the risks associated with their program. All risk tools establish a set of undesired events and then try to evaluate the risk to the program by assessing the severity of the undesired event and the likelihood of that event occurring. Some tools provide qualitative results, some provide quantitative results and some do both. However, in the end the program manager and his/her team must decide which risks are acceptable and which are not. Even with a wide array of analysis tools available, risk acceptance is often a controversial and difficult decision making process. And yet, today's space exploration programs are moving toward more risk based design approaches. Thus, risk identification and good risk assessment is becoming even more vital to the engineering development process. This paper explores how known and unknown information influences risk-based decisions by looking at how the various parts of our personalities are affected by what they know and what they don't know. This paper then offers some criteria for consideration when making risk-based decisions.
NASA Astrophysics Data System (ADS)
Marti, Joan; Bartolini, Stefania; Becerril, Laura
2016-04-01
VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524
SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity
NASA Astrophysics Data System (ADS)
Crema, Stefano; Cavalli, Marco
2018-02-01
There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.
Purdue ionomics information management system. An integrated functional genomics platform.
Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S; Salt, David E
2007-02-01
The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics.
NASA Astrophysics Data System (ADS)
Wollschläger, Ute; Helming, Katharina; Heinrich, Uwe; Bartke, Stephan; Kögel-Knabner, Ingrid; Russell, David; Eberhardt, Einar; Vogel, Hans-Jörg
2016-04-01
Fertile soils are central resources for the production of biomass and provision of food and energy. A growing world population and latest climate targets lead to an increasing demand for both, food and bio-energy, which require preserving and improving the long-term productivity of soils as a bio-economic resource. At the same time, other soil functions and ecosystem services need to be maintained. To render soil management sustainable, we need to establish a scientific knowledge base about complex soil system processes that allows for the development of model tools to quantitatively predict the impact of a multitude of management measures on soil functions. This, finally, will allow for the provision of site-specific options for sustainable soil management. To face this challenge, the German Federal Ministry of Education and Research recently launched the funding program "Soil as a Natural Resource for the Bio-Economy - BonaRes". In a joint effort, ten collaborative projects and the coordinating BonaRes Centre are engaged to close existing knowledge gaps for a profound and systemic understanding of soil functions and their sensitivity to soil management. This presentation provides an overview of the concept of the BonaRes Centre which is responsible for i) setting up a comprehensive data base for soil-related information, ii) the development of model tools aiming to estimate the impact of different management measures on soil functions, and iii) establishing a web-based portal providing decision support tools for a sustainable soil management. A specific focus of the presentation will be laid on the so-called "knowledge-portal" providing the infrastructure for a community effort towards a comprehensive meta-analysis on soil functions as a basis for future model developments.
Sumner, John; Ross, Tom; Jenson, Ian; Pointon, Andrew
2005-11-25
A risk profile of microbial hazards across the supply continuum for the beef, sheep and goat meat industries was developed using both a qualitative tool and a semi-quantitative, spreadsheet tool, Risk Ranger. The latter is useful for highlighting factors contributing to food safety risk and for ranking the risk of various product/pathogen combinations. In the present profile the qualitative tool was used as a preliminary screen for a wide range of hazard-product pairings while Risk Ranger was used to rank in order of population health risk pairings for which quantitative data were available and for assessing the effect of hypothetical scenarios. 'High' risk hazard-product pairings identified were meals contaminated with Clostridium perfringens provided by caterers which have not implemented HACCP; kebabs cross-contaminated by Salmonella present in drip trays or served undercooked; meals served in the home cross-contaminated with Salmonella. 'Medium' risk hazard-product pairings identified were ready-to-eat meats contaminated with Listeria monocytogenes and which have extended shelf life; Uncooked Comminuted Fermented Meat (UCFM)/Salami contaminated with Enterohaemorrhagic E. coli (EHEC) and Salmonella; undercooked hamburgers contaminated with EHEC; kebabs contaminated by Salmonella under normal production or following final "flash" heating. Identified 'low' risk hazard-product pairings included cooked, ready-to-eat sausages contaminated with Salmonella; UCFM/Salami contaminated with L. monocytogenes; well-cooked hamburgers contaminated with EHEC. The risk profile provides information of value to Australia's risk managers in the regulatory, processing and R&D sectors of the meat and meat processing industry for the purposes of identifying food safety risks in the industry and for prioritising risk management actions.
Proposal of an environmental performance index to assess solid waste treatment technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goulart Coelho, Hosmanny Mauro, E-mail: hosmanny@hotmail.com; Lange, Lisete Celina; Coelho, Lineker Max Goulart
2012-07-15
Highlights: Black-Right-Pointing-Pointer Proposal of a new concept in waste management: Cleaner Treatment. Black-Right-Pointing-Pointer Development of an index to assess quantitatively waste treatment technologies. Black-Right-Pointing-Pointer Delphi Method was carried out so as to define environmental indicators. Black-Right-Pointing-Pointer Environmental performance evaluation of waste-to-energy plants. - Abstract: Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond wastemore » energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation.« less
Fisichelli, Nicholas A.; Schuurman, Gregor; Symstad, Amy J.; Ray, Andrea; Friedman, Jonathan M.; Miller, Brian; Rowland, Erika
2016-01-01
The Scaling Climate Change Adaptation in the Northern Great Plains through Regional Climate Summaries and Local Qualitative-Quantitative Scenario Planning Workshops project synthesizes climate data into 3-5 distinct but plausible climate summaries for the northern Great Plains region; crafts quantitative summaries of these climate futures for two focal areas; and applies these local summaries by developing climate-resource-management scenarios through participatory workshops and, where possible, simulation models. The two focal areas are central North Dakota and southwest South Dakota (Figure 1). The primary objective of this project is to help resource managers and scientists in a focal area use scenario planning to make management and planning decisions based on assessments of critical future uncertainties.This report summarizes project work for public and tribal lands in the central North Dakota focal area, with an emphasis on Knife River Indian Villages National Historic Site. The report explainsscenario planning as an adaptation tool in general, then describes how it was applied to the central North Dakota focal area in three phases. Priority resource management and climate uncertainties were identified in the orientation phase. Local climate summaries for relevant, divergent, and challenging climate scenarios were developed in the second phase. In the final phase, a two-day scenario planning workshop held November 12-13, 2015 in Bismarck, ND, featured scenario development and implications, testing management decisions, and methods for operationalizing scenario planning outcomes.
Fisichelli, Nicholas A.; Schuurman, Gregor W.; Symstad, Amy J.; Ray, Andrea; Miller, Brian; Cross, Molly; Rowland, Erika
2016-01-01
The Scaling Climate Change Adaptation in the Northern Great Plains through Regional Climate Summaries and Local Qualitative-Quantitative Scenario Planning Workshops project synthesizes climate data into 3-5 distinct but plausible climate summaries for the northern Great Plains region; crafts quantitative summaries of these climate futures for two focal areas; and applies these local summaries by developing climate-resource-management scenarios through participatory workshops and, where possible, simulation models. The two focal areas are central North Dakota and southwest South Dakota (Figure 1). The primary objective of this project is to help resource managers and scientists in a focal area use scenario planning to make management and planning decisions based on assessments of critical future uncertainties.This report summarizes project work for public and tribal lands in the southwest South Dakota grasslands focal area, with an emphasis on Badlands National Park and Buffalo Gap National Grassland. The report explains scenario planning as an adaptation tool in general, then describes how it was applied to the focal area in three phases. Priority resource management and climate uncertainties were identified in the orientation phase. Local climate summaries for relevant, divergent, and challenging climate scenarios were developed in the second phase. In the final phase, a two-day scenario planning workshop held January 20-21, 2016 in Rapid City, South Dakota, featured scenario development and implications, testing management decisions, and methods for operationalizing scenario planning outcomes.
Quantitative Tools for the Logistics Manager.
1980-04-01
aft essentialy the a, mini= Custom support while oumidn the cost tradeoffs of Inventory anGmmt, and ain"m21ia the total bwb*we days when obtaiin item...manainth as mplq sistem requiresmgmn attention. In two areas. Coitrol, issue and rePLema " mm1 09 Laeo siv ntorye - eusto Control amd issu of Customer ...few analysis. ase laImutorsm e follows I 1. Customer Support 3f~~.5 2. Stookitge Uftivemes i6- 3. Not Mis.sion Capale - . -ly . - - , 4. Priority
2015-03-19
to Abiotic Degradation Magnetite (FeO.Fe2O3) often occurs naturally in sediments formed by weathering of igneous or metamorphic rock Magnetite...send questions at any time using the Q&A panel 6 SERDP & ESTCP Webinar Series (#11) SERDP & ESTCP Webinar Series SERDP and ESTCP Overview Andrea...Attenuation (MNA) Integrate the decision-making framework into an easy to use application • Excel spreadsheet Guide users in the selection of
Quantitation of HBV DNA in human serum using a branched DNA (bDNA) signal amplification assay.
Hendricks, D A; Stowe, B J; Hoo, B S; Kolberg, J; Irvine, B D; Neuwald, P D; Urdea, M S; Perrillo, R P
1995-11-01
The aim of this study was to establish the performance characteristics of a nonradioisotopic branched DNA (bDNA) signal amplification assay for quantitation of hepatitis B virus (HBV) DNA in human serum. Quantitation was determined from a standard curve and expressed as HBV DNA equivalents/mL (Eq/mL; 285,000 Eq = 1 pg of double stranded HBV DNA). The bDNA assay exhibited a nearly four log dynamic range of quantitation and an analytical detection limit of approximately 100,000 Eq/mL. To ensure a specificity of 99.7%, the quantitation limit was set at 700,000 Eq/mL. The interassay percent coefficient of variance for quantification values ranged from 10% to 15% when performed by novice users with different sets of reagents. Using the bDNA assay, HBV DNA was detected in 94% to 100% of hepatitis B e antigen-positive specimens and 27% to 31% of hepatitis B e antigen-negative specimens from chronic HBV-infected patients. The bDNA assay may be useful as a prognostic and therapy monitoring tool for the management of HBV-infected patients undergoing antiviral treatment.
Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi
2016-01-01
Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691
Probabilistic cost-benefit analysis of disaster risk management in a development context.
Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan
2013-07-01
Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
[ABC supplies classification: a managment tool of costs in nursing].
Lourenço, Karina Gomes; Castilho, Valéria
2006-01-01
The implementation of costs management systems has been extremely helpful to healthcare area owing to their efficacy in cutting expenditures as well as improving service quality. The ABC classification is an applied strategy to stocktaking and control. The research, which consists of an exploratory/descriptive quantitative analysis, has been carried out in order to identify, in a year time period, the demand for supplies at Universidade de Sao Paulo's Hospital. Of 1938 classified materials, 67 itens had been classified that they correspond to the materials with bigger costs for the hospital. 31.3% of these A-Class supplies catalogued items are the nursing materials, more used for the nursing team.
Vision training methods for sports concussion mitigation and management.
Clark, Joseph F; Colosimo, Angelo; Ellis, James K; Mangine, Robert; Bixenmann, Benjamin; Hasselfeld, Kimberly; Graman, Patricia; Elgendy, Hagar; Myer, Gregory; Divine, Jon
2015-05-05
There is emerging evidence supporting the use vision training, including light board training tools, as a concussion baseline and neuro-diagnostic tool and potentially as a supportive component to concussion prevention strategies. This paper is focused on providing detailed methods for select vision training tools and reporting normative data for comparison when vision training is a part of a sports management program. The overall program includes standard vision training methods including tachistoscope, Brock's string, and strobe glasses, as well as specialized light board training algorithms. Stereopsis is measured as a means to monitor vision training affects. In addition, quantitative results for vision training methods as well as baseline and post-testing *A and Reaction Test measures with progressive scores are reported. Collegiate athletes consistently improve after six weeks of training in their stereopsis, *A and Reaction Test scores. When vision training is initiated as a team wide exercise, the incidence of concussion decreases in players who participate in training compared to players who do not receive the vision training. Vision training produces functional and performance changes that, when monitored, can be used to assess the success of the vision training and can be initiated as part of a sports medical intervention for concussion prevention.
Quantitative Stratification of Diffuse Parenchymal Lung Diseases
Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.
2014-01-01
Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Scoring the importance of tropical forest landscapes with local people: patterns and insights.
Sheil, Douglas; Liswanti, Nining
2006-07-01
Good natural resource management is scarce in many remote tropical regions. Improved management requires better local consultation, but accessing and understanding the preferences and concerns of stakeholders can be difficult. Scoring, where items are numerically rated in relation to each other, is simple and seems applicable even in situations where capacity and funds are limited, but managers rarely use such methods. Here we investigate scoring with seven indigenous communities threatened by forest loss in Kalimantan, Indonesia. We aimed to clarify the forest's multifaceted importance, using replication, cross-check exercises, and interviews. Results are sometimes surprising, but generally explained by additional investigation that sometimes provides new insights. The consistency of scoring results increases in line with community literacy and wealth. Various benefits and pitfalls are identified and examined. Aside from revealing and clarifying local preferences, scoring has unexplored potential as a quantitative technique. Scoring is an underappreciated management tool with wide potential.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
An Evolutionary Complex Systems Decision-Support Tool for the Management of Operations
NASA Astrophysics Data System (ADS)
Baldwin, J. S.; Allen, P. M.; Ridgway, K.
2011-12-01
This research aimed to add both to the development of complex systems thinking in the subject area of Operations and Production Management and to the limited number of applications of computational models and simulations from the science of complex systems. The latter potentially offer helpful decision-support tools for operations and production managers. A mechanical engineering firm was used as a case study where a combined qualitative and quantitative methodological approach was employed to extract the required data from four senior managers. Company performance measures as well as firm technologies, practices and policies, and their relation and interaction with one another, were elicited. The data were subjected to an evolutionary complex systems model resulting in a series of simulations. The findings included both reassuring and some unexpected results. The simulation based on the CEO's opinions led the most cohesive and synergistic collection of practices describing the firm, closely followed by the Marketing and R&D Managers. The Manufacturing Manager's responses led to the most extreme evolutionary trajectory where the integrity of the entire firm came into question particularly when considering how employees were utilised. By drawing directly from the opinions and views of managers rather than from logical 'if-then' rules and averaged mathematical representations of agents that characterise agent-based and other self-organisational models, this work builds on previous applications by capturing a micro-level description of diversity and a learning effect that has been problematical not only in terms of theory but also in application. This approach can be used as a decision-support tool for operations and other managers providing a forum with which to explore a) the strengths, weaknesses and consequences of different decision-making capacities within the firm; b) the introduction of new manufacturing technologies, practices and policies; and, c) the different evolutionary trajectories that a firm can take.
Rapid quantitation of neuraminidase inhibitor drug resistance in influenza virus quasispecies.
Lackenby, Angie; Democratis, Jane; Siqueira, Marilda M; Zambon, Maria C
2008-01-01
Emerging resistance of influenza viruses to neuraminidase inhibitors is a concern, both in surveillance of global circulating strains and in treatment of individual patients. Current methodologies to detect resistance rely on the use of cultured virus, thus taking time to complete or lacking the sensitivity to detect mutations in viral quasispecies. Methodology for rapid detection of clinically meaningful resistance is needed to assist individual patient management and to track the transmission of resistant viruses in the community. We have developed a pyrosequencing methodology to detect and quantitate influenza neuraminidase inhibitor resistance mutations in cultured virus and directly in clinical material. Our assays target polymorphisms associated with drug resistance in the neuraminidase genes of human influenza A H1N1 as well as human and avian H5N1 viruses. Quantitation can be achieved using viral RNA extracted directly from respiratory or tissue samples, thus eliminating the need for virus culture and allowing the assay of highly pathogenic viruses such as H5N1 without high containment laboratory facilities. Antiviral-resistant quasispecies are detected and quantitated accurately when present in the total virus population at levels as low as 10%. Pyrosequencing is a real-time assay; therefore, results can be obtained within a clinically relevant timeframe and provide information capable of informing individual patient or outbreak management. Pyrosequencing is ideally suited for early identification of emerging antiviral resistance in human and avian influenza infection and is a useful tool for laboratory surveillance and pandemic preparedness.
NASA Astrophysics Data System (ADS)
Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko
2013-04-01
With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.
Lieffers, Jessica R L; Arocha, Jose F; Grindrod, Kelly; Hanning, Rhona M
2018-02-01
Nutrition mobile apps have become accessible and popular weight-management tools available to the general public. To date, much of the research has focused on quantitative outcomes with these tools (eg, weight loss); little is known about user experiences and perceptions of these tools when used outside of a research trial environment. Our aim was to understand the experiences and perceptions of adult volunteers who have used publicly available mobile apps to support nutrition behavior change for weight management. We conducted one-on-one semi-structured interviews with individuals who reported using nutrition mobile apps for weight management outside of a research setting. Twenty-four healthy adults (n=19 females, n=5 males) who had used publicly available nutrition mobile apps for weight management for ≥1 week within the past 3 to 4 months were recruited from the community in southern Ontario and Edmonton, Canada, using different methods (eg, social media, posters, and word of mouth). Interviews were audiorecorded, transcribed verbatim, and transcripts were verified against recordings. Data were coded inductively and organized into categories using NVivo, version 10 (QSR International). Participants used nutrition apps for various amounts of time (mean=approximately 14 months). Varied nutrition apps were used; however, MyFitnessPal was the most common. In the interviews, the following four categories of experiences with nutrition apps became apparent: food data entry (database, data entry methods, portion size, and complex foods); accountability, feedback, and progress (goal setting, accountability, monitoring, and feedback); technical and app-related factors; and personal factors (self-motivation, privacy, knowledge, and obsession). Most participants used apps without professional or dietitian support. This work reveals that numerous factors affect use and ongoing adherence to use of nutrition mobile apps. These data are relevant to professionals looking to better assist individuals using these tools, as well as developers looking to develop new and improved apps. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.
2017-12-01
Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the effects of fire and grazing on vegetation. Ultimately, this integrative and iterative approach yielded counter-intuitive and surprising findings, and resulted in a more tractable set of possible futures for resource management planning.
Modeling with Young Students--Quantitative and Qualitative.
ERIC Educational Resources Information Center
Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis
1999-01-01
A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…
Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters
NASA Astrophysics Data System (ADS)
Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit
2018-06-01
Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
Belrhiti, Zakaria; Booth, Andrew; Marchal, Bruno; Verstraeten, Roosmarijn
2016-04-27
District health managers play a key role in the effectiveness of decentralized health systems in low- and middle-income countries. Inadequate management and leadership skills often hamper their ability to improve quality of care and effectiveness of health service delivery. Nevertheless, significant investments have been made in capacity-building programmes based on site-based training, mentoring, and operational research. This systematic review aims to review the effectiveness of site-based training, mentoring, and operational research (or action research) on the improvement of district health system management and leadership. Our secondary objectives are to assess whether variations in composition or intensity of the intervention influence its effectiveness and to identify enabling and constraining contexts and underlying mechanisms. We will search the following databases: MEDLINE, PsycInfo, Cochrane Library, CRD database (DARE), Cochrane Effective Practice and Organisation of Care (EPOC) group, ISI Web of Science, Health Evidence.org, PDQ-Evidence, ERIC, EMBASE, and TRIP. Complementary search will be performed (hand-searching journals and citation and reference tracking). Studies that meet the following PICO (Population, Intervention, Comparison, Outcome) criteria will be included: P: professionals working at district health management level; I: site-based training with or without mentoring, or operational research; C: normal institutional arrangements; and O: district health management functions. We will include cluster randomized controlled trials, controlled before-and-after studies, interrupted time series analysis, quasi-experimental designs, and cohort and longitudinal studies. Qualitative research will be included to contextualize findings and identify barriers and facilitators. Primary outcomes that will be reported are district health management and leadership functions. We will assess risk of bias with the Cochrane Collaboration's tools for randomized controlled trials (RCT) and non RCT studies and Critical Appraisal Skills Programme checklists for qualitative studies. We will assess strength of recommendations with the GRADE tool for quantitative studies, and the CERQual approach for qualitative studies. Synthesis of quantitative studies will be performed through meta-analysis when appropriate. Best fit framework synthesis will be used to synthesize qualitative studies. This protocol paper describes a systematic review assessing the effectiveness of site-based training (with or without mentoring programmes or operational research) on the improvement of district health system management and leadership. PROSPERO CRD42015032351.
Integrated farm sustainability assessment for the environmental management of rural activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stachetii Rodrigues, Geraldo, E-mail: stacheti@cnpma.embrapa.b; Aparecida Rodrigues, Izilda, E-mail: isis@cnpma.embrapa.b; Almeida Buschinelli, Claudio Cesar de, E-mail: buschi@cnpma.embrapa.b
2010-07-15
Farmers have been increasingly called upon to respond to an ongoing redefinition in consumers' demands, having as a converging theme the search for sustainable production practices. In order to satisfy this objective, instruments for the environmental management of agricultural activities have been sought out. Environmental impact assessment methods are appropriate tools to address the choice of technologies and management practices to minimize negative effects of agricultural development, while maximizing productive efficiency, sound usage of natural resources, conservation of ecological assets and equitable access to wealth generation means. The 'system for weighted environmental impact assessment of rural activities' (APOIA-NovoRural) presented inmore » this paper is organized to provide integrated farm sustainability assessment according to quantitative environmental standards and defined socio-economic benchmarks. The system integrates sixty-two objective indicators in five sustainability dimensions - (i) Landscape ecology, (ii) Environmental quality (atmosphere, water and soil), (iii) Sociocultural values, (iv) Economic values, and (v) Management and administration. Impact indices are expressed in three integration levels: (i) specific indicators, that offer a diagnostic and managerial tool for farmers and rural administrators, by pointing out particular attributes of the rural activities that may be failing to comply with defined environmental performance objectives; (ii) integrated sustainability dimensions, that show decision-makers the major contributions of the rural activities toward local sustainable development, facilitating the definition of control actions and promotion measures; and (iii) aggregated sustainability index, that can be considered a yardstick for eco-certification purposes. Nine fully documented case studies carried out with the APOIA-NovoRural system, focusing on different scales, diverse rural activities/farming systems, and contrasting spatial/territorial contexts, attest to the malleability of the method and its applicability as an integrated farm environmental management tool.« less
Health impact assessment – A survey on quantifying tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org
Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
Slade, Jeffrey W.; Adams, Jean V.; Christie, Gavin C.; Cuddy, Douglas W.; Fodale, Michael F.; Heinrich, John W.; Quinlan, Henry R.; Weise, Jerry G.; Weisser, John W.; Young, Robert J.
2003-01-01
Before 1995, Great Lakes streams were selected for lampricide treatment based primarily on qualitative measures of the relative abundance of larval sea lampreys, Petromyzon marinus. New integrated pest management approaches required standardized quantitative measures of sea lamprey. This paper evaluates historical larval assessment techniques and data and describes how new standardized methods for estimating abundance of larval and metamorphosed sea lampreys were developed and implemented. These new methods have been used to estimate larval and metamorphosed sea lamprey abundance in about 100 Great Lakes streams annually and to rank them for lampricide treatment since 1995. Implementation of these methods has provided a quantitative means of selecting streams for treatment based on treatment cost and estimated production of metamorphosed sea lampreys, provided managers with a tool to estimate potential recruitment of sea lampreys to the Great Lakes and the ability to measure the potential consequences of not treating streams, resulting in a more justifiable allocation of resources. The empirical data produced can also be used to simulate the impacts of various control scenarios.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
Purdue Ionomics Information Management System. An Integrated Functional Genomics Platform1[C][W][OA
Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S.; Salt, David E.
2007-01-01
The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics. PMID:17189337
2016-01-01
Political risk is identified as a dominant risk category of disaster risk management (DRM) which could negatively affect the success of those measures implemented to reduce disaster risk. Key to political risk is the construct of national identity which, if poorly constructed, could greatly contribute to political risk. This article proposed a tool to measure the construct of national identity and to provide recommendations to strengthen the construct in order to mitigate the exacerbating influence it may have on political risk and ultimately on DRM. The design of the measurement tool consisted of a mixed methodological approach employing both quantitative and qualitative data. The data collection instruments included a literature review (which is shortly provided in the previous sections) and an empirical study that utilised data obtained through structured questionnaires. Although the results of the proposed measuring instrument did not include a representative sample of all the cultures in South Africa, the results alluded to different levels for the construction of national identity among black and white respondents, possibly because of different ideological expectations among these groups. The results of the study should be considered as a validation of the measuring tool and not necessarily of the construct of national identity in South Africa. The measuring tool is thus promising for future studies to reduce political risk and ultimately disaster risk.
Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas
2016-04-26
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.
Possible ways for Public Health Surveillance practices evaluation.
Vilela, Maria Filomena de Gouveia; Santos, Dario Nunes Dos; Kemp, Brigina
2017-10-01
This is an evaluative and qualitative study that proposes to investigate self-assessment evaluation as a device to analyze Health Surveillance practices through a questionnaire built by researchers, adapted from the Self-Assessment of Improved Access and Primary Care Quality (AMAQ) and available on the FORMSUS platform. Forty-one Health Surveillance workers and managers of a large municipality from São Paulo State evaluated the realms of "management", "teamwork" and their respective sub-realms. Two categories were created to analyze the results: "Management" and "Team" in dialogue with references from Management, Evaluation and Health Surveillance. Most "management" and "teamwork" sub-realms were deemed satisfactory. Self-assessment evaluation through an applied evaluation tool was shown to be a powerful resource for the analysis of Health Surveillance practices in combination with other devices adopted by the Unified Health System (SUS). Unlike usual evaluation processes guided by quantitative markers, this self-assessable evaluative process included subjects and enabled the possibility of incorporating a new look at itself to the way Health Surveillance is carried out and support future management contracts between workers and managers.
A strategic management model for evaluation of health, safety and environmental performance.
Abbaspour, Majid; Toutounchian, Solmaz; Roayaei, Emad; Nassiri, Parvin
2012-05-01
Strategic health, safety, and environmental management system (HSE-MS) involves systematic and cooperative planning in each phase of the lifecycle of a project to ensure that interaction among the industry group, client, contractor, stakeholder, and host community exists with the highest level of health, safety, and environmental standard performances. Therefore, it seems necessary to assess the HSE-MS performance of contractor(s) by a comparative strategic management model with the aim of continuous improvement. The present Strategic Management Model (SMM) has been illustrated by a case study and the results show that the model is a suitable management tool for decision making in a contract environment, especially in oil and gas fields and based on accepted international standards within the framework of management deming cycle. To develop this model, a data bank has been created, which includes the statistical data calculated by converting the HSE performance qualitative data into quantitative values. Based on this fact, the structure of the model has been formed by defining HSE performance indicators according to the HSE-MS model. Therefore, 178 indicators have been selected which have been grouped into four attributes. Model output provides quantitative measures of HSE-MS performance as a percentage of an ideal level with maximum possible score for each attribute. Defining the strengths and weaknesses of the contractor(s) is another capability of this model. On the other hand, this model provides a ranking that could be used as the basis for decision making at the contractors' pre-qualification phase or during the execution of the project.
Data Independent Acquisition analysis in ProHits 4.0.
Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude
2016-10-21
Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.
Stevens, Patricia; Walters, Katie D.
2015-01-01
The Trust Species and Habitats Branch of the Fort Collins Science Center includes a diverse group of scientists encompassing both traditional and specialized expertise in wildlife biology, ecosystem ecology, quantitative ecology, disease ecology, molecular genetics, and stable isotope geochemistry. Using our expertise and collaborating with others around the world, our goal is to provide the information, tools, and technologies that our partners need to support conservation, management, and restoration of terrestrial vertebrate populations, habitats, and ecosystem function in a changing world.
Calibration of a COTS Integration Cost Model Using Local Project Data
NASA Technical Reports Server (NTRS)
Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David
1997-01-01
The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
Thermal Characterization of Carbon Nanotubes by Photothermal Techniques
NASA Astrophysics Data System (ADS)
Leahu, G.; Li Voti, R.; Larciprete, M. C.; Sibilia, C.; Bertolotti, M.; Nefedov, I.; Anoshkin, I. V.
2015-06-01
Carbon nanotubes (CNTs) are multifunctional materials commonly used in a large number of applications in electronics, sensors, nanocomposites, thermal management, actuators, energy storage and conversion, and drug delivery. Despite recent important advances in the development of CNT purity assessment tools and atomic resolution imaging of individual nanotubes by scanning tunnelling microscopy and high-resolution transmission electron microscopy, the macroscale assessment of the overall surface qualities of commercial CNT materials remains a great challenge. The lack of quantitative measurement technology to characterize and compare the surface qualities of bulk manufactured and engineered CNT materials has negative impacts on the reliable and consistent nanomanufacturing of CNT products. In this paper it is shown how photoacoustic spectroscopy and photothermal radiometry represent useful non-destructive tools to study the optothermal properties of carbon nanotube thin films.
A Quantitative ADME-base Tool for Exploring Human ...
Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It
NASA Astrophysics Data System (ADS)
Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.
2014-02-01
In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.
chipPCR: an R package to pre-process raw data of amplification curves.
Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter
2015-09-01
Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Vision Training Methods for Sports Concussion Mitigation and Management
Clark, Joseph F.; Colosimo, Angelo; Ellis, James K.; Mangine, Robert; Bixenmann, Benjamin; Hasselfeld, Kimberly; Graman, Patricia; Elgendy, Hagar; Myer, Gregory; Divine, Jon
2015-01-01
There is emerging evidence supporting the use vision training, including light board training tools, as a concussion baseline and neuro-diagnostic tool and potentially as a supportive component to concussion prevention strategies. This paper is focused on providing detailed methods for select vision training tools and reporting normative data for comparison when vision training is a part of a sports management program. The overall program includes standard vision training methods including tachistoscope, Brock’s string, and strobe glasses, as well as specialized light board training algorithms. Stereopsis is measured as a means to monitor vision training affects. In addition, quantitative results for vision training methods as well as baseline and post-testing *A and Reaction Test measures with progressive scores are reported. Collegiate athletes consistently improve after six weeks of training in their stereopsis, *A and Reaction Test scores. When vision training is initiated as a team wide exercise, the incidence of concussion decreases in players who participate in training compared to players who do not receive the vision training. Vision training produces functional and performance changes that, when monitored, can be used to assess the success of the vision training and can be initiated as part of a sports medical intervention for concussion prevention. PMID:25992878
Strengthening immunization in a West African country: Mali.
Milstien, J B; Tapia, M; Sow, S O; Keita, L; Kotloff, K
2007-11-01
OBJECTIVES AND CONTEXT: This paper describes the preliminary outcomes of a collaborative capacity-building initiative performed in Mali to strengthen the immunization program. We conducted baseline assessments, training and post-training assessments in four programmatic areas: vaccine management, immunization safety, surveillance, and vaccine coverage, using adapted World Health Organization (WHO) tools. Impact assessment was done by evaluation of trainee performance, programmatic impact and sustainability. Qualitative and quantitative improvement of trainee performance was seen after the training interventions: some knowledge improvement, greater compliance with vaccine management practices and improved vaccine coverage. Deficiencies in information transfer to the periphery were identified. The program involves shared responsibility for planning, implementation and financing with national stakeholders while emphasizing the training of leaders and managers to ensure sustainability. Although short-term gains were measured, our initial assessments indicate that sustained impact will require improvements in staffing, financing and guidelines to ensure delivery of information and skills to the periphery.
Stenberg, Nicola; Furness, Penny J
2017-03-01
The outcomes of self-management interventions are commonly assessed using quantitative measurement tools, and few studies ask people with long-term conditions to explain, in their own words, what aspects of the intervention they valued. In this Grounded Theory study, a Health Trainers service in the north of England was evaluated based on interviews with eight service-users. Open, focused, and theoretical coding led to the development of a preliminary model explaining participants' experiences and perceived impact of the service. The model reflects the findings that living well with a long-term condition encompassed social connectedness, changed identities, acceptance, and self-care. Health trainers performed four related roles that were perceived to contribute to these outcomes: conceptualizer, connector, coach, and champion. The evaluation contributes a grounded theoretical understanding of a personalized self-management intervention that emphasizes the benefits of a holistic approach to enable cognitive, behavioral, emotional, and social adjustments.
Aquatics Systems Branch: transdisciplinary research to address water-related environmental problems
Dong, Quan; Walters, Katie D.
2015-01-01
The Aquatic Systems Branch at the Fort Collins Science Center is a group of scientists dedicated to advancing interdisciplinary science and providing science support to solve water-related environmental issues. Natural resource managers have an increasing need for scientific information and stakeholders face enormous challenges of increasing and competing demands for water. Our scientists are leaders in ecological flows, riparian ecology, hydroscape ecology, ecosystem management, and contaminant biology. The Aquatic Systems Branch employs and develops state-of-the-science approaches in field investigations, laboratory experiments, remote sensing, simulation and predictive modeling, and decision support tools. We use the aquatic experimental laboratory, the greenhouse, the botanical garden and other advanced facilities to conduct unique research. Our scientists pursue research on the ground, in the rivers, and in the skies, generating and testing hypotheses and collecting quantitative information to support planning and design in natural resource management and aquatic restoration.
Jadhav, Pravin R; Neal, Lauren; Florian, Jeff; Chen, Ying; Naeger, Lisa; Robertson, Sarah; Soon, Guoxing; Birnkrant, Debra
2010-09-01
This article presents a prototype for an operational innovation in knowledge management (KM). These operational innovations are geared toward managing knowledge efficiently and accessing all available information by embracing advances in bioinformatics and allied fields. The specific components of the proposed KM system are (1) a database to archive hepatitis C virus (HCV) treatment data in a structured format and retrieve information in a query-capable manner and (2) an automated analysis tool to inform trial design elements for HCV drug development. The proposed framework is intended to benefit drug development by increasing efficiency of dose selection and improving the consistency of advice from US Food and Drug Administration (FDA). It is also hoped that the framework will encourage collaboration among FDA, industry, and academic scientists to guide the HCV drug development process using model-based quantitative analysis techniques.
Haze Gray Paint and the U.S. Navy: A Procurement Process Review
2017-12-01
support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane
How models can support ecosystem-based management of coral reefs
NASA Astrophysics Data System (ADS)
Weijerman, Mariska; Fulton, Elizabeth A.; Janssen, Annette B. G.; Kuiper, Jan J.; Leemans, Rik; Robson, Barbara J.; van de Leemput, Ingrid A.; Mooij, Wolf M.
2015-11-01
Despite the importance of coral reef ecosystems to the social and economic welfare of coastal communities, the condition of these marine ecosystems have generally degraded over the past decades. With an increased knowledge of coral reef ecosystem processes and a rise in computer power, dynamic models are useful tools in assessing the synergistic effects of local and global stressors on ecosystem functions. We review representative approaches for dynamically modeling coral reef ecosystems and categorize them as minimal, intermediate and complex models. The categorization was based on the leading principle for model development and their level of realism and process detail. This review aims to improve the knowledge of concurrent approaches in coral reef ecosystem modeling and highlights the importance of choosing an appropriate approach based on the type of question(s) to be answered. We contend that minimal and intermediate models are generally valuable tools to assess the response of key states to main stressors and, hence, contribute to understanding ecological surprises. As has been shown in freshwater resources management, insight into these conceptual relations profoundly influences how natural resource managers perceive their systems and how they manage ecosystem recovery. We argue that adaptive resource management requires integrated thinking and decision support, which demands a diversity of modeling approaches. Integration can be achieved through complimentary use of models or through integrated models that systemically combine all relevant aspects in one model. Such whole-of-system models can be useful tools for quantitatively evaluating scenarios. These models allow an assessment of the interactive effects of multiple stressors on various, potentially conflicting, management objectives. All models simplify reality and, as such, have their weaknesses. While minimal models lack multidimensionality, system models are likely difficult to interpret as they require many efforts to decipher the numerous interactions and feedback loops. Given the breadth of questions to be tackled when dealing with coral reefs, the best practice approach uses multiple model types and thus benefits from the strength of different models types.
Optimization of Statistical Methods Impact on Quantitative Proteomics Data.
Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L
2015-10-02
As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.
TQM and lean strategy deployment in Italian hospitals.
Chiarini, Andrea; Baccarani, Claudio
2016-10-03
Purpose This paper aims to contribute to the debate concerning total quality management (TQM)-Lean strategy in public healthcare by analyzing the deployment path for implementation, the possible benefits that can be achieved and the encountered pitfalls. Design/methodology/approach Three case studies are drawn from three large Italian hospitals with more than 500 beds each and structured with many departments. The hospitals are located in Tuscany, Italy. These three hospitals have embraced TQM and Lean, starting from strategic objectives and their deployment. At the same time, they have also implemented many TQM-Lean tools. The case studies are based on interviews held with four managers in each of these three public hospitals. Findings Results from the interviews show that there is a specific deployment path for TQM-Lean implementation. The hospitals have also achieved benefits linked to patient satisfaction and improved organizational performances. Problems related to organizational and cultural issues, such as senior managers' commitment, staff management, manufacturing culture and tools adaptation, could affect the benefits. Research limitations/implications The research has been carried out in just three Italian public hospitals. Hence, similar investigations could be managed in other countries. Researchers could also use a larger sample and investigate these issues by means of quantitative inquiry. Practical implications Practitioners could try to apply the deployment path revealed by these case studies in other public and private hospitals. Originality/value The results of this research show that there is a specific, new deployment path for implementing TQM-Lean strategy in some public hospitals.
Structured decision making for managing pneumonia epizootics in bighorn sheep
Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.
2016-01-01
Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.
Hoon, Lim Siew; Hong-Gu, He; Mackey, Sandra
Paediatric pain management remains a challenge in clinical settings. Parents can contribute to the effective and accurate pain assessment and management of their child. No systematic reviews regarding the parental involvement in their child's post-operative pain management have been published. To determine the best available evidence regarding parental involvement in managing their children's post-operative pain in the hospital setting. The review considered studies that included parents of all ethnic groups with children aged between 6 to 12 years old who were hospitalised and undergone surgery of any kind with post-operative surgical or incision site pain where care was provided in acute hospital settings. The phenomena of interest were the experiences of parents in managing their children's post-operative pain. A three-step search strategy was utilised in each component of this review. Major databases searched included: MEDLINE, CINAHL, Scopus, ScienceDirect, the Cochrane library, PubMed as well as Google Scholar. The search included published studies and papers in English from 1990 to 2009. Each included study was assessed by two independent reviewers using the appropriate appraisal checklists developed by the Joanna Briggs Institute (JBI). Quantitative and qualitative data were extracted from the included papers using standardised data extraction tools from the JBI, Meta-analysis Statistics Assessment and Review Instrument data extraction tool for descriptive/case series and the JBI-Qualitative Assessment and Review Instrument data extraction tool for interpretive and critical research. The five quantitative studies included in this review were not suitable for meta-analysis due to clinical and methodological heterogeneity and therefore the findings are presented in a narrative form. The two qualitative studies were from the same study, therefore meta-synthesis was not possible. Hence the results of the studies were presented in a narrative format. Seven papers were included in this review. The evidence identified topics including: pharmacological and non-pharmacological interventions carried out by parents; the experience of concern, fear, helplessness, anxiety, depression, frustration and lack of support felt by parents during their child's hospitalisation; communication issues and knowledge deficits; need for information by parents to promote effective participation in managing their child's post-operative pain. This review revealed pharmacological and non-pharmacological interventions carried out by parents to alleviate their children's post-operative pain. Obstacles and promoting factors influencing parents' experiences as well as their needs in the process of caring were identified. Parents' roles in their child's surgical pain management should be clarified and their efforts acknowledged, which will encourage parents' active participation in their child's caring process. Nurses should provide guidance, education and support to parents. More studies are needed to examine parents' experiences in caring for their child, investigate the effectiveness of education and guidance provided to parents by the nurses and explore the influence of parents' cultural values and nurses' perceptions of parental participation in their child's care.
Frameworks and tools for risk assessment of manufactured nanomaterials.
Hristozov, Danail; Gottardo, Stefania; Semenzin, Elena; Oomen, Agnes; Bos, Peter; Peijnenburg, Willie; van Tongeren, Martie; Nowack, Bernd; Hunt, Neil; Brunelli, Andrea; Scott-Fordsmand, Janeck J; Tran, Lang; Marcomini, Antonio
2016-10-01
Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sperotto, Anna; Molina, José-Luis; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio
2017-11-01
The evaluation and management of climate change impacts on natural and human systems required the adoption of a multi-risk perspective in which the effect of multiple stressors, processes and interconnections are simultaneously modelled. Despite Bayesian Networks (BNs) are popular integrated modelling tools to deal with uncertain and complex domains, their application in the context of climate change still represent a limited explored field. The paper, drawing on the review of existing applications in the field of environmental management, discusses the potential and limitation of applying BNs to improve current climate change risk assessment procedures. Main potentials include the advantage to consider multiple stressors and endpoints in the same framework, their flexibility in dealing and communicate with the uncertainty of climate projections and the opportunity to perform scenario analysis. Some limitations (i.e. representation of temporal and spatial dynamics, quantitative validation), however, should be overcome to boost BNs use in climate change impacts assessment and management. Copyright © 2017 Elsevier Ltd. All rights reserved.
Understanding and planning ecological restoration of plant-pollinator networks.
Devoto, Mariano; Bailey, Sallie; Craze, Paul; Memmott, Jane
2012-04-01
Theory developed from studying changes in the structure and function of communities during natural or managed succession can guide the restoration of particular communities. We constructed 30 quantitative plant-flower visitor networks along a managed successional gradient to identify the main drivers of change in network structure. We then applied two alternative restoration strategies in silico (restoring for functional complementarity or redundancy) to data from our early successional plots to examine whether different strategies affected the restoration trajectories. Changes in network structure were explained by a combination of age, tree density and variation in tree diameter, even when variance explained by undergrowth structure was accounted for first. A combination of field data, a network approach and numerical simulations helped to identify which species should be given restoration priority in the context of different restoration targets. This combined approach provides a powerful tool for directing management decisions, particularly when management seeks to restore or conserve ecosystem function. © 2012 Blackwell Publishing Ltd/CNRS.
Yates, Katherine L; Schoeman, David S
2013-01-01
Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way.
Symstad, Amy J.; Long, Andrew J.; Stamm, John; King, David A.; Bachelet, Dominque M.; Norton, Parker A.
2014-01-01
Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches. Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches.
Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets
Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi
2013-01-01
Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368
Abdulcadir, Jasmine; Say, Lale; Pallitto, Christina
2017-05-22
Improving healthcare providers' capacities of prevention and treatment of female genital mutilation (FGM) is important given the fact that 200 million women and girls globally are living with FGM. However, training programs are lacking and often not evaluated. Validated and standardized tools to assess providers' knowledge, attitude and practice (KAP) regarding FGM are lacking. Therefore, little evidence exists on the impact of training efforts on healthcare providers' KAP on FGM. The aim of our paper is to systematically review the available published and grey literature on the existing quantitative tools (e.g. scales, questionnaires) measuring healthcare students' and providers' KAP on FGM. We systematically reviewed the published and grey literature on any quantitative assessment/measurement/evaluation of KAP of healthcare students and providers about FGM from January 1 st , 1995 to July 12 th , 2016. Twenty-nine papers met our inclusion criteria. We reviewed 18 full text questionnaires implemented and administered to healthcare professionals (students, nurses, midwives and physicians) in high and low income countries. The questionnaires assessed basic KAP on FGM. Some included personal and cultural beliefs, past clinical experiences, personal awareness of available clinical guidelines and laws, previous training on FGM, training needs, caregiver's confidence in management of women with FGM, communication and personal perceptions. Identified gaps included the medical, psychological or surgical treatments indicated to improve girls and women's health; correct diagnosis, recording ad reporting capacities; clitoral reconstruction and psychosexual care of circumcised women. Cultural and personal beliefs on FGM were investigated only in high prevalence countries. Few questionnaires addressed care of children, child protection strategies, treatment of short-term complications, and prevention. There is a need for implementation and testing of interventions aimed at improving healthcare professionals' and students' capacities of diagnosis, care and prevention of FGM. Designing tools for measuring the outcomes of such interventions is a critical aspect. A unique, reproducible and standardized questionnaire could be created to measure the effect of a particular training program. Such a tool would also allow comparisons between settings, countries and interventions. An ideal tool would test the clinical capacities of providers in managing complications and communicating with clients with FGM as well as changes in KAP.
Mathauer, Inke; Imhoff, Ingo
2006-08-29
There is a serious human resource crisis in the health sector in developing countries, particularly in Africa. One of the challenges is the low motivation of health workers. Experience and the evidence suggest that any comprehensive strategy to maximize health worker motivation in a developing country context has to involve a mix of financial and non-financial incentives. This study assesses the role of non-financial incentives for motivation in two cases, in Benin and Kenya. The study design entailed semi-structured qualitative interviews with doctors and nurses from public, private and NGO facilities in rural areas. The selection of health professionals was the result of a layered sampling process. In Benin 62 interviews with health professionals were carried out; in Kenya 37 were obtained. Results from individual interviews were backed up with information from focus group discussions. For further contextual information, interviews with civil servants in the Ministry of Health and at the district level were carried out. The interview material was coded and quantitative data was analysed with SPSS software. The study shows that health workers overall are strongly guided by their professional conscience and similar aspects related to professional ethos. In fact, many health workers are demotivated and frustrated precisely because they are unable to satisfy their professional conscience and impeded in pursuing their vocation due to lack of means and supplies and due to inadequate or inappropriately applied human resources management (HRM) tools. The paper also indicates that even some HRM tools that are applied may adversely affect the motivation of health workers. The findings confirm the starting hypothesis that non-financial incentives and HRM tools play an important role with respect to increasing motivation of health professionals. Adequate HRM tools can uphold and strengthen the professional ethos of doctors and nurses. This entails acknowledging their professionalism and addressing professional goals such as recognition, career development and further qualification. It must be the aim of human resources management/quality management (HRM/QM) to develop the work environment so that health workers are enabled to meet their personal and the organizational goals.
Mathauer, Inke; Imhoff, Ingo
2006-01-01
Background There is a serious human resource crisis in the health sector in developing countries, particularly in Africa. One of the challenges is the low motivation of health workers. Experience and the evidence suggest that any comprehensive strategy to maximize health worker motivation in a developing country context has to involve a mix of financial and non-financial incentives. This study assesses the role of non-financial incentives for motivation in two cases, in Benin and Kenya. Methods The study design entailed semi-structured qualitative interviews with doctors and nurses from public, private and NGO facilities in rural areas. The selection of health professionals was the result of a layered sampling process. In Benin 62 interviews with health professionals were carried out; in Kenya 37 were obtained. Results from individual interviews were backed up with information from focus group discussions. For further contextual information, interviews with civil servants in the Ministry of Health and at the district level were carried out. The interview material was coded and quantitative data was analysed with SPSS software. Results and discussion The study shows that health workers overall are strongly guided by their professional conscience and similar aspects related to professional ethos. In fact, many health workers are demotivated and frustrated precisely because they are unable to satisfy their professional conscience and impeded in pursuing their vocation due to lack of means and supplies and due to inadequate or inappropriately applied human resources management (HRM) tools. The paper also indicates that even some HRM tools that are applied may adversely affect the motivation of health workers. Conclusion The findings confirm the starting hypothesis that non-financial incentives and HRM tools play an important role with respect to increasing motivation of health professionals. Adequate HRM tools can uphold and strengthen the professional ethos of doctors and nurses. This entails acknowledging their professionalism and addressing professional goals such as recognition, career development and further qualification. It must be the aim of human resources management/quality management (HRM/QM) to develop the work environment so that health workers are enabled to meet their personal and the organizational goals. PMID:16939644
Peña, Adolfo; Estrada, Carlos A; Soniat, Debbie; Taylor, Benjamin; Burton, Michael
2012-01-01
Pain management in hospitalized patients remains a priority area for improvement; effective strategies for consensus development are needed to prioritize interventions. To identify challenges, barriers, and perspectives of healthcare providers in managing pain among hospitalized patients. Qualitative and quantitative group consensus using a brainstorming technique for quality improvement-the nominal group technique (NGT). One medical, 1 medical-surgical, and 1 surgical hospital unit at a large academic medical center. Nurses, resident physicians, patient care technicians, and unit clerks. Responses and ranking to the NGT question: "What causes uncontrolled pain in your unit?" Twenty-seven health workers generated a total of 94 ideas. The ideas perceived contributing to a suboptimal pain control were grouped as system factors (timeliness, n = 18 ideas; communication, n = 11; pain assessment, n = 8), human factors (knowledge and experience, n = 16; provider bias, n = 8; patient factors, n = 19), and interface of system and human factors (standardization, n = 14). Knowledge, timeliness, provider bias, and patient factors were the top ranked themes. Knowledge and timeliness are considered main priorities to improve pain control. NGT is an efficient tool for identifying general and context-specific priority areas for quality improvement; teams of healthcare providers should consider using NGT to address their own challenges and barriers. Copyright © 2011 Society of Hospital Medicine.
Sherriff, Sophie C; Rowan, John S; Fenton, Owen; Jordan, Philip; Melland, Alice R; Mellander, Per-Erik; hUallacháin, Daire Ó
2016-02-16
Within agricultural watersheds suspended sediment-discharge hysteresis during storm events is commonly used to indicate dominant sediment sources and pathways. However, availability of high-resolution data, qualitative metrics, longevity of records, and simultaneous multiwatershed analyses has limited the efficacy of hysteresis as a sediment management tool. This two year study utilizes a quantitative hysteresis index from high-resolution suspended sediment and discharge data to assess fluctuations in sediment source location, delivery mechanisms and export efficiency in three intensively farmed watersheds during events over time. Flow-weighted event sediment export was further considered using multivariate techniques to delineate rainfall, stream hydrology, and antecedent moisture controls on sediment origins. Watersheds with low permeability (moderately- or poorly drained soils) with good surface hydrological connectivity, therefore, had contrasting hysteresis due to source location (hillslope versus channel bank). The well-drained watershed with reduced connectivity exported less sediment but, when watershed connectivity was established, the largest event sediment load of all watersheds occurred. Event sediment export was elevated in arable watersheds when low groundcover was coupled with high connectivity, whereas in the grassland watershed, export was attributed to wetter weather only. Hysteresis analysis successfully indicated contrasting seasonality, connectivity and source availability and is a useful tool to identify watershed specific sediment management practices.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
The SAM framework: modeling the effects of management factors on human behavior in risk analysis.
Murphy, D M; Paté-Cornell, M E
1996-08-01
Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.
The modern role of transoesophageal echocardiography in the assessment of valvular pathologies
Bull, Sacha; Newton, James
2017-01-01
Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed the detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves and provide guidance for the evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. PMID:28096184
The modern role of transoesophageal echocardiography in the assessment of valvular pathologies.
Wamil, Malgorzata; Bull, Sacha; Newton, James
2017-01-17
Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves, and provide guidance for evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. © 2017 The authors.
Remote sensing techniques for conservation and management of natural vegetation ecosystems
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Verdesio, J. J.; Dossantos, J. R.
1981-01-01
The importance of using remote sensing techniques, in the visible and near-infrared ranges, for mapping, inventory, conservation and management of natural ecosystems is discussed. Some examples realized in Brazil or other countries are given to evaluate the products from orbital platform (MSS and RBV imagery of LANDSAT) and aerial level (photography) for ecosystems study. The maximum quantitative and qualitative information which can be obtained from each sensor, at different level, are discussed. Based on the developed experiments it is concluded that the remote sensing technique is a useful tool in mapping vegetation units, estimating biomass, forecasting and evaluation of fire damage, disease detection, deforestation mapping and change detection in land-use. In addition, remote sensing techniques can be used in controling implantation and planning natural/artificial regeneration.
Integrated Data & Analysis in Support of Informed and Transparent Decision Making
NASA Astrophysics Data System (ADS)
Guivetchi, K.
2012-12-01
The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Montesinos, Isabel; Brancart, Françoise; Schepers, Kinda; Jacobs, Frederique; Denis, Olivier; Delforge, Marie-Luce
2015-06-01
A total of 120 bronchoalveolar lavage specimens from HIV and non-HIV immunocompromised patients, positive for Pneumocystis jirovecii by an "in house" real-time polymerase chain reaction (PCR), were evaluated by the Bio-Evolution Pneumocystis real-time PCR, a commercial quantitative assay. Patients were classified in 2 categories based on clinical and radiological findings: definite and unlikely Pneumocystis pneumonia (PCP). For the "in house" PCR, cycle threshold 34 was established as cut-off value to discriminate definite PCP from unlikely PCP with 65% and 85% of sensitivity and specificity, respectively. For the Bio-Evolution quantitative PCR, a cut-off value of 2.8×10(5)copies/mL was defined with 72% and 82% of sensitivity and specificity, respectively. Overlapped zones of results for definite and unlikely PCP were observed. Quantitative PCR is probably a useful tool for PCP diagnosis. However, for optimal management of PCP in non-HIV immunocompromised patients, operational thresholds should be assessed according to underlying diseases and other clinical and radiological parameters. Copyright © 2015 Elsevier Inc. All rights reserved.
Informatics methods to enable sharing of quantitative imaging research data.
Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L
2012-11-01
The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.
Through ARIPAR-GIS the quantified area risk analysis supports land-use planning activities.
Spadoni, G; Egidi, D; Contini, S
2000-01-07
The paper first summarises the main aspects of the ARIPAR methodology whose steps can be applied to quantify the impact on a territory of major accident risks due to processing, storing and transporting dangerous substances. Then the capabilities of the new decision support tool ARIPAR-GIS, implementing the mentioned procedure, are described, together with its main features and types of results. These are clearly shown through a short description of the updated ARIPAR study (reference year 1994), in which the impact of changes due to industrial and transportation dynamics on the Ravenna territory in Italy were evaluated. The brief explanation of how results have been used by local administrations offers the opportunity to discuss about advantages of the quantitative area risk analysis tool in supporting activities of risk management, risk control and land-use planning.
Le Guillou-Guillemette, Helene; Lunel-Fabiani, Francoise
2009-01-01
The treatment schedule (combination of compounds, doses, and duration) and the virological follow-up for management of antiviral treatment in patients chronically infected by HCV is now well standardized, but to ensure good monitoring of the treated patients, physicians need rapid, reproducible, and sensitive molecular virological tools with a wide range of detection and quantification of HCV RNA in blood samples. Several assays for detection and/or quantification of HCV RNA are currently commercially available. Here, all these assays are detailed, and a brief description of each step of the assay is provided. They are divided into two categories by method: those based on signal amplification and those based on target amplification. These two categories are then divided into qualitative, quantitative, and quantitative detection assays. The real-time reverse-transcription polymerase chain reaction (RT-PCR)-based assays are the most promising strategy in the HCV virological area.
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
Investigating the Validity of Two Widely Used Quantitative Text Tools
ERIC Educational Resources Information Center
Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne
2018-01-01
In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…
ERIC Educational Resources Information Center
Baldwin, Grover H.
The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
Hein, L R
2001-10-01
A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.
Rüter, Anders; Vikstrom, Tore
2009-01-01
Good staff procedure skills in a management group during incidents and disasters are believed to be a prerequisite for good management of the situation. However, this has not been demonstrated scientifically. Templates for evaluation results from performance indicators during simulation exercises have previously been tested. The aim of this study was to demonstrate the possibility that these indicators can be used as a tool for studying the relationship between good management skills and good staff procedure skills. Good and structured work (staff procedure skills) in a hospital management group during simulation exercises in disaster medicine is related to good and timely decisions (good management skills). Results from 29 consecutive simulation exercises in which staff procedure skills and management skills were evaluated using quantitative measurements were included. The statistical analysis method used was simple linear regression with staff procedure skills as the response variable and management skills as the predictor variable. An overall significant relationship was identified between staff procedure skills and management skills (p(2)0.05). This study suggests that there is a relationship between staff procedure skills and management skills in the educational setting used. Future studies are needed to demonstrate if this also can be observed during actual incidents.
Théron, Laëtitia; Centeno, Delphine; Coudy-Gandilhon, Cécile; Pujos-Guillot, Estelle; Astruc, Thierry; Rémond, Didier; Barthelemy, Jean-Claude; Roche, Frédéric; Feasson, Léonard; Hébraud, Michel; Béchet, Daniel; Chambon, Christophe
2016-10-26
Mass spectrometry imaging (MSI) is a powerful tool to visualize the spatial distribution of molecules on a tissue section. The main limitation of MALDI-MSI of proteins is the lack of direct identification. Therefore, this study focuses on a MSI~LC-MS/MS-LF workflow to link the results from MALDI-MSI with potential peak identification and label-free quantitation, using only one tissue section. At first, we studied the impact of matrix deposition and laser ablation on protein extraction from the tissue section. Then, we did a back-correlation of the m / z of the proteins detected by MALDI-MSI to those identified by label-free quantitation. This allowed us to compare the label-free quantitation of proteins obtained in LC-MS/MS with the peak intensities observed in MALDI-MSI. We managed to link identification to nine peaks observed by MALDI-MSI. The results showed that the MSI~LC-MS/MS-LF workflow (i) allowed us to study a representative muscle proteome compared to a classical bottom-up workflow; and (ii) was sparsely impacted by matrix deposition and laser ablation. This workflow, performed as a proof-of-concept, suggests that a single tissue section can be used to perform MALDI-MSI and protein extraction, identification, and relative quantitation.
Théron, Laëtitia; Centeno, Delphine; Coudy-Gandilhon, Cécile; Pujos-Guillot, Estelle; Astruc, Thierry; Rémond, Didier; Barthelemy, Jean-Claude; Roche, Frédéric; Feasson, Léonard; Hébraud, Michel; Béchet, Daniel; Chambon, Christophe
2016-01-01
Mass spectrometry imaging (MSI) is a powerful tool to visualize the spatial distribution of molecules on a tissue section. The main limitation of MALDI-MSI of proteins is the lack of direct identification. Therefore, this study focuses on a MSI~LC-MS/MS-LF workflow to link the results from MALDI-MSI with potential peak identification and label-free quantitation, using only one tissue section. At first, we studied the impact of matrix deposition and laser ablation on protein extraction from the tissue section. Then, we did a back-correlation of the m/z of the proteins detected by MALDI-MSI to those identified by label-free quantitation. This allowed us to compare the label-free quantitation of proteins obtained in LC-MS/MS with the peak intensities observed in MALDI-MSI. We managed to link identification to nine peaks observed by MALDI-MSI. The results showed that the MSI~LC-MS/MS-LF workflow (i) allowed us to study a representative muscle proteome compared to a classical bottom-up workflow; and (ii) was sparsely impacted by matrix deposition and laser ablation. This workflow, performed as a proof-of-concept, suggests that a single tissue section can be used to perform MALDI-MSI and protein extraction, identification, and relative quantitation. PMID:28248242
Web portal on environmental sciences "ATMOS''
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Lykosov, V. N.; Fazliev, A. Z.
2006-06-01
The developed under INTAS grant web portal ATMOS (http://atmos.iao.ru and http://atmos.scert.ru) makes available to the international research community, environmental managers, and the interested public, a bilingual information source for the domain of Atmospheric Physics and Chemistry, and the related application domain of air quality assessment and management. It offers access to integrated thematic information, experimental data, analytical tools and models, case studies, and related information and educational resources compiled, structured, and edited by the partners into a coherent and consistent thematic information resource. While offering the usual components of a thematic site such as link collections, user group registration, discussion forum, news section etc., the site is distinguished by its scientific information services and tools: on-line models and analytical tools, and data collections and case studies together with tutorial material. The portal is organized as a set of interrelated scientific sites, which addressed basic branches of Atmospheric Sciences and Climate Modeling as well as the applied domains of Air Quality Assessment and Management, Modeling, and Environmental Impact Assessment. Each scientific site is open for external access information-computational system realized by means of Internet technologies. The main basic science topics are devoted to Atmospheric Chemistry, Atmospheric Spectroscopy and Radiation, Atmospheric Aerosols, Atmospheric Dynamics and Atmospheric Models, including climate models. The portal ATMOS reflects current tendency of Environmental Sciences transformation into exact (quantitative) sciences and is quite effective example of modern Information Technologies and Environmental Sciences integration. It makes the portal both an auxiliary instrument to support interdisciplinary projects of regional environment and extensive educational resource in this important domain.
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B
2014-02-01
Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.
Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.
2013-01-01
Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875
Tacking Flood Risk from Watersheds using a Natural Flood Risk Management Toolkit
NASA Astrophysics Data System (ADS)
Reaney, S. M.; Pearson, C.; Barber, N.; Fraser, A.
2017-12-01
In the UK, flood risk management is moving beyond solely mitigating at the point of impact in towns and key infrastructure to tackle problem at source through a range of landscape based intervention measures. This natural flood risk management (NFM) approach has been trailed within a range of catchments in the UK and is moving towards being adopted as a key part of flood risk management. The approach offers advantages including lower cost and co-benefits for water quality and habitat creation. However, for an agency or group wishing to implement NFM within a catchment, there are two key questions that need to be addressed: Where in the catchment to place the measures? And how many measures are needed to be effective? With this toolkit, these questions are assessed with a two-stage workflow. First, SCIMAP-Flood gives a risk based mapping of likely locations that contribute to the flood peak. This tool uses information on land cover, hydrological connectivity, flood generating rainfall patterns and hydrological travel time distributions to impacted communities. The presented example applies the tool to the River Eden catchment, UK, with 5m grid resolution and hence provide sub-field scale information at the landscape extent. SCIMAP-Flood identifies sub-catchments where physically based catchment hydrological simulation models can be applied to test different NFM based mitigation measures. In this example, the CRUM3 catchment hydrological model has been applied within an uncertainty framework to consider the effectiveness of soil compaction reduction and large woody debris dams within a sub-catchment. It was found that large scale soil aeration to reduce soil compaction levels throughout the catchment is probably the most useful natural flood management measure for this catchment. NFM has potential for wide-spread application and these tools help to ensure that the measures are correctly designed and the scheme performance can be quantitatively assessed and predicted.
Elbers, Nieke A; Chase, Robin; Craig, Ashley; Guy, Lyn; Harris, Ian A; Middleton, James W; Nicholas, Michael K; Rebbeck, Trudy; Walsh, John; Willcock, Simon; Lockwood, Keri; Cameron, Ian D
2017-05-22
Problems may arise during the approval process of treatment after a compensable work injury, which include excess paperwork, delays in approving services, disputes, and allegations of over-servicing. This is perceived as undesirable for injured people, health care professionals and claims managers, and costly to the health care system, compensation system, workplaces and society. Introducing an Evidence Based Medicine (EBM) decision tool in the workers' compensation system could provide a partial solution, by reducing uncertainty about effective treatment. The aim of this study was to investigate attitudes of health care professionals (HCP) to the potential implementation of an EBM tool in the workers' compensation setting. The study has a mixed methods design. The quantitative study consisted of an online questionnaire asking about self-reported knowledge, attitudes and behaviour to EBM in general. The qualitative study consisted of interviews about an EBM tool being applied in the workers' compensation process. Participants were health care practitioners from different clinical specialties. They were recruited through the investigators' clinical networks and the workers' compensation government regulator's website. Participants completing the questionnaire (n = 231) indicated they were knowledgeable about the evidence-base in their field, but perceived some difficulties when applying EBM. General practitioners reported having the greatest obstacles to applying EBM. Participants who were interviewed (n = 15) perceived that an EBM tool in the workers' compensation setting could potentially have some advantages, such as reducing inappropriate treatment, or over-servicing, and providing guidance for clinicians. However, participants expressed substantial concerns that the EBM tool would not adequately reflect the impact of psychosocial factors on recovery. They also highlighted a lack of timeliness in decision making and proper assessment, particularly in pain management. Overall, HCP are supportive of EBM, but have strong concerns about implementation of EBM based decision making in the workers' compensation setting. The participants felt that an EBM tool should not be applied rigidly and should take into account clinical judgement and patient variability and preferences. In general, the treatment approval process in the workers' compensation insurance system is a sensitive area, in which the interaction between HCP and claims managers can be improved.
Challenges in Higher Education Research: The Use of Quantitative Tools in Comparative Analyses
ERIC Educational Resources Information Center
Reale, Emanuela
2014-01-01
Despite the value of the comparative perspective for the study of higher education is widely recognised, there is little consensus about the specific methodological approaches. Quantitative tools outlined their relevance for addressing comparative analyses since they are supposed to reducing the complexity, finding out and graduating similarities…
Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study
ERIC Educational Resources Information Center
Clark, Robert Lynn
2014-01-01
The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…
Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R
2013-01-01
The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
Cox, Trevor F; Ranganath, Lakshminarayan
2011-12-01
Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.
Elastographic techniques of thyroid gland: current status.
Andrioli, Massimiliano; Persani, Luca
2014-08-01
Thyroid nodules are very common with malignancies accounting for about 5 %. Fine-needle biopsy is the most accurate test for thyroid cancer diagnosis. Elastography, a new technology directly evaluating the elastic property of the tissue, has been recently added to the diagnostic armamentarium of the endocrinologists as noninvasive predictor of thyroid malignancy. In this paper, we critically reviewed characteristics and applications of elastographic methods in thyroid gland. Elastographic techniques can be classified on the basis of the following: source-of-tissue compression (free-hand, carotid vibration, ultrasound pulses), processing time (real-time, off-line), stiffness expression (qualitative, semi-quantitative, or quantitative). Acoustic radiation force impulse and aixplorer shear wave are the newest and most promising quantitative elastographic methods. Primary application of elastography is the detection of nodular lesions suspicious for malignancy. Published data show a high sensitivity and negative predictive value of the technique. Insufficient data are available on the possible application of elastography in the differential diagnosis of indeterminate lesions and in thyroiditis. Elastography represents a noninvasive tool able to increase the performance of ultrasound in the selection of thyroid nodules at higher risk of malignancy. Some technical improvements and definition of more robust quantitative diagnostic criteria are required for assigning a definite role in the management of thyroid nodules and thyroiditis to elastography.
Huang, Xunbing; Wu, Huihui; McNeill, Mark Richard; Qin, Xinghu; Ma, Jingchuan; Tu, Xiongbing; Cao, Guangchun; Wang, Guangjun; Nong, Xiangqun; Zhang, Zehua
2016-01-01
Studies on grasshopper diets have historically employed a range of methodologies, each with certain advantages and disadvantages. For example, some methodologies are qualitative instead of quantitative. Others require long experimental periods or examine population-level effects, only. In this study, we used real-time PCR to examine diets of individual grasshoppers. The method has the advantage of being both fast and quantitative. Using two grasshopper species, Oedaleus asiaticus and Dasyhippus barbipes, we designed ITS primer sequences for their three main host plants, Stipa krylovii, Leymus chinensis and Cleistogenes squarrosa and used real-time PCR method to test diet structure both qualitatively and quantitatively. The lowest detection efficiency of the three grass species was ~80% with a strong correlation between actual and PCR-measured food intake. We found that Oedaleus asiaticus maintained an unchanged diet structure across grasslands with different grass communities. By comparison, Dasyhippus barbipes changed its diet structure. These results revealed why O. asiaticus distribution is mainly confined to Stipa-dominated grassland, and D. barbipes is more widely distributed across Inner Mongolia. Overall, real-time PCR was shown to be a useful tool for investigating grasshopper diets, which in turn offers some insight into grasshopper distributions and improved pest management. PMID:27562455
Developing Web-based Tools for Collaborative Science and Public Outreach
NASA Astrophysics Data System (ADS)
Friedman, A.; Pizarro, O.; Williams, S. B.
2016-02-01
With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of crowd-sourced data without compromising science objectives.
Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua
2014-04-01
To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.
McGowan, Conor P.; Lyons, James E.; Smith, David
2015-01-01
Structured decision making (SDM) is an increasingly utilized approach and set of tools for addressing complex decisions in environmental management. SDM is a value-focused thinking approach that places paramount importance on first establishing clear management objectives that reflect core values of stakeholders. To be useful for management, objectives must be transparently stated in unambiguous and measurable terms. We used these concepts to develop consensus objectives for the multiple stakeholders of horseshoe crab harvest in Delaware Bay. Participating stakeholders first agreed on a qualitative statement of fundamental objectives, and then worked to convert those objectives to specific and measurable quantities, so that management decisions could be assessed. We used a constraint-based approach where the conservation objectives for Red Knots, a species of migratory shorebird that relies on horseshoe crab eggs as a food resource during migration, constrained the utility of crab harvest. Developing utility functions to effectively reflect the management objectives allowed us to incorporate stakeholder risk aversion even though different stakeholder groups were averse to different or competing risks. While measurable objectives and quantitative utility functions seem scientific, developing these objectives was fundamentally driven by the values of the participating stakeholders.
Operational seasonal forecasting of crop performance.
Stone, Roger C; Meinke, Holger
2005-11-29
Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production.
Operational seasonal forecasting of crop performance
Stone, Roger C; Meinke, Holger
2005-01-01
Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production. PMID:16433097
NASA Astrophysics Data System (ADS)
McGowan, Conor P.; Lyons, James E.; Smith, David R.
2015-04-01
Structured decision making (SDM) is an increasingly utilized approach and set of tools for addressing complex decisions in environmental management. SDM is a value-focused thinking approach that places paramount importance on first establishing clear management objectives that reflect core values of stakeholders. To be useful for management, objectives must be transparently stated in unambiguous and measurable terms. We used these concepts to develop consensus objectives for the multiple stakeholders of horseshoe crab harvest in Delaware Bay. Participating stakeholders first agreed on a qualitative statement of fundamental objectives, and then worked to convert those objectives to specific and measurable quantities, so that management decisions could be assessed. We used a constraint-based approach where the conservation objectives for Red Knots, a species of migratory shorebird that relies on horseshoe crab eggs as a food resource during migration, constrained the utility of crab harvest. Developing utility functions to effectively reflect the management objectives allowed us to incorporate stakeholder risk aversion even though different stakeholder groups were averse to different or competing risks. While measurable objectives and quantitative utility functions seem scientific, developing these objectives was fundamentally driven by the values of the participating stakeholders.
Development of quantitative risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesmeyer, J. M.; Okrent, D.
Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
Integration of PKPD relationships into benefit–risk analysis
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-01-01
Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398
Integration of PKPD relationships into benefit-risk analysis.
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-11-01
Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.
O'Brien, Doireann; Harvey, Kate; Howse, Jessica; Reardon, Tessa; Creswell, Cathy
2016-10-01
Mental health problems are common and typically have an early onset. Effective treatments for mental health problems in childhood and adolescence are available, yet only a minority of children who are affected access them. This is of serious concern, considering the far-reaching and long-term negative consequences of such problems. Primary care is usually the first port of call for concerned parents so it is important to understand how primary care practitioners manage child and adolescent mental health problems and the barriers they face. To ascertain primary care practitioners' perceptions of the barriers that prevent effective management of child and adolescent mental health problems. A systematic review of qualitative and quantitative literature in a primary care setting. A database search of peer-reviewed articles using PsycINFO, MEDLINE(®), Embase, and Web of Science, from inception (earliest 1806) until October 2014, was conducted. Additional studies were identified through hand searches and forward-citation searches. Studies needed to have at least one search term in four categories: primary care, childhood/adolescence, mental health, and barriers. A total of 4151 articles were identified, of which 43 were included (30 quantitative studies and 13 qualitative studies). The majority of the barriers related to identification, management, and/or referral. Considerable barriers included a lack of providers and resources, extensive waiting lists, and financial restrictions. The identification of a broad range of significant barriers highlights the need to strengthen the ability to deal with these common difficulties in primary care. There is a particular need for tools and training to aid accurate identification and management, and for more efficient access to specialist services. © British Journal of General Practice 2016.
O’Brien, Doireann; Harvey, Kate; Howse, Jessica; Reardon, Tessa; Creswell, Cathy
2016-01-01
Background Mental health problems are common and typically have an early onset. Effective treatments for mental health problems in childhood and adolescence are available, yet only a minority of children who are affected access them. This is of serious concern, considering the far-reaching and long-term negative consequences of such problems. Primary care is usually the first port of call for concerned parents so it is important to understand how primary care practitioners manage child and adolescent mental health problems and the barriers they face. Aim To ascertain primary care practitioners’ perceptions of the barriers that prevent effective management of child and adolescent mental health problems. Design and setting A systematic review of qualitative and quantitative literature in a primary care setting. Method A database search of peer-reviewed articles using PsycINFO, MEDLINE®, Embase, and Web of Science, from inception (earliest 1806) until October 2014, was conducted. Additional studies were identified through hand searches and forward-citation searches. Studies needed to have at least one search term in four categories: primary care, childhood/adolescence, mental health, and barriers. Results A total of 4151 articles were identified, of which 43 were included (30 quantitative studies and 13 qualitative studies). The majority of the barriers related to identification, management, and/or referral. Considerable barriers included a lack of providers and resources, extensive waiting lists, and financial restrictions. Conclusion The identification of a broad range of significant barriers highlights the need to strengthen the ability to deal with these common difficulties in primary care. There is a particular need for tools and training to aid accurate identification and management, and for more efficient access to specialist services. PMID:27621291
ERIC Educational Resources Information Center
Wright, Benjamin D.
2000-01-01
Summarizes the distinctions between qualitative and quantitative research and shows their complementary aspects. Shows there is no contradiction or conflict between the qualitative and the quantitative and discusses Rasch measurement as the construction tool of quantitative research. (SLD)
Gates, Allison; Shave, Kassi; Featherstone, Robin; Buckreus, Kelli; Ali, Samina; Scott, Shannon; Hartling, Lisa
2017-06-06
There exist many evidence-based interventions available to manage procedural pain in children and neonates, yet they are severely underutilized. Parents play an important role in the management of their child's pain; however, many do not possess adequate knowledge of how to effectively do so. The purpose of the planned study is to systematically review and synthesize current knowledge of the experiences and information needs of parents with regard to the management of their child's pain and distress related to medical procedures in the emergency department. We will conduct a systematic review using rigorous methods and reporting based on the PRISMA statement. We will conduct a comprehensive search of literature published between 2000 and 2016 reporting on parents' experiences and information needs with regard to helping their child manage procedural pain and distress. Ovid MEDLINE, Ovid PsycINFO, CINAHL, and PubMed will be searched. We will also search reference lists of key studies and gray literature sources. Two reviewers will screen the articles following inclusion criteria defined a priori. One reviewer will then extract the data from each article following a data extraction form developed by the study team. The second reviewer will check the data extraction for accuracy and completeness. Any disagreements with regard to study inclusion or data extraction will be resolved via discussion. Data from qualitative studies will be summarized thematically, while those from quantitative studies will be summarized narratively. The second reviewer will confirm the overarching themes resulting from the qualitative and quantitative data syntheses. The Critical Appraisal Skills Programme Qualitative Research Checklist and the Quality Assessment Tool for Quantitative Studies will be used to assess the quality of the evidence from each included study. To our knowledge, no published review exists that comprehensively reports on the experiences and information needs of parents related to the management of their child's procedural pain and distress. A systematic review of parents' experiences and information needs will help to inform strategies to empower them with the knowledge necessary to ensure their child's comfort during a painful procedure. PROSPERO CRD42016043698.
[Planning With Nanda, Noc, Nic Taxonomies In Neurologic Rehabilitation. A clinical study].
Iori, Alessandra; Foracchia, Marco; Gradellini, Cinzia
2015-01-01
Nursing classifications identify a specific professional responsibility, increase nursing visibility, according with nursing evolution of these last years. To evaluate care planning with NANDA taxonomy in neurologic rehabilitation context. Care plan managing with NANDA taxonomy, regarding diagnosis of constipation and impaired skin integrity, using a computerized tool for systematically observation, organized in check list. Registered data with taxonomy planning are higher in quantitative and qualitative terms. For most of patients (87%) one diagnosis has been opened, both diagnosis for 60% of them. Nursing care plan with NANDA taxonomy can be considered a valid methodology of care for neurologic patient, this since it requests a deep and complete registration of first assessment a systematically registration of each monitoring, it increases visibility of nursing job, and it underlines specific autonomy and responsibility in prevention and management of problems.
Conducting field studies for testing pesticide leaching models
Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.
1990-01-01
A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.
NASA Astrophysics Data System (ADS)
Siirila-Woodburn, Erica R.; Steefel, Carl I.; Williams, Kenneth H.; Birkholzer, Jens T.
2018-03-01
The effects of land use and land cover (LULC) change on environmental systems across the land surface's "critical zone" are highly uncertain, often making prediction and risk management decision difficult. In a series of numerical experiments with an integrated hydrologic model, overland flow generation is quantified for both present day and forest thinning scenarios. A typhoon storm event in a watershed near the Fukushima Dai-ichi Nuclear Power Plant is used as an example application in which the interplay between LULC change and overland flow generation is important given that sediment-bound radionuclides may cause secondary contamination via surface water transport. Results illustrate the nonlinearity of the integrated system spanning from the deep groundwater to the atmosphere, and provide quantitative tools when determining the tradeoffs of different risk-mitigation strategies.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
[Biomechanical modeling of pelvic organ mobility: towards personalized medicine].
Cosson, Michel; Rubod, Chrystèle; Vallet, Alexandra; Witz, Jean-François; Brieu, Mathias
2011-11-01
Female pelvic mobility is crucial for urinary, bowel and sexual function and for vaginal delivery. This mobility is ensured by a complex organ suspension system composed of ligaments, fascia and muscles. Impaired pelvic mobility affects one in three women of all ages and can be incapacitating. Surgical management has a high failure rate, largely owing to poor knowledge of the organ support system, including the barely discernible ligamentous system. We propose a 3D digital model of the pelvic cavity based on MRI images and quantitative tools, designed to locate the pelvic ligaments. We thus obtain a coherent anatomical and functional model which can be used to analyze pelvic pathophysiology. This work represents a first step towards creating a tool for localizing and characterizing the source of pelvic imbalance. We examine possible future applications of this model, in terms of personalized therapy and prevention.
Multi-factor energy price models and exotic derivatives pricing
NASA Astrophysics Data System (ADS)
Hikspoors, Samuel
The high pace at which many of the world's energy markets have gradually been opened to competition have generated a significant amount of new financial activity. Both academicians and practitioners alike recently started to develop the tools of energy derivatives pricing/hedging as a quantitative topic of its own. The energy contract structures as well as their underlying asset properties set the energy risk management industry apart from its more standard equity and fixed income counterparts. This thesis naturally contributes to these broad market developments in participating to the advances of the mathematical tools aiming at a better theory of energy contingent claim pricing/hedging. We propose many realistic two-factor and three-factor models for spot and forward price processes that generalize some well known and standard modeling assumptions. We develop the associated pricing methodologies and propose stable calibration algorithms that motivate the application of the relevant modeling schemes.
Southern California Disasters II
NASA Technical Reports Server (NTRS)
Nicholson, Heather; Todoroff, Amber L.; LeBoeuf, Madeline A.
2015-01-01
The USDA Forest Service (USFS) has multiple programs in place which primarily utilize Landsat imagery to produce burn severity indices for aiding wildfire damage assessment and mitigation. These indices provide widely-used wildfire damage assessment tools to decision makers. When the Hyperspectral Infrared Imager (HyspIRI) is launched in 2022, the sensor's hyperspectral resolution will support new methods for assessing natural disaster impacts on ecosystems, including wildfire damage to forests. This project used simulated HyspIRI data to study three southern California fires: Aspen, French, and King. Burn severity indices were calculated from the data and the results were quantitatively compared to the comparable USFS products currently in use. The final results from this project illustrate how HyspIRI data may be used in the future to enhance assessment of fire-damaged areas and provide additional monitoring tools for decision support to the USFS and other land management agencies.
Provencher, Louis; Frid, Leonardo; Czembor, Christina; Morisette, Jeffrey T.
2016-01-01
State-and-Transition Simulation Modeling (STSM) is a quantitative analysis method that can consolidate a wide array of resource management issues under a “what-if” scenario exercise. STSM can be seen as an ensemble of models, such as climate models, ecological models, and economic models that incorporate human dimensions and management options. This chapter presents STSM as a tool to help synthesize information on social–ecological systems and to investigate some of the management issues associated with exotic annual Bromus species, which have been described elsewhere in this book. Definitions, terminology, and perspectives on conceptual and computer-simulated stochastic state-and-transition models are given first, followed by a brief review of past STSM studies relevant to the management of Bromus species. A detailed case study illustrates the usefulness of STSM for land management. As a whole, this chapter is intended to demonstrate how STSM can help both managers and scientists: (a) determine efficient resource allocation for monitoring nonnative grasses; (b) evaluate sources of uncertainty in model simulation results involving expert opinion, and their consequences for management decisions; and (c) provide insight into the consequences of predicted local climate change effects on ecological systems invaded by exotic annual Bromus species.
Review of software tools for design and analysis of large scale MRM proteomic datasets.
Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi
2013-06-15
Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Interactive visualization to advance earthquake simulation
Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.
2008-01-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.
Evaluation of the Earth System CoG Infrastructure in Supporting a Model Intercomparison Project
NASA Astrophysics Data System (ADS)
Wallis, J. C.; Rood, R. B.; Murphy, S.; Cinquini, L.; DeLuca, C.
2013-12-01
Earth System CoG is a web-based collaboration environment that combines data services with metadata and project management services. The environment is particularly suited to support software development and model intercomparison projects. CoG was recently used to support the National Climate Predictions and Projections Platform (NCPP) Quantitative Evaluation of Downscaling (QED-2013) workshop. QED-2013 was a workshop with a community approach for the objective, quantitative evaluation of techniques to downscale climate model predictions and projections. This paper will present a brief introduction to CoG, QED-2013, and findings from an ethnographic evaluation of how CoG supported QED-2013. The QED-2013 workshop focused on real-world application problems drawn from several sectors, and contributed to the informed use of downscaled data. This workshop is a part of a larger effort by NCPP and partner organizations to develop a standardized evaluation framework for local and regional climate information. The main goals of QED-2013 were to a) coordinate efforts for quantitative evaluation, b) develop software infrastructure, c) develop a repository of information, d) develop translational and guidance information, e) identify and engage key user communities, and f) promote collaboration and interoperability. CoG was a key player in QED-2013 support. NCPP was an early adopter of the CoG platform, providing valuable recommendations for overall development plus specific workshop-related requirements. New CoG features developed for QED-2013 included: the ability to publish images and associated metadata contained within XML files to its associated data node combine both artifacts into an integrated display. The ability to modify data search facets into scientifically relevant groups and display dynamic lists of workshop participants and their interests was also added to the interface. During the workshop, the QED-2013 project page on CoG provided meeting logistics, meeting materials, shared spaces and resources, and data services. The evaluation of CoG tools was focused on the usability of products rather than metrics, such as number of independent hits to a web site. We wanted to know how well CoG tools supported the workshop participants and their tasks. For instance, what workshop tasks could be performed within the CoG environment? Were these tasks performed there or with alternative tools? And do participants plan to use the tools after the workshop for other projects? Ultimately, we wanted to know if CoG contributed to NCPP's need for a flexible and extensible evaluation platform, and did it support the integration of dispersed resources, quantitative evaluation of climate projections, and the generation and management of interpretive information. Evaluation of the workshop and activity occurred during, at the end of, and after the workshop. During the workshop, an ethnographer observed and participated in the workshop, and collected short, semi-structured interviews with a subset of the participants. At the end of the workshop, an exit survey was administered to all the participants. After the workshop, a variety of methods were used to capture the impact of the workshop.
NASA Technical Reports Server (NTRS)
Bubenheim, David L.; Schlick, Greg; Genovese, Vanessa; Wilson, Kenneth D.
2018-01-01
Management of aquatic weeds in complex watersheds and river systems present many challenges to assessment, planning and implementation of management practices for floating and submerged aquatic invasive plants. The Delta Region Areawide Aquatic Weed Project (DRAAWP), a USDA sponsored area-wide project, is working to enhance planning, decision-making and operational efficiency in the California Sacramento-San Joaquin Delta. Satellite and airborne remote sensing are used map (area coverage and biomass density), direct operations, and assess management impacts on plant communities. Archived satellite records enable review of results following previous climate and management events and aide in developing long-term strategies. Examples of remote sensing aiding effectiveness of aquatic weed management will be discussed as well as areas for potential technological improvement. Modeling at local and watershed scales using the SWAT modeling tool provides insight into land-use effects on water quality (described by Zhang in same Symposium). Controlled environment growth studies have been conducted to quantify the growth response of invasive aquatic plants to water quality and other environmental factors. Environmental variability occurs across a range of time scales from long-term climate and seasonal trends to short-term water flow mediated variations. Response time for invasive species response are examined at time scales of weeks, day, and hours using a combination of study duration and growth assessment techniques to assess water quality, temperature (air and water), nitrogen, phosphorus, and light effects. These provide response parameters for plant growth models in response to the variation and interact with management and economic models associated with aquatic weed management. Plant growth models are to be informed by remote sensing and applied spatially across the Delta to balance location and type of aquatic plant, growth response to altered environments and phenology. Initial utilization of remote sensing tools developed for mapping of aquatic invasive plants improved operational efficiency in management practices. These assessment methods provide a comprehensive and quantitative view of aquatic invasive plants communities in the California Delta.
The Multi-Sector Sustainability Browser (MSSB): A Tool for ...
The MSSB is the first and only decision support tool containing information from scientific literature and technical reports that can be used to develop and implement sustainability initiatives. The MSSB is designed to assist individuals and communities in understanding the impacts that the four key dimensions of sustainability - Land Use, Buildings and Infrastructure, Transportation, and Materials Management - can have on human health, the economy, the built environment and natural environments. The MSSB has the following capabilities: a. Displays and describes linkages between the four major sustainability concepts (Land Use, Buildings and Infrastructure, Transportation, and Materials Management) and their subordinate concepts. b. Displays and lists literature sources and references (including weblinks where applicable) providing information about each major sustainability concept and its associated subordinate concepts. c. Displays and lists quantitative data related to each major sustainability concept and its associated subordinate concepts, with weblinks where applicable.The MSSB serves as a ‘visual database’, allowing users to: investigate one or more of the four key sustainability dimensions; explore available scientific literature references, and; assess potential impacts of sustainability activities. The MSSB reduces the amount of time and effort required to assess the state of sustainability science and engineering research pertaining
Fernandez-Piquer, Judith; Bowman, John P; Ross, Tom; Estrada-Flores, Silvia; Tamplin, Mark L
2013-07-01
Vibrio parahaemolyticus can accumulate and grow in oysters stored without refrigeration, representing a potential food safety risk. High temperatures during oyster storage can lead to an increase in total viable bacteria counts, decreasing product shelf life. Therefore, a predictive tool that allows the estimation of both V. parahaemolyticus populations and total viable bacteria counts in parallel is needed. A stochastic model was developed to quantitatively assess the populations of V. parahaemolyticus and total viable bacteria in Pacific oysters for six different supply chain scenarios. The stochastic model encompassed operations from oyster farms through consumers and was built using risk analysis software. Probabilistic distributions and predictions for the percentage of Pacific oysters containing V. parahaemolyticus and high levels of viable bacteria at the point of consumption were generated for each simulated scenario. This tool can provide valuable information about V. parahaemolyticus exposure and potential control measures and can help oyster companies and regulatory agencies evaluate the impact of product quality and safety during cold chain management. If coupled with suitable monitoring systems, such models could enable preemptive action to be taken to counteract unfavorable supply chain conditions.
Design and implementation of ergonomic performance measurement system at a steel plant in India.
Ray, Pradip Kumar; Tewari, V K
2012-01-01
Management of Tata Steel, the largest steel making company of India in the private sector, felt the need to develop a framework to determine the levels of ergonomic performance at its different workplaces. The objectives of the study are manifold: to identify and characterize the ergonomic variables for a given worksystem with regard to work efficiency, operator safety, and working conditions, to design a comprehensive Ergonomic Performance Indicator (EPI) for quantitative determination of the ergonomic status and maturity of a given worksystem. The study team of IIT Kharagpur consists of three faculty members and the management of Tata Steel formed a team of eleven members for implementation of EPI model. In order to design and develop the EPI model with total participation and understanding of the concerned personnel of Tata Steel, a three-phase action plan for the project was prepared. The project consists of three phases: preparation and data collection, detailed structuring and validation of EPI model. Identification of ergonomic performance factors, development of interaction matrix, design of assessment tool, and testing and validation of assessment tool (EPI) in varied situations are the major steps in these phases. The case study discusses in detail the EPI model and its applications.
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
Diagnosis and treatment of superficial esophageal cancer.
Barret, Maximilien; Prat, Frédéric
2018-01-01
Endoscopy allows for the screening, early diagnosis, treatment and follow up of superficial esophageal cancer. Endoscopic submucosal dissection has become the gold standard for the resection of superficial squamous cell neoplasia. Combinations of endoscopic mucosal resection and radiofrequency ablation are the mainstay of the management of Barrett's associated neoplasia. However, protruded, non-lifting or large lesions may be better managed by endoscopic submucosal dissection. Novel ablation tools, such as argon plasma coagulation with submucosal lifting and cryoablation balloons, are being developed for the treatment of residual Barrett's esophagus, since iatrogenic strictures still hamper the development of extensive circumferential resections in the esophagus. Optimal surveillance modalities after endoscopic resection are still to be determined. The assessment of the risk of lymph-node metastases, as well as of the need for additional treatments based on qualitative and quantitative histological criteria, balanced to the patient's condition, requires a dedicated multidisciplinary team decision process. The need for trained endoscopists, expert pathologists and surgeons, and specialized multidisciplinary meetings underlines the role of expert centers in the management of superficial esophageal cancer.
Parmagnani, Federica; Ranzi, Andrea; Ancona, Carla; Angelini, Paola; Chiusolo, Monica; Cadum, Ennio; Lauriola, Paolo; Forastiere, Francesco
2014-01-01
The Project Epidemiological Surveillance of Health Status of Resident Population Around the Waste Treatment Plants (SESPIR) included five Italian regions (Emilia-Romagna, Piedmont, Lazio, Campania, and Sicily) and the National Institute of Health in the period 2010-2013. SESPIR was funded by the Ministry of Health as part of the National centre for diseases prevention and control (CCM) programme of 2010 with the general objective to provide methods and operational tools for the implementation of surveillance systems for waste and health, aimed at assessing the impact of the municipal solid waste (MSW) treatment cycle on the health of the population. The specific objective was to assess health impacts resulting from the presence of disposal facilities related to different regional scenarios of waste management. Suitable tools for analysis of integrated assessment of environmental and health impact were developed and applied, using current demographic, environmental and health data. In this article, the methodology used for the quantitative estimation of the impact on the health of populations living nearby incinerators, landfills and mechanical biological treatment plants is showed, as well as the analysis of three different temporal scenarios: the first related to the existing plants in the period 2008-2009 (baseline), the second based on regional plans, the latter referring to MSW virtuous policy management based on reduction of produced waste and an intense recovery policy.
de Rauville, Ingrid; Chetty, Sandhya; Pahl, Jenny
2006-01-01
Word finding difficulties frequently found in learners with language learning difficulties (Casby, 1992) are an integral part of Speech-Language Therapists' management role when working with learning disabled children. This study investigated current management for word finding difficulties by 70 Speech-Language Therapists in South African remedial schools. A descriptive survey design using a quantitative and qualitative approach was used. A questionnaire and follow-up focus group discussion were used to collect data. Results highlighted the use of the Renfrew Word Finding Scale (Renfrew, 1972, 1995) as the most frequently used formal assessment tool. Language sample analysis and discourse analysis were the most frequently used informal assessment procedures. Formal intervention programmes were generally not used. Phonetic, phonemic or phonological cueing were the most frequently used therapeutic strategies. The authors note strengths and raise concerns about current management for word finding difficulties in South African remedial schools, particularly in terms of bilingualism. Opportunities are highlighted regarding the development of assessment and intervention measures relevant to the diverse learning disabled population in South Africa.
Washington, Karla T; Wilkes, Chelsey M; Rakes, Christopher R; Otten, Sheila J; Parker Oliver, Debra; Demiris, George
2018-05-04
Family caregivers (FCGs) face numerous stressors and are at heightened risk of psychological distress. While theoretical explanations exist linking caregiving stressors with outcomes such as anxiety and depression, limited testing of these theories has occurred among FCGs of patients nearing the end of life. Researchers sought to evaluate mediational relationships among burden experienced by hospice FCGs because of symptom management demands, caregivers' coping responses, and caregivers' psychological distress. Quantitative data for this descriptive exploratory study were collected through survey. Hypothesized relationships among caregiver variables were examined with structural equation modeling. Respondents were FCGs (N = 228) of hospice patients receiving services from a large, non-profit community hospice in the Mid-Southern United States. Burden associated with managing hospice patients' psychological symptoms was shown to predict psychological distress for FCGs. Caregivers' use of escape-avoidance coping responses mediated this relationship. Results suggest that FCGs would benefit from additional tools to address patients' psychological symptoms at end of life. When faced with psychological symptom management burden, caregivers need a range of coping skills as alternatives to escape-avoidance coping.
[The balanced scorecard. "Tool or toy" in hospitals].
Brinkmann, A; Gebhard, F; Isenmann, R; Bothner, U; Mohl, U; Schwilk, B
2003-10-01
The change in hospital funding with diagnosis related groups (DRG), medical advances as well as demographic changes will call for new quantitative and qualitative standards imposed on German hospitals. Increasing costs and competition in the health care sector requires new and innovative strategies for resource management. Today's policy is mainly defined by rationing and intensified workload. The introduction of DRGs will presumably further constrict management perspectives on pure financial aspects. However, to ensure future development, compassionate services and continued existence of hospitals, a balance of seemingly conflicting perspectives, such as finance, customer, process, learning and growth are of utmost importance. Herein doctors and nurses in leading positions should play a key role in changing management practice. For several years the balanced scorecard has been successfully used as a strategic management concept in non-profit organizations, even in the health care sector. This concept complies with the multidimensional purposes of hospitals and focuses on policy deployment. Finally it gives the opportunity to involve all employees in the original development, communication and execution of a balanced scorecard approach.
NASA Astrophysics Data System (ADS)
Bartholomeus, H.; Kooistra, L.
2012-04-01
For quantitative estimation of soil properties by means of remote sensing, often hyperspectral data are used. But these data are scarce and expensive, which prohibits wider implementation of the developed techniques in agricultural management. For precision agriculture, observations at a high spatial resolution are required. Colour aerial photographs at this scale are widely available, and can be acquired at no of very low costs. Therefore, we investigated whether publically available aerial photographs can be used to a) automatically delineate management zones and b) estimate levels of organic carbon spatially. We selected three study areas within the Netherlands that cover a large variance in soil type (peat, sand, and clay). For the fields of interest, RGB aerial photographs with a spatial resolution of 50 cm were extracted from a publically available data provider. Further pre-processing exists of geo-referencing only. Since the images originate from different sources and are potentially acquired under unknown illumination conditions, the exact radiometric properties of the data are unknown. Therefore, we used spectral indices to emphasize the differences in reflectance and normalize for differences in radiometry. To delineate management zones we used image segmentation techniques, using the derived indices as input. Comparison with management zone maps as used by the farmers shows that there is good correspondence. Regression analysis between a number of soil properties and the derived indices shows that organic carbon is the major explanatory variable for differences in index values within the fields. However, relations do not hold for large regions, indicating that local models will have to be used, which is a problem that is also still relevant for hyperspectral remote sensing data. With this research, we show that low-cost aerial photographs can be a valuable tool for quantitative analysis of organic carbon and automatic delineation of management zones. Since a lot of data are publically available this offers great possibilities for implementing remote sensing techniques in agricultural management.
The application of quantitative risk assessment to microbial food safety risks.
Jaykus, L A
1996-01-01
Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data limitations to date. Conclusions include a brief discussion of subsequent uncertainty and risk analysis methodologies, and a commentary on present and future applications of QRA in the management of the public health risks associated with the presence of pathogenic microorganisms in the food supply.
Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy
2011-08-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.
Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy
2011-01-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510
Fernando, Irosh; Carter, Gregory
2016-02-01
There is a need for a simple and brief tool that can be used in routine clinical practice for the quantitative measurement of mental state across all diagnostic groups. The main utilities of such a tool would be to provide a global metric for the mental state examination, and to monitor the progression over time using this metric. We developed the mental state examination scale (MSES), and used it in an acute inpatient setting in routine clinical work to test its initial feasibility. Using a clinical case, the utility of MSES is demonstrated in this paper. When managing the patient described, the MSES assisted the clinician to assess the initial mental state, track the progress of the recovery, and make timely treatment decisions by quantifying the components of the mental state examination. MSES may enhance the quality of clinical practice for clinicians, and potentially serve as an index of universal mental healthcare outcome that can be used in clinical practice, service evaluation, and healthcare economics. © The Royal Australian and New Zealand College of Psychiatrists 2015.
Kwei, Johnny; Halstead, Fenella D; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S
2015-11-06
Sepsis from burn injuries can result from colonisation of burn wounds, especially in large surface area burns. Reducing bacterial infection will reduce morbidity and mortality, and mortality for severe burns can be as high as 15 %. There are various quantitative and semi-quantitative techniques to monitor bacterial load on wounds. In the UK, burn wounds are typically monitored for the presence or absence of bacteria through the collection and culture of swabs, but no absolute count is obtained. Quantitative burn wound culture provides a measure of bacterial count and is gaining increased popularity in some countries. It is however more resource intensive, and evidence for its utility appears to be inconsistent. This systematic review therefore aims to assess the evidence on the utility and reliability of different quantitative microbiology techniques in terms of diagnosing or predicting clinical outcomes. Standard systematic review methods aimed at minimising bias will be employed for study identification, selection and data extraction. Bibliographic databases and ongoing trial registers will be searched and conference abstracts screened. Studies will be eligible if they are prospective studies or systematic reviews of burn patients (any age) for whom quantitative microbiology has been performed, whether it is compared to another method. Quality assessment will be based on quality assessment tools for diagnostic and prognostic studies and tailored to the review as necessary. Synthesis is likely to be primarily narrative, but meta-analysis may be considered where clinical and methodological homogeneity exists. Given the increasing use of quantitative methods, this is a timely systematic review, which will attempt to clarify the evidence base. As far as the authors are aware, it will be the first to address this topic. PROSPERO, CRD42015023903.
Quantitative evaluation of water quality in the coastal zone by remote sensing
NASA Technical Reports Server (NTRS)
James, W. P.
1971-01-01
Remote sensing as a tool in a waste management program is discussed. By monitoring both the pollution sources and the environmental quality, the interaction between the components of the exturaine system was observed. The need for in situ sampling is reduced with the development of improved calibrated, multichannel sensors. Remote sensing is used for: (1) pollution source determination, (2) mapping the influence zone of the waste source on water quality parameters, and (3) estimating the magnitude of the water quality parameters. Diffusion coefficients and circulation patterns can also be determined by remote sensing, along with subtle changes in vegetative patterns and density.
Translational benchmark risk analysis
Piegorsch, Walter W.
2010-01-01
Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283
MARZKE, MARY W.; MARZKE, R. F.
2000-01-01
The discovery of fossil hand bones from an early human ancestor at Olduvai Gorge in 1960, at the same level as primitive stone tools, generated a debate about the role of tools in the evolution of the human hand that has raged to the present day. Could the Olduvai hand have made the tools? Did the human hand evolve as an adaptation to tool making and tool use? The debate has been fueled by anatomical studies comparing living and fossil human and nonhuman primate hands, and by experimental observations. These have assessed the relative abilities of apes and humans to manufacture the Oldowan tools, but consensus has been hampered by disagreements about how to translate experimental data from living species into quantitative models for predicting the performance of fossil hands. Such models are now beginning to take shape as new techniques are applied to the capture, management and analysis of data on kinetic and kinematic variables ranging from hand joint structure, muscle mechanics, and the distribution and density of bone to joint movements and muscle recruitment during manipulative behaviour. The systematic comparative studies are highlighting a functional complex of features in the human hand facilitating a distinctive repertoire of grips that are apparently more effective for stone tool making than grips characterising various nonhuman primate species. The new techniques are identifying skeletal variables whose form may provide clues to the potential of fossil hominid hands for one-handed firm precision grips and fine precision manoeuvering movements, both of which are essential for habitual and effective tool making and tool use. PMID:10999274
New risk metrics and mathematical tools for risk analysis: Current and future challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and “momentary” expressed spoilage and safety level.« less
New risk metrics and mathematical tools for risk analysis: Current and future challenges
NASA Astrophysics Data System (ADS)
Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon
2015-01-01
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and "momentary" expressed spoilage and safety level.
Spatially explicit multi-criteria decision analysis for managing vector-borne diseases
2011-01-01
The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355
Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)
ERIC Educational Resources Information Center
Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; Belitsky, Jason M.; Umbanhowar, Charles, Jr.; Overvoorde, Paul J.
2017-01-01
Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative…
Shrader, Sarah; Farland, Michelle Z; Danielson, Jennifer; Sicat, Brigitte; Umland, Elena M
2017-08-01
Objective. To identify and describe the available quantitative tools that assess interprofessional education (IPE) relevant to pharmacy education. Methods. A systematic approach was used to identify quantitative IPE assessment tools relevant to pharmacy education. The search strategy included the National Center for Interprofessional Practice and Education Resource Exchange (Nexus) website, a systematic search of the literature, and a manual search of journals deemed likely to include relevant tools. Results. The search identified a total of 44 tools from the Nexus website, 158 abstracts from the systematic literature search, and 570 abstracts from the manual search. A total of 36 assessment tools met the criteria to be included in the summary, and their application to IPE relevant to pharmacy education was discussed. Conclusion. Each of the tools has advantages and disadvantages. No single comprehensive tool exists to fulfill assessment needs. However, numerous tools are available that can be mapped to IPE-related accreditation standards for pharmacy education.
Brockelman, Karin F; Scheyett, Anna M
2015-12-01
Universities across the country struggle with the legal and ethical dilemmas of how to respond when a student shows symptoms of serious mental illness. This mixed-method study provides information on faculty knowledge of mental health problems in students, their use of available accommodations and strategies, and their willingness to accept psychiatric advance directives (PADs) as helpful interventions for managing student crises. Participants were 168 faculty members at a large, public, Southern university. A web-based survey was used to collect quantitative self-report data as well as qualitative data in the form of open-ended questions. Quantitative data are presented with descriptive statistics. Qualitative data were analyzed using thematic analysis. The majority of faculty surveyed have an overall supportive stance and are willing to provide accommodations to students with a mental illness. The most common advantage faculty see in a PAD is support of student autonomy and choice, and the primary concern voiced about PADs is that students with mental illness will have poor judgment regarding the contents of the PADs they create. PADs may be effective recovery tools to help university students with mental illnesses manage crises and attain stability and academic success. For PADs to be effective, university faculty and administration will need to understand mental illnesses, the strategies students need to manage mental health crises, and how PADs can play a role in supporting students. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Crema, Stefano; Schenato, Luca; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco
2014-05-01
The increased interest in sediment connectivity has brought the geomorphologists' community to focus on sediment fluxes as a key process (Cavalli et al., 2013; Heckmann and Schwanghart, 2013). The challenge of dealing with erosion-related processes in alpine catchments is of primary relevance for different fields of investigations and applications, including, but not limited to natural hazards, hydraulic structures design, ecology and stream restoration. The present work focuses on the development of a free tool for sediment connectivity assessment as described in Cavalli et al. (2013), introducing some novel improvements. The choice of going for a free software is motivated by the need of widening the access and improving participation beyond the restrictions on algorithms customization, typical of commercial software. A couple of features further enhance the tool: being completely free and adopting a user-friendly interface, its target audience includes researchers and stakeholders (e.g., local managers and civil protection authorities in charge of planning the priorities of intervention in the territory), being written in Python programming language, it can benefit from optimized algorithms for high-resolution DEMs (Digital Elevation Models) handling and for propagation workflows implementation; these two factors make the tool computationally competitive with the most recent commercial GIS products. The overall goal of this tool is supporting the analysis of sediment connectivity, facing the challenge of widening, as much as possible, the users' community among scientists and stakeholders. This aspect is crucial, as future improvement of this tool will benefit of feedbacks from users in order to improve the quantitative assessment of sediment connectivity as a major input information for the optimal management of mountain areas. References: Cavalli, M., Trevisani, S., Comiti, F., Marchi, L., 2013. Geomorphometric assessment of spatial sediment connectivity in small Alpine catchments. Geomorphology 188, 31-41. Heckmann, T., Schwanghart, W., 2013. Geomorphic coupling and sediment connectivity in an alpine catchment - Exploring sediment cascades using graph theory. Geomorphology 182, 89-103.
An Integrated Risk Management Model for Source Water Protection Areas
Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien
2012-01-01
Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans. PMID:23202770
The clinical utility of posturography.
Visser, Jasper E; Carpenter, Mark G; van der Kooij, Herman; Bloem, Bastiaan R
2008-11-01
Postural instability and falls are common and devastating features of ageing and many neurological, visual, vestibular or orthopedic disorders. Current management of these problems is hampered by the subjective and variable nature of the available clinical balance measures. In this narrative review, we discuss the clinical utility of posturography as a more objective and quantitative measure of balance and postural instability, focusing on several areas where clinicians presently experience the greatest difficulties in managing their patients: (a) to make an appropriate differential diagnosis in patients presenting with falls or balance impairment; (b) to reliably identify those subjects who are at risk of falling; (c) to objectively and quantitatively document the outcome of therapeutic interventions; and (d) to gain a better pathophysiological understanding of postural instability and falls, as a basis for the development of improved treatment strategies to prevent falling. In each of these fields, posturography offers several theoretical advantages and, when applied correctly, provides a useful tool to gain a better understanding of pathophysiological mechanisms in patients with balance disorders, at the group level. However, based on the available evidence, none of the existing techniques is currently able to significantly influence the clinical decision making in individual patients. We critically review the shortcomings of posturography as it is presently used, and conclude with several recommendations for future research.
Robinson-Papp, Jessica; George, Mary Catherine; Wongmek, Arada; Nmashie, Alexandra; Merlin, Jessica S; Ali, Yousaf; Epstein, Lawrence; Green, Mark; Serban, Stelian; Sheth, Parag; Simpson, David M
2015-09-01
The extent to which patients take chronic pain medications as prescribed is not well studied, and there are no generally agreed-upon measures. The Quantitative Analgesic Questionnaire (QAQ) is a new instrument designed to comprehensively document patient-reported medication use, generate scores to quantify it (by individual drug, class, and/or overall), and compare it (qualitatively and/or quantitatively) to the regimen as prescribed. The aim of this study was to describe the development and preliminary validation of the QAQ. The QAQ was studied in a convenience sample of 149 HIV-infected participants. We found that the QAQ scores computed for participants' chronic pain medication regimens were valid based on their correlation with 1) patient-reported pain intensity (r = 0.38; P < 0.001) and 2) experienced pain management physicians' independent quantification of the regimens (r = 0.89; P < 0.001). The QAQ also demonstrated high interrater reliability (r = 0.957; P < 0.001). Detailed examination of the QAQ data in a subset of 34 participants demonstrated that the QAQ revealed suboptimal adherence in 44% of participants and contained information that would not have been gleaned from review of the medical record alone in 94%, including use of over-the-counter medications and quantification of "as needed" dosing. The QAQ also was found to be useful in quantifying change in the medication regimen over time, capturing a change in 50% of the participants from baseline to eight week follow-up. The QAQ is a simple tool that can facilitate understanding of patient-reported chronic pain medication regimens, including calculation of percent adherence and generation of quantitative scores suitable for estimating and tracking change in medication use over time. Copyright © 2015 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Necefer, Len Edward
Decision-making surrounding pathways of future energy resource management are complexity and requires balancing tradeoffs of multiple environmental, social, economic, and technical outcomes. Technical decision aid can provide a framework for informed decision making, allowing individuals to better understand the tradeoff between resources, technology, energy services, and prices. While technical decision aid have made significant advances in evaluating these quantitative aspects of energy planning and performance, they have not been designed to incorporate human factors, such as preferences and behavior that are informed by cultural values. Incorporating cultural values into decision tools can provide not only an improved decision framework for the Navajo Nation, but also generate new insights on how these perspective can improve decision making on energy resources. Ensuring these aids are a cultural fit for each context has the potential to increase trust and promote understanding of the tradeoffs involved in energy resource management. In this dissertation I present the development of a technical tool that explicitly addresses cultural and spiritual values and experimentally assesses their influence on the preferences and decision making of Navajo citizens. Chapter 2 describes the results of a public elicitation effort to gather information about stakeholder views and concerns related to energy development in the Navajo Nation in order to develop a larger sample survey and a decision-support tool that links techno-economic energy models with sociocultural attributes. Chapter 3 details the methods of developing the energy decision aid and its underlying assumptions for alternative energy projects and their impacts. This tool also provides an alternative to economic valuation of cultural impacts based upon an ordinal index tied to environmental impacts. Chapter 4 details the the influence of various cultural, environmental, and economic outcome information provided through the developed decision aid on beliefs and preferences related to the type and scale of energy development, trust of decision makers, and larger concern for environmental protection. Finally, chapter 5 presents concluding thoughts future research and on how technical-social decision tools can provide a means ensuring effective decision making on the Navajo Nation and other American Indian communities.
Vidueira, Pablo; Díaz-Puente, José M; Rivera, María
2014-08-01
Ex ante impact assessment has become a fundamental tool for effective program management, and thus, a compulsory task when establishing a new program in the European Union (EU). This article aims to analyze benefits from ex ante impact assessment, methodologies followed, and difficulties encountered. This is done through the case study on the rural development programs (RDPs) in the EU. Results regarding methodologies are then contrasted with the international context in order to provide solid insights to evaluators and program managing authorities facing ex ante impact assessment. All European RDPs from the period 2007 through 2013 (a total of 88) and their corresponding available ex ante evaluations (a total of 70) were analyzed focusing on the socioeconomic impact assessment. Only 46.6% of the regions provide quantified impact estimations on socioeconomic impacts in spite of it being a compulsory task demanded by the European Commission (EC). Recommended methods by the EC are mostly used, but there is a lack of mixed method approaches since qualitative methods are used in substitution of quantitative ones. Two main difficulties argued were the complexity of program impacts and the lack of needed program information. Qualitative approaches on their own have been found as not suitable for ex ante impact assessment, while quantitative approaches-such as microsimulation models-provide a good approximation to actual impacts. However, time and budgetary constraints make that quantitative and mixed methods should be mainly applied on the most relevant impacts for the program success. © The Author(s) 2014.
Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan
2015-12-01
Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, N.W.T.; Ortega, R.; Rahilly, P.
2011-12-17
The project has provided science-based tools for the long-term management of salinity in drainage discharges from wetlands to the San Joaquin River. The results of the project are being used to develop best management practices (BMP) and a decision support system to assist wetland managers adjust the timing of salt loads delivered to the San Joaquin River during spring drawdown. Adaptive drainage management scheduling has the potential to improve environmental compliance with salinity objectives in the Lower San Joaquin River by reducing the frequency of violation of Vernalis salinity standards, especially in dry and critically dry years. The paired approachmore » to project implementation whereby adaptively managed and traditional practices were monitored in a side-by-side fashion has provided a quantitative measure of the impacts of the project on the timing of salt loading to the San Joaquin River. The most significant accomplishments of the project has been the technology transfer to wetland biologists, ditch tenders and water managers within the Grasslands Ecological Area. This “learning by doing” has build local community capacity within the Grassland Water District and California Department of Fish and Game providing these institutions with new capability to assess and effectively manage salinity within their wetlands while simultaneously providing benefits to salinity management of the San Joaquin River.« less
Digital Holography, a metrological tool for quantitative analysis: Trends and future applications
NASA Astrophysics Data System (ADS)
Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro
2018-05-01
A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.
NASA Astrophysics Data System (ADS)
Cederman, L.-E.; Conte, R.; Helbing, D.; Nowak, A.; Schweitzer, F.; Vespignani, A.
2012-11-01
A huge flow of quantitative social, demographic and behavioral data is becoming available that traces the activities and interactions of individuals, social patterns, transportation infrastructures and travel fluxes. This has caused, together with innovative computational techniques and methods for modeling social actions in hybrid (natural and artificial) societies, a qualitative change in the ways we model socio-technical systems. For the first time, society can be studied in a comprehensive fashion that addresses social and behavioral complexity. In other words we are in the position to envision the development of large data and computational cyber infrastructure defining an exploratory of society that provides quantitative anticipatory, explanatory and scenario analysis capabilities ranging from emerging infectious disease to conflict and crime surges. The goal of the exploratory of society is to provide the basic infrastructure embedding the framework of tools and knowledge needed for the design of forecast/anticipatory/crisis management approaches to socio technical systems, supporting future decision making procedures by accelerating the scientific cycle that goes from data generation to predictions.
An index to assess the health and benefits of the global ocean.
Halpern, Benjamin S; Longo, Catherine; Hardy, Darren; McLeod, Karen L; Samhouri, Jameal F; Katona, Steven K; Kleisner, Kristin; Lester, Sarah E; O'Leary, Jennifer; Ranelletti, Marla; Rosenberg, Andrew A; Scarborough, Courtney; Selig, Elizabeth R; Best, Benjamin D; Brumbaugh, Daniel R; Chapin, F Stuart; Crowder, Larry B; Daly, Kendra L; Doney, Scott C; Elfes, Cristiane; Fogarty, Michael J; Gaines, Steven D; Jacobsen, Kelsey I; Karrer, Leah Bunce; Leslie, Heather M; Neeley, Elizabeth; Pauly, Daniel; Polasky, Stephen; Ris, Bud; St Martin, Kevin; Stone, Gregory S; Sumaila, U Rashid; Zeller, Dirk
2012-08-30
The ocean plays a critical role in supporting human well-being, from providing food, livelihoods and recreational opportunities to regulating the global climate. Sustainable management aimed at maintaining the flow of a broad range of benefits from the ocean requires a comprehensive and quantitative method to measure and monitor the health of coupled human–ocean systems. We created an index comprising ten diverse public goals for a healthy coupled human–ocean system and calculated the index for every coastal country. Globally, the overall index score was 60 out of 100 (range 36–86), with developed countries generally performing better than developing countries, but with notable exceptions. Only 5% of countries scored higher than 70, whereas 32% scored lower than 50. The index provides a powerful tool to raise public awareness, direct resource management, improve policy and prioritize scientific research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lewicki, Scott; Morgan, Scott
2011-01-01
The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.
NASA Astrophysics Data System (ADS)
Yazdanfar, Siavash; Kulkarni, Manish D.; Wong, Richard C. K.; Sivak, Michael J., Jr.; Willis, Joseph; Barton, Jennifer K.; Welch, Ashley J.; Izatt, Joseph A.
1998-04-01
A recently developed modality for blood flow measurement holds high promise in the management of bleeding ulcers. Color Doppler optical coherence tomography (CDOCT) uses low- coherence interferometry and digital signal processing to obtain precise localization of tissue microstructure simultaneous with bi-directional quantitation of blood flow. We discuss CDOCT as a diagnostic tool in the management of bleeding gastrointestinal lesions. Common treatments for bleeding ulcers include local injection of a vasoconstrictor, coagulation of blood via thermal contact or laser treatment, and necrosis of surrounding tissue with a sclerosant. We implemented these procedures in a rat dorsal skin flap model, and acquired CDOCT images before and after treatment. In these studies, CDOCT succeeded in identifying cessation of flow before it could be determined visually. Hence, we demonstrate the diagnostic capabilities of CDOCT in the regulation of bleeding in micron-scale vessels.
NASA Astrophysics Data System (ADS)
Hubbart, J. A.; Kellner, R. E.; Zeiger, S. J.
2016-12-01
Advancements in watershed management are both a major challenge, and urgent need of this century. The experimental watershed study (EWS) approach provides critical baseline and long-term information that can improve decision-making, and reduce misallocation of mitigation investments. Historically, the EWS approach was used in wildland watersheds to quantitatively characterize basic landscape alterations (e.g. forest harvest, road building). However, in recent years, EWS is being repurposed in contemporary multiple-land-use watersheds comprising a mosaic of land use practices such as urbanizing centers, industry, agriculture, and rural development. The EWS method provides scalable and transferrable results that address the uncertainties of development, while providing a scientific basis for total maximum daily load (TMDL) targets in increasing numbers of Clean Water Act 303(d) listed waters. Collaborative adaptive management (CAM) programs, designed to consider the needs of many stakeholders, can also benefit from EWS-generated information, which can be used for best decision making, and serve as a guidance tool throughout the CAM program duration. Of similar importance, long-term EWS monitoring programs create a model system to show stakeholders how investing in rigorous scientific research initiatives improves decision-making, thereby increasing management efficiencies through more focused investments. The evolution from classic wildland EWS designs to contemporary EWS designs in multiple-land-use watersheds will be presented while illustrating how such an approach can encourage innovation, cooperation, and trust among watershed stakeholders working to reach the common goal of improving and sustaining hydrologic regimes and water quality.
Creating an automated tool for measuring software cohesion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tutton, J.M.; Zucconi, L.
1994-05-06
Program modules with high complexity tend to be more error prone and more difficult to understand. These factors increase maintenance and enhancement costs. Hence, a tool that can help programmers determine a key factor in module complexity should be very useful. Our goal is to create a software tool that will automatically give a quantitative measure of the cohesiveness of a given module, and hence give us an estimate of the {open_quotes}maintainability{close_quotes} of that module. The Tool will use a metric developed by Professors Linda M. Ott and James M. Bieman. The Ott/Bieman metric gives quantitative measures that indicate themore » degree of functional cohesion using abstract data slices.« less
Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety
ERIC Educational Resources Information Center
Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.
2013-01-01
Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…
Infrared vibrational spectroscopy: a rapid and novel diagnostic and monitoring tool for cystinuria
Oliver, Katherine V.; Vilasi, Annalisa; Maréchal, Amandine; Moochhala, Shabbir H.; Unwin, Robert J.; Rich, Peter R.
2016-01-01
Cystinuria is the commonest inherited cause of nephrolithiasis (~1% in adults; ~6% in children) and is the result of impaired cystine reabsorption in the renal proximal tubule. Cystine is poorly soluble in urine with a solubility of ~1 mM and can readily form microcrystals that lead to cystine stone formation, especially at low urine pH. Diagnosis of cystinuria is made typically by ion-exchange chromatography (IEC) detection and quantitation, which is slow, laboursome and costly. More rapid and frequent monitoring of urinary cystine concentration would significantly improve the diagnosis and clinical management of cystinuria. We used attenuated total reflection - Fourier transform infrared spectroscopy (ATR-FTIR) to detect and quantitate insoluble cystine in 22 cystinuric and 5 healthy control urine samples. Creatinine concentration was also determined by ATR-FTIR to adjust for urinary concentration/dilution. Urine was centrifuged, the insoluble fraction re-suspended in 5 μL water and dried on the ATR prism. Cystine was quantitated using its 1296 cm−1 absorption band and levels matched with parallel measurements made using IEC. ATR-FTIR afforded a rapid and inexpensive method of detecting and quantitating insoluble urinary cystine. This proof-of-concept study provides a basis for developing a high-throughput, cost-effective diagnostic method for cystinuria, and for point-of-care clinical monitoring PMID:27721432
Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea
2016-10-01
Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.
Astronomical large projects managed with MANATEE: management tool for effective engineering
NASA Astrophysics Data System (ADS)
García-Vargas, M. L.; Mujica-Alvarez, E.; Pérez-Calpena, A.
2012-09-01
This paper describes MANATEE, which is the Management project web tool developed by FRACTAL, specifically designed for managing large astronomical projects. MANATEE facilitates the management by providing an overall view of the project and the capabilities to control the three main projects parameters: scope, schedule and budget. MANATEE is one of the three tools of the FRACTAL System & Project Suite, which is composed also by GECO (System Engineering Tool) and DOCMA (Documentation Management Tool). These tools are especially suited for those Consortia and teams collaborating in a multi-discipline, complex project in a geographically distributed environment. Our Management view has been applied successfully in several projects and currently is being used for Managing MEGARA, the next instrument for the GTC 10m telescope.
Groene, Oliver; Brandt, Elimer; Schmidt, Werner; Moeller, Johannes
2009-08-01
Strategy development and implementation in acute care settings is often restricted by competing challenges, the pace of policy reform and the existence of parallel hierarchies. To describe a generic approach to strategy development, illustrate the use of the Balanced Scorecard as a tool to facilitate strategy implementation and demonstrate how to break down strategic goals into measurable elements. Multi-method approach using three different conceptual models: Health Promoting Hospitals Standards and Strategies, the European Foundation for Quality Management (EFQM) Model and the Balanced Scorecard. A bundle of qualitative and quantitative methods were used including in-depth interviews, standardized organization-wide surveys on organizational values, staff satisfaction and patient experience. Three acute care hospitals in four different locations belonging to a German holding group. Chief executive officer, senior medical officers, working group leaders and hospital staff. Development and implementation of the Balanced Scorecard. Twenty strategic objectives with corresponding Balanced Scorecard measures. A stepped approach from strategy development to implementation is presented to identify key themes for strategy development, drafting a strategy map and developing strategic objectives and measures. The Balanced Scorecard, in combination with the EFQM model, is a useful tool to guide strategy development and implementation in health care organizations. As for other quality improvement and management tools not specifically developed for health care organizations, some adaptations are required to improve acceptability among professionals. The step-wise approach of strategy development and implementation presented here may support similar processes in comparable organizations.
2013-06-01
measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review
Pritt, Jeremy J; Frimpong, Emmanuel A
2010-10-01
Conserving rare species and protecting biodiversity and ecosystem functioning depends on sound information on the nature of rarity. Rarity is multidimensional and has a variety of definitions, which presents the need for a quantitative classification scheme with which to categorize species as rare or common. We constructed such a classification for North American freshwater fishes to better describe rarity in fishes and provide researchers and managers with a tool to streamline conservation efforts. We used data on range extents, habitat specificities, and local population sizes of North American freshwater fishes and a variety of quantitative methods and statistical decision criteria, including quantile regression and a cost-function algorithm to determine thresholds for categorizing a species as rare or common. Species fell into eight groups that conform to an established framework for rarity. Fishes listed by the American Fisheries Society (AFS) as endangered, threatened, or vulnerable were most often rare because their local population sizes were low, ranges were small, and they had specific habitat needs, in that order, whereas unlisted species were most often considered common on the basis of these three factors. Species with large ranges generally had few specific habitat needs, whereas those with small ranges tended to have narrow habitat specificities. We identified 30 species not designated as imperiled by AFS that were rare along all dimensions of rarity and may warrant further study or protection, and we found three designated species that were common along all dimensions and may require a review of their imperilment status. Our approach could be applied to other taxa to aid conservation decisions and serve as a useful tool for future revisions of listings of fish species. © 2010 Society for Conservation Biology.
Understanding ecosystem services adoption by natural resource managers and research ecologists
Engel, Daniel; Evans, Mary; Low, Bobbi S.; Schaeffer, Jeff
2017-01-01
The ecosystem services (ES) paradigm has gained much traction as a natural resource management approach due to its comprehensive nature and ability to provide quantitative tools to improve decision-making. However, it is still uncertain whether and how practitioners have adopted the ES paradigm into their work and how this aligns with resource management information needs. To address this, we surveyed natural resource managers within the Great Lakes region about their use of ES information in decision-making. We complemented our manager survey with in-depth interviews of a related population—research ecologists at the U.S. Geological Survey Great Lakes Science Center. In this study, managers and ecologists almost unanimously agreed that ES were appropriate to consider in resource management. We also found high congruence between managers and ecologists in the ES considered most relevant to their work, with provision of habitat, recreation and tourism, biological control, and primary production being the ES ranked highly by both groups. However, a disconnect arose when research ecologists deemed the information they provide regarding ES as adequate for management needs, but managers disagreed. Furthermore, managers reported that they would use economic information about ES if they had access to that information. We believe this data deficiency could represent a gap in scientific coverage by ecologists, but it may also simply reflect an underrepresentation of ecological economists who can translate ecological knowledge of ES providers into economic information that many managers desired.
NASA Astrophysics Data System (ADS)
Mongkolsawat, Darunee
The performance of energy management is usually considered through the energy reduction result however this does not sufficient for managing facility's energy in the long term. In combination to that, this study decides to investigate the relationship between the effectiveness of energy information management and the energy management performance. The interested sector is higher education institutions in Thailand due to their complex organisation both in management and property aspects. By not focusing on quantitative energy reduction as centre, the study seeks to establish a framework or tool in helping to understand such relationship qualitatively through organisation resource and process based view. Additionally, energy management structure is also accounted as initial factor. In relation to such framework, the performance of energy management is considered on its primary results concerning the issues of the data available, analysis results, and energy action. After the investigation, it is found that between the concerned factors and primary performance there are various specific relationships. For example, some tend to have direct connections as relations between the energy management structure and implemented actions, and between the investment in organisation resources and data available. While some have flexible relations as between data collection and results of analysed data. Furthermore, the load of energy management has been found influencing on organisation's motivation to invest in energy management. At the end of the paper, further application to the study is also proposed.
Exploration Medical System Trade Study Tools Overview
NASA Technical Reports Server (NTRS)
Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.
2018-01-01
ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.
Quantitative risk assessment for skin sensitization: Success or failure?
Kimber, Ian; Gerberick, G Frank; Basketter, David A
2017-02-01
Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.
Watershed Planning within a Quantitative Scenario Analysis Framework.
Merriam, Eric R; Petty, J Todd; Strager, Michael P
2016-07-24
There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.
[Evaluation of the family focus and community orientation in the Family Health Strategy].
Alencar, Monyk Neves de; Coimbra, Liberata Campos; Morais, Ana Patrícia Pereira; Silva, Antônio Augusto Moura da; Pinheiro, Siane Rocha de Almeida; Queiroz, Rejane Christine de Sousa
2014-02-01
The Family Health Strategy should be focused on the family unit and constructed operationally within the community sphere. The research assessed the family focus and community orientation as attributes of Primary Health Care, comparing if the responses differed among users, professionals and managers. It is an evaluative study of a population-based quantitative approach conducted between January 2010 and March 2011 in São Luís in the state of Maranhão. The study involved a population of 32 managers and 80 professionals with more than six months experience in the Family Health Strategy, and 883 users were selected by means of cluster sampling. Questionnaires validated in Brazil were used based on the components of the Primary Care Assessment Tool (PCATool). The composite index of the family focus was 2.7 for users, 4.9 for professionals and 5.3 for managers. In the posttest phase, differences were detected between users and professionals, and users and managers. The composite index of community orientation was 2.9 for users, 3.9 for professionals and 4.8 for managers (p < 0.001). Managers attributed higher percentages in all indicators, followed by professionals and lastly users. Both attributes were rated as being unsatisfactory in the perception of the users.
NASA Astrophysics Data System (ADS)
Gergel, D. R.; Watts, L. H.; Salathe, E. P.; Mankowski, J. D.
2017-12-01
Climate science, already a highly interdisciplinary field, is rapidly evolving, and natural resource managers are increasingly involved in policymaking and adaptation decisions to address climate change that need to be informed by state-of-the-art climate science. Consequently, there is a strong demand for unique organizations that engender collaboration and cooperation between government, non-profit, academic and for-profit sectors that are addressing issues relating to natural resources management and climate adaptation and resilience. These organizations are often referred to as boundary organizations. The Northwest Climate Science Center (NW CSC) and the North Pacific Landscape Conservation Cooperative (NP LCC) are two such boundary organizations operating in different contexts. Together, the NW CSC and the NP LCC fulfill the need for sites of co-production between researchers and managers working on climate-related issues, and a key component of this work is a monthly climate science newsletter that includes recent climate science journal articles, reports, and climate-related events. Our study evaluates the effectiveness of the climate science digest (CSD) through a three-pronged approach: a) in-depth interviews with natural resource managers who use the CSD, b) poll questions distributed to CSD subscribers, and c) quantitative analysis of CSD effectiveness using analytics from MailChimp distribution. We aim to a) map the reach of the CSD across the Northwest and at a national level; b) understand the efficacy of the CSD at communicating climate science to diverse audiences; c) evaluate the usefulness of CSD content for diverse constituencies of subscribers; d) glean transferrable knowledge for future evaluations of boundary management tools; and e) establish a protocol for designing climate science newsletters for other agencies disseminating climate science information. We will present results from all three steps of our evaluation process and describe their implications for future evaluations of climate science communications products and other boundary management tools in the field of natural resources management.
ERIC Educational Resources Information Center
Garner, Stuart
2009-01-01
This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students…
Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces
2012-03-01
with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation
Mukherjee, Sudeshna Basu; Ray, Anjali
2009-01-01
Background: The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Materials and Methods: Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for ‘t’ and ANOVA. Results: Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them. PMID:21180486
Mukherjee, Sudeshna Basu; Ray, Anjali
2009-07-01
The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for 't' and ANOVA. Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
Interactive Visualization to Advance Earthquake Simulation
NASA Astrophysics Data System (ADS)
Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn
2008-04-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.
Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D
2008-01-01
Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345
The Watershed Management Optimization Support Tool (WMOST) is a public-domain software application designed to aid decision makers with integrated water resources management. The tool allows water resource managers and planners to screen a wide-range of management practices for c...
ERIC Educational Resources Information Center
Trejo, Arturo
2013-01-01
The present quantitative correlational research study explored relationships between Emotional Intelligence (EI) competencies, such as self-awareness, self-management, social awareness, and relationship management, and project management outcomes: scope creep, in-budget project cost, and project timeliness. The study was conducted within the…
Kim, Mincheol; Jang, Yong-Chul; Lee, Seunguk
2013-10-15
The management of waste electrical and electronic equipment (WEEE) or electronic waste (e-waste) has become a major issue of concern for solid waste communities due to the large volumes of waste being generated from the consumption of modern electrical and electronic products. In 2003, Korea introduced the extended producer responsibility (EPR) system to reduce the amount of electronic products to be disposed and to promote resource recovery from WEEE. The EPR currently regulates a total of 10 electrical and electronic products. This paper presents the results of the application of the Delphi method and analytical hierarchy process (AHP) modeling to the WEEE management tool in the policy-making process. Specifically, this paper focuses on the application of the Delphi-AHP technique to determine the WEEE priority to be included in the EPR system. Appropriate evaluation criteria were derived using the Delphi method to assess the potential selection and priority among electrical and electronic products that will be regulated by the EPR system. Quantitative weightings from the AHP model were calculated to identify the priorities of electrical and electronic products to be potentially regulated. After applying all the criteria using the AHP model, the results indicate that the top 10 target recycling products for the expansion of the WEEE list were found to be vacuum cleaners, electric fans, rice cookers, large freezers, microwave ovens, water purifiers, air purifiers, humidifiers, dryers, and telephones in order from the first to last. The proposed Delphi-AHP method can offer a more efficient means of selecting WEEE than subjective assessment methods that are often based on professional judgment or limited available data. By providing WEEE items to be regulated, the proposed Delphi-AHP method can eliminate uncertainty and subjective assessment and enable WEEE management policy-makers to identify the priority of potential WEEE. More generally, the work performed in this study is an example of how Delphi-AHP modeling can be used as a decision-making process tool in WEEE management. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Computational medicinal chemistry in fragment-based drug discovery: what, how and when.
Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen
2011-01-01
The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
NASA Astrophysics Data System (ADS)
Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi
2015-05-01
Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.
Systematic mechanism-orientated approach to chronic pancreatitis pain.
Bouwense, Stefan A W; de Vries, Marjan; Schreuder, Luuk T W; Olesen, Søren S; Frøkjær, Jens B; Drewes, Asbjørn M; van Goor, Harry; Wilder-Smith, Oliver H G
2015-01-07
Pain in chronic pancreatitis (CP) shows similarities with other visceral pain syndromes (i.e., inflammatory bowel disease and esophagitis), which should thus be managed in a similar fashion. Typical causes of CP pain include increased intrapancreatic pressure, pancreatic inflammation and pancreatic/extrapancreatic complications. Unfortunately, CP pain continues to be a major clinical challenge. It is recognized that ongoing pain may induce altered central pain processing, e.g., central sensitization or pro-nociceptive pain modulation. When this is present conventional pain treatment targeting the nociceptive focus, e.g., opioid analgesia or surgical/endoscopic intervention, often fails even if technically successful. If central nervous system pain processing is altered, specific treatment targeting these changes should be instituted (e.g., gabapentinoids, ketamine or tricyclic antidepressants). Suitable tools are now available to make altered central processing visible, including quantitative sensory testing, electroencephalograpy and (functional) magnetic resonance imaging. These techniques are potentially clinically useful diagnostic tools to analyze central pain processing and thus define optimum management approaches for pain in CP and other visceral pain syndromes. The present review proposes a systematic mechanism-orientated approach to pain management in CP based on a holistic view of the mechanisms involved. Future research should address the circumstances under which central nervous system pain processing changes in CP, and how this is influenced by ongoing nociceptive input and therapies. Thus we hope to predict which patients are at risk for developing chronic pain or not responding to therapy, leading to improved treatment of chronic pain in CP and other visceral pain disorders.
ERIC Educational Resources Information Center
Small, Christine J.; Newtoff, Kiersten N.
2013-01-01
Undergraduate biology education is undergoing dramatic changes, emphasizing student training in the "tools and practices" of science, particularly quantitative and problem-solving skills. We redesigned a freshman ecology lab to emphasize the importance of scientific inquiry and quantitative reasoning in biology. This multi-week investigation uses…
Quantitative Phase Determination by Using a Michelson Interferometer
ERIC Educational Resources Information Center
Pomarico, Juan A.; Molina, Pablo F.; D'Angelo, Cristian
2007-01-01
The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as…
Comparison of hospital databases on antibiotic consumption in France, for a single management tool.
Henard, S; Boussat, S; Demoré, B; Clément, S; Lecompte, T; May, T; Rabaud, C
2014-07-01
The surveillance of antibiotic use in hospitals and of data on resistance is an essential measure for antibiotic stewardship. There are 3 national systems in France to collect data on antibiotic use: DREES, ICATB, and ATB RAISIN. We compared these databases and drafted recommendations for the creation of an optimized database of information on antibiotic use, available to all concerned personnel: healthcare authorities, healthcare facilities, and healthcare professionals. We processed and analyzed the 3 databases (2008 data), and surveyed users. The qualitative analysis demonstrated major discrepancies in terms of objectives, healthcare facilities, participation rate, units of consumption, conditions for collection, consolidation, and control of data, and delay before availability of results. The quantitative analysis revealed that the consumption data for a given healthcare facility differed from one database to another, challenging the reliability of data collection. We specified user expectations: to compare consumption and resistance data, to carry out benchmarking, to obtain data on the prescribing habits in healthcare units, or to help understand results. The study results demonstrated the need for a reliable, single, and automated tool to manage data on antibiotic consumption compared with resistance data on several levels (national, regional, healthcare facility, healthcare units), providing rapid local feedback and educational benchmarking. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Plant pathogen nanodiagnostic techniques: forthcoming changes?
Khiyami, Mohammad A.; Almoammar, Hassan; Awad, Yasser M.; Alghuthaymi, Mousa A.; Abd-Elsalam, Kamel A.
2014-01-01
Plant diseases are among the major factors limiting crop productivity. A first step towards managing a plant disease under greenhouse and field conditions is to correctly identify the pathogen. Current technologies, such as quantitative polymerase chain reaction (Q-PCR), require a relatively large amount of target tissue and rely on multiple assays to accurately identify distinct plant pathogens. The common disadvantage of the traditional diagnostic methods is that they are time consuming and lack high sensitivity. Consequently, developing low-cost methods to improve the accuracy and rapidity of plant pathogens diagnosis is needed. Nanotechnology, nano particles and quantum dots (QDs) have emerged as essential tools for fast detection of a particular biological marker with extreme accuracy. Biosensor, QDs, nanostructured platforms, nanoimaging and nanopore DNA sequencing tools have the potential to raise sensitivity, specificity and speed of the pathogen detection, facilitate high-throughput analysis, and to be used for high-quality monitoring and crop protection. Furthermore, nanodiagnostic kit equipment can easily and quickly detect potential serious plant pathogens, allowing experts to help farmers in the prevention of epidemic diseases. The current review deals with the application of nanotechnology for quicker, more cost-effective and precise diagnostic procedures of plant diseases. Such an accurate technology may help to design a proper integrated disease management system which may modify crop environments to adversely affect crop pathogens. PMID:26740775
The tools of an evidence-based culture: implementing clinical-practice guidelines in an Israeli HMO.
Kahan, Natan R; Kahan, Ernesto; Waitman, Dan-Andrei; Kitai, Eliezer; Chintz, David P
2009-09-01
Although clinical-practice guidelines (CPGs) are implemented on the assumption that they will improve the quality, efficiency, and consistency of health care, they generally have limited effect in changing physicians' behavior. The purpose of this study was to design and implement an effective program for formulating, promulgating, and implementing CPGs to foster the development of an evidence-based culture in an Israeli HMO. The authors implemented a four-stage program of stepwise collaborative efforts with academic institutions composed of developing quantitative tools to evaluate prescribing patterns, updating CPGs, collecting MDs' input via focus groups and quantitative surveys, and conducting a randomized controlled trial of a two-stage, multipronged intervention. The test case for this study was the development, dissemination, and implementation of CPG for the treatment of acute uncomplicated cystitis in adult women. Interventions in the form of a lecture at a conference and a letter with personalized feedback were implemented, both individually and combined, to improve physicians' rates of prescribing the first-line drug, nitrofurantoin, and, in the absence of nitrofurantoin, adhering to the recommended duration of three days of treatment with ofloxacin. The tools and data-generating capabilities designed and constructed in Stage I of the project were integral components of all subsequent stages of the program. Personalized feedback alone was sufficient to improve the rate of adherence to the guidelines by 19.4% (95% CI = 16.7, 22.1). This study provides a template for introducing the component of experimentation essential for cultivating an evidence-based culture. This process, composed of collaborative efforts between academic institutions and a managed care organization, may be beneficial to other health care systems.
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator... use the application forms and procedures specified by OSM in accordance with Office of Management and...
Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database
2017-06-01
relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2010 CFR
2010-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2011 CFR
2011-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2012 CFR
2012-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2013 CFR
2013-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2014 CFR
2014-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
Statistical, economic and other tools for assessing natural aggregate
Bliss, J.D.; Moyle, P.R.; Bolm, K.S.
2003-01-01
Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.
Human and climate impact on global riverine water and sediment fluxes - a distributed analysis
NASA Astrophysics Data System (ADS)
Cohen, S.; Kettner, A.; Syvitski, J. P.
2013-05-01
Understanding riverine water and sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of climate, landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. The intensity and dynamics between man-made and climatic factors vary widely across the globe and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment and water discharge model (WBMsed) to simulate human and climate effect on our planet's large rivers.
Improving Aquatic Plant Management in the California Sacramento-San Joaquin Delta
NASA Technical Reports Server (NTRS)
Bubenheim, David L.; Potter, Chris
2018-01-01
Management of aquatic weeds in complex watersheds and river systems present many challenges to assessment, planning and implementation of management practices for floating and submerged aquatic invasive plants. The Delta Region Areawide Aquatic Weed Project (DRAAWP), a USDA sponsored area-wide project, is working to enhance planning, decision-making and operational efficiency in the California Sacramento-San Joaquin Delta. Satellite and airborne remote sensing are used map (area coverage and biomass), direct operations, and assess management impacts on plant communities. Archived satellite records going are used to review results from previous climate and management events and aide in developing long-term strategies. Modeling at local and watershed scales provides insight into land-use effects on water quality. Plant growth models informed by remote sensing are being applied spatially across the Delta to balance location and type of aquatic plant, growth response to altered environments, phenology, environmental regulations, and economics in selection of management practices. Initial utilization of remote sensing tools developed for mapping of aquatic invasive weeds improved operational efficiency by focusing limited chemical use to strategic areas with high plant-control impact and incorporating mechanical harvesting when chemical use is restricted. These assessment methods provide a comprehensive and quantitative view of aquatic invasive plants communities in the California Delta, both spatial and temporal, informed by ecological understanding with the objective of improving management and assessment effectiveness.
Validating a tool to measure auxiliary nurse midwife and nurse motivation in rural Nepal.
Morrison, Joanna; Batura, Neha; Thapa, Rita; Basnyat, Regina; Skordis-Worrall, Jolene
2015-05-12
A global shortage of health workers in rural areas increases the salience of motivating and supporting existing health workers. Understandings of motivation may vary in different settings, and it is important to use measurement methods that are contextually appropriate. We identified a measurement tool, previously used in Kenya, and explored its validity and reliability to measure the motivation of auxiliary nurse midwives (ANM) and staff nurses (SN) in rural Nepal. Qualitative and quantitative methods were used to assess the content validity, the construct validity, the internal consistency and the reliability of the tool. We translated the tool into Nepali and it was administered to 137 ANMs and SNs in three districts. We collected qualitative data from 78 nursing personnel and district- and central-level stakeholders using interviews and focus group discussions. We calculated motivation scores for ANMs and SNs using the quantitative data and conducted statistical tests for validity and reliability. Motivation scores were compared with qualitative data. Descriptive exploratory analysis compared mean motivation scores by ANM and SN sociodemographic characteristics. The concept of self-efficacy was added to the tool before data collection. Motivation was revealed through conscientiousness. Teamwork and the exertion of extra effort were not adequately captured by the tool, but important in illustrating motivation. The statement on punctuality was problematic in quantitative analysis, and attendance was more expressive of motivation. The calculated motivation scores usually reflected ANM and SN interview data, with some variation in other stakeholder responses. The tool scored within acceptable limits in validity and reliability testing and was able to distinguish motivation of nursing personnel with different sociodemographic characteristics. We found that with minor modifications, the tool provided valid and internally consistent measures of motivation among ANMs and SNs in this context. We recommend the use of this tool in similar contexts, with the addition of statements about self-efficacy, teamwork and exertion of extra effort. Absenteeism should replace the punctuality statement, and statements should be worded both positively and negatively to mitigate positive response bias. Collection of qualitative data on motivation creates a more nuanced understanding of quantitative scores.
NASA Astrophysics Data System (ADS)
Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.
2014-12-01
X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07
Pearson, D.K.; Gary, R.H.; Wilson, Z.D.
2007-01-01
Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.
Lanfranchi, Fiorella; Alaimo, Sara; Conway, P M
2014-01-01
In 2010, Italian regulatory guidelines have been issued consisting of a stepwise procedure for the assessment and management of work-related stress. However, research that empirically examines whether this procedure proves effective in accurately identifying critical psychosocial factors and informing risk management is scarce. To examine the differential sensitivity of two approaches to risk assessment, the first based on objective instruments only, the second consisting of an integrated approach combining different methods and theoretical perspectives. We examined a sample of 306 healthcare employees in a large-size hospital in northern Italy, using a series of tools, both quantitative (an observational checklist and the HSE-IT and MOHQ questionnaires) and qualitative (Focus Groups). Through instrument-specific reference values, we then compared risk profiles between different homogeneous groups within the institution. The psychosocial work environment resulted to be far more positive when adopting the first compared to the second approach to risk assessment. The latter approach was also more sensitive in detecting between-groups differences in risk profiles. Furthermore, the Focus Groups returned a more context-specific picture of the psychosocial work environment. Finally, going beyond the emphasis on negative working conditions inherent in the other quantitative instruments, the MOHQ allowed for also identifying health-promoting factors in need for improvement. Although more research is needed to confirm our findings, the present study suggests that using an integrated approach to assess the psychosocial work environment may be the most effective way to accurately identify risk factors and support the management process.
Risk management in the North sea offshore industry: History, status and challenges
NASA Astrophysics Data System (ADS)
Smith, E. J.
1995-10-01
There have been major changes in the UK and Norwegian offshore safety regimes in the last decade. On the basis of accumulated experience (including some major accidents), there has been a move away from a rigid, prescriptive approach to setting safety standards; it is now recognised that a more flexible, "goal-setting" approach is more suited to achieving cost-effective solutions to offshore safety. In order to adapt to this approach, offshore operators are increasingly using Quantitative Risk Assessment (QRA) techniques as part of their risk management programmes. Structured risk assessment can be used at all stages of a project life-cycle. In the design stages (concept and detailed design), these techniques are valuable tools in ensuring that money is wisely spent on safety-related systems. In the operational stage, QRA can aid the development of procedures. High quality Safety Management Systems (SMSs), covering issues such as training, inspection, and emergency planning, are crucial to maintain "asdesigned" levels of safety and reliability. Audits of SMSs should be carried out all through the operational phase to ensure that risky conditions do not accumulate.
Health, Environment and Social Management in Enterprises programme in the Republic of Macedonia.
Karadzinska-Bislimovska, Jovanka; Baranski, Boguslaw; Risteska-Kuc, Snezana
2004-01-01
Macedonia is the first country in the region to launch implementation of the WHO Health, Environment and Social Management in Enterprises (HESME) Programme, following the WHO Ministerial Conference on Environment and Health held in London in 1999. The aim of this paper is to describe the efforts made to implement this programme. Methods are based on integrated management with joint involvement of crucial partners at all levels of activities suggested by the WHO. Commitment to inter-sectorial and interagency collaboration at national level, adoption of a final version of a National HESME Plan, with basic principles, criteria and concrete activities, establishment of a National coordination center for the HESME Project, development of training curricula and specific educational tools for occupational health personnel, preparation of questionnaires and procedures for a national survey to detect high occupational risks, specific occupational hazards and health promotion needs of the working population, and finally setting up quantitative and qualitative indicators for national or provincial workplace health profiles. Building up the concept of cooperation, partnership and common work in HESME activities is a challenge for the new public health view in Europe.
Bridging the Engineering and Medicine Gap
NASA Technical Reports Server (NTRS)
Walton, M.; Antonsen, E.
2018-01-01
A primary challenge NASA faces is communication between the disparate entities of engineers and human system experts in life sciences. Clear communication is critical for exploration mission success from the perspective of both risk analysis and data handling. The engineering community uses probabilistic risk assessment (PRA) models to inform their own risk analysis and has extensive experience managing mission data, but does not always fully consider human systems integration (HSI). The medical community, as a part of HSI, has been working 1) to develop a suite of tools to express medical risk in quantitative terms that are relatable to the engineering approaches commonly in use, and 2) to manage and integrate HSI data with engineering data. This talk will review the development of the Integrated Medical Model as an early attempt to bridge the communication gap between the medical and engineering communities in the language of PRA. This will also address data communication between the two entities in the context of data management considerations of the Medical Data Architecture. Lessons learned from these processes will help identify important elements to consider in future communication and integration of these two groups.
The Portland Harbor Superfund Site Sustainability Project: Introduction.
Fitzpatrick, Anne G; Apitz, Sabine E; Harrison, David; Ruffle, Betsy; Edwards, Deborah A
2018-01-01
This article introduces the Portland Harbor Superfund Site Sustainability Project (PHSP) special series in this issue. The Portland Harbor Superfund Site is one of the "mega-sediment sites" in the United States, comprising about 10 miles of the Lower Willamette River, running through the heart of Portland, Oregon. The primary aim of the PHSP was to conduct a comprehensive sustainability assessment, integrating environmental, economic, and social considerations of a selection of the remedial alternatives laid out by the US Environmental Protection Agency. A range of tools were developed for this project to quantitatively address environmental, economic, and social costs and benefits based upon diverse stakeholder values. In parallel, a probabilistic risk assessment was carried out to evaluate the risk assumptions at the core of the remedial investigation and feasibility study process. Integr Environ Assess Manag 2018;14:17-21. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Diagnosis and treatment of superficial esophageal cancer
Barret, Maximilien; Prat, Frédéric
2018-01-01
Endoscopy allows for the screening, early diagnosis, treatment and follow up of superficial esophageal cancer. Endoscopic submucosal dissection has become the gold standard for the resection of superficial squamous cell neoplasia. Combinations of endoscopic mucosal resection and radiofrequency ablation are the mainstay of the management of Barrett’s associated neoplasia. However, protruded, non-lifting or large lesions may be better managed by endoscopic submucosal dissection. Novel ablation tools, such as argon plasma coagulation with submucosal lifting and cryoablation balloons, are being developed for the treatment of residual Barrett’s esophagus, since iatrogenic strictures still hamper the development of extensive circumferential resections in the esophagus. Optimal surveillance modalities after endoscopic resection are still to be determined. The assessment of the risk of lymph-node metastases, as well as of the need for additional treatments based on qualitative and quantitative histological criteria, balanced to the patient’s condition, requires a dedicated multidisciplinary team decision process. The need for trained endoscopists, expert pathologists and surgeons, and specialized multidisciplinary meetings underlines the role of expert centers in the management of superficial esophageal cancer. PMID:29720850
Understanding online health information: Evaluation, tools, and strategies.
Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J
2017-02-01
Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C
2016-01-01
The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Watershed Management Optimization Support Tool (WMOST) Workshop.
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green i...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
Bayesian networks for maritime traffic accident prevention: benefits and challenges.
Hänninen, Maria
2014-12-01
Bayesian networks are quantitative modeling tools whose applications to the maritime traffic safety context are becoming more popular. This paper discusses the utilization of Bayesian networks in maritime safety modeling. Based on literature and the author's own experiences, the paper studies what Bayesian networks can offer to maritime accident prevention and safety modeling and discusses a few challenges in their application to this context. It is argued that the capability of representing rather complex, not necessarily causal but uncertain relationships makes Bayesian networks an attractive modeling tool for the maritime safety and accidents. Furthermore, as the maritime accident and safety data is still rather scarce and has some quality problems, the possibility to combine data with expert knowledge and the easy way of updating the model after acquiring more evidence further enhance their feasibility. However, eliciting the probabilities from the maritime experts might be challenging and the model validation can be tricky. It is concluded that with the utilization of several data sources, Bayesian updating, dynamic modeling, and hidden nodes for latent variables, Bayesian networks are rather well-suited tools for the maritime safety management and decision-making. Copyright © 2014 Elsevier Ltd. All rights reserved.
Prioritizing Seafloor Mapping for Washington’s Pacific Coast
Battista, Timothy; Buja, Ken; Christensen, John; Hennessey, Jennifer; Lassiter, Katrina
2017-01-01
Remote sensing systems are critical tools used for characterizing the geological and ecological composition of the seafloor. However, creating comprehensive and detailed maps of ocean and coastal environments has been hindered by the high cost of operating ship- and aircraft-based sensors. While a number of groups (e.g., academic research, government resource management, and private sector) are engaged in or would benefit from the collection of additional seafloor mapping data, disparate priorities, dauntingly large data gaps, and insufficient funding have confounded strategic planning efforts. In this study, we addressed these challenges by implementing a quantitative, spatial process to facilitate prioritizing seafloor mapping needs in Washington State. The Washington State Prioritization Tool (WASP), a custom web-based mapping tool, was developed to solicit and analyze mapping priorities from each participating group. The process resulted in the identification of several discrete, high priority mapping hotspots. As a result, several of the areas have been or will be subsequently mapped. Furthermore, information captured during the process about the intended application of the mapping data was paramount for identifying the optimum remote sensing sensors and acquisition parameters to use during subsequent mapping surveys. PMID:28350338
Assessing youth policies. A system of indicators for local government.
Planas, Anna; Soler, Pere; Vilà, Montserrat
2014-08-01
In the current European climate of economic, financial and political crisis and the questioning of the welfare state, assessing public policies assume a primary and strategic relevance in clarifying the results and contributions of policy actions. In this article, we aim to present the current situation in relation to youth policy assessment so as to formulate a system of assessment indicators in the sphere of Spanish local government youth policy. A review is conducted of some of the principal contributions in the field of constructing indicators for evaluating youth policies. We have found that most of these evaluation tools exist on a national or state level and that there is a dearth of local or municipal tools. The article concludes with a concrete proposal for an assessment tool: the SIAPJove (Sistema d'Indicadors d'Avaluació per a les Polítiques Municipals de Joventut or System of Assessment Indicators for Local Government Youth Policies) (web page: http://siapjove.udg.edu/). It provides both quantitative and qualitative indicators for local youth policy managers to obtain assessment reports with relative ease in 12 possible areas for assessment within youth policy. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vetter, Thomas R; Barman, Joydip; Boudreaux, Arthur M; Jones, Keith A
2016-03-22
Persistently variable success has been experienced in locally translating even well-grounded national clinical practice guidelines, including in the perioperative setting. We have sought greater applicability and acceptance of clinical practice guidelines and protocols with our novel Perioperative Risk Optimization and Management Planning Tool (PROMPT™). This study was undertaken to survey our institutional perioperative clinicians regarding (a) their qualitative recommendations for (b) their quantitative perceptions of the relative importance of a series of clinical issues and patient medical conditions as potential topics for creating a PROMPT™. We applied a mixed methods research design that involved collecting, analyzing, and "mixing" both qualitative and quantitative methods and data in a single study to answer a research question. Survey One was qualitative in nature and asked the study participants to list as free text up to 12 patient medical conditions or clinical issues that they perceived to be high priority topics for development of a PROMPT™. Survey Two was quantitative in nature and asked the study participants to rate each of these 57 specific, pre-selected clinical issues and patient medical conditions on an 11-point Likert scale of perceived importance as a potential topic for a PROMPT™. The two electronic, online surveys were completed by participants who were recruited from the faculty in our Department of Anesthesiology and Perioperative Medicine and Department of Surgery, and the cohort of hospital-employed certified registered nurse anesthetists. A total of 57 possible topics for a PROMPT™ was created and prioritized by our stakeholders. A strong correlation (r = 0.82, 95% CI: 0.71, 0.89, P < 0.001) was observed between the quantitative clinician survey rating scores reported by the anesthesiologists/certified registered nurse anesthetists versus the surgeons. The quantitative survey displayed strong inter-rater reliability (ICC = 0.92, P < 0.001). Our qualitative clinician stakeholder survey generated a comprehensive roster of clinical issues and patient medical conditions. Our subsequent quantitative clinician stakeholder survey indicated that there is generally strong agreement among anesthesiologists/certified registered nurse anesthetists and surgeons about the relative importance of these clinical issues and patient medical conditions as potential topics for perioperative optimization and risk management.
PC tools for project management: Programs and the state-of-the-practice
NASA Technical Reports Server (NTRS)
Bishop, Peter C.; Freedman, Glenn B.; Dede, Christopher J.; Lidwell, William; Learned, David
1990-01-01
The use of microcomputer tools for NASA project management; which features are the most useful; the impact of these tools on job performance and individual style; and the prospects for new features in project management tools and related tools are addressed. High, mid, and low end PM tools are examined. The pro's and con's of the tools are assessed relative to various tasks. The strengths and weaknesses of the tools are presented through cases and demonstrations.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
ERIC Educational Resources Information Center
Walters, Charles David
2017-01-01
Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…
78 FR 68450 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... hours. Key informant interviews and the quantitative survey will be conducted by telephone. As telephone... qualitative and quantitative data in order to develop and refine the Tool, and assess feasibility and audience... collection will be used to help inform a quantitative stage of work to include a national sample of...
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832
Mansfield, Theodore J; MacDonald Gibson, Jacqueline
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.
Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan
2014-05-30
We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.
2012-04-01
Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity at different scales and to produce maps with the spatial representation of the geodiversity index, which could be an inestimable contribute for land-use management.
Casian, Tibor; Iurian, Sonia; Bogdan, Catalina; Rus, Lucia; Moldovan, Mirela; Tomuta, Ioan
2017-12-01
This study proposed the development of oral lyophilisates with respect to pediatric medicine development guidelines, by applying risk management strategies and DoE as an integrated QbD approach. Product critical quality attributes were overviewed by generating Ishikawa diagrams for risk assessment purposes, considering process, formulation and methodology related parameters. Failure Mode Effect Analysis was applied to highlight critical formulation and process parameters with an increased probability of occurrence and with a high impact on the product performance. To investigate the effect of qualitative and quantitative formulation variables D-optimal designs were used for screening and optimization purposes. Process parameters related to suspension preparation and lyophilization were classified as significant factors, and were controlled by implementing risk mitigation strategies. Both quantitative and qualitative formulation variables introduced in the experimental design influenced the product's disintegration time, mechanical resistance and dissolution properties selected as CQAs. The optimum formulation selected through Design Space presented ultra-fast disintegration time (5 seconds), a good dissolution rate (above 90%) combined with a high mechanical resistance (above 600 g load). Combining FMEA and DoE allowed the science based development of a product with respect to the defined quality target profile by providing better insights on the relevant parameters throughout development process. The utility of risk management tools in pharmaceutical development was demonstrated.
Identifying persistent and characteristic features in firearm tool marks on cartridge cases
NASA Astrophysics Data System (ADS)
Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John
2017-12-01
Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.
Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne
2014-12-01
Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p < .001) for the whole tool, with variations from 0.619 to 0.920 (p < .01) between dimensions. The expert panel was satisfied with the content and face validity of the tool. The psychometric qualities of the NOTPaM developed in this study are satisfactory. However, the tool could be improved with slight modifications. Nevertheless, it was useful in assessing intensive care nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Watershed Management Optimization Support Tool v3
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... Change To Expand the Availability of Risk Management Tools November 30, 2012. Pursuant to Section 19(b)(1... Exchange's Risk Management Tool (the ``Tool'') to all Exchange Members.\\3\\ The Tool is currently available... Access Risk Management Tool.\\6\\ This optional service acts as a risk filter by causing the orders of...
The typological approach to submarine groundwater discharge (SGD)
Bokuniewicz, H.; Buddemeier, R.; Maxwell, B.; Smith, C.
2003-01-01
Coastal zone managers need to factor submarine groundwater discharge (SGD) in their integration. SGD provides a pathway for the transfer of freshwater, and its dissolved chemical burden, from the land to the coastal ocean. SGD reduces salinities and provides nutrients to specialized coastal habitats. It also can be a pollutant source, often undetected, causing eutrophication and triggering nuisance algal blooms. Despite its importance, SGD remains somewhat of a mystery in most places because it is usually unseen and difficult to measure. SGD has been directly measured at only about a hundred sites worldwide. A typology generated by the Land-Ocean Interaction in the Coastal Zone (LOICZ) Project is one of the few tools globally available to coastal resource managers for identifying areas in their jurisdiction where SGD may be a confounding process. (LOICZ is a core project of the International Geosphere/Biosphere Programme.) Of the hundreds of globally distributed parameters in the LOICZ typology, a SGD subset of potentially relevant parameters may be culled. A quantitative combination of the relevant hydrological parameters can serve as a proxy for the SGD conditions not directly measured. Web-LOICZ View, geospatial software then provides an automated approach to clustering these data into groups of locations that have similar characteristics. It permits selection of variables, of the number of clusters desired, and of the clustering criteria, and provides means of testing predictive results against independent variables. Information on the occurrence of a variety of SGD indicators can then be incorporated into regional clustering analysis. With such tools, coastal managers can focus attention on the most likely sites of SGD in their jurisdiction and design the necessary measurement and modeling programs needed for integrated management.
An empowerment-based approach to developing innovative e-health tools for self-management.
Alpay, Laurence; van der Boog, Paul; Dumaij, Adrie
2011-12-01
E-health is seen as an important technological tool in achieving self-management; however, there is little evidence of how effective e-health is for self-management. Example tools remain experimental and there is limited knowledge yet about the design, use, and effects of this class of tools. By way of introducing a new view on the development of e-health tools dedicated to self-management we aim to contribute to the discussion for further research in this area. Our assumption is that patient empowerment is an important mechanism of e-health self-management and we suggest incorporating it within the development of self-management tools. Important components of empowerment selected from literature are: communication, education and health literacy, information, self-care, decision aids and contact with fellow patients. All components require skills of both patients and the physicians. In this discussion paper we propose how the required skills can be used to specify effective self-management tools.
Governance of Local Disaster Management Committees in line with SOD in Bangladesh
NASA Astrophysics Data System (ADS)
Siddiquee, S. A.
2016-12-01
Due to its geographical location Bangladesh has always been prone to natural disasters such as tropical cyclones, floods, droughts, tidal surges, tornadoes, river-bank erosion and many more. The study was conducted using both qualitative and quantitative methods. Both open-ended and close-ended questions were asked. Questionnaire, KII and district gathering consultation tools were used to collect information from respondents in both the government organizations and NGOs. A total of 51 Disaster Management Committees (DMCs) in five districts that were vulnerable to flood, river-bank erosion, drought and cyclone were taken as sample to analyze the current situation of the disaster management committee. The study was conducted using both qualitative and quantitative methods. Surprisingly, the study has found that only 38.9% DMC members are informed about Disaster Management Act and 36.76% are aware about their roles and responsibilities in the Standing Orders on Disaster (SOD). Although the selected districts are extremely prone to disasters and District Disaster Management Committees (DDMCs), Upazila Disaster Management Committees (UzDMCs) and Union Disaster Management Committees (UDMCs) are holding regular meetings as per the SOD to mitigate the problems. The scenario has been found that the committees are the pillars of exchanging and coordinating the different departments to act collaboratively. 43.80% of DMCs have Risk Reduction Action Plan (RRAP) according to the Risk Reduction Action Plan. It was found that 23.3% of DMCs have developed volunteer groups and 26% of DMCs have arranged community awareness building programs. The study has also found that 34% of Union Parishads have incorporated Disaster Risk Reduction (DRR) into their Annual Development Plan (ADP). It is alarming that even though Bangladesh is one of the prime victims of climate change, encountering severe and frequent disasters like Sidr, Aila and Mahasen, 66% of the sample Union Parishads did not have DRR integrated into their ADPs. The functionality of the DMCs needs to be improved through capacity building, training, and materials such as a guidebook to simplify the SOD etc. Empowering the DMC members by increasing their level of understanding in IT and national linking will ultimately lead to more and improved governance system.
NASA Astrophysics Data System (ADS)
Reid, Jackie; Wilkes, Janelle
2016-08-01
Mapping quantitative skills across the science, technology, engineering and mathematics (STEM) curricula will help educators identify gaps and duplication in the teaching, practice and assessment of the necessary skills. This paper describes the development and implementation of quantitative skills mapping tools for courses in STEM at a regional university that offers both on-campus and distance modes of study. Key elements of the mapping project included the identification of key graduate quantitative skills, the development of curriculum mapping tools to record in which unit(s) and at what level of attainment each quantitative skill is taught, practised and assessed, and identification of differences in the way quantitative skills are developed for on-campus and distance students. Particular attention is given to the differences that are associated with intensive schools, which consist of concentrated periods of face-to-face learning over a three-four day period, and are available to distance education students enrolled in STEM units. The detailed quantitative skills mapping process has had an impact on the review of first-year mathematics units, resulted in crucial changes to the curriculum in a number of courses, and contributed to a more integrated approach, and a collective responsibility, to the development of students' quantitative skills for both face-to-face and online modes of learning.
Prioritizing the mHealth Design Space: A Mixed-Methods Analysis of Smokers' Perspectives.
Hartzler, Andrea Lisabeth; BlueSpruce, June; Catz, Sheryl L; McClure, Jennifer B
2016-08-05
Smoking remains the leading cause of preventable disease and death in the United States. Therefore, researchers are constantly exploring new ways to promote smoking cessation. Mobile health (mHealth) technologies could be effective cessation tools. Despite the availability of commercial quit-smoking apps, little research to date has examined smokers' preferred treatment intervention components (ie, design features). Honoring these preferences is important for designing programs that are appealing to smokers and may be more likely to be adopted and used. The aim of this study was to understand smokers' preferred design features of mHealth quit-smoking tools. We used a mixed-methods approach consisting of focus groups and written surveys to understand the design preferences of adult smokers who were interested in quitting smoking (N=40). Focus groups were stratified by age to allow differing perspectives to emerge between older (>40 years) and younger (<40 years) participants. Focus group discussion included a "blue-sky" brainstorming exercise followed by participant reactions to contrasting design options for communicating with smokers, providing social support, and incentivizing program use. Participants rated the importance of preselected design features on an exit survey. Qualitative analyses examined emergent discussion themes and quantitative analyses compared feature ratings to determine which were perceived as most important. Participants preferred a highly personalized and adaptive mHealth experience. Their ideal mHealth quit-smoking tool would allow personalized tracking of their progress, adaptively tailored feedback, and real-time peer support to help manage smoking cravings. Based on qualitative analysis of focus group discussion, participants preferred pull messages (ie, delivered upon request) over push messages (ie, sent automatically) and preferred interaction with other smokers through closed social networks. Preferences for entertaining games or other rewarding incentives to encourage program use differed by age group. Based on quantitative analysis of surveys, participants rated the importance of select design features significantly differently (P<.001). Design features rated as most important included personalized content, the ability to track one's progress, and features designed to help manage nicotine withdrawal and medication side effects. Design features rated least important were quit-smoking videos and posting on social media. Communicating with stop-smoking experts was rated more important than communicating with family and friends about quitting (P=.03). Perceived importance of various design features varied by age, experience with technology, and frequency of smoking. Future mHealth cessation aids should be designed with an understanding of smokers' needs and preferences for these tools. Doing so does not guarantee treatment effectiveness, but balancing user preferences with best-practice treatment considerations could enhance program adoption and improve treatment outcomes. Grounded in the perspectives of smokers, we identify several design considerations, which should be prioritized when designing future mHealth cessation tools and which warrant additional empirical validation.
Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn
2014-01-01
Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.
Watershed Management Optimization Support Tool (WMOST) v3: User Guide
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...
Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Hunink, J E; Droogers, P; Kauffman, S; Mwaniki, B M; Bouma, J
2012-11-30
Upstream soil and water conservation measures in catchments can have positive impact both upstream in terms of less erosion and higher crop yields, but also downstream by less sediment flow into reservoirs and increased groundwater recharge. Green Water Credits (GWC) schemes are being developed to encourage upstream farmers to invest in soil and water conservation practices which will positively effect upstream and downstream water availability. Quantitative information on water and sediment fluxes is crucial as a basis for such financial schemes. A pilot design project in the large and strategically important Upper-Tana Basin in Kenya has the objective to develop a methodological framework for this purpose. The essence of the methodology is the integration and use of a collection of public domain tools and datasets: the so-called Green water and Blue water Assessment Toolkit (GBAT). This toolkit was applied in order to study different options to implement GWC in agricultural rainfed land for the pilot study. Impact of vegetative contour strips, mulching, and tied ridges were determined for: (i) three upstream key indicators: soil loss, crop transpiration and soil evaporation, and (ii) two downstream indicators: sediment inflow in reservoirs and groundwater recharge. All effects were compared with a baseline scenario of average conditions. Thus, not only actual land management was considered but also potential benefits of changed land use practices. Results of the simulations indicate that especially applying contour strips or tied ridges significantly reduces soil losses and increases groundwater recharge in the catchment. The model was used to build spatial expressions of the proposed management practices in order to assess their effectiveness. The developed procedure allows exploring the effects of soil conservation measures in a catchment to support the implementation of GWC. Copyright © 2012 Elsevier Ltd. All rights reserved.
Kostova, Zlatina; Caiata-Zufferey, Maria; Schulz, Peter J
2015-01-01
There is strong empirical evidence that the support that chronic patients receive from their environment is fundamental for the way they cope with physical and psychological suffering. Nevertheless, in the case of rheumatoid arthritis (RA), providing the appropriate social support is still a challenge, and such support has often proven to be elusive and unreliable in helping patients to manage the disease. To explore whether and how social support for RA patients can be provided online, and to assess the conditions under which such support is effective. An online support tool was designed to provide patients with both tailored information and opportunities to interact online with health professionals and fellow sufferers. The general purpose was to identify where the support provided did - or did not - help patients, and to judge whether the determinants of success lay more within patients - their engagement and willingness to participate - or within the design of the website itself. The present study reports qualitative interviews with 19 users of the tool. A more specific purpose was to elaborate qualitatively on results from a quantitative survey of users, which indicated that any positive impact was confined to practical matters of pain management rather than extending to more fundamental psychological outcomes such as acceptance. Overall, online learning and interaction can do much to help patients with the everyday stresses of their disease; however, its potential for more durable positive impact depends on various individual characteristics such as personality traits, existing social networks, and the severity and longevity of the disease.
Cranmer, Alexana; Smetzer, Jennifer R; Welch, Linda; Baker, Erin
2017-05-15
Quantifying and managing the potential adverse wildlife impacts of offshore wind energy is critical for developing offshore wind energy in a sustainable and timely manner, but poses a significant challenge, particularly for small marine birds that are difficult to monitor. We developed a discrete-time Markov model of seabird movement around a colony site parameterized by automated radio telemetry data from common terns (Sterna hirundo) and Arctic terns (S. paradisaea), and derived impact functions that estimate the probability of collision fatality as a function of the distance and bearing of wind turbines from a colony. Our purpose was to develop and demonstrate a new, flexible tool that can be used for specific management and wind-energy planning applications when adequate data are available, rather than inform wind-energy development at this site. We demonstrate how the tool can be used 1) in marine spatial planning exercises to quantitatively identify setback distances under development scenarios given a risk threshold, 2) to examine the ecological and technical trade-offs of development alternatives to facilitate negotiation between objectives, and 3) in the U.S. National Environmental Policy Act (NEPA) process to estimate collision fatality under alternative scenarios. We discuss model limitations and data needs, and highlight opportunities for future model extension and development. We present a highly flexible tool for wind energy planning that can be easily extended to other central place foragers and data sources, and can be updated and improved as new monitoring data arises. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Lee, Katharine
2004-01-01
The Surface Management System (SMS) is a decision support tool that will help controllers, traffic managers, and NAS users manage the movements of aircraft on the surface of busy airports, improving capacity, efficiency, and flexibility. The Advanced Air Transportation Technologies (AATT) Project at NASA is developing SMS in cooperation with the FAA's Free Flight Phase 2 (FFP2) pro5ram. SMS consists of three parts: a traffic management tool, a controller tool, and a National Airspace System (NAS) information tool.
ERIC Educational Resources Information Center
Popovich, Karen
2012-01-01
This paper describes the process taken to develop a quantitative-based and Excel™-driven course that combines "BOTH" Management Information Systems (MIS) and Decision Science (DS) modeling outcomes and lays the foundation for upper level quantitative courses such as operations management, finance and strategic management. In addition,…
ERIC Educational Resources Information Center
Yarime, Masaru; Tanaka, Yuko
2012-01-01
Assessment tools influence incentives to higher education institutions by encouraging them to move towards sustainability. A review of 16 sustainability assessment tools was conducted to examine the recent trends in the issues and methodologies addressed in assessment tools quantitatively and qualitatively. The characteristics of the current…
Managing Epilepsy Well: Emerging e-Tools for epilepsy self-management.
Shegog, Ross; Bamps, Yvan A; Patel, Archna; Kakacek, Jody; Escoffery, Cam; Johnson, Erica K; Ilozumba, Ukwuoma O
2013-10-01
The Managing Epilepsy Well (MEW) Network was established in 2007 by the Centers for Disease Control and Prevention Epilepsy Program to expand epilepsy self-management research. The network has employed collaborative research strategies to develop, test, and disseminate evidence-based, community-based, and e-Health interventions (e-Tools) for epilepsy self-management for people with epilepsy, caregivers, and health-care providers. Since its inception, MEW Network collaborators have conducted formative studies (n=7) investigating the potential of e-Health to support epilepsy self-management and intervention studies evaluating e-Tools (n=5). The MEW e-Tools (the MEW website, WebEase, UPLIFT, MINDSET, and PEARLS online training) and affiliated e-Tools (Texting 4 Control) are designed to complement self-management practices in each phase of the epilepsy care continuum. These tools exemplify a concerted research agenda, shared methodological principles and models for epilepsy self-management, and a communal knowledge base for implementing e-Health to improve quality of life for people with epilepsy. © 2013.
MilQuant: a free, generic software tool for isobaric tagging-based quantitation.
Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo
2012-09-18
Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.
Understanding Pre-Quantitative Risk in Projects
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.
NASA Technical Reports Server (NTRS)
Johnson, Paul W.
2008-01-01
ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.
Tool independence for the Web Accessibility Quantitative Metric.
Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio
2009-07-01
The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.
A case study of resources management planning with multiple objectives and projects
NASA Astrophysics Data System (ADS)
Peterson, David L.; Silsbee, David G.; Schmoldt, Daniel L.
1994-09-01
Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decision-making tools. We have previously described a multiple objective planning process for inventory and monitoring programs (Schmoldt and others 1994). To test the applicability of that process for the more general needs of resources management planning, we conducted an exercise on the Olympic National Park (NP) in Washington State, USA. Eight projects were selected as typical of those considered in RMPs and five members of the Olympic NP staff used the analytic hierarchy process (AHP) to prioritize the eight projects with respect to their implicit management objectives. By altering management priorities for the park, three scenarios were generated. All three contained some similarities in rankings for the eight projects, as well as some differences. Mathematical allocations of money and people differed among these scenarios and differed substantially from what the actual 1990 Olympic NP RMP contains. Combining subjective priority measures with budget dollars and personnel time into an objective function creates a subjective economic metric for comparing different RMP’s. By applying this planning procedure, actual expenditures of budget and personnel in Olympic NP can agree more closely with the staff’s management objectives for the park.
Pepin, K.M.; Spackman, E.; Brown, J.D.; Pabilonia, K.L.; Garber, L.P.; Weaver, J.T.; Kennedy, D.A.; Patyk, K.A.; Huyvaert, K.P.; Miller, R.S.; Franklin, A.B.; Pedersen, K.; Bogich, T.L.; Rohani, P.; Shriner, S.A.; Webb, C.T.; Riley, S.
2014-01-01
Wild birds are the primary source of genetic diversity for influenza A viruses that eventually emerge in poultry and humans. Much progress has been made in the descriptive ecology of avian influenza viruses (AIVs), but contributions are less evident from quantitative studies (e.g., those including disease dynamic models). Transmission between host species, individuals and flocks has not been measured with sufficient accuracy to allow robust quantitative evaluation of alternate control protocols. We focused on the United States of America (USA) as a case study for determining the state of our quantitative knowledge of potential AIV emergence processes from wild hosts to poultry. We identified priorities for quantitative research that would build on existing tools for responding to AIV in poultry and concluded that the following knowledge gaps can be addressed with current empirical data: (1) quantification of the spatio-temporal relationships between AIV prevalence in wild hosts and poultry populations, (2) understanding how the structure of different poultry sectors impacts within-flock transmission, (3) determining mechanisms and rates of between-farm spread, and (4) validating current policy-decision tools with data. The modeling studies we recommend will improve our mechanistic understanding of potential AIV transmission patterns in USA poultry, leading to improved measures of accuracy and reduced uncertainty when evaluating alternative control strategies. PMID:24462191
Asthma education: different viewpoints elicited by qualitative and quantitative methods.
Damon, Scott A; Tardif, Richard R
2015-04-01
This project began as a qualitative examination of how asthma education provided by health professionals could be improved. Unexpected qualitative findings regarding the use of Asthma Action Plans and the importance of insurance reimbursement for asthma education prompted further quantitative examination. Qualitative individual interviews were conducted with primary care physicians in private practice who routinely provide initial diagnoses of asthma and focus groups were conducted with other clinicians in private primary care practices who routinely provide asthma education. Using the DocStyles quantitative tool two questions regarding Asthma Action Plans and insurance reimbursement were asked of a representative sample of physicians and other clinicians. The utility of Asthma Action Plans was questioned in the 2012 qualitative study. Qualitative findings also raised questions regarding whether reimbursement is the barrier to asthma education for patients performed by medical professionals it is thought to be. 2013 quantitative findings show that the majority of clinicians see Asthma Action Plans as useful. The question of whether reimbursement is a barrier to providing asthma education to patients was not resolved by the quantitative data. The majority of clinicians see Asthma Action Plans as a useful tool for patient education. Clinicians had less clear opinions on whether the lack of defined reimbursement codes acted as a barrier to asthma education. The study also provided useful audience data for design of new asthma educational tools developed by CDC.
ERIC Educational Resources Information Center
Reid, Jackie; Wilkes, Janelle
2016-01-01
Mapping quantitative skills across the science, technology, engineering and mathematics (STEM) curricula will help educators identify gaps and duplication in the teaching, practice and assessment of the necessary skills. This paper describes the development and implementation of quantitative skills mapping tools for courses in STEM at a regional…
TECHNOLOGY ASSESSMENT IN HOSPITALS: LESSONS LEARNED FROM AN EMPIRICAL EXPERIMENT.
Foglia, Emanuela; Lettieri, Emanuele; Ferrario, Lucrezia; Porazzi, Emanuele; Garagiola, Elisabetta; Pagani, Roberta; Bonfanti, Marzia; Lazzarotti, Valentina; Manzini, Raffaella; Masella, Cristina; Croce, Davide
2017-01-01
Hospital Based Health Technology Assessment (HBHTA) practices, to inform decision making at the hospital level, emerged as urgent priority for policy makers, hospital managers, and professionals. The present study crystallized the results achieved by the testing of an original framework for HBHTA, developed within Lombardy Region: the IMPlementation of A Quick hospital-based HTA (IMPAQHTA). The study tested: (i) the HBHTA framework efficiency, (ii) feasibility, (iii) the tool utility and completeness, considering dimensions and sub-dimensions. The IMPAQHTA framework deployed the Regional HTA program, activated in 2008 in Lombardy, at the hospital level. The relevance and feasibility of the framework were tested over a 3-year period through a large-scale empirical experiment, involving seventy-four healthcare professionals organized in different HBHTA teams for assessing thirty-two different technologies within twenty-two different hospitals. Semi-structured interviews and self-reported questionnaires were used to collect data regarding the relevance and feasibility of the IMPAQHTA framework. The proposed HBHTA framework proved to be suitable for application at the hospital level, in the Italian context, permitting a quick assessment (11 working days) and providing hospital decision makers with relevant and quantitative information. Performances in terms of feasibility, utility, completeness, and easiness proved to be satisfactory. The IMPAQHTA was considered to be a complete and feasible HBHTA framework, as well as being replicable to different technologies within any hospital settings, thus demonstrating the capability of a hospital to develop a complete HTA, if supported by adequate and well defined tools and quantitative metrics.
Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Lynch, Abigail J.; Taylor, William W.; McCright, Aaron M.
2016-01-01
Decision support tools can aid decision making by systematically incorporating information, accounting for uncertainties, and facilitating evaluation between alternatives. Without user buy-in, however, decision support tools can fail to influence decision-making processes. We surveyed fishery researchers, managers, and fishers affiliated with the Lake Whitefish Coregonus clupeaformis fishery in the 1836 Treaty Waters of Lakes Huron, Michigan, and Superior to assess opinions of current and future management needs to identify barriers to, and opportunities for, developing a decision support tool based on Lake Whitefish recruitment projections with climate change. Approximately 64% of 39 respondents were satisfied with current management, and nearly 85% agreed that science was well integrated into management programs. Though decision support tools can facilitate science integration into management, respondents suggest that they face significant implementation barriers, including lack of political will to change management and perceived uncertainty in decision support outputs. Recommendations from this survey can inform development of decision support tools for fishery management in the Great Lakes and other regions.
Carrier, Emily; Reschovsky, James
2009-12-01
Use of care management tools--such as group visits or patient registries--varies widely among primary care physicians whose practices care for patients with four common chronic conditions--asthma, diabetes, congestive heart failure and depression--according to a new national study by the Center for Studying Health System Change (HSC). For example, less than a third of these primary care physicians in 2008 reported their practices use nurse managers to coordinate care, and only four in 10 were in practices using registries to keep track of patients with chronic conditions. Physicians also used care management tools for patients with some chronic conditions but not others. Practice size and setting were strongly related to the likelihood that physicians used care management tools, with solo and smaller group practices least likely to use care management tools. The findings suggest that, along with experimenting with financial incentives for primary care physicians to adopt care management tools, policy makers might consider developing community-level care management resources, such as nurse managers, that could be shared among smaller physician practices.
Modeling the Effect of Polychromatic Light in Quantitative Absorbance Spectroscopy
ERIC Educational Resources Information Center
Smith, Rachel; Cantrell, Kevin
2007-01-01
Laboratory experiment is conducted to give the students practical experience with the principles of electronic absorbance spectroscopy. This straightforward approach creates a powerful tool for exploring many of the aspects of quantitative absorbance spectroscopy.
Thompson, Bryony A.; Greenblatt, Marc S.; Vallee, Maxime P.; Herkert, Johanna C.; Tessereau, Chloe; Young, Erin L.; Adzhubey, Ivan A.; Li, Biao; Bell, Russell; Feng, Bingjian; Mooney, Sean D.; Radivojac, Predrag; Sunyaev, Shamil R.; Frebourg, Thierry; Hofstra, Robert M.W.; Sijmons, Rolf H.; Boucher, Ken; Thomas, Alun; Goldgar, David E.; Spurdle, Amanda B.; Tavtigian, Sean V.
2015-01-01
Classification of rare missense substitutions observed during genetic testing for patient management is a considerable problem in clinical genetics. The Bayesian integrated evaluation of unclassified variants is a solution originally developed for BRCA1/2. Here, we take a step toward an analogous system for the mismatch repair (MMR) genes (MLH1, MSH2, MSH6, and PMS2) that confer colon cancer susceptibility in Lynch syndrome by calibrating in silico tools to estimate prior probabilities of pathogenicity for MMR gene missense substitutions. A qualitative five-class classification system was developed and applied to 143 MMR missense variants. This identified 74 missense substitutions suitable for calibration. These substitutions were scored using six different in silico tools (Align-Grantham Variation Grantham Deviation, multivariate analysis of protein polymorphisms [MAPP], Mut-Pred, PolyPhen-2.1, Sorting Intolerant From Tolerant, and Xvar), using curated MMR multiple sequence alignments where possible. The output from each tool was calibrated by regression against the classifications of the 74 missense substitutions; these calibrated outputs are interpretable as prior probabilities of pathogenicity. MAPP was the most accurate tool and MAPP + PolyPhen-2.1 provided the best-combined model (R2 = 0.62 and area under receiver operating characteristic = 0.93). The MAPP + PolyPhen-2.1 output is sufficiently predictive to feed as a continuous variable into the quantitative Bayesian integrated evaluation for clinical classification of MMR gene missense substitutions. PMID:22949387
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.
Social workers and workplace bullying: perceptions, responses and implications.
Whitaker, Tracy
2012-01-01
This non-experimental, cross-sectional study examined social workers' perceptions of bullying work relationships and their ability to construct effective coping responses to perceived workplace bullying. Quantitative data were gathered through the use of a mailed questionnaire, and qualitative data resulted from semi-structured individual interviews. The quantitative sample consisted of 111 social workers from the metropolitan, Washington, DC area, who were employed in organizations. Two self-identified targets of bullying participated in the interviews. Nearly three of five social workers (58%) in the sample reported being the targets of demeaning, rude, and hostile workplace interactions more than once in the previous year. Targets were more likely to work in government agencies/military and mental health outpatient organizations (19% and 18% respectively). More than a third of targets (35%) held a direct service role (clinical/direct practice), whereas almost a third (29%) identified their role as administration or management. The findings from this study suggest that workplace bullying may be a problem for social workers and that the social work profession may need to develop tools and guidelines to help practitioners identify, confront and extinguish these behaviors.
Clinical Utility of Quantitative Imaging
Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Subramaniam, Rathan M.; Lenchik, Leon
2014-01-01
Quantitative imaging (QI) is increasingly applied in modern radiology practice, assisting in the clinical assessment of many patients and providing a source of biomarkers for a spectrum of diseases. QI is commonly used to inform patient diagnosis or prognosis, determine the choice of therapy, or monitor therapy response. Because most radiologists will likely implement some QI tools to meet the patient care needs of their referring clinicians, it is important for all radiologists to become familiar with the strengths and limitations of QI. The Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force has explored the clinical application of QI and summarizes its work in this review. We provide an overview of the clinical use of QI by discussing QI tools that are currently employed in clinical practice, clinical applications of these tools, approaches to reporting of QI, and challenges to implementing QI. It is hoped that these insights will help radiologists recognize the tangible benefits of QI to their patients, their referring clinicians, and their own radiology practice. PMID:25442800
Bassett, Danielle S.; Mattar, Marcelo G.
2017-01-01
Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. PMID:28259554
Bassett, Danielle S; Mattar, Marcelo G
2017-04-01
Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analyzing the texture changes in the quantitative phase maps of adipocytes
NASA Astrophysics Data System (ADS)
Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.
2016-03-01
We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.
Schmid, Volker J; Cremer, Marion; Cremer, Thomas
2017-07-01
Recent advancements of super-resolved fluorescence microscopy have revolutionized microscopic studies of cells, including the exceedingly complex structural organization of cell nuclei in space and time. In this paper we describe and discuss tools for (semi-) automated, quantitative 3D analyses of the spatial nuclear organization. These tools allow the quantitative assessment of highly resolved different chromatin compaction levels in individual cell nuclei, which reflect functionally different regions or sub-compartments of the 3D nuclear landscape, and measurements of absolute distances between sites of different chromatin compaction. In addition, these tools allow 3D mapping of specific DNA/RNA sequences and nuclear proteins relative to the 3D chromatin compaction maps and comparisons of multiple cell nuclei. The tools are available in the free and open source R packages nucim and bioimagetools. We discuss the use of masks for the segmentation of nuclei and the use of DNA stains, such as DAPI, as a proxy for local differences in chromatin compaction. We further discuss the limitations of 3D maps of the nuclear landscape as well as problems of the biological interpretation of such data. Copyright © 2017 Elsevier Inc. All rights reserved.
Management Matters: The Library Media Specialist's Management Toolbox
ERIC Educational Resources Information Center
Pappas, Marjorie L.
2004-01-01
Library media specialists need tools to help them manage the school library media program. The Internet includes a vast array of tools that a library media specialist might find useful. The websites and electronic resources included in this article are only a representative sample and future columns may explore additional tools. All the tools are…
Vehicle Thermal Management Models and Tools | Transportation Research |
NREL Models and Tools Vehicle Thermal Management Models and Tools The National Renewable Energy Laboratory's (NREL's) vehicle thermal management modeling tools allow researchers to assess the trade-offs and calculate the potential benefits of thermal design options. image of three models of semi truck cabs. Truck
EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS
NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...
Adoption of online health management tools among healthy older adults: An exploratory study.
Zettel-Watson, Laura; Tsukerman, Dmitry
2016-06-01
As the population ages and chronic diseases abound, overburdened healthcare systems will increasingly require individuals to manage their own health. Online health management tools, quickly increasing in popularity, have the potential to diminish or even replace in-person contact with health professionals, but overall efficacy and usage trends are unknown. The current study explored perceptions and usage patterns among users of online health management tools, and identified barriers and barrier-breakers among non-users. An online survey was completed by 169 computer users (aged 50+). Analyses revealed that a sizable minority (37%) of participants use online health management tools and most users (89%) are satisfied with these tools, but a limited range of tools are being used and usage occurs in relatively limited domains. Improved awareness and education for online health management tools could enhance people's abilities to remain at home as they age, reducing the financial burden on formal assistance programs. © The Author(s) 2014.
Teaching quantitative biology: goals, assessments, and resources
Aikens, Melissa L.; Dolan, Erin L.
2014-01-01
More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425
Nepolean, Thirunavukkarsau; Kaul, Jyoti; Mukri, Ganapati; Mittal, Shikha
2018-01-01
Breeding science has immensely contributed to the global food security. Several varieties and hybrids in different food crops including maize have been released through conventional breeding. The ever growing population, decreasing agricultural land, lowering water table, changing climate, and other variables pose tremendous challenge to the researchers to improve the production and productivity of food crops. Drought is one of the major problems to sustain and improve the productivity of food crops including maize in tropical and subtropical production systems. With advent of novel genomics and breeding tools, the way of doing breeding has been tremendously changed in the last two decades. Drought tolerance is a combination of several component traits with a quantitative mode of inheritance. Rapid DNA and RNA sequencing tools and high-throughput SNP genotyping techniques, trait mapping, functional characterization, genomic selection, rapid generation advancement, and other tools are now available to understand the genetics of drought tolerance and to accelerate the breeding cycle. Informatics play complementary role by managing the big-data generated from the large-scale genomics and breeding experiments. Genome editing is the latest technique to alter specific genes to improve the trait expression. Integration of novel genomics, next-generation breeding, and informatics tools will accelerate the stress breeding process and increase the genetic gain under different production systems. PMID:29696027
Recent development in software and automation tools for high-throughput discovery bioanalysis.
Shou, Wilson Z; Zhang, Jun
2012-05-01
Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.
Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael
2017-09-12
Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
IT infrastructure in the era of imaging 3.0.
McGinty, Geraldine B; Allen, Bibb; Geis, J Raymond; Wald, Christoph
2014-12-01
Imaging 3.0 is a blueprint for the future of radiology modeled after the description of Web 3.0 as "more connected, more open, and more intelligent." Imaging 3.0 involves radiologists' using their expertise to manage all aspects of imaging care to improve patient safety and outcomes and to deliver high-value care. IT tools are critical elements and drivers of success as radiologists embrace the concepts of Imaging 3.0. Organized radiology, specifically the ACR, is the natural convener and resource for the development of this Imaging 3.0 toolkit. The ACR's new Imaging 3.0 Informatics Committee is actively working to develop the informatics tools radiologists need to improve efficiency, deliver more value, and provide quantitative ways to demonstrate their value in new health care delivery and payment systems. This article takes each step of the process of delivering high-value Imaging 3.0 care and outlines the tools available as well as additional resources available to support practicing radiologists. From the moment when imaging is considered through the delivery of a meaningful and actionable report that is communicated to the referring clinician and, when appropriate, to the patient, Imaging 3.0 IT tools will enable radiologists to position themselves as vital constituents in cost-effective, high-value health care. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Low Latency DESDynI Data Products for Disaster Response, Resource Management and Other Applications
NASA Technical Reports Server (NTRS)
Doubleday, Joshua R.; Chien, Steve A.; Lou, Yunling
2011-01-01
We are developing onboard processor technology targeted at the L-band SAR instrument onboard the planned DESDynI mission to enable formation of SAR images onboard opening possibilities for near-real-time data products to augment full data streams. Several image processing and/or interpretation techniques are being explored as possible direct-broadcast products for use by agencies in need of low-latency data, responsible for disaster mitigation and assessment, resource management, agricultural development, shipping, etc. Data collected through UAVSAR (L-band) serves as surrogate to the future DESDynI instrument. We have explored surface water extent as a tool for flooding response, and disturbance images on polarimetric backscatter of repeat pass imagery potentially useful for structural collapse (earthquake), mud/land/debris-slides etc. We have also explored building vegetation and snow/ice classifiers, via support vector machines utilizing quad-pol backscatter, cross-pol phase, and a number of derivatives (radar vegetation index, dielectric estimates, etc.). We share our qualitative and quantitative results thus far.
Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)
1999-01-01
Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.
Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M
2010-11-01
Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.
Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K
2018-04-01
Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.
Quantitative Evaluation of a Planetary Renderer for Terrain Relative Navigation
NASA Astrophysics Data System (ADS)
Amoroso, E.; Jones, H.; Otten, N.; Wettergreen, D.; Whittaker, W.
2016-11-01
A ray-tracing computer renderer tool is presented based on LOLA and LROC elevation models and is quantitatively compared to LRO WAC and NAC images for photometric accuracy. We investigated using rendered images for terrain relative navigation.
Systematic mechanism-orientated approach to chronic pancreatitis pain
Bouwense, Stefan AW; de Vries, Marjan; Schreuder, Luuk TW; Olesen, Søren S; Frøkjær, Jens B; Drewes, Asbjørn M; van Goor, Harry; Wilder-Smith, Oliver HG
2015-01-01
Pain in chronic pancreatitis (CP) shows similarities with other visceral pain syndromes (i.e., inflammatory bowel disease and esophagitis), which should thus be managed in a similar fashion. Typical causes of CP pain include increased intrapancreatic pressure, pancreatic inflammation and pancreatic/extrapancreatic complications. Unfortunately, CP pain continues to be a major clinical challenge. It is recognized that ongoing pain may induce altered central pain processing, e.g., central sensitization or pro-nociceptive pain modulation. When this is present conventional pain treatment targeting the nociceptive focus, e.g., opioid analgesia or surgical/endoscopic intervention, often fails even if technically successful. If central nervous system pain processing is altered, specific treatment targeting these changes should be instituted (e.g., gabapentinoids, ketamine or tricyclic antidepressants). Suitable tools are now available to make altered central processing visible, including quantitative sensory testing, electroencephalograpy and (functional) magnetic resonance imaging. These techniques are potentially clinically useful diagnostic tools to analyze central pain processing and thus define optimum management approaches for pain in CP and other visceral pain syndromes. The present review proposes a systematic mechanism-orientated approach to pain management in CP based on a holistic view of the mechanisms involved. Future research should address the circumstances under which central nervous system pain processing changes in CP, and how this is influenced by ongoing nociceptive input and therapies. Thus we hope to predict which patients are at risk for developing chronic pain or not responding to therapy, leading to improved treatment of chronic pain in CP and other visceral pain disorders. PMID:25574079
Narayanan, Pradeep; Moulasha, K; Wheeler, Tisha; Baer, James; Bharadwaj, Sowmyaa; Ramanathan, T V; Thomas, Tom
2012-10-01
In a participatory approach to health and development interventions, defining and measuring community mobilisation is important, but it is challenging to do this effectively, especially at scale. A cross-sectional, participatory monitoring tool was administered in 2008-2009 and 2009-2010 across a representative sample of 25 community-based groups (CBGs) formed under the Avahan India AIDS Initiative, to assess their progress in mobilisation, and to inform efforts to strengthen the groups and make them sustainable. The survey used a weighted index to capture both qualitative and quantitative data in numeric form. The index permitted broad, as well as highly detailed, analysis of community mobilisation, relevant at the level of individual groups, as well as state-wide and across the whole programme. The survey demonstrated that leadership and programme management were the strongest areas among the CBGs, confirming the programme's investment in these areas. Discussion of the Round 1 results led to efforts to strengthen governance and democratic decision making in the groups, and progress was reflected in the Round 2 survey results. CBG engagement with state authorities to gain rights and entitlements and securing the long-term financial stability of groups remain a challenge. The survey has proven useful for informing the managers of programmes about what is happening on the ground, and it has opened spaces for discussion within community groups about the nature of leadership, decision making and their goals, which is leading to accelerated progress. The tool provided useful data to manage community mobilisation in Avahan.
Renault, Ilana Zalcberg; Scholl, Vanesa; Hassan, Rocio; Capelleti, Paola; de Lima, Marcos; Cortes, Jorge
2011-01-01
Tyrosine kinase inhibitors have changed the management and outcomes of chronic myeloid leukemia patients. Quantitative polymerase chain reaction is used to monitor molecular responses to tyrosine kinase inhibitors. Molecular monitoring represents the most sensitive tool to judge chronic myeloid leukemia disease course and allows early detection of relapse. Evidence of achieving molecular response is important for several reasons: 1. early molecular response is associated with major molecular response rates at 18-24 months; 2. patients achieving major molecular response are less likely to lose their complete cytogenetic response; 3. a durable, stable major molecular response is associated with increased progression-free survival. However, standardization of molecular techniques is still challenging. PMID:23049363
NASA Astrophysics Data System (ADS)
Aditya, B. R.; Permadi, A.
2018-03-01
This paper describes implementation of Unified Theory of Acceptance and User of Technology (UTAUT) model to assess the use of virtual classroom in support of teaching and learning in higher education. The purpose of this research is how virtual classroom that has fulfilled the basic principle can be accepted and used by students positively. This research methodology uses the quantitative and descriptive approach with a questionnaire as a tool for measuring the height of virtual classroom principle acception. This research uses a sample of 105 students in D3 Informatics Management at Telkom University. The result of this research is that the use of classroom virtual principle are positive and relevant to the students in higher education.
Reviewing the economic efficiency of disaster risk management
NASA Astrophysics Data System (ADS)
Mechler, Reinhard
2013-04-01
There is a lot of rhetoric suggesting that disaster risk management (DRM) pays, yet surprisingly little in the way of hard facts. Cost-benefit analysis (CBA) is one major tool that can provide quantitative information about the prioritization of disaster risk management (DRM) (and climate adaptation) based on economic principles. Yet, on a global scale, there has been surprisingly little robust evidence on the economic efficiency and benefits of risk management measures. This review shows that for the limited evidence reported the economic case for DRM across a range of hazards is strong and that the benefits of investing in DRM outweigh the costs of doing so, on average, by about four times the cost in terms of avoided and reduced losses. Most studies using a CBA approach focus on structural DRM and most information has been made available on physical flood prevention. There have been some limited studies on preparedness and risk financing. The global evidence base is limited and estimates appear not very solid, and overall, in line with the conclusion of the recent IPCC SREX report, there is limited evidence and medium agreement across the literature. Some of the factors behind the limited robustness are inherent to CBA more widely: these challenges comprise the inability to price intangibles, evaluating strategies rather than single projects, difficulties in assessing softer rather than infrastructure-related options, choices regarding a proper discount rate, lack of accounting for the distribution of benefits and costs and difficulties with assessing nonmarket values such as those related to health, the environment, or public goods. Although techniques exist to address some of these challenges, they are not very likely to easily go away. Other challenges associated specifically with DRM, such as the need and difficulty to undertake risk -based analysis can be overcome, and there have been manuals and reports providing a way forward. In an age of austerity, cost-benefit analysis continues to be an important tool for prioritising efficient DRM measures, yet with a shifting emphasis from infrastructure-based options (hard resilience) to preparedness and systemic interventions (soft resilience), other tools such as cost-effectiveness analysis, multi-criteria analysis and robust decision-making approaches deserve more attention.
Extended statistical entropy analysis as a quantitative management tool for water resource systems
NASA Astrophysics Data System (ADS)
Sobantka, Alicja; Rechberger, Helmut
2010-05-01
The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency of WWTPs with respect to the state-of-the-art of technology, waste water treatment could become more resources preserving.
Watershed Management Optimization Support Tool (WMOST) ...
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green infrastructure), wastewater, drinking water, and land conservation programs to find the least cost solutions. The pdf version of these presentations accompany the recorded webinar with closed captions to be posted on the WMOST web page. The webinar was recorded at the time a training workshop took place for EPA's Watershed Management Optimization Support Tool (WMOST, v2).
RNA interference tools for the western flower thrips, Frankliniella occidentalis.
Badillo-Vargas, Ismael E; Rotenberg, Dorith; Schneweis, Brandi A; Whitfield, Anna E
2015-05-01
The insect order Thysanoptera is exclusively comprised of small insects commonly known as thrips. The western flower thrips, Frankliniella occidentalis, is an economically important pest amongst thysanopterans due to extensive feeding damage and tospovirus transmission to hundreds of plant species worldwide. Geographically-distinct populations of F. occidentalis have developed resistance against many types of traditional chemical insecticides, and as such, management of thrips and tospoviruses are a persistent challenge in agriculture. Molecular methods for defining the role(s) of specific genes in thrips-tospovirus interactions and for assessing their potential as gene targets in thrips management strategies is currently lacking. The goal of this work was to develop an RNA interference (RNAi) tool that enables functional genomic assays and to evaluate RNAi for its potential as a biologically-based approach for controlling F. occidentalis. Using a microinjection system, we delivered double-stranded RNA (dsRNA) directly to the hemocoel of female thrips to target the vacuolar ATP synthase subunit B (V-ATPase-B) gene of F. occidentalis. Gene expression analysis using real-time quantitative reverse transcriptase-PCR (qRT-PCR) revealed significant reductions of V-ATPase-B transcripts at 2 and 3 days post-injection (dpi) with dsRNA of V-ATPase-B compared to injection with dsRNA of GFP. Furthermore, the effect of knockdown of the V-ATPase-B gene in females at these two time points was mirrored by the decreased abundance of V-ATPase-B protein as determined by quantitative analysis of Western blots. Reduction in V-ATPase-B expression in thrips resulted in increased female mortality and reduced fertility, i.e., number of viable offspring produced. Survivorship decreased significantly by six dpi compared to the dsRNA-GFP control group, which continued decreasing significantly until the end of the bioassay. Surviving female thrips injected with dsRNA-V-ATPase-B produced significantly fewer offspring compared to those in the dsRNA-GFP control group. Our findings indicate that an RNAi-based strategy to study gene function in thrips is feasible, can result in quantifiable phenotypes, and provides a much-needed tool for investigating the molecular mechanisms of thrips-tospovirus interactions. To our knowledge, this represents the first report of RNAi for any member of the insect order Thysanoptera and demonstrates the potential for translational research in the area of thrips pest control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Open Source for Knowledge and Learning Management: Strategies beyond Tools
ERIC Educational Resources Information Center
Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.
2007-01-01
In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…
Turn over folders: a proven tool in succession management planning.
Engells, Thomas E
2011-01-01
The dual challenges of succession management and succession management planning are considerable. A tool, the Turn over Folder, was introduced and described in detail as a useful first step in succession management planning. The adoption of that tool will not in itself produce a succession management plan, but it will orientate the organization and its members to the reality of succession management in all important leadership and critical positions. Succession management is an important consideration in all progressive organizations and well worth the effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less
Addison, P F E; Flander, L B; Cook, C N
2015-02-01
Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.
An Instructional Feedback Technique for Teaching Project Management Tools Aligned with PMBOK
ERIC Educational Resources Information Center
Gonçalves, Rafael Queiroz; von Wangenheim, Christiane Gresse; Hauck, Jean Carlo Rossa; Petri, Giani
2017-01-01
The management of contemporary software projects is unfeasible without the support of a Project Management (PM) tool. In order to enable the adoption of PM tools in practice, teaching its usage is important as part of computer education. Aiming at teaching PM tools, several approaches have been proposed, such as the development of educational PM…
Intelligent Model Management in a Forest Ecosystem Management Decision Support System
Donald Nute; Walter D. Potter; Frederick Maier; Jin Wang; Mark Twery; H. Michael Rauscher; Peter Knopp; Scott Thomasma; Mayukh Dass; Hajime Uchiyama
2002-01-01
Decision making for forest ecosystem management can include the use of a wide variety of modeling tools. These tools include vegetation growth models, wildlife models, silvicultural models, GIS, and visualization tools. NED-2 is a robust, intelligent, goal-driven decision support system that integrates tools in each of these categories. NED-2 uses a blackboard...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... Rule Change To Expand the Availability of Risk Management Tools November 30, 2012. Pursuant to Section... availability of a Risk Management Tool (the ``Tool'') currently made available in connection with sponsored... Sponsored Access Risk Management Tool.\\6\\ This optional service acts as a risk filter by causing the orders...
NASA Astrophysics Data System (ADS)
Liu, Y.; Zhou, J.; Song, L.; Zou, Q.; Guo, J.; Wang, Y.
2014-02-01
In recent years, an important development in flood management has been the focal shift from flood protection towards flood risk management. This change greatly promoted the progress of flood control research in a multidisciplinary way. Moreover, given the growing complexity and uncertainty in many decision situations of flood risk management, traditional methods, e.g., tight-coupling integration of one or more quantitative models, are not enough to provide decision support for managers. Within this context, this paper presents a beneficial methodological framework to enhance the effectiveness of decision support systems, through the dynamic adaptation of support regarding the needs of the decision-maker. In addition, we illustrate a loose-coupling technical prototype for integrating heterogeneous elements, such as multi-source data, multidisciplinary models, GIS tools and existing systems. The main innovation is the application of model-driven concepts, which put the system in a state of continuous iterative optimization. We define the new system as a model-driven decision support system (MDSS ). Two characteristics that differentiate the MDSS are as follows: (1) it is made accessible to non-technical specialists; and (2) it has a higher level of adaptability and compatibility. Furthermore, the MDSS was employed to manage the flood risk in the Jingjiang flood diversion area, located in central China near the Yangtze River. Compared with traditional solutions, we believe that this model-driven method is efficient, adaptable and flexible, and thus has bright prospects of application for comprehensive flood risk management.
Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K
2015-07-05
We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.
Petterson, S R
2016-02-01
The aim of this study was to develop a modified quantitative microbial risk assessment (QMRA) framework that could be applied as a decision support tool to choose between alternative drinking water interventions in the developing context. The impact of different household water treatment (HWT) interventions on the overall incidence of diarrheal disease and disability adjusted life years (DALYs) was estimated, without relying on source water pathogen concentration as the starting point for the analysis. A framework was developed and a software tool constructed and then implemented for an illustrative case study for Nepal based on published scientific data. Coagulation combined with free chlorine disinfection provided the greatest estimated health gains in the short term; however, when long-term compliance was incorporated into the calculations, the preferred intervention was porous ceramic filtration. The model demonstrates how the QMRA framework can be used to integrate evidence from different studies to inform management decisions, and in particular to prioritize the next best intervention with respect to estimated reduction in diarrheal incidence. This study only considered HWT interventions; it is recognized that a systematic consideration of sanitation, recreation, and drinking water pathways is important for effective management of waterborne transmission of pathogens, and the approach could be expanded to consider the broader water-related context. © 2015 Society for Risk Analysis.
More Analytical Tools for Fluids Management in Space
NASA Astrophysics Data System (ADS)
Weislogel, Mark
Continued advances during the 2000-2010 decade in the analysis of a class of capillary-driven flows relevant to materials processing and fluids management aboard spacecraft have been made. The class of flows addressed concern combined forced and spontaneous capillary flows in complex containers with interior edges. Such flows are commonplace in space-based fluid systems and arise from the particular container geometry and wetting properties of the system. Important applications for this work include low-g liquid fill and/or purge operations and passive fluid phase separation operations, where the container (i.e. fuel tank, water processer, etc.) geometry possesses interior edges, and where quantitative information of fluid location, transients, flow rates, and stability is critical. Examples include the storage and handling of liquid propellants and cryogens, water conditioning for life support, fluid phase-change thermal systems, materials processing in the liquid state, on-orbit biofluids processing, among others. For a growing number of important problems, closed-form expressions to transient three-dimensional flows are possible that, as design tools, replace difficult, time-consuming, and rarely performed numerical calculations. An overview of a selection of solutions in-hand is presented with example problems solved. NASA drop tower, low-g aircraft, and ISS flight ex-periment results are employed where practical to buttress the theoretical findings. The current review builds on a similar review presented at COSPAR, 2002, for the approximate decade 1990-2000.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-12
... serves the need for direct and quantitative measurement of our target population, and which, as a quantitative research tool has some major benefits: To focus on our target population of adults who use the...
Formalization of the engineering science discipline - knowledge engineering
NASA Astrophysics Data System (ADS)
Peng, Xiao
Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.
van der Meer, Victor; van Stel, Henk F; Detmar, Symone B; Otten, Wilma; Sterk, Peter J; Sont, Jacob K
2007-07-01
Internet and short message service are emerging tools for chronic disease management in adolescents, but few data exist on the barriers to and benefits of internet-based asthma self-management. Our objective was to reveal the barriers and benefits perceived by adolescents with well-controlled and poorly controlled asthma to current and internet-based asthma management. Ninety-seven adolescents with mild-to-moderate persistent asthma monitored their asthma control on a designated Web site. After 4 weeks, 35 adolescents participated in eight focus groups. Participants were stratified in terms of age, gender, and asthma control level. We used qualitative and quantitative methods to analyze the written focus group transcripts. Limited self-efficacy to control asthma was a significant barrier to current asthma management in adolescents with poor asthma control (65%) compared to adolescents with good asthma control (17%; p < 0.01). The former group revealed the following several benefits from internet-based asthma self-management: feasible electronic monitoring; easily accessible information; e-mail communication; and use of an electronic action plan. Personal benefits included the ability to react to change and to optimize asthma control. Patients with poor asthma control were able and ready to incorporate internet-based asthma self-management for a long period of time (65%), whereas patients with good control were not (11%; p < 0.01). Our findings reveal a need for the support of self-management in adolescents with poorly controlled asthma that can be met by the application of novel information and communication technologies. Internet-based self-management should therefore target adolescents with poor asthma control.
Management of fish populations in large rivers: a review of tools and approaches
Petts, Geoffrey E.; Imhoff, Jack G.; Manny, Bruce A.; Maher, John F. B.; Weisberg, Stephen B.
1989-01-01
In common with most branches of science, the management of riverine fish populations is characterised by reductionist and isolationist philosophies. Traditional fish management focuses on stocking and controls on fishing. This paper presents a concensus of scientists involved in the LARS workshop on the management of fish populations in large rivers. A move towards a more holistic philosophy is advocated, with fish management forming an integral part of sustainable river development. Based upon a questionnaire survey of LARS members, with wide-ranging expertise and experience from all parts of the world, lists of management tools currently in use are presented. Four categories of tools are described: flow, water-quality, habitat, and biological. The potential applications of tools for fish management in large rivers is discussed and research needs are identified. The lack of scientific evaluations of the different tools remains the major constraint to their wider application.
Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.
2012-01-01
Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616
Cost/Schedule Control Systems Criteria: A Reference Guide to C/SCSC information
1992-09-01
Smith, Larry A. "Mainframe ARTEMIS: More than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988...A. "Mainframe ARTEMIS: More than a Project Management Tool - Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 14...than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 17. Trufant, Thomas M. and Robert
Initial development of a practical safety audit tool to assess fleet safety management practices.
Mitchell, Rebecca; Friswell, Rena; Mooren, Lori
2012-07-01
Work-related vehicle crashes are a common cause of occupational injury. Yet, there are few studies that investigate management practices used for light vehicle fleets (i.e. vehicles less than 4.5 tonnes). One of the impediments to obtaining and sharing information on effective fleet safety management is the lack of an evidence-based, standardised measurement tool. This article describes the initial development of an audit tool to assess fleet safety management practices in light vehicle fleets. The audit tool was developed by triangulating information from a review of the literature on fleet safety management practices and from semi-structured interviews with 15 fleet managers and 21 fleet drivers. A preliminary useability assessment was conducted with 5 organisations. The audit tool assesses the management of fleet safety against five core categories: (1) management, systems and processes; (2) monitoring and assessment; (3) employee recruitment, training and education; (4) vehicle technology, selection and maintenance; and (5) vehicle journeys. Each of these core categories has between 1 and 3 sub-categories. Organisations are rated at one of 4 levels on each sub-category. The fleet safety management audit tool is designed to identify the extent to which fleet safety is managed in an organisation against best practice. It is intended that the audit tool be used to conduct audits within an organisation to provide an indicator of progress in managing fleet safety and to consistently benchmark performance against other organisations. Application of the tool by fleet safety researchers is now needed to inform its further development and refinement and to permit psychometric evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines
ERIC Educational Resources Information Center
Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.
2015-01-01
Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…
The QATSDD critical appraisal tool: comments and critiques.
Fenton, Lara; Lauckner, Heidi; Gilbert, Robert
2015-12-01
The aim of this research note is to reflect on the effectiveness of the QATSDD tool for its intended use in critical appraisals of synthesis work such as integrative reviews. A seven-member research team undertook a critical appraisal of qualitative and quantitative studies using the QATSDD. We believe that the tool can spur useful dialogue among researchers and increase in-depth understanding of reviewed papers, including the strengths and limitations of the literature. To increase the clarity of the process, we suggest further definition of the language in each indicator and inclusion of explicit examples for each criterion. We would also like to see the authors outline clear parameters around the use of the tool, essentially stating that the tool should be used in synthesis work for studies of mixed methods or work that includes qualitative and quantitative research informed by a positivist paradigm. In the context of an appropriate team composition, the tool can be a useful mechanism for guiding people who are coming together to discuss the merits of studies across multiple methodologies and disciplines. © 2015 John Wiley & Sons, Ltd.
ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.
ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.
Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)
NASA Technical Reports Server (NTRS)
Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis;
2011-01-01
Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duren, Mike; Aldridge, Hal; Abercrombie, Robert K
2013-01-01
Compromises attributable to the Advanced Persistent Threat (APT) highlight the necessity for constant vigilance. The APT provides a new perspective on security metrics (e.g., statistics based cyber security) and quantitative risk assessments. We consider design principals and models/tools that provide high assurance for energy delivery systems (EDS) operations regardless of the state of compromise. Cryptographic keys must be securely exchanged, then held and protected on either end of a communications link. This is challenging for a utility with numerous substations that must secure the intelligent electronic devices (IEDs) that may comprise complex control system of systems. For example, distribution andmore » management of keys among the millions of intelligent meters within the Advanced Metering Infrastructure (AMI) is being implemented as part of the National Smart Grid initiative. Without a means for a secure cryptographic key management system (CKMS) no cryptographic solution can be widely deployed to protect the EDS infrastructure from cyber-attack. We consider 1) how security modeling is applied to key management and cyber security concerns on a continuous basis from design through operation, 2) how trusted models and key management architectures greatly impact failure scenarios, and 3) how hardware-enabled trust is a critical element to detecting, surviving, and recovering from attack.« less
Connecting the Dots: Responses of Coastal Ecosystems to Changing Nutrient Concentrations
2011-01-01
Empirical relationships between phytoplankton biomass and nutrient concentrations established across a wide range of different ecosystems constitute fundamental quantitative tools for predicting effects of nutrient management plans. Nutrient management plans based on such relationships, mostly established over trends of increasing rather than decreasing nutrient concentrations, assume full reversibility of coastal eutrophication. Monitoring data from 28 ecosystems located in four well-studied regions were analyzed to study the generality of chlorophyll a versus nutrient relationships and their applicability for ecosystem management. We demonstrate significant differences across regions as well as between specific coastal ecosystems within regions in the response of chlorophyll a to changing nitrogen concentrations. We also show that the chlorophyll a versus nitrogen relationships over time constitute convoluted trajectories rather than simple unique relationships. The ratio of chlorophyll a to total nitrogen almost doubled over the last 30–40 years across all regions. The uniformity of these trends, or shifting baselines, suggest they may result from large-scale changes, possibly associated with global climate change and increasing human stress on coastal ecosystems. Ecosystem management must, therefore, develop adaptation strategies to face shifting baselines and maintain ecosystem services at a sustainable level rather than striving to restore an ecosystem state of the past. PMID:21958109
Management Tools in Engineering Education.
ERIC Educational Resources Information Center
Fehr, M.
1999-01-01
Describes a teaching model that applies management tools such as delegation, total quality management, time management, teamwork, and Deming rules. Promotes the advantages of efficiency, reporting, independent scheduling, and quality. (SK)
Towards the Development of an Intimate Partner Violence Screening Tool for Gay and Bisexual Men
Stephenson, Rob; Hall, Casey D.; Williams, Whitney; Sato, Kimi; Finneran, Catherine
2013-01-01
Introduction: Recent research suggests that gay and bisexual men experience intimate partner violence (IPV) at rates comparable to heterosexual women. However, current screening tools used to identify persons experiencing IPV were largely created for use with heterosexual women. Given the high prevalence of IPV among gay and bisexual men in the United States, the lack of IPV screening tools that reflect the lived realities of gay and bisexual men is problematic.This paper describes the development of a short-form IPV screening tool intended to be used with gay and bisexual men. Methods: A novel definition of IPV, informed by formative Focus Group Discussions, was derived from a quantitative survey of approximately 1,100 venue-recruited gay and bisexual men. From this new definition, a draft IPV screening tool was created. After expert review (n=13) and cognitive interviews with gay and bisexual men (n=47), a screening tool of six questions was finalized.A national, online-recruited sample (n=822) was used to compare rates of IPV identified by the novel tool and current standard tools. Results: The six-item, short-form tool created through the six-stage research process captured a significantly higher prevalence of recent experience of IPV compared to a current and commonly used screening tool (30.7% versus 7.5%, p<0.05). The novel short-form tool described additional domains of IPV not currently found in screening tools, including monitoring behaviors, controlling behaviors, and HIV-related IPV. The screener takes less than five minutes to complete and is 6th grade reading level. Conclusion: Gay and bisexual men experiencing IPV must first be identified before services can reach them. Given emergent literature that demonstrates the high prevalence of IPV among gay and bisexual men and the known adverse health sequela of experiencing IPV, this novel screening tool may allow for the quick identification of men experiencing IPV and the opportunity for referrals for the synergistic management of IPV. Future work should focus on implementing this tool in primary or acute care settings in order to determine its acceptability and its feasibility of use more broadly. PMID:23997849
Towards the development of an intimate partner violence screening tool for gay and bisexual men.
Stephenson, Rob; Hall, Casey D; Williams, Whitney; Sato, Kimi; Finneran, Catherine
2013-08-01
Recent research suggests that gay and bisexual men experience intimate partner violence (IPV) at rates comparable to heterosexual women. However, current screening tools used to identify persons experiencing IPV were largely created for use with heterosexual women. Given the high prevalence of IPV among gay and bisexual men in the United States, the lack of IPV screening tools that reflect the lived realities of gay and bisexual men is problematic.This paper describes the development of a short-form IPV screening tool intended to be used with gay and bisexual men. A novel definition of IPV, informed by formative Focus Group Discussions, was derived from a quantitative survey of approximately 1,100 venue-recruited gay and bisexual men. From this new definition, a draft IPV screening tool was created. After expert review (n=13) and cognitive interviews with gay and bisexual men (n=47), a screening tool of six questions was finalized.A national, online-recruited sample (n=822) was used to compare rates of IPV identified by the novel tool and current standard tools. The six-item, short-form tool created through the six-stage research process captured a significantly higher prevalence of recent experience of IPV compared to a current and commonly used screening tool (30.7% versus 7.5%, p<0.05). The novel short-form tool described additional domains of IPV not currently found in screening tools, including monitoring behaviors, controlling behaviors, and HIV-related IPV. The screener takes less than five minutes to complete and is 6th grade reading level. Gay and bisexual men experiencing IPV must first be identified before services can reach them. Given emergent literature that demonstrates the high prevalence of IPV among gay and bisexual men and the known adverse health sequela of experiencing IPV, this novel screening tool may allow for the quick identification of men experiencing IPV and the opportunity for referrals for the synergistic management of IPV. Future work should focus on implementing this tool in primary or acute care settings in order to determine its acceptability and its feasibility of use more broadly.
Whalen, Kimberly J; Buchholz, Susan W
The overall objective of this review is to quantitatively measure the psychometric properties and the feasibility of caregiver burden screening tools. The more specific objectives were to determine the reliability, validity as well as feasibility of tools that are used to screen for caregiver burden and strain. This review considered international quantitative research papers that addressed the psychometric properties and feasibility of caregiver burden screening tools. The search strategy aimed to find both published and unpublished studies from 1980-2007 published only in the English language. An initial limited search of MEDLINE and CINAHL was undertaken followed by analysis of the text words contained in the title and abstract and the index terms used to describe the article. A second search identified keywords and index terms across major databases. Third, the reference list of identified reports and articles was searched for additional studies. Each paper was assessed by two independent reviewers for methodological quality prior to inclusion in the review using an appropriate critical appraisal instrument from the Joanna Briggs Institutes' System for the Unified Management, Assessment and Review (SUMARI) package. Because burden is a multidimensional construct defined internationally with a multitude of other terms, only those studies whose title, abstract or keywords contained the search terminology developed for this review were identified for retrieval. The construct of caregiver burden is not standardized, and many terms are used to describe burden. A caregiver is also identified as a carer. Instruments exist in multiple languages and have been tested in multiple populations. A total of 112 papers, experimental and non-experimental in nature, were included in the review. The majority of papers were non-experimental studies that tested or used a caregiver burden screening tool. Because of the nature of these papers, a meta-analysis of the results was not possible. Instead a table is used to depict the 74 caregiver burden screening tools that meet the psychometric and feasibility standards of this review. The Zarit Burden Interview (ZBI), in particular the 22-item version, has been examined the most throughout the literature. In addition to its sound psychometric properties, the ZBI has been widely used across languages and cultures. The significant amount of research that has already been done on psychometric testing of caregiver burden tools has provided a solid foundation for additional research. Although some tools have been well tested, many tools have published limited psychometric properties and feasibility data. The clinician needs to be aware of this and may need to team up with a researcher to obtain additional research data on their specific population before using a minimally tested caregiver burden screening tool. Because caregiver burden is multidimensional and many different terms are used to describe burden, both the clinician and researcher need to be precise in their selection of the appropriate tool for their work.
Pepin, K M; Spackman, E; Brown, J D; Pabilonia, K L; Garber, L P; Weaver, J T; Kennedy, D A; Patyk, K A; Huyvaert, K P; Miller, R S; Franklin, A B; Pedersen, K; Bogich, T L; Rohani, P; Shriner, S A; Webb, C T; Riley, S
2014-03-01
Wild birds are the primary source of genetic diversity for influenza A viruses that eventually emerge in poultry and humans. Much progress has been made in the descriptive ecology of avian influenza viruses (AIVs), but contributions are less evident from quantitative studies (e.g., those including disease dynamic models). Transmission between host species, individuals and flocks has not been measured with sufficient accuracy to allow robust quantitative evaluation of alternate control protocols. We focused on the United States of America (USA) as a case study for determining the state of our quantitative knowledge of potential AIV emergence processes from wild hosts to poultry. We identified priorities for quantitative research that would build on existing tools for responding to AIV in poultry and concluded that the following knowledge gaps can be addressed with current empirical data: (1) quantification of the spatio-temporal relationships between AIV prevalence in wild hosts and poultry populations, (2) understanding how the structure of different poultry sectors impacts within-flock transmission, (3) determining mechanisms and rates of between-farm spread, and (4) validating current policy-decision tools with data. The modeling studies we recommend will improve our mechanistic understanding of potential AIV transmission patterns in USA poultry, leading to improved measures of accuracy and reduced uncertainty when evaluating alternative control strategies. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Reilly, B. T.; Stoner, J. S.; Wiest, J.
2017-08-01
Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
The role of risk-based prioritization in total quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-10-01
The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approachmore » - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.« less
Mitchell, M.S.; Rutzmoser, S.H.; Wigley, T.B.; Loehle, C.; Gerwin, J.A.; Keyser, P.D.; Lancia, R.A.; Perry, R.W.; Reynolds, C.J.; Thill, R.E.; Weih, R.; White, D.; Wood, P.B.
2006-01-01
Little is known about factors that structure biodiversity on landscape scales, yet current land management protocols, such as forest certification programs, place an increasing emphasis on managing for sustainable biodiversity at landscape scales. We used a replicated landscape study to evaluate relationships between forest structure and avian diversity at both stand and landscape-levels. We used data on bird communities collected under comparable sampling protocols on four managed forests located across the Southeastern US to develop logistic regression models describing relationships between habitat factors and the distribution of overall richness and richness of selected guilds. Landscape models generated for eight of nine guilds showed a strong relationship between richness and both availability and configuration of landscape features. Diversity of topographic features and heterogeneity of forest structure were primary determinants of avian species richness. Forest heterogeneity, in both age and forest type, were strongly and positively associated with overall avian richness and richness for most guilds. Road density was associated positively but weakly with avian richness. Landscape variables dominated all models generated, but no consistent patterns in metrics or scale were evident. Model fit was strong for neotropical migrants and relatively weak for short-distance migrants and resident species. Our models provide a tool that will allow managers to evaluate and demonstrate quantitatively how management practices affect avian diversity on landscapes.
Toolkit of Available EPA Green Infrastructure Modeling ...
This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).
Geographic Information System Tools for Conservation Planning: User's Manual
Fox, Timothy J.; Rohweder, Jason J.; Kenow, K.P.; Korschgen, C.E.; DeHaan, H.C.
2003-01-01
Public and private land managers desire better ways to incorporate landscape, species, and habitat relations into their conservation planning processes. We present three tools, developed for the Environmental Systems Research Institute?s ArcView 3.x platform, applicable to many types of wildlife conservation management and planning efforts. These tools provide managers and planners with the ability to rapidly assess landscape attributes and link these attributes with species-habitat information. To use the tools, the user provides a detailed land cover spatial database and develops a matrix to identify species-habitat relations for the landscape of interest. The tools are applicable to any taxa or suite of taxa for which the required data are available. The user also has the ability to interactively make polygon-specific changes to the landscape and re-examine species-habitat relations. The development of these tools has given resource managers the means to evaluate the merits of proposed landscape management scenarios and to choose the scenario that best fits the goals of the managed area.
Oster, Ryan J; Wijesinghe, Rasanthi U; Haack, Sheridan K; Fogarty, Lisa R; Tucker, Taaja R; Riley, Stephen C
2014-12-16
Quantitative assessment of bacterial pathogens, their geographic variability, and distribution in various matrices at Great Lakes beaches are limited. Quantitative PCR (qPCR) was used to test for genes from E. coli O157:H7 (eaeO157), shiga-toxin producing E. coli (stx2), Campylobacter jejuni (mapA), Shigella spp. (ipaH), and a Salmonella enterica-specific (SE) DNA sequence at seven Great Lakes beaches, in algae, water, and sediment. Overall, detection frequencies were mapA>stx2>ipaH>SE>eaeO157. Results were highly variable among beaches and matrices; some correlations with environmental conditions were observed for mapA, stx2, and ipaH detections. Beach seasonal mean mapA abundance in water was correlated with beach seasonal mean log10 E. coli concentration. At one beach, stx2 gene abundance was positively correlated with concurrent daily E. coli concentrations. Concentration distributions for stx2, ipaH, and mapA within algae, sediment, and water were statistically different (Non-Detect and Data Analysis in R). Assuming 10, 50, or 100% of gene copies represented viable and presumably infective cells, a quantitative microbial risk assessment tool developed by Michigan State University indicated a moderate probability of illness for Campylobacter jejuni at the study beaches, especially where recreational water quality criteria were exceeded. Pathogen gene quantification may be useful for beach water quality management.
NASA Astrophysics Data System (ADS)
Choi, Woo June; Pepple, Kathryn L.; Zhi, Zhongwei; Wang, Ruikang K.
2015-01-01
Uveitis models in rodents are important in the investigation of pathogenesis in human uveitis and the development of appropriate therapeutic strategies for treatment. Quantitative monitoring of ocular inflammation in small animal models provides an objective metric to assess uveitis progression and/or therapeutic effects. We present a new application of optical coherence tomography (OCT) and OCT-based microangiography (OMAG) to a rat model of acute anterior uveitis induced by intravitreal injection of a killed mycobacterial extract. OCT/OMAG is used to provide noninvasive three-dimensional imaging of the anterior segment of the eyes prior to injection (baseline) and two days post-injection (peak inflammation) in rats with and without steroid treatments. OCT imaging identifies characteristic structural and vascular changes in the anterior segment of the inflamed animals when compared to baseline images. Characteristics of inflammation identified include anterior chamber cells, corneal edema, pupillary membranes, and iris vasodilation. In contrast, no significant difference from the control is observed for the steroid-treated eye. These findings are compared with the histology assessment of the same eyes. In addition, quantitative measurements of central corneal thickness and iris vessel diameter are determined. This pilot study demonstrates that OCT-based microangiography promises to be a useful tool for the assessment and management of uveitis in vivo.
Lighthall, Geoffrey K; Bahmani, Dona; Gaba, David
2016-02-01
Classroom lectures are the mainstay of imparting knowledge in a structured manner and have the additional goals of stimulating critical thinking, lifelong learning, and improvements in patient care. The impact of lectures on patient care is difficult to examine in critical care because of the heterogeneity in patient conditions and personnel as well as confounders such as time pressure, interruptions, fatigue, and nonstandardized observation methods. The critical care environment was recreated in a simulation laboratory using a high-fidelity mannequin simulator, where a mannequin simulator with a standardized script for septic shock was presented to trainees. The reproducibility of this patient and associated conditions allowed the evaluation of "clinical performance" in the management of septic shock. In a previous study, we developed and validated tools for the quantitative analysis of house staff managing septic shock simulations. In the present analysis, we examined whether measures of clinical performance were improved in those cases where a lecture on the management of shock preceded a simulated exercise on the management of septic shock. The administration of the septic shock simulations allowed for performance measurements to be calculated for both medical interns and for subsequent management by a larger resident-led team. The analysis revealed that receiving a lecture on shock before managing a simulated patient with septic shock did not produce scores higher than for those who did not receive the previous lecture. This result was similar for both interns managing the patient and for subsequent management by a resident-led team. We failed to find an immediate impact on clinical performance in simulations of septic shock after a lecture on the management of this syndrome. Lectures are likely not a reliable sole method for improving clinical performance in the management of complex disease processes.
ERIC Educational Resources Information Center
European Training Foundation, Turin (Italy).
This document presents a management tool kit on training needs assessment and program design for countries in transition to a market economy. Chapter 1 describes the tool's development within the framework of the project called Strengthening of Partnership between Management Training Institutions and Companies, Ukraine-Kazakhstan-Kyrgyzstan.…
Barnett, Carolina; Merkies, Ingemar S J; Katzberg, Hans; Bril, Vera
2015-09-02
The Quantitative Myasthenia Gravis Score and the Myasthenia Gravis Composite are two commonly used outcome measures in Myasthenia Gravis. So far, their measurement properties have not been compared, so we aimed to study their psychometric properties using the Rasch model. 251 patients with stable myasthenia gravis were assessed with both scales, and 211 patients returned for a second assessment. We studied fit to the Rasch model at the first visit, and compared item fit, thresholds, differential item functioning, local dependence, person separation index, and tests for unidimensionality. We also assessed test-retest reliability and estimated the Minimal Detectable Change. Neither scale fit the Rasch model (X2p < 0.05). The Myasthenia Gravis Composite had lower discrimination properties than the Quantitative Myasthenia Gravis Scale (Person Separation Index: 0.14 and 0.7). There was local dependence in both scales, as well as differential item functioning for ocular and generalized disease. Disordered thresholds were found in 6(60%) items of the Myasthenia Gravis Composite and in 4(31%) of the Quantitative Myasthenia Gravis Score. Both tools had adequate test-retest reliability (ICCs >0.8). The minimally detectable change was 4.9 points for the Myasthenia Gravis Composite and 4.3 points for the Quantitative Myasthenia Gravis Score. Neither scale fulfilled Rasch model expectations. The Quantitative Myasthenia Gravis Score has higher discrimination than the Myasthenia Gravis Composite. Both tools have items with disordered thresholds, differential item functioning and local dependency. There was evidence of multidimensionality in the QMGS. The minimal detectable change values are higher than previous studies on the minimal significant change. These findings might inform future modifications of these tools.
NASA Astrophysics Data System (ADS)
Loisel, J.; Harden, J. W.; Hugelius, G.
2017-12-01
What are the most important soil services valued by land stewards and planners? Which soil-data metrics can be used to quantify each soil service? What are the steps required to quantitatively index the baseline value of soil services and their vulnerability under different land-use and climate change scenarios? How do we simulate future soil service pathways (or trajectories) under changing management regimes using process-based ecosystem models? What is the potential cost (economic, social, and other) of soil degradation under these scenarios? How sensitive or resilient are soil services to prescribed management practices, and how does sensitivity vary over space and time? We are bringing together a group of scientists and conservation organizations to answer these questions by launching Soil Banker, an open and flexible tool to quantify soil services that can be used at any scale, and by any stakeholder. Our overarching goals are to develop metrics and indices to quantify peatland soil ecosystem services, monitor change of these services, and guide management. This paper describes our methodology applied to peatlands and presents two case studies (Indonesia and Patagonia) demonstrating how Peatland Soil Banker can be deployed as an accounting tool of peatland stocks, a quantitative measure of peatland health, and as a projection of peatland degradation or enhancement under different land-use cases. Why peatlands? They store about 600 billion tons of carbon that account for ⅓ of the world's soil carbon. Peatlands have dynamic GHG exchanges of CO2, CH4, and NOx with the atmosphere, which plays a role in regulating global climate; studies indicate that peatland degradation releases about 2-3 billion tons of CO2 to the atmosphere annually. These ecosystems also provide local and regional ecosystem services: they constitute important components of the N and P cycles, store about 10% of the world's freshwater and buffer large fluxes of freshwater on an annual basis; they also support much biodiversity, including iconic species such as the orangutan in Indonesia and the guanaco in Chile. While these ecosystem services have been recognized in many sectors and a voluntary standard for a peatland carbon market is emerging, peatland services have not been systematically quantified, or accounted for, at the global level.
ERIC Educational Resources Information Center
Castillo, Enrico G.; Pincus, Harold A.; Wieland, Melissa; Roter, Debra; Larson, Susan; Houck, Patricia; Reynolds, Charles F.; Cruz, Mario
2012-01-01
Objective: The authors quantitatively examined differences in psychiatric residents' and attending physicians' communication profiles and voice tones. Methods: Audiotaped recordings of 49 resident-patient and 35 attending-patient medication-management appointments at four ambulatory sites were analyzed with the Roter Interaction Analysis System…
Implementation of Quality Management in Core Service Laboratories
Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.
2010-01-01
CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.
Simulating Heterogeneous Infiltration and Contaminant leaching Processes at Chalk River, Ontario
NASA Astrophysics Data System (ADS)
Ali, M. A.; Ireson, A. M.; Keim, D.
2015-12-01
A study is conducted at a waste management area in Chalk River, Ontario to characterize flow and contaminant transport with the aim of contributing to improved hydrogeological risk assessment in the context of waste management. Field monitoring has been performed to gain insights into the unsaturated zone characteristics, moisture dynamics, and contaminant transport rates. The objective is to provide quantitative estimates of surface fluxes (quantification of infiltration and evaporation) and investigations of unsaturated zone processes controlling water infiltration and spatial variability in head distributions and flow rates. One particular issue is to examine the effectiveness of the clayey soil cap installed to prevent infiltration of water into the waste repository and the top sand soil cover above the clayey layer to divert the infiltrated water laterally. The spatial variability in the unsaturated zone properties and associated effects on water flow and contaminant transport observed at the site, have led to a concerted effort to develop improved model of flow and transport based on stochastic concepts. Results obtained through the unsaturated zone model investigations are combined with the hydrogeological and geochemical components and develop predictive tools to assess the long term fate of the contaminants at the waste management site.
Balancing water scarcity and quality for sustainable irrigated agriculture
NASA Astrophysics Data System (ADS)
Assouline, Shmuel; Russo, David; Silber, Avner; Or, Dani
2015-05-01
The challenge of meeting the projected doubling of global demand for food by 2050 is monumental. It is further exacerbated by the limited prospects for land expansion and rapidly dwindling water resources. A promising strategy for increasing crop yields per unit land requires the expansion of irrigated agriculture and the harnessing of water sources previously considered "marginal" (saline, treated effluent, and desalinated water). Such an expansion, however, must carefully consider potential long-term risks on soil hydroecological functioning. The study provides critical analyses of use of marginal water and management approaches to map out potential risks. Long-term application of treated effluent (TE) for irrigation has shown adverse impacts on soil transport properties, and introduces certain health risks due to the persistent exposure of soil biota to anthropogenic compounds (e.g., promoting antibiotic resistance). The availability of desalinated water (DS) for irrigation expands management options and improves yields while reducing irrigation amounts and salt loading into the soil. Quantitative models are used to delineate trends associated with long-term use of TE and DS considering agricultural, hydrological, and environmental aspects. The primary challenges to the sustainability of agroecosystems lies with the hazards of saline and sodic conditions, and the unintended consequences on soil hydroecological functioning. Multidisciplinary approaches that combine new scientific knowhow with legislative, economic, and societal tools are required to ensure safe and sustainable use of water resources of different qualities. The new scientific knowhow should provide quantitative models for integrating key biophysical processes with ecological interactions at appropriate spatial and temporal scales.
Fernandes, Francisco S; Godoy, Wesley A C; Ramalho, Francisco S; Garcia, Adriano G; Santos, Bárbara D B; Malaquias, José B
2018-01-01
Population dynamics of aphids have been studied in sole and intercropping systems. These studies have required the use of more precise analytical tools in order to better understand patterns in quantitative data. Mathematical models are among the most important tools to explain the dynamics of insect populations. This study investigated the population dynamics of aphids Aphis gossypii and Aphis craccivora over time, using mathematical models composed of a set of differential equations as a helpful analytical tool to understand the population dynamics of aphids in arrangements of cotton and cowpea. The treatments were sole cotton, sole cowpea, and three arrangements of cotton intercropped with cowpea (t1, t2 and t3). The plants were infested with two aphid species and were evaluated at 7, 14, 28, 35, 42, and 49 days after the infestations. Mathematical models were used to fit the population dynamics of two aphid species. There were good fits for aphid dynamics by mathematical model over time. The highest population peak of both species A. gossypii and A. craccivora was found in the sole crops, and the lowest population peak was found in crop system t2. These results are important for integrated management programs of aphids in cotton and cowpea.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
Bishai, David; Sherry, Melissa; Pereira, Claudia C; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N
2016-01-01
This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their performance of the essential public health functions. Development began with a consensus-building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country's health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers' perception of the usefulness of the approach. Country stakeholders were able to develop consensus around 11 essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to upcode during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by the Ministry of Health or external donors in the African region for monitoring the district-level performance of the essential public health functions.
Bishai, David; Sherry, Melissa; Pereira, Claudia C.; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N.
2018-01-01
Introduction This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their own performance of the essential public health functions. Methods Development began with a consensus building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country’s health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers’ perception of the usefulness of the approach. Results Country stakeholders were able to develop consensus around eleven essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to up code during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. Conclusions The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by ministry of health or external donors in the African region for monitoring the district level performance of the essential public health functions. PMID:27682727