Development and application of air quality models at the U.S. EPA
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Resear...
Development and application of air quality models at the US ...
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa
2016-01-01
Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.
Uncertainty, ensembles and air quality dispersion modeling: applications and challenges
NASA Astrophysics Data System (ADS)
Dabberdt, Walter F.; Miller, Erik
The past two decades have seen significant advances in mesoscale meteorological modeling research and applications, such as the development of sophisticated and now widely used advanced mesoscale prognostic models, large eddy simulation models, four-dimensional data assimilation, adjoint models, adaptive and targeted observational strategies, and ensemble and probabilistic forecasts. Some of these advances are now being applied to urban air quality modeling and applications. Looking forward, it is anticipated that the high-priority air quality issues for the near-to-intermediate future will likely include: (1) routine operational forecasting of adverse air quality episodes; (2) real-time high-level support to emergency response activities; and (3) quantification of model uncertainty. Special attention is focused here on the quantification of model uncertainty through the use of ensemble simulations. Application to emergency-response dispersion modeling is illustrated using an actual event that involved the accidental release of the toxic chemical oleum. Both surface footprints of mass concentration and the associated probability distributions at individual receptors are seen to provide valuable quantitative indicators of the range of expected concentrations and their associated uncertainty.
Innovations in projecting emissions for air quality modeling
Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality mana...
[Service quality in health care: the application of the results of marketing research].
Verheggen, F W; Harteloh, P P
1993-01-01
This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.
Real-time video quality monitoring
NASA Astrophysics Data System (ADS)
Liu, Tao; Narvekar, Niranjan; Wang, Beibei; Ding, Ran; Zou, Dekun; Cash, Glenn; Bhagavathy, Sitaram; Bloom, Jeffrey
2011-12-01
The ITU-T Recommendation G.1070 is a standardized opinion model for video telephony applications that uses video bitrate, frame rate, and packet-loss rate to measure the video quality. However, this model was original designed as an offline quality planning tool. It cannot be directly used for quality monitoring since the above three input parameters are not readily available within a network or at the decoder. And there is a great room for the performance improvement of this quality metric. In this article, we present a real-time video quality monitoring solution based on this Recommendation. We first propose a scheme to efficiently estimate the three parameters from video bitstreams, so that it can be used as a real-time video quality monitoring tool. Furthermore, an enhanced algorithm based on the G.1070 model that provides more accurate quality prediction is proposed. Finally, to use this metric in real-world applications, we present an example emerging application of real-time quality measurement to the management of transmitted videos, especially those delivered to mobile devices.
Characterization of spatial variability of air pollutants in an urban setting at fine scales is critical for improved air toxics exposure assessments, for model evaluation studies and also for air quality regulatory applications. For this study, we investigate an approach that su...
Meteorological Processes Affecting Air Quality – Research and Model Development Needs
Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...
ERIC Educational Resources Information Center
Nworji, Alexander O.
2013-01-01
Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…
Measuring the Perceived Quality of an AR-Based Learning Application: A Multidimensional Model
ERIC Educational Resources Information Center
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragos Daniel
2017-01-01
Augmented reality (AR) technologies could enhance learning in several ways. The quality of an AR-based educational platform is a combination of key features that manifests in usability, usefulness, and enjoyment for the learner. In this paper, we present a multidimensional model to measure the quality of an AR-based application as perceived by…
ERIC Educational Resources Information Center
Saavedra, Pedro; Kuchak, JoAnn
An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…
[Review on HSPF model for simulation of hydrology and water quality processes].
Li, Zhao-fu; Liu, Hong-Yu; Li, Yan
2012-07-01
Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.
U.S. EPA MODELS-3/CMAQ - STATUS AND APPLICATIONS
An advanced third-generation air quality modeling system has been developed by the Atmospheric Modeling Division of the U.S. EPA. The air quality simulation model at the heart of the system is known as the Community Multiscale Air Quality (CMAQ) Model. It is comprehensive in ...
THE EMERGENCE OF NUMERICAL AIR QUALITY FORECASTING MODELS AND THEIR APPLICATION
In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...
THE EMERGENCE OF NUMERICAL AIR QUALITY FORCASTING MODELS AND THEIR APPLICATIONS
In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...
[Watershed water environment pollution models and their applications: a review].
Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang
2013-10-01
Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.
Land Surface Process and Air Quality Research and Applications at MSFC
NASA Technical Reports Server (NTRS)
Quattrochi, Dale; Khan, Maudood
2007-01-01
This viewgraph presentation provides an overview of land surface process and air quality research at MSFC including atmospheric modeling and ongoing research whose objective is to undertake a comprehensive spatiotemporal analysis of the effects of accurate land surface characterization on atmospheric modeling results, and public health applications. Land use maps as well as 10 meter air temperature, surface wind, PBL mean difference heights, NOx, ozone, and O3+NO2 plots as well as spatial growth model outputs are included. Emissions and general air quality modeling are also discussed.
A YEAR-LONG MM5 EVALUATION USING A MODEL EVALUATION TOOLKIT
Air quality modeling has expanded in both sophistication and application over the past decade. Meteorological and air quality modeling tools are being used for research, forecasting, and regulatory related emission control strategies. Results from air quality simulations have far...
ERIC Educational Resources Information Center
Saavedra, Pedro; And Others
Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…
Application of an IRT Polytomous Model for Measuring Health Related Quality of Life
ERIC Educational Resources Information Center
Tejada, Antonio J. Rojas; Rojas, Oscar M. Lozano
2005-01-01
Background: The Item Response Theory (IRT) has advantages for measuring Health Related Quality of Life (HRQOL) as opposed to the Classical Tests Theory (CTT). Objectives: To present the results of the application of a polytomous model based on IRT, specifically, the Rating Scale Model (RSM), to measure HRQOL with the EORTC QLQ-C30. Methods: 103…
Application and evaluation of high-resolution WRF-CMAQ with simple urban parameterization.
The 2-way coupled WRF-CMAQ meteorology and air quality modeling system is evaluated for high-resolution applications by comparing to a regional air quality field study (Discover-AQ). The model was modified to better account for the effects of urban environments. High-resolution...
Application and evaluation of high-resolution WRF-CMAQ with simple urban parameterization
The 2-way coupled WRF-CMAQ meteorology and air quality modeling system is evaluated for high-resolution applications by comparing to a regional air quality field study (Discover-AQ). The model was modified to better account for the effects of urban environments. High-resolution...
Persistence of initial conditions in continental scale air quality simulations
This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springt...
NASA Astrophysics Data System (ADS)
Comyn-Wattiau, Isabelle; Thalheim, Bernhard
Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.
Currently used dispersion models, such as the AMS/EPA Regulatory Model (AERMOD), process routinely available meteorological observations to construct model inputs. Thus, model estimates of concentrations depend on the availability and quality of Meteorological observations, as we...
Modeling Applications and Tools
The U.S. EPA's Air Quality Modeling Group (AQMG) conducts modeling analyses to support policy and regulatory decisions in OAR and provides leadership and direction on the full range of air quality models and other mathematical simulation techniques used in
Atmospheric Boundary Layer Modeling for Combined Meteorology and Air Quality Systems
Atmospheric Eulerian grid models for mesoscale and larger applications require sub-grid models for turbulent vertical exchange processes, particularly within the Planetary Boundary Layer (PSL). In combined meteorology and air quality modeling systems consistent PSL modeling of wi...
ERIC Educational Resources Information Center
Downey, Thomas E.
Continuous quality improvement (CQI) models, which were first applied in business, are critical to making new technology-based learning paradigms and flexible learning environments a reality. The following are among the factors that have facilitated CQI's application in education: increased operating costs; increased competition from private…
40 CFR 93.159 - Procedures for conformity determinations of general Federal actions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... based on the applicable air quality models, data bases, and other requirements specified in the most... applicable air quality models, data bases, and other requirements specified in the most recent version of the... data are available, such as actual stack test data from stationary sources which are part of the...
Russell, Armistead G
2008-02-01
One objective of the U.S. Environmental Protection Agency's (EPA's) Supersite Program was to provide data that could be used to more thoroughly evaluate and improve air quality models, and then have those models used to address both scientific and policy-related issues dealing with air quality management. In this direction, modeling studies have used Supersites-related data and are reviewed here. Fine temporal resolution data have been used both to test model components (e.g., the inorganic thermodynamic routines) and air quality modeling systems (in particular, Community Multiscale Air Quality [CMAQ] and Comprehensive Air Quality Model with extensions [CAMx] applications). Such evaluations suggest that the inorganic thermodynamic approaches being used are accurate, as well as the description of sulfate production, although there are significant uncertainties in production of nitric acid, biogenic and ammonia emissions, secondary organic aerosol formation, and the ability to follow the formation and evolution of ultrafine particles. Model applications have investigated how PM levels will respond to various emissions controls, suggesting that nitrate will replace some of the reductions in sulfate particulate matter (PM), although the replacement is small in the summer. Although not part of the Supersite program, modeling being conducted by EPA, regional planning organizations, and states for policy purposes has benefited from the detailed data collected, and the PM models have advanced by their more widespread use.
Four-dimensional data assimilation applied to photochemical air quality modeling is used to suggest adjustments to the emissions inventory of the Atlanta, Georgia metropolitan area. In this approach, a three-dimensional air quality model, coupled with direct sensitivity analys...
The U.S. Environmental Protection Agency (U.S. EPA) is extending its Models-3/Community Multiscale Air Quality (CMAQ) Modeling System to provide detailed gridded air quality concentration fields and sub-grid variability characterization at neighborhood scales and in urban areas...
The community multiscale air quality (CMAQ) model of the U.S. Environmental Protection Agency is one of the most widely used air quality model worldwide; it is employed for both research and regulatory applications at major universities and government agencies for improving under...
AIR QUALITY MODELING OF AMMONIA: A REGIONAL MODELING PERSPECTIVE
The talk will address the status of modeling of ammonia from a regional modeling perspective, yet the observations and comments should have general applicability. The air quality modeling system components that are central to modeling ammonia will be noted and a perspective on ...
Application-Driven No-Reference Quality Assessment for Dermoscopy Images With Multiple Distortions.
Xie, Fengying; Lu, Yanan; Bovik, Alan C; Jiang, Zhiguo; Meng, Rusong
2016-06-01
Dermoscopy images often suffer from blur and uneven illumination distortions that occur during acquisition, which can adversely influence consequent automatic image analysis results on potential lesion objects. The purpose of this paper is to deploy an algorithm that can automatically assess the quality of dermoscopy images. Such an algorithm could be used to direct image recapture or correction. We describe an application-driven no-reference image quality assessment (IQA) model for dermoscopy images affected by possibly multiple distortions. For this purpose, we created a multiple distortion dataset of dermoscopy images impaired by varying degrees of blur and uneven illumination. The basis of this model is two single distortion IQA metrics that are sensitive to blur and uneven illumination, respectively. The outputs of these two metrics are combined to predict the quality of multiply distorted dermoscopy images using a fuzzy neural network. Unlike traditional IQA algorithms, which use human subjective score as ground truth, here ground truth is driven by the application, and generated according to the degree of influence of the distortions on lesion analysis. The experimental results reveal that the proposed model delivers accurate and stable quality prediction results for dermoscopy images impaired by multiple distortions. The proposed model is effective for quality assessment of multiple distorted dermoscopy images. An application-driven concept for IQA is introduced, and at the same time, a solution framework for the IQA of multiple distortions is proposed.
Measuring and modeling of radiofrequency dielectric properties of chicken breast meat
USDA-ARS?s Scientific Manuscript database
Dielectric properties of chicken breast meat are important for both dielectric heating and quality sensing applications. In heating applications they allow optimization of energy transfer and uniformity of heating. In sensing applications, they can be used to predict quality attributes of the chicke...
Regional-scale air quality models are being used to demonstrate attainment of the ozone air quality standard. In current regulatory applications, a regional-scale air quality model is applied for a base year and a future year with reduced emissions using the same meteorological ...
Gu, Zhi-rong; Wang, Ya-li; Sun, Yu-jing; Dind, Jun-xia
2014-09-01
To investigate the establishment and application methods of entropy-weight TOPSIS model in synthetical quality evaluation of traditional Chinese medicine with Angelica sinensis growing in Gansu Province as an example. The contents of ferulic acid, 3-butylphthalide, Z-butylidenephthalide, Z-ligustilide, linolic acid, volatile oil, and ethanol soluble extractive were used as an evaluation index set. The weights of each evaluation index were determined by information entropy method. The entropyweight TOPSIS model was established to synthetically evaluate the quality of Angelica sinensis growing in Gansu Province by Euclid closeness degree. The results based on established model were in line with the daodi meaning and the knowledge of clinical experience. The established model was simple in calculation, objective, reliable, and can be applied to synthetical quality evaluation of traditional Chinese medicine.
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
EMISSION AND SURFACE EXCHANGE PROCESS
This task supports the development, evaluation, and application of emission and dry deposition algorithms in air quality simulation models, such as the Models-3/Community Multiscale Air Quality (CMAQ) modeling system. Emission estimates influence greatly the accuracy of air qual...
Recent Advances in WRF Modeling for Air Quality Applications
The USEPA uses WRF in conjunction with the Community Multiscale Air Quality (CMAQ) for air quality regulation and research. Over the years we have added physics options and geophysical datasets to the WRF system to enhance model capabilities especially for extended retrospective...
40 CFR 52.60 - Significant deterioration of air quality.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...
40 CFR 52.60 - Significant deterioration of air quality.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...
40 CFR 52.60 - Significant deterioration of air quality.
Code of Federal Regulations, 2011 CFR
2011-07-01
... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...
40 CFR 52.60 - Significant deterioration of air quality.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...
40 CFR 52.60 - Significant deterioration of air quality.
Code of Federal Regulations, 2010 CFR
2010-07-01
... quality. 52.60 Section 52.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 from... “Guideline on Air Quality Models (Revised)” or other models approved by EPA. [42 FR 22869, May 5, 1977, as...
A systematic literature review of open source software quality assessment models.
Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo
2016-01-01
Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.
APPLICATION OF BIAS AND ADJUSTMENT TECHNIQUES TO THE ETA-CMAQ AIR QUALITY FORECAST
The current air quality forecast system, based on linking NOAA's Eta meteorological model with EPA's Community Multiscale Air Quality (CMAQ) model, consistently overpredicts surface ozone concentrations, but simulates its day-to-day variability quite well. The ability of bias cor...
NASA Astrophysics Data System (ADS)
Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.
During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moss, M.T.; Segal, H.M.
1994-06-01
A new complex source microcomputer model has been developed for use at civil airports and Air Force bases. This paper describes both the key features of this model and its application in evaluating the air quality impact of new construction projects at three airports: one in the United States and two in Canada. The single EDMS model replaces the numerous models previously required to assess the air quality impact of pollution sources at airports. EDMS also employs a commercial data base to reduce the time and manpower required to accurately assess and document the air quality impact of airfield operations.more » On July 20, 1993, the U.S. Environmental Protection Agency (EPA) issued the final rule (Federal Register, 7/20/93, page 38816) to add new models to the Guideline on Air Quality Models. At that time EDMS was incorporated into the Guideline as an Appendix A model. 12 refs., 4 figs., 1 tab.« less
USDA-ARS?s Scientific Manuscript database
Information to support application of hydrologic and water quality (H/WQ) models abounds, yet modelers commonly use arbitrary, ad hoc methods to conduct, document, and report model calibration, validation, and evaluation. Consistent methods are needed to improve model calibration, validation, and e...
Regional air quality models are being used in a policy-setting to estimate the response of air pollutant concentrations to changes in emissions and meteorology. Dynamic evaluation entails examination of a retrospective case(s) to assess whether an air quality model has properly p...
A parsimonious dynamic model for river water quality assessment.
Mannina, Giorgio; Viviani, Gaspare
2010-01-01
Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.
The Community Miultiscale Air Quality (CMAQ) modeling system is a "one atmosphere" chemical transport model that simulates the transport and fate of air pollutants from urban to continental scales and from daily to annual time intervals.
DEVELOPMENT OF A LAND-SURFACE MODEL PART I: APPLICATION IN A MESOSCALE METEOROLOGY MODEL
Parameterization of land-surface processes and consideration of surface inhomogeneities are very important to mesoscale meteorological modeling applications, especially those that provide information for air quality modeling. To provide crucial, reliable information on the diurn...
Air Pollution Data for Model Evaluation and Application
One objective of designing an air pollution monitoring network is to obtain data for evaluating air quality models that are used in the air quality management process and scientific discovery.1.2 A common use is to relate emissions to air quality, including assessing ...
Design guidelines for an umbilical cord blood stem cell therapy quality assessment model
NASA Astrophysics Data System (ADS)
Januszewski, Witold S.; Michałek, Krzysztof; Yagensky, Oleksandr; Wardzińska, Marta
The paper enlists the pivotal guidelines for producing an empirical umbilical cord blood stem cell therapy quality assessment model. The methodology adapted was single equation linear model with domain knowledge derived from MEDAFAR classification. The resulting model is ready for therapeutical application.
Presentation slides provide background on model evaluation techniques. Also included in the presentation is an operational evaluation of 2001 Community Multiscale Air Quality (CMAQ) annual simulation, and an evaluation of PM2.5 for the CMAQ air quality forecast (AQF) ...
Modeling the Effects of Conservation Tillage on Water Quality at the Field Scale
USDA-ARS?s Scientific Manuscript database
The development and application of predictive tools to quantitatively assess the effects of tillage and related management activities should be carefully tested against high quality field data. This study reports on: 1) the calibration and validation of the Root Zone Water Quality Model (RZWQM) to a...
Like most air quality modeling systems, CMAQ divides the treatment of meteorological and chemical/transport processes into separate models run sequentially. A potential drawback to this approach is that it creates the illusion that these processes are minimally interdependent an...
USDA-ARS?s Scientific Manuscript database
Previous publications have outlined recommended practices for hydrologic and water quality (H/WQ) modeling, but none have formulated comprehensive guidelines for the final stage of modeling applications, namely evaluation, interpretation, and communication of model results and the consideration of t...
A next generation air quality modeling system is being developed at the U.S. EPA to enable modeling of air quality from global to regional to (eventually) local scales. We envision that the system will have three configurations: 1. Global meteorology with seamless mesh refinemen...
A real-time air quality forecasting system (Eta-CMAQ model suite) has been developed by linking the NCEP Eta model to the U.S. EPA CMAQ model. This work presents results from the application of the Eta-CMAQ modeling system for forecasting O3 over the northeastern U.S d...
Persistence of initial conditions in continental scale air quality simulations
NASA Astrophysics Data System (ADS)
Hogrefe, Christian; Roselle, Shawn J.; Bash, Jesse O.
2017-07-01
This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springtime and the second for summertime. Results show that a spin-up period of ten days commonly used in regional-scale applications may not be sufficient to reduce the effects of initial conditions to less than 1% of seasonally-averaged surface ozone concentrations everywhere while 20 days were found to be sufficient for the entire domain for the spring case and almost the entire domain for the summer case. For the summer case, differences were found to persist longer aloft due to circulation of air masses and even a spin-up period of 30 days was not sufficient to reduce the effects of ICs to less than 1% of seasonally-averaged layer 34 ozone concentrations over the southwestern portion of the modeling domain. Analysis of the effect of soil initial conditions for the CMAQ bidirectional NH3 exchange model shows that during springtime they can have an important effect on simulated inorganic aerosols concentrations for time periods of one month or longer. The effects are less pronounced during other seasons. The results, while specific to the modeling domain and time periods simulated here, suggest that modeling protocols need to be scrutinized for a given application and that it cannot be assumed that commonly-used spin-up periods are necessarily sufficient to reduce the effects of initial conditions on model results to an acceptable level. What constitutes an acceptable level of difference cannot be generalized and will depend on the particular application, time period and species of interest. Moreover, as the application of air quality models is being expanded to cover larger geographical domains and as these models are increasingly being coupled with other modeling systems to better represent air-surface-water exchanges, the effects of model initialization in such applications needs to be studied in future work.
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT
Kim, Jonghyuk
2018-01-01
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.
Kim, Jonghyuk; Hwangbo, Hyunwoo
2018-03-23
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
Context-aware workflow management of mobile health applications.
Salden, Alfons; Poortinga, Remco
2006-01-01
We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brinkman, J.J.; Griffioen, P.S.; Groot, S.
1987-03-01
The Netherlands have a rather complex water-management system consisting of a number of major rivers, canals, lakes and ditches. Water-quantity management on a regional scale is necessary for an effective water-quality policy. To support water management, a computer model was developed that includes both water quality and water quantity, based on three submodels: ABOPOL for the water movement, DELWAQ for the calculation of water quality variables and BLOOM-II for the phytoplankton growth. The northern province of Friesland was chosen as a test case for the integrated model to be developed, where water quality is highly related to the water distributionmore » and the main trade-off is minimizing the intake of (eutrophicated) alien water in order to minimize external nutrient load and maximizing the intake in order to flush channels and lakes. The results of the application of these models to this and to a number of hypothetical future situations are described.« less
Application of the PRECEDE model to understanding mental health promoting behaviors in Hong Kong.
Mo, Phoenix K H; Mak, Winnie W S
2008-08-01
The burdens related to mental illness have been increasingly recognized in many countries. Nevertheless, research in positive mental health behaviors remains scarce. This study utilizes the Predisposing, Reinforcing, and Enabling Causes in Education Diagnosis and Evaluation (PRECEDE) model to identify factors associated with mental health promoting behaviors and to examine the effects of these behaviors on mental well-being and quality of life among 941 adults in Hong Kong. Structural equation modeling shows that sense of coherence (predisposing factor), social support (reinforcing factor), and daily hassles (enabling factor) are significantly related to mental health promoting behaviors, which are associated with mental well-being and quality of life. Results of bootstrap analyses confirm the mediating role of mental health promoting behaviors on well-being and quality of life. The study supports the application of the PRECEDE model in understanding mental health promoting behaviors and demonstrates its relationships with well-being and quality of life.
DOT National Transportation Integrated Search
1978-02-01
Ride-quality models for city buses and intercity trains are presented and discussed in terms of their ability to predict passenger comfort and ride acceptability. The report, the last of three volumes, contains procedural guidelines to be employed by...
ERIC Educational Resources Information Center
Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji
2017-01-01
This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…
The Models-3 Community Multi-scale Air Quality (CMAQ) model, first released by the USEPA in 1999 (Byun and Ching. 1999), continues to be developed and evaluated. The principal components of the CMAQ system include a comprehensive emission processor known as the Sparse Matrix O...
THE LAKE MICHIGAN MASS BALANCE PROJECT: QUALITY ASSURANCE PLAN FOR MATHEMATICAL MODELLING
This report documents the quality assurance process for the development and application of the Lake Michigan Mass Balance Models. The scope includes the overall modeling framework as well as the specific submodels that are linked to form a comprehensive synthesis of physical, che...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Selection and Classification Using a Forecast Applicant Pool.
ERIC Educational Resources Information Center
Hendrix, William H.
The document presents a forecast model of the future Air Force applicant pool. By forecasting applicants' quality (means and standard deviations of aptitude scores) and quantity (total number of applicants), a potential enlistee could be compared to the forecasted pool. The data used to develop the model consisted of means, standard deviation, and…
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Influence of Boundary Conditions on Simulated U.S. Air Quality
One of the key inputs to regional-scale photochemical models frequently used in air quality planning and forecasting applications are chemical boundary conditions representing background pollutant concentrations originating outside the regional modeling domain. A number of studie...
AQMEII Phase 2: Overview and WRF/CMAQ Application over North America
In this study, we provide an overview of the second phase of the Air Quality Model Evaluation International Initiative (AQMEII). Activities in this phase are focused on the application and evaluation of coupled meteorologychemistry models. Participating modeling systems are being...
Storm Water Management Model Applications Manual
The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model that computes runoff quantity and quality from primarily urban areas. This manual is a practical application guide for new SWMM users who have already had some previous training in hydrolog...
The paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associa...
Space-Time Fusion Under Error in Computer Model Output: An Application to Modeling Air Quality
In the last two decades a considerable amount of research effort has been devoted to modeling air quality with public health objectives. These objectives include regulatory activities such as setting standards along with assessing the relationship between exposure to air pollutan...
The 4th workshop of the Air Quality Model Evaluation International Initiative (AQMEII) was held on May 8 in Utrecht, The Netherlands, in conjunction with the NATO/SPS International Technical Meeting on Air Pollution Modeling and Its Application. AQMEII was launched in 2009 as a l...
The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modeled processes were examined and enhanced to suitably represent the extended space and timesca...
USDA-ARS?s Scientific Manuscript database
This chapter presents the development and application of a three-dimensional water quality model for predicting the distributions of nutrients, phytoplankton, dissolved oxygen, etc., in natural lakes. In this model, the computational domain was divided into two parts: the water column and the bed se...
Production system with process quality control: modelling and application
NASA Astrophysics Data System (ADS)
Tsou, Jia-Chi
2010-07-01
Over the past decade, there has been a great deal of research dedicated to the study of quality and the economics of production. In this article, we develop a dynamic model which is based on the hypothesis of a traditional economic production quantity model. Taguchi's cost of poor quality is used to evaluate the cost of poor quality in the dynamic production system. A practical case from the automotive industry, which uses the Six-sigma DMAIC methodology, is discussed to verify the proposed model. This study shows that there is an optimal value of quality investment to make the production system reach a reasonable quality level and minimise the production cost. Based on our model, the management can adjust its investment in quality improvement to generate considerable financial return.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
Should we trust build-up/wash-off water quality models at the scale of urban catchments?
Bonhomme, Céline; Petrucci, Guido
2017-01-01
Models of runoff water quality at the scale of an urban catchment usually rely on build-up/wash-off formulations obtained through small-scale experiments. Often, the physical interpretation of the model parameters, valid at the small-scale, is transposed to large-scale applications. Testing different levels of spatial variability, the parameter distributions of a water quality model are obtained in this paper through a Monte Carlo Markov Chain algorithm and analyzed. The simulated variable is the total suspended solid concentration at the outlet of a periurban catchment in the Paris region (2.3 km 2 ), for which high-frequency turbidity measurements are available. This application suggests that build-up/wash-off models applied at the catchment-scale do not maintain their physical meaning, but should be considered as "black-box" models. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mirauda, D.; Ostoich, M.; Di Maria, F.; Benacchio, S.; Saccardo, I.
2018-03-01
In this paper, a mathematical model has been applied to a river in North-East Italy to describe vulnerability scenarios due to environmental pollution phenomena. Such model, based on the influence diagrams theory, allowed identifying the extremely critical factors, such as wastewater discharges, drainage of diffuse pollution from agriculture and climate changes, which might affect the water quality of the river. The obtained results underlined how the water quality conditions have improved thanks to the continuous controls on the territory, following the application of Water Framework Directive 2000/60/EC. Nevertheless, some fluvial stretches did not reach the “good ecological status” by 2015, because of the increasing population in urban areas recorded in the last years and the high presence of tourists during the summer months, not balanced by a treatment plants upgrade.
Regional air quality models are frequently used for regulatory applications to predict changes in air quality due to changes in emissions or changes in meteorology. Dynamic model evaluation is thus an important step in establishing credibility in the model predicted pollutant re...
USDA-ARS?s Scientific Manuscript database
AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic/water quality simulation components. The AgES-W model was previously evaluated for streamflow and recently has been enhanced with the addition of nitrogen (N) and sediment modeling compo...
NASA Earth Observation Systems and Applications for Health and Air Quality
NASA Technical Reports Server (NTRS)
Omar, Ali H.
2015-01-01
There is a growing body of evidence that the environment can affect human health in ways that are both complex and global in scope. To address some of these complexities, NASA maintains a diverse constellation of Earth observing research satellites, and sponsors research in developing satellite data applications across a wide spectrum of areas. These include environmental health; infectious disease; air quality standards, policies, and regulations; and the impact of climate change on health and air quality in a number of interrelated efforts. The Health and Air Quality Applications fosters the use of observations, modeling systems, forecast development, application integration, and the research to operations transition process to address environmental health effects. NASA has been a primary partner with Federal operational agencies over the past nine years in these areas. This talk presents the background of the Health and Air Quality Applications program, recent accomplishments, and a plan for the future.
Measuring health care process quality with software quality measures.
Yildiz, Ozkan; Demirörs, Onur
2012-01-01
Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.
DEVELOPMENT AND APPLICATIONS OF CFD SIMULATIONS SUPPORTING URBAN AIR QUALITY AND HOMELAND SECURITY
Prior to September 11, 2001 developments of Computational Fluid Dynamics (CFD) were begun to support air quality applications. CFD models are emerging as a promising technology for such assessments, in part due to the advancing power of computational hardware and software. CFD si...
There is a need to properly develop the application of Computational Fluid Dynamics (CFD) methods in support of air quality studies involving pollution sources near buildings at industrial sites. CFD models are emerging as a promising technology for such assessments, in part due ...
Evidence-based dentistry: a model for clinical practice.
Faggion, Clóvis M; Tu, Yu-Kang
2007-06-01
Making decisions in dentistry should be based on the best evidence available. The objective of this study was to demonstrate a practical procedure and model that clinicians can use to apply the results of well-conducted studies to patient care by critically appraising the evidence with checklists and letter grade scales. To demonstrate application of this model for critically appraising the quality of research evidence, a hypothetical case involving an adult male with chronic periodontitis is used as an example. To determine the best clinical approach for this patient, a four-step, evidence-based model is demonstrated, consisting of the following: definition of a research question using the PICO format, search and selection of relevant literature, critical appraisal of identified research reports using checklists, and the application of evidence. In this model, the quality of research evidence was assessed quantitatively based on different levels of quality that are assigned letter grades of A, B, and C by evaluating the studies against the QUOROM (Quality of Reporting Meta-Analyses) and CONSORT (Consolidated Standards of Reporting Trials) checklists in a tabular format. For this hypothetical periodontics case, application of the model identified the best available evidence for clinical decision making, i.e., one randomized controlled trial and one systematic review of randomized controlled trials. Both studies showed similar answers for the research question. The use of a letter grade scale allowed an objective analysis of the quality of evidence. A checklist-driven model that assesses and applies evidence to dental practice may substantially improve dentists' decision making skill.
“AQMEII Phase 2: Overview and WRF/CMAQ Application over North America”.
This presentation provides an overview of the second phase of the Air Quality Model Evaluation International Initative (AQMEII). Activities in this phase are focused on the application and evaluation of coupled meteorology-chemistry models to assess how well these models can simu...
Regional, state, and local environmental regulatory agencies often use Eulerian meteorological and air quality models to investigate the potential impacts of climate, emissions, and land use changes on nutrient loading and air quality. The Noah land surface model in WRF could be...
USDA-ARS?s Scientific Manuscript database
In order to control algal blooms, stressor-response relationships between water quality metrics, environmental variables, and algal growth should be understood and modeled. Machine-learning methods were suggested to express stressor-response relationships found by application of mechanistic water qu...
DOT National Transportation Integrated Search
1986-01-01
This report describes an investigation of state-of-the-art models for predicting the impact on air quality of additions or changes to a highway system identified by the U.S. Environmental Protection Agency as a "non-attainment area" for air quality s...
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four report volumes. Moreover, the tests are generally applicable to other model evaluation problem...
The Samurai or the Cowboy? Toward an American Model of Quality Management.
ERIC Educational Resources Information Center
Beck, Mark W.
1994-01-01
The Japanese model of business management and Total Quality Management principles being applied to higher education as well as businesses are often ineffective because of the application of packaged ideas without consideration of the subtleties of individual organizations. The cowboy model of teamwork stresses the individual's role and better fits…
Crop model application to soybean irrigation management in the mid-south USA
USDA-ARS?s Scientific Manuscript database
Since mid 1990s, there have been a rapid development and application of crop growth models such as APEX (the Agricultural Policy/Environmental eXtender) and RZWQM2 (Root Zone Water Quality Model). Such process-oriented models have been designed to study the interactions of genetypes, weather, soil, ...
This study demonstrates the value of a coupled chemical transport modeling system for investigating groundwater nitrate contamination responses associated with nitrogen (N) fertilizer application and increased corn production. The coupled Community Multiscale Air Quality Bidirect...
A FEDERATED PARTNERSHIP FOR URBAN METEOROLOGICAL AND AIR QUALITY MODELING
Recently, applications of urban meteorological and air quality models have been performed at resolutions on the order of km grid sizes. This necessitated development and incorporation of high resolution landcover data and additional boundary layer parameters that serve to descri...
Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...
A software quality model and metrics for risk assessment
NASA Technical Reports Server (NTRS)
Hyatt, L.; Rosenberg, L.
1996-01-01
A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.
NASA Astrophysics Data System (ADS)
Hong, Chaopeng; Zhang, Qiang; Zhang, Yang; Tang, Youhua; Tong, Daniel; He, Kebin
2017-06-01
In this study, a regional coupled climate-chemistry modeling system using the dynamical downscaling technique was established by linking the global Community Earth System Model (CESM) and the regional two-way coupled Weather Research and Forecasting - Community Multi-scale Air Quality (WRF-CMAQ) model for the purpose of comprehensive assessments of regional climate change and air quality and their interactions within one modeling framework. The modeling system was applied over east Asia for a multi-year climatological application during 2006-2010, driven with CESM downscaling data under Representative Concentration Pathways 4.5 (RCP4.5), along with a short-term air quality application in representative months in 2013 that was driven with a reanalysis dataset. A comprehensive model evaluation was conducted against observations from surface networks and satellite observations to assess the model's performance. This study presents the first application and evaluation of the two-way coupled WRF-CMAQ model for climatological simulations using the dynamical downscaling technique. The model was able to satisfactorily predict major meteorological variables. The improved statistical performance for the 2 m temperature (T2) in this study (with a mean bias of -0.6 °C) compared with the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-models might be related to the use of the regional model WRF and the bias-correction technique applied for CESM downscaling. The model showed good ability to predict PM2. 5 in winter (with a normalized mean bias (NMB) of 6.4 % in 2013) and O3 in summer (with an NMB of 18.2 % in 2013) in terms of statistical performance and spatial distributions. Compared with global models that tend to underpredict PM2. 5 concentrations in China, WRF-CMAQ was able to capture the high PM2. 5 concentrations in urban areas. In general, the two-way coupled WRF-CMAQ model performed well for both climatological and air quality applications. The coupled modeling system with direct aerosol feedbacks predicted aerosol optical depth relatively well and significantly reduced the overprediction in downward shortwave radiation at the surface (SWDOWN) over polluted regions in China. The performance of cloud variables was not as good as other meteorological variables, and underpredictions of cloud fraction resulted in overpredictions of SWDOWN and underpredictions of shortwave and longwave cloud forcing. The importance of climate-chemistry interactions was demonstrated via the impacts of aerosol direct effects on climate and air quality. The aerosol effects on climate and air quality in east Asia (e.g., SWDOWN and T2 decreased by 21.8 W m-2 and 0.45 °C, respectively, and most pollutant concentrations increased by 4.8-9.5 % in January over China's major cities) were more significant than in other regions because of higher aerosol loadings that resulted from severe regional pollution, which indicates the need for applying online-coupled models over east Asia for regional climate and air quality modeling and to study the important climate-chemistry interactions. This work established a baseline for WRF-CMAQ simulations for a future period under the RCP4.5 climate scenario, which will be presented in a future paper.
ERIC Educational Resources Information Center
Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente
2013-01-01
In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…
NASA Astrophysics Data System (ADS)
Balasubramanian, S.; Nelson, A. J.; Koloutsou-Vakakis, S.; Lin, J.; Myles, L.; Rood, M. J.
2016-12-01
Biogeochemical models such as DeNitrification DeComposition (DNDC) are used to model greenhouse and other trace gas fluxes (e.g., ammonia (NH3)) from agricultural ecosystems. NH3 is of interest to air quality because it is a precursor to ambient particulate matter. NH3 fluxes from chemical fertilizer application are uncertain due to dependence on local weather and soil properties, and farm nitrogen management practices. DNDC can be advantageously implemented to model the underlying spatial and temporal trends to support air quality modeling. However, such implementation, requires a detailed evaluation of model predictions, and model behavior. This is the first study to assess DNDC predictions of NH3 fluxes to/from the atmosphere, from chemical fertilizer application, during an entire crop growing season, in the United States. Relaxed eddy accumulation (REA) measurements over corn in Central Illinois, in year 2014, were used to evaluate magnitude and trends in modeled NH3 fluxes. DNDC was able to replicate both magnitude and trends in measured NH3 fluxes, with greater accuracy during the initial 33 days after application, when NH3 was mostly emitted to the atmosphere. However, poorer performance was observed when depositional fluxes were measured. Sensitivity analysis using Monte Carlo simulations indicated that modeled NH3 fluxes were most sensitive to input air temperature and precipitation, soil organic carbon, field capacity and pH and fertilizer loading rate, timing, and application depth and tilling date. By constraining these inputs for conditions in Central Illinois, uncertainty in annual NH3 fluxes was estimated to vary from -87% to 61%. Results from this study provides insight to further improve DNDC predictions and inform efforts for upscaling site predictions to regional scale for the development of emission inventories for air quality modeling.
We present an application of the online coupled WRF-CMAQ modeling system to two annual simulations over North America performed under Phase 2 of the Air Quality Model Evaluation International Initiative (AQMEII). Operational evaluation shows that model performance is comparable t...
MODELING CONSISTENCY, MODEL QUALITY, AND FOSTERING CONTINUED IMPROVEMENT
We believe that most contributors to and participants of the International Conference, Marine Waste Water Discharges 2000, "MWWD 2000," could agree that the overarching dream of the conference might be to chart a path the will lead to the best, long-term, applicable water quality...
AQMEII: A New International Initiative on Air Quality Model Evaluation
We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....
CHOOSING A CHEMICAL MECHANISM FOR REGULATORY AND RESEARCH AIR QUALITY MODELING APPLICATIONS
There are numerous, different chemical mechanisms currently available for use in air quality models, and new mechanisms and versions of mechanisms are continually being developed. The development of Morphecule-type mechanisms will add a near-infinite number of additional mecha...
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
A Review of Surface Water Quality Models
Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng
2013-01-01
Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
APPLICATION OF A WATER QUALITY ASSESSMENT MODELING SYSTEM AT A SUPERFUND SITE
Water quality modeling and related exposure assessments at a Superfund site, Silver Bow Creek-Clark Fork River in Montana, demonstrate the capability to predict the fate of mining waste pollutants in the environment. inked assessment system--consisting of hydrology and erosion, r...
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
Service Quality and Customer Satisfaction: An Assessment and Future Directions.
ERIC Educational Resources Information Center
Hernon, Peter; Nitecki, Danuta A.; Altman, Ellen
1999-01-01
Reviews the literature of library and information science to examine issues related to service quality and customer satisfaction in academic libraries. Discusses assessment, the application of a business model to higher education, a multiple constituency approach, decision areas regarding service quality, resistance to service quality, and future…
Mathur, Rohit; Xing, Jia; Gilliam, Robert; Sarwar, Golam; Hogrefe, Christian; Pleim, Jonathan; Pouliot, George; Roselle, Shawn; Spero, Tanya L.; Wong, David C.; Young, Jeffrey
2018-01-01
The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modelled processes were examined and enhanced to suitably represent the extended space and time scales for such applications. Hemispheric scale simulations with CMAQ and the Weather Research and Forecasting (WRF) model are performed for multiple years. Model capabilities for a range of applications including episodic long-range pollutant transport, long-term trends in air pollution across the Northern Hemisphere, and air pollution-climate interactions are evaluated through detailed comparison with available surface, aloft, and remotely sensed observations. The expansion of CMAQ to simulate the hemispheric scales provides a framework to examine interactions between atmospheric processes occurring at various spatial and temporal scales with physical, chemical, and dynamical consistency. PMID:29681922
Three-Dimensional Visualization of Ozone Process Data.
1997-06-18
Scattered Multivariate Data. IEEE Computer Graphics & Applications. 11 (May), 47-55. Odman, M.T. and Ingram, C.L. (1996) Multiscale Air Quality Simulation...the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. MAQSIP is a modular comprehensive air quality modeling system which MCNC...photolyzed back again to nitric oxide. Finally, oxides of 6 nitrogen are terminated through loss or combination into nitric acid, organic nitrates
AIR QUALITY MODELING OF HAZARDOUS POLLUTANTS: CURRENT STATUS AND FUTURE DIRECTIONS
The paper presents a review of current air toxics modeling applications and discusses possible advanced approaches. Many applications require the ability to predict hot spots from industrial sources or large roadways that are needed for community health and Environmental Justice...
Landsat - What is operational in water resources
NASA Technical Reports Server (NTRS)
Middleton, E. M.; Munday, J. C., Jr.
1981-01-01
Applications of Landsat data in hydrology and water quality measurement were examined to determine which applications are operational. In hydrology, the principal applications have been surface water inventory, and land cover analysis for (1) runoff modeling and (2) abatement planning for non-point pollution and erosion. In water quality measurement, the principal applications have been: (1) trophic state assessment, and (2) measurement of turbidity and suspended sediment. The following applications were found to be operational: mapping of surface water, snow cover, and land cover (USGS Level 1) for watershed applications; measurement of turbidity, Secchi disk depth, suspended sediment concentration, and water depth.
Measuring Student Course Evaluations: The Use of a Loglinear Model
ERIC Educational Resources Information Center
Ting, Ding Hooi; Abella, Mireya Sosa
2007-01-01
In this paper, the researchers attempt to incorporate the marketing theory (specifically the service quality model) into the education system. The service quality measurements have been employed to investigate its applicability in the education environment. Most of previous studies employ the regression-based analysis to test the effectiveness of…
Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.
ERIC Educational Resources Information Center
Heesacker, Martin
1986-01-01
Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…
The status of military specifications with regard to atmospheric turbulence
NASA Technical Reports Server (NTRS)
Moorhouse, David J.; Heffley, Robert K.
1987-01-01
The features of atmospheric disturbances that are significant to aircraft flying qualities are discussed. Next follows a survey of proposed models. Lastly, there is a discussion of the content and application of the model contained in the current flying qualities specification and the forthcoming MIL-Standard.
DOT National Transportation Integrated Search
2008-04-01
The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...
The adaptation of the Community Multiscale Air Quality (CMAQ) modeling system to simulate O3, particulate matter, and related precursor distributions over the northern hemisphere is presented. Hemispheric simulations with CMAQ and the Weather Research and Forecasting (...
In recent years the applications of regional air quality models are continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physic...
Measurement of Productivity and Quality in Non-Marketable Services: With Application to Schools
ERIC Educational Resources Information Center
Fare, R.; Grosskopf, S.; Forsund, F. R.; Hayes, K.; Heshmati, A.
2006-01-01
Purpose: This paper seeks to model and compute productivity, including a measure of quality, of a service which does not have marketable outputs--namely public education at the micro level. This application is a case study for Sweden public schools. Design/methodology/approach: A Malmquist productivity index is employed which allows for multiple…
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
The CMAQ modeling system has been used to simulate the CONUS using 12-km by 12-km horizontal grid spacing for the entire year of 2006 as part of the Air Quality Model Evaluation International initiative (AQMEII). The operational model performance for O3 and PM2.5<...
[Quality process control system of Chinese medicine preparation based on "holistic view"].
Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming
2018-01-01
"High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.
NASA Technical Reports Server (NTRS)
Duncan, Bryan Neal; Prados, Ana; Lamsal, Lok N.; Liu, Yang; Streets, David G.; Gupta, Pawan; Hilsenrath, Ernest; Kahn, Ralph A.; Nielsen, J. Eric; Beyersdorf, Andreas J.;
2014-01-01
Satellite data of atmospheric pollutants are becoming more widely used in the decision-making and environmental management activities of public, private sector and non-profit organizations. They are employed for estimating emissions, tracking pollutant plumes, supporting air quality forecasting activities, providing evidence for "exceptional event" declarations, monitoring regional long-term trends, and evaluating air quality model output. However, many air quality managers are not taking full advantage of the data for these applications nor has the full potential of satellite data for air quality applications been realized. A key barrier is the inherent difficulties associated with accessing, processing, and properly interpreting observational data. A degree of technical skill is required on the part of the data end-user, which is often problematic for air quality agencies with limited resources. Therefore, we 1) review the primary uses of satellite data for air quality applications, 2) provide some background information on satellite capabilities for measuring pollutants, 3) discuss the many resources available to the end-user for accessing, processing, and visualizing the data, and 4) provide answers to common questions in plain language.
NASA Technical Reports Server (NTRS)
Duncan, Bryan; Prados, Ana I.; Lamsal, Lok; Liu, Yang; Streets, David G.; Gupta, Pawan; Hilsenrath, Ernest; Kahn, Ralph A.; Nielsen, J. Eric; Beyersdorf, Andreas J.;
2014-01-01
Satellite data of atmospheric pollutants are becoming more widely used in the decision-making and environmental management activities of public, private sector and non-profit organizations. They are employed for estimating emissions, tracking pollutant plumes, supporting air quality forecasting activities, providing evidence for "exceptional event" declarations, monitoring regional long-term trends, and evaluating air quality model output. However, many air quality managers are not taking full advantage of the data for these applications nor has the full potential of satellite data for air quality applications been realized. A key barrier is the inherent difficulties associated with accessing, processing, and properly interpreting observational data. A degree of technical skill is required on the part of the data end-user, which is often problematic for air quality agencies with limited resources. Therefore, we 1) review the primary uses of satellite data for air quality applications, 2) provide some background information on satellite capabilities for measuring pollutants, 3) discuss the many resources available to the end-user for accessing, processing, and visualizing the data, and 4) provide answers to common questions in plain language.
Harlow C. Landphair
1979-01-01
This paper relates the evolution of an empirical model used to predict public response to scenic quality objectively. The text relates the methods used to develop the visual quality index model, explains the terms used in the equation and briefly illustrates how the model is applied and how it is tested. While the technical application of the model relies heavily on...
Air Quality Dispersion Modeling - Alternative Models
Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.
Innovations in projecting emissions for air quality modeling ...
Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality management strategy has climate change implications is encouraging longer modeling time horizons. However, for multi-decadal time horizons, many questions about future conditions arise. For example, will current population, economic, and land use trends continue, or will we see shifts that may alter the spatial and temporal pattern of emissions? Similarly, will technologies such as building-integrated solar photovoltaics, battery storage, electric vehicles, and CO2 capture emerge as disruptive technologies - shifting how we produce and use energy - or will these technologies achieve only niche markets and have little impact? These are some of the questions that are being evaluated by researchers within the U.S. EPA’s Office of Research and Development. In this presentation, Dr. Loughlin will describe a range of analytical approaches that are being explored. These include: (i) the development of alternative scenarios of the future that can be used to evaluate candidate management strategies over wide-ranging conditions, (ii) the application of energy system models to project emissions decades into the future and to assess the environmental implications of new technologies, (iii) and methodo
Comprehensive model for predicting perceptual image quality of smart mobile devices.
Gong, Rui; Xu, Haisong; Luo, M R; Li, Haifeng
2015-01-01
An image quality model for smart mobile devices was proposed based on visual assessments of several image quality attributes. A series of psychophysical experiments were carried out on two kinds of smart mobile devices, i.e., smart phones and tablet computers, in which naturalness, colorfulness, brightness, contrast, sharpness, clearness, and overall image quality were visually evaluated under three lighting environments via categorical judgment method for various application types of test images. On the basis of Pearson correlation coefficients and factor analysis, the overall image quality could first be predicted by its two constituent attributes with multiple linear regression functions for different types of images, respectively, and then the mathematical expressions were built to link the constituent image quality attributes with the physical parameters of smart mobile devices and image appearance factors. The procedure and algorithms were applicable to various smart mobile devices, different lighting conditions, and multiple types of images, and performance was verified by the visual data.
Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao
2013-06-01
In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.
Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin
Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R
2017-01-01
Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency’s model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program—Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes. PMID:29162976
Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin.
Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R
2017-01-01
Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency's model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program-Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes.
Applications of MIDAS regression in analysing trends in water quality
NASA Astrophysics Data System (ADS)
Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.
2014-04-01
We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.
High-quality, daily meteorological data at high spatial resolution are essential for a variety of hydrologic and ecological modeling applications that support environmental risk assessments and decision making. This paper describes the development, application, and assessment of ...
This chapter reviews the regulatory background and policy applications driving the use of various types of environmental fate and bioaccumulation models at US EPA (air quality, surface water and watersheds, contaminated sites). Comparing current research frontiers with contempora...
Close-range photogrammetry for aircraft quality control
NASA Astrophysics Data System (ADS)
Schwartz, D. S.
Close range photogrammetry is applicable to quality assurance inspections, design data acquisition, and test management support tasks, yielding significant cost avoidance and increased productivity. An understanding of mensuration parameters and their related accuracies is fundamental to the successful application of industrial close range photogrammetry. Attention is presently given to these parameters and to the use of computer modelling as an aid to the photogrammetric entrepreneur in industry. Suggested improvements to cameras and film readers for industrial applications are discussed.
Hanna, R. Blair; Campbell, Sharon G.
2000-01-01
This report describes the water quality model developed for the Klamath River System Impact Assessment Model (SIAM). The Klamath River SIAM is a decision support system developed by the authors and other US Geological Survey (USGS), Midcontinent Ecological Science Center staff to study the effects of basin-wide water management decisions on anadromous fish in the Klamath River. The Army Corps of Engineersa?? HEC5Q water quality modeling software was used to simulate water temperature, dissolved oxygen and conductivity in 100 miles of the Klamath River Basin in Oregon and California. The water quality model simulated three reservoirs and the mainstem Klamath River influenced by the Shasta and Scott River tributaries. Model development, calibration and two validation exercises are described as well as the integration of the water quality model into the SIAM decision support system software. Within SIAM, data are exchanged between the water quantity model (MODSIM), the water quality model (HEC5Q), the salmon population model (SALMOD) and methods for evaluating ecosystem health. The overall predictive ability of the water quality model is described in the context of calibration and validation error statistics. Applications of SIAM and the water quality model are described.
Users manual for a one-dimensional Lagrangian transport model
Schoellhamer, D.H.; Jobson, H.E.
1986-01-01
A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)
Quality Assurance in E-Learning: PDPP Evaluation Model and Its Application
ERIC Educational Resources Information Center
Zhang, Weiyuan; Cheng, Y. L.
2012-01-01
E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of e-learning, however, is essential for the quality assurance of e-learning courses. This paper constructs a four-phase evaluation model for e-learning courses, which includes planning, development,…
Enhancing E-Learning Quality through the Application of the AKUE Procedure Model
ERIC Educational Resources Information Center
Bremer, C.
2012-01-01
The paper describes the procedure model AKUE, which aims at the improvement and assurance of quality and cost efficiency in the context of the introduction of e-learning and the development of digital learning material. AKUE divides the whole planning and implementation process into four different phases: analysis, conception, implementation, and…
Quality assessment for color reproduction using a blind metric
NASA Astrophysics Data System (ADS)
Bringier, B.; Quintard, L.; Larabi, M.-C.
2007-01-01
This paper deals with image quality assessment. This field plays nowadays an important role in various image processing applications. Number of objective image quality metrics, that correlate or not, with the subjective quality have been developed during the last decade. Two categories of metrics can be distinguished, the first with full-reference and the second with no-reference. Full-reference metric tries to evaluate the distortion introduced to an image with regards to the reference. No-reference approach attempts to model the judgment of image quality in a blind way. Unfortunately, the universal image quality model is not on the horizon and empirical models established on psychophysical experimentation are generally used. In this paper, we focus only on the second category to evaluate the quality of color reproduction where a blind metric, based on human visual system modeling is introduced. The objective results are validated by single-media and cross-media subjective tests.
[Quality assurance of the renal applications software].
del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M
2007-01-01
The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.
NASA Astrophysics Data System (ADS)
Wang, S.
2014-12-01
Atmospheric ammonia (NH3) plays an important role in fine particle formation. Accurate estimates of ammonia can reduce uncertainties in air quality modeling. China is one of the largest countries emitting ammonia with the majority of NH3 emissions coming from the agricultural practices, such as fertilizer applications and animal operations. The current ammonia emission estimates in China are mainly based on pre-defined emission factors. Thus, there are considerable uncertainties in estimating NH3 emissions, especially in time and space distribution. For example, fertilizer applications vary in the date of application and amount by geographical regions and crop types. In this study, the NH3 emission from the agricultural fertilizer use in China of 2011 was estimated online by an agricultural fertilizer modeling system coupling a regional air-quality model and an agro-ecosystem model, which contains three main components 1) the Environmental Policy Integrated Climate (EPIC) model, 2) the meso-scale meteorology Weather Research and Forecasting (WRF) model and 3) the CMAQ air quality model with bi-directional ammonia fluxes. The EPIC output information about daily fertilizer application and soil characteristics would be the input of the CMAQ model. In order to run EPIC model, much Chinese local information is collected and processed. For example, Crop land data are computed from the MODIS land use data at 500-m resolution and crop categories at Chinese county level; the fertilizer use rate for different fertilizer types, crops and provinces are obtained from Chinese statistic materials. The system takes into consideration many influencing factors on agriculture ammonia emission, including weather, the fertilizer application method, timing, amount, and rate for specific pastures and crops. The simulated fertilizer data is compared with the NH3 emissions and fertilizer application data from other sources. The results of CMAQ modeling are also discussed and analyzed with field measurements. The estimated agricultural fertilizer NH3 emission in this study is about 3Tg in 2011. The regions with the highest emission rates are located in the North China Plain. Monthly, the peak ammonia emissions occur in April to July.
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
ERIC Educational Resources Information Center
Gomez, Laura E.; Arias, Benito; Verdugo, Miguel Angel; Navas, Patricia
2012-01-01
Background: Most instruments that assess quality of life have been validated by means of the classical test theory (CTT). However, CTT limitations have resulted in the development of alternative models, such as the Rasch rating scale model (RSM). The main goal of this paper is testing and improving the psychometric properties of the INTEGRAL…
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2017-01-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
NASA Astrophysics Data System (ADS)
Engwirda, Darren
2017-06-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
Useful measures and models for analytical quality management in medical laboratories.
Westgard, James O
2016-02-01
The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.
Development of a cloud-based application for the Fracture Liaison Service model of care.
Holzmueller, C G; Karp, S; Zeldow, D; Lee, D B; Thompson, D A
2016-02-01
The aims of this study are to develop a cloud-based application of the Fracture Liaison Service for practitioners to coordinate the care of osteoporotic patients after suffering primary fractures and provide a performance feedback portal for practitioners to determine quality of care. The application provides continuity of care, improved patient outcomes, and reduced medical costs. The purpose of this study is to describe the content development and functionality of a cloud-based application to broadly deploy the Fracture Liaison Service (FLS) to coordinate post-fracture care for osteoporotic patients. The Bone Health Collaborative developed the FLS application in 2013 to support practitioners' access to information and management of patients and provide a feedback portal for practitioners to track their performance in providing quality care. A five-step protocol (identify, inform, initiate, investigate, and iterate) organized osteoporotic post-fracture care-related tasks and timelines for the application. A range of descriptive data about the patient, their medical condition, therapies and care, and current providers can be collected. Seven quality of care measures from the National Quality Forum, The Joint Commission, and the Centers for Medicare and Medicaid Services can be tracked through the application. There are five functional areas including home, tasks, measures, improvement, and data. The home, tasks, and data pages are used to enter patient information and coordinate care using the five-step protocol. Measures and improvement pages are used to enter quality measures and provide practitioners with continuous performance feedback. The application resides within a portal, running on a multitenant, private cloud-based Avedis enterprise registry platform. All data are encrypted in transit and users access the application using a password from any common web browser. The application could spread the FLS model of care across the US health care system, provide continuity of care, effectively manage osteoporotic patients, improve outcomes, and reduce medical costs.
Value for money of changing healthcare services? Economic evaluation of quality improvement
Severens, J
2003-01-01
There are many instances of perceived or real inefficiencies in health service delivery. Both healthcare providers and policy makers need to know the impact and cost of applying strategies to change the behaviour of individuals or organisations. Quality improvement or implementation research is concerned with evaluating the methods of behavioural change. Addressing inefficiencies in healthcare services raises a series of issues, beginning with how inefficiency itself should be defined. The basic concepts of cost analysis and economic evaluations are explained and a model for working through the economic issues of quality improvement is discussed. This model combines the costs and benefits of corrected inefficiency with the costs and degree of behavioural change achieved by a quality improvement method in the policy maker's locality. It shows why it may not always be cost effective for policy makers to address suboptimal behaviour. Both the interpretation of quality improvement research findings and their local application need careful consideration. The limited availability of applicable quality improvement research may make it difficult to provide robust advice on the value for money of many behavioural quality improvement strategies. PMID:14532369
Preservation of protein clefts in comparative models.
Piedra, David; Lois, Sergi; de la Cruz, Xavier
2008-01-16
Comparative, or homology, modelling of protein structures is the most widely used prediction method when the target protein has homologues of known structure. Given that the quality of a model may vary greatly, several studies have been devoted to identifying the factors that influence modelling results. These studies usually consider the protein as a whole, and only a few provide a separate discussion of the behaviour of biologically relevant features of the protein. Given the value of the latter for many applications, here we extended previous work by analysing the preservation of native protein clefts in homology models. We chose to examine clefts because of their role in protein function/structure, as they are usually the locus of protein-protein interactions, host the enzymes' active site, or, in the case of protein domains, can also be the locus of domain-domain interactions that lead to the structure of the whole protein. We studied how the largest cleft of a protein varies in comparative models. To this end, we analysed a set of 53507 homology models that cover the whole sequence identity range, with a special emphasis on medium and low similarities. More precisely we examined how cleft quality - measured using six complementary parameters related to both global shape and local atomic environment, depends on the sequence identity between target and template proteins. In addition to this general analysis, we also explored the impact of a number of factors on cleft quality, and found that the relationship between quality and sequence identity varies depending on cleft rank amongst the set of protein clefts (when ordered according to size), and number of aligned residues. We have examined cleft quality in homology models at a range of seq.id. levels. Our results provide a detailed view of how quality is affected by distinct parameters and thus may help the user of comparative modelling to determine the final quality and applicability of his/her cleft models. In addition, the large variability in model quality that we observed within each sequence bin, with good models present even at low sequence identities (between 20% and 30%), indicates that properly developed identification methods could be used to recover good cleft models in this sequence range.
Application of Six Sigma Model to Evaluate the Analytical Quality of Four HbA1c Analyzers.
Maesa, Jos Eacute M; Fern Aacute Ndez-Riejos, Patricia; S Aacute Nchez-Mora, Catalina; Toro-Crespo, Mar Iacute A De; Gonz Aacute Lez-Rodriguez, Concepci Oacute N
2017-01-01
The Six Sigma Model is a global quality management system applicable to the determination of glycated hemoglobin (HbA1c). In addition, this model can ensure the three characteristics influencing the patient risk: the correct performance of the analytical method with low inaccuracy and bias, the quality control strategy used by the laboratory, and the necessary quality of the analyte. The aim of this study is to use the Six Sigma Model for evaluating quality criteria in the determination of glycated hemoglobin HbA1c and its application to assess four different HbA1c analyzers. Four HbA1c analyzers were evaluated: HA-8180V®, D-100®, G8®, and Variant II Turbo®. For 20 consecutive days, two levels of quality control (high and low) provided by the manufacturers were measured in each of the instruments. Imprecision (CV), bias, and Sigma values (σ) were calculated with the data obtained and a method decision chart was developed considering a range of quality requirements (allowable total error, TEa). For a TEa = 3%, HA-8180V = 1.54 σ, D-100 = 1.63 σ, G8 = 2.20 σ, and Variant II Turbo = -0.08 σ. For a TEa = 4%, HA-8180V = 2.34 σ, D-100 = 2.32 σ, G8 = 3.74 σ, and Variant II Turbo = 0.16 σ. For a TEa = 10%, HA8180V = 7.12 σ, D-100 = 6.46 σ, G8 = 13.0 σ, and Variant II Turbo = 1.56 σ. Applying the Stockholm consensus and its subsequent Milan review to the results: the maximum level in quality requirements for HbA1c is an allowable total error (TEa) = 3%, G8 is located in region 2 σ (2.20), which is a poor result, and HA-8180V and D-100 are both in region 1 σ (1.54 and 1.63, respectively), which is an unacceptable analytical performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less
Brewer, Shannon K.; Worthington, Thomas; Mollenhauer, Robert; Stewart, David; McManamay, Ryan; Guertault, Lucie; Moore, Desiree
2018-01-01
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio‐economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models, 43 were commonly applied due to their versatility, accessibility, user‐friendliness, and excellent user‐support. Forty‐one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user‐support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user‐friendly forms, increasing user‐support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Nonetheless, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.
Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert; ...
2018-04-06
Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less
The Kubler-Ross model, physician distress, and performance reporting.
Smaldone, Marc C; Uzzo, Robert G
2013-07-01
Physician performance reporting has been proposed as an essential component of health-care reform, with the aim of improving quality by providing transparency and accountability. Despite strong evidence demonstrating regional variation in practice patterns and lack of evidence-based care, public outcomes reporting has been met with resistance from medical professionals. Application of the Kubler-Ross 'five stages of grief' model--a conceptual framework consisting of a series of emotional stages (denial, anger, bargaining, depression, and acceptance) inspired by work with terminally ill patients--could provide some insight into why physicians are reluctant to accept emerging quality-reporting mechanisms. Physician-led quality-improvement initiatives are vital to contemporary health-care reform efforts and applications in urology, as well as other medical disciplines, are currently being explored.
Air quality surfaces representing pollutant concentrations across space and time are needed for many applications, including tracking trends and relating air quality to human and ecosystem health. The spatial and temporal characteristics of these surfaces may reveal new informat...
ISO 9000 Quality Systems: Application to Higher Education.
ERIC Educational Resources Information Center
Clery, Roger G.
This paper describes and explains the 20 elements of the International Organization for Standards 9000 (ISO 9000) series, a model for quality assurance in the business processes of design/development, production, installation and servicing. The standards were designed in 1987 to provide a common denominator for business quality particularly to…
A Linear Algebra Measure of Cluster Quality.
ERIC Educational Resources Information Center
Mather, Laura A.
2000-01-01
Discussion of models for information retrieval focuses on an application of linear algebra to text clustering, namely, a metric for measuring cluster quality based on the theory that cluster quality is proportional to the number of terms that are disjoint across the clusters. Explains term-document matrices and clustering algorithms. (Author/LRW)
Applications of artificial neural networks (ANNs) in food science.
Huang, Yiqun; Kangas, Lars J; Rasco, Barbara A
2007-01-01
Artificial neural networks (ANNs) have been applied in almost every aspect of food science over the past two decades, although most applications are in the development stage. ANNs are useful tools for food safety and quality analyses, which include modeling of microbial growth and from this predicting food safety, interpreting spectroscopic data, and predicting physical, chemical, functional and sensory properties of various food products during processing and distribution. ANNs hold a great deal of promise for modeling complex tasks in process control and simulation and in applications of machine perception including machine vision and electronic nose for food safety and quality control. This review discusses the basic theory of the ANN technology and its applications in food science, providing food scientists and the research community an overview of the current research and future trend of the applications of ANN technology in the field.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot-induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations is formulated. Finally, a model based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.
Kanso, A; Chebbo, G; Tassin, B
2005-01-01
Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.
USDA-ARS?s Scientific Manuscript database
Well-tested agricultural system models can improve our understanding of the water quality effects of management practices under different conditions. The Root Zone Water Quality Model (RZWQM) has been tested under a variety of conditions. However, the current model’s ability to simulate pesticide tr...
The ability to forecast local and regional air pollution events is challenging since the processes governing the production and sustenance of atmospheric pollutants are complex and often non-linear. Comprehensive atmospheric models, by representing in as much detail as possible t...
Twentieth Annual Conference on Manual Control, Volume 1
NASA Technical Reports Server (NTRS)
Hart, S. G. (Compiler); Hartzell, E. J. (Compiler)
1984-01-01
The 48 papers presented were devoted to humanopeator modeling, application of models to simulation and operational environments, aircraft handling qualities, teleopertors, fault diagnosis, and biodynamics.
Robson, Stanley G.
1978-01-01
This study investigated the use of a two-dimensional profile-oriented water-quality model for the simulation of head and water-quality changes through the saturated thickness of an aquifer. The profile model is able to simulate confined or unconfined aquifers with nonhomogeneous anisotropic hydraulic conductivity, nonhomogeneous specific storage and porosity, and nonuniform saturated thickness. An aquifer may be simulated under either steady or nonsteady flow conditions provided that the ground-water flow path along which the longitudinal axis of the model is oriented does not move in the aquifer during the simulation time period. The profile model parameters are more difficult to quantify than are the corresponding parameters for an areal-oriented water-fluality model. However, the sensitivity of the profile model to the parameters may be such that the normal error of parameter estimation will not preclude obtaining acceptable model results. Although the profile model has the advantage of being able to simulate vertical flow and water-quality changes in a single- or multiple-aquifer system, the types of problems to which it can be applied is limited by the requirements that (1) the ground-water flow path remain oriented along the longitudinal axis of the model and (2) any subsequent hydrologic factors to be evaluated using the model must be located along the land-surface trace of the model. Simulation of hypothetical ground-water management practices indicates that the profile model is applicable to problem-oriented studies and can provide quantitative results applicable to a variety of management practices. In particular, simulations of the movement and dissolved-solids concentration of a zone of degraded ground-water quality near Barstow, Calif., indicate that halting subsurface disposal of treated sewage effluent in conjunction with pumping a line of fully penetrating wells would be an effective means of controlling the movement of degraded ground water.
Sound quality indicators for urban places in Paris cross-validated by Milan data.
Ricciardi, Paola; Delaitre, Pauline; Lavandier, Catherine; Torchia, Francesca; Aumond, Pierre
2015-10-01
A specific smartphone application was developed to collect perceptive and acoustic data in Paris. About 3400 questionnaires were analyzed, regarding the global sound environment characterization, the perceived loudness of some emergent sources and the presence time ratio of sources that do not emerge from the background. Sound pressure level was recorded each second from the mobile phone's microphone during a 10-min period. The aim of this study is to propose indicators of urban sound quality based on linear regressions with perceptive variables. A cross validation of the quality models extracted from Paris data was carried out by conducting the same survey in Milan. The proposed sound quality general model is correlated with the real perceived sound quality (72%). Another model without visual amenity and familiarity is 58% correlated with perceived sound quality. In order to improve the sound quality indicator, a site classification was performed by Kohonen's Artificial Neural Network algorithm, and seven specific class models were developed. These specific models attribute more importance on source events and are slightly closer to the individual data than the global model. In general, the Parisian models underestimate the sound quality of Milan environments assessed by Italian people.
NASA Technical Reports Server (NTRS)
Estes, Sue; Haynes, John; Omar, Ali
2013-01-01
Health and Air Quality providers and researchers need environmental data to study and understand the geographic, environmental, and meteorological differences in disease. Satellite remote sensing of the environment offers a unique vantage point that can fill in the gaps of environmental, spatial, and temporal data for tracking disease. This presentation will demonstrate the need for collaborations between multi-disciplinary research groups to develop the full potential of utilizing Earth Observations in studying health. Satellite earth observations present a unique vantage point of the earth's environment from space, which offers a wealth of health applications for the imaginative investigator. The presentation is directly related to Earth Observing systems and Global Health Surveillance and will present research results of the remote sensing environmental observations of earth and health applications, which can contribute to the public health and air quality research. As part of NASA approach and methodology they have used Earth Observation Systems and Applications for Public Health and Air Quality Models to provide a method for bridging gaps of environmental, spatial, and temporal data for tracking disease. This presentation will provide an overview of projects dealing with infectious diseases, water borne diseases and air quality and how many environmental variables effect human health. This presentation will provide a venue where the results of both research and practice using satellite earth observations to study weather and it's role in public health research.
NASA Technical Reports Server (NTRS)
Estes, Sue; Haynes, John; Omar, Ali
2012-01-01
Health and Air Quality providers and researchers need environmental data to study and understand the geographic, environmental, and meteorological differences in disease. Satellite remote sensing of the environment offers a unique vantage point that can fill in the gaps of environmental, spatial, and temporal data for tracking disease. This presentation will demonstrate the need for collaborations between multi-disciplinary research groups to develop the full potential of utilizing Earth Observations in studying health. Satellite earth observations present a unique vantage point of the earth's environment from space, which offers a wealth of health applications for the imaginative investigator. The presentation is directly related to Earth Observing systems and Global Health Surveillance and will present research results of the remote sensing environmental observations of earth and health applications, which can contribute to the public health and air quality research. As part of NASA approach and methodology they have used Earth Observation Systems and Applications for Public Health and Air Quality Models to provide a method for bridging gaps of environmental, spatial, and temporal data for tracking disease. This presentation will provide an overview of projects dealing with infectious diseases, water borne diseases and air quality and how many environmental variables effect human health. This presentation will provide a venue where the results of both research and practice using satellite earth observations to study weather and it's role in public health research.
NASA Astrophysics Data System (ADS)
Cipriani, L.; Fantini, F.; Bertacchi, S.
2014-06-01
Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.
Persistence of initial conditions in continental scale air quality ...
This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springtime and the second for summertime. Results show that a spin-up period of ten days commonly used in regional-scale applications may not be sufficient to reduce the effects of initial conditions to less than 1% of seasonally-averaged surface ozone concentrations everywhere while 20 days were found to be sufficient for the entire domain for the spring case and almost the entire domain for the summer case. For the summer case, differences were found to persist longer aloft due to circulation of air masses and even a spin-up period of 30 days was not sufficient to reduce the effects of ICs to less than 1% of seasonally-averaged layer 34 ozone concentrations over the southwestern portion of the modeling domain. Analysis of the effect of soil initial conditions for the CMAQ bidirectional NH3 exchange model shows that during springtime they can have an important effect on simulated inorganic aerosols concentrations for time periods of one month or longer. The effects are less pronounced during other seasons. The results, while specific to the modeling domain and time periods simulated here, suggest that modeling protocols need to be scrutinized for a given application and that it cannot be assum
Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling
NASA Astrophysics Data System (ADS)
Ormsbee, L.; Tufail, M.
2005-12-01
The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.
The Impact of ISO Quality Management Systems on Primary and Secondary Schools in Spain
ERIC Educational Resources Information Center
Arribas Díaz, Jorge Antonio; Martínez-Mediano, Catalina
2018-01-01
Purpose: The purpose of this study is to evaluate the application of quality management systems (QMS) based on international standards of quality in education (ISO 9001:2008) and ascertain the influence of this quality model on primary and secondary schools in Spain. Design/methodology/approach: The study was conducted in 26 publicly funded,…
Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency
NASA Astrophysics Data System (ADS)
Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.
2013-09-01
A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.
Data-base development for water-quality modeling of the Patuxent River basin, Maryland
Fisher, G.T.; Summers, R.M.
1987-01-01
Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)
NASA Astrophysics Data System (ADS)
Mathur, R.
2009-12-01
Emerging regional scale atmospheric simulation models must address the increasing complexity arising from new model applications that treat multi-pollutant interactions. Sophisticated air quality modeling systems are needed to develop effective abatement strategies that focus on simultaneously controlling multiple criteria pollutants as well as use in providing short term air quality forecasts. In recent years the applications of such models is continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physical and chemical atmospheric processes occurring at these disparate spatial and temporal scales requires the use of observation data beyond traditional in-situ networks so that the model simulations can be reasonably constrained. Preliminary applications of assimilation of remote sensing and aloft observations within a comprehensive regional scale atmospheric chemistry-transport modeling system will be presented: (1) A methodology is developed to assimilate MODIS aerosol optical depths in the model to represent the impacts long-range transport associated with the summer 2004 Alaskan fires on surface-level regional fine particulate matter (PM2.5) concentrations across the Eastern U.S. The episodic impact of this pollution transport event on PM2.5 concentrations over the eastern U.S. during mid-July 2004, is quantified through the complementary use of the model with remotely-sensed, aloft, and surface measurements; (2) Simple nudging experiments with limited aloft measurements are performed to identify uncertainties in model representations of physical processes and assess the potential use of such measurements in improving the predictive capability of atmospheric chemistry-transport models. The results from these early applications will be discussed in context of uncertainties in the model and in the remote sensing data and needs for defining a future optimum observing strategy.
Human factors systems approach to healthcare quality and patient safety
Carayon, Pascale; Wetterneck, Tosha B.; Rivera-Rodriguez, A. Joy; Hundt, Ann Schoofs; Hoonakker, Peter; Holden, Richard; Gurses, Ayse P.
2013-01-01
Human factors systems approaches are critical for improving healthcare quality and patient safety. The SEIPS (Systems Engineering Initiative for Patient Safety) model of work system and patient safety is a human factors systems approach that has been successfully applied in healthcare research and practice. Several research and practical applications of the SEIPS model are described. Important implications of the SEIPS model for healthcare system and process redesign are highlighted. Principles for redesigning healthcare systems using the SEIPS model are described. Balancing the work system and encouraging the active and adaptive role of workers are key principles for improving healthcare quality and patient safety. PMID:23845724
ERIC Educational Resources Information Center
Azkiyah, Siti Nurul; Mukminin, Amirul
2017-01-01
This study was intended to investigate the teaching quality of student teachers when they conducted their teaching practicum. Teaching quality is conceptualised based on eight classroom factors (orientation, structuring, modelling, application, questioning, building classroom as a learning environment, assessment, and time management) of the…
USDA-ARS?s Scientific Manuscript database
A three-dimensional water quality model was developed for simulating temporal and spatial variations of phytoplankton, nutrients, and dissolved oxygen in freshwater bodies. Effects of suspended and bed sediment on the water quality processes were simulated. A formula was generated from field measure...
Spatial Data Quality Control Procedure applied to the Okavango Basin Information System
NASA Astrophysics Data System (ADS)
Butchart-Kuhlmann, Daniel
2014-05-01
Spatial data is a powerful form of information, capable of providing information of great interest and tremendous use to a variety of users. However, much like other data representing the 'real world', precision and accuracy must be high for the results of data analysis to be deemed reliable and thus applicable to real world projects and undertakings. The spatial data quality control (QC) procedure presented here was developed as the topic of a Master's thesis, in the sphere of and using data from the Okavango Basin Information System (OBIS), itself a part of The Future Okavango (TFO) project. The aim of the QC procedure was to form the basis of a method through which to determine the quality of spatial data relevant for application to hydrological, solute, and erosion transport modelling using the Jena Adaptable Modelling System (JAMS). As such, the quality of all data present in OBIS classified under the topics of elevation, geoscientific information, or inland waters, was evaluated. Since the initial data quality has been evaluated, efforts are underway to correct the errors found, thus improving the quality of the dataset.
Supporting Collaborative Model and Data Service Development and Deployment with DevOps
NASA Astrophysics Data System (ADS)
David, O.
2016-12-01
Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.
Analysis of aircraft longitudinal handling qualities
NASA Technical Reports Server (NTRS)
Hess, R. A.
1981-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Quality Assurance Guidance for the Collection of Meteorological Data Using Passive Radiometers
This document augments the February 2000 guidance entitled Meteorological Monitoring Guidance for Regulatory Modeling Applications and the March 2008 guidance entitled Quality Assurance Handbook for Air Pollution Measurement Systems Volume IV: Meteorological Measurements Version ...
Heuristic Model Of The Composite Quality Index Of Environmental Assessment
NASA Astrophysics Data System (ADS)
Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.
2017-01-01
The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.
USDA-ARS?s Scientific Manuscript database
The Ensemble Kalman Filter (EnKF), a popular data assimilation technique for non-linear systems was applied to the Root Zone Water Quality Model. Measured soil moisture data at four different depths (5cm, 20cm, 40cm and 60cm) from two agricultural fields (AS1 and AS2) in northeastern Indiana were us...
Predicting indoor pollutant concentrations, and applications to air quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lorenzetti, David M.
Because most people spend more than 90% of their time indoors, predicting exposure to airborne pollutants requires models that incorporate the effect of buildings. Buildings affect the exposure of their occupants in a number of ways, both by design (for example, filters in ventilation systems remove particles) and incidentally (for example, sorption on walls can reduce peak concentrations, but prolong exposure to semivolatile organic compounds). Furthermore, building materials and occupant activities can generate pollutants. Indoor air quality depends not only on outdoor air quality, but also on the design, maintenance, and use of the building. For example, ''sick building'' symptomsmore » such as respiratory problems and headaches have been related to the presence of air-conditioning systems, to carpeting, to low ventilation rates, and to high occupant density (1). The physical processes of interest apply even in simple structures such as homes. Indoor air quality models simulate the processes, such as ventilation and filtration, that control pollutant concentrations in a building. Section 2 describes the modeling approach, and the important transport processes in buildings. Because advection usually dominates among the transport processes, Sections 3 and 4 describe methods for predicting airflows. The concluding section summarizes the application of these models.« less
Water quality modelling of an impacted semi-arid catchment using flow data from the WEAP model
NASA Astrophysics Data System (ADS)
Slaughter, Andrew R.; Mantel, Sukhmani K.
2018-04-01
The continuous decline in water quality in many regions is forcing a shift from quantity-based water resources management to a greater emphasis on water quality management. Water quality models can act as invaluable tools as they facilitate a conceptual understanding of processes affecting water quality and can be used to investigate the water quality consequences of management scenarios. In South Africa, the Water Quality Systems Assessment Model (WQSAM) was developed as a management-focussed water quality model that is relatively simple to be able to utilise the small amount of available observed data. Importantly, WQSAM explicitly links to systems (yield) models routinely used in water resources management in South Africa by using their flow output to drive water quality simulations. Although WQSAM has been shown to be able to represent the variability of water quality in South African rivers, its focus on management from a South African perspective limits its use to within southern African regions for which specific systems model setups exist. Facilitating the use of WQSAM within catchments outside of southern Africa and within catchments for which these systems model setups to not exist would require WQSAM to be able to link to a simple-to-use and internationally-applied systems model. One such systems model is the Water Evaluation and Planning (WEAP) model, which incorporates a rainfall-runoff component (natural hydrology), and reservoir storage, return flows and abstractions (systems modelling), but within which water quality modelling facilities are rudimentary. The aims of the current study were therefore to: (1) adapt the WQSAM model to be able to use as input the flow outputs of the WEAP model and; (2) provide an initial assessment of how successful this linkage was by application of the WEAP and WQSAM models to the Buffalo River for historical conditions; a small, semi-arid and impacted catchment in the Eastern Cape of South Africa. The simulations of the two models were compared to the available observed data, with the initial focus within WQSAM on a simulation of instream total dissolved solids (TDS) and nutrient concentrations. The WEAP model was able to adequately simulate flow in the Buffalo River catchment, with consideration of human inputs and outputs. WQSAM was adapted to successfully take as input the flow output of the WEAP model, and the simulations of nutrients by WQSAM provided a good representation of the variability of observed nutrient concentrations in the catchment. This study showed that the WQSAM model is able to accept flow inputs from the WEAP model, and that this approach is able to provide satisfactory estimates of both flow and water quality for a small, semi-arid and impacted catchment. It is hoped that this research will encourage the application of WQSAM to an increased number of catchments within southern Africa and beyond.
Data envelopment analysis in service quality evaluation: an empirical study
NASA Astrophysics Data System (ADS)
Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid
2015-09-01
Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irene Farnham
This Quality Assurance Project Plan (QAPP) provides the overall quality assurance (QA) program requirements and general quality practices to be applied to the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) Underground Test Area (UGTA) Sub-Project (hereafter the Sub-Project) activities. The requirements in this QAPP are consistent with DOE Order 414.1C, Quality Assurance (DOE, 2005); U.S. Environmental Protection Agency (EPA) Guidance for Quality Assurance Project Plans for Modeling (EPA, 2002); and EPA Guidance on the Development, Evaluation, and Application of Environmental Models (EPA, 2009). The QAPP Revision 0 supersedes DOE--341, Underground Test Area Quality Assurancemore » Project Plan, Nevada Test Site, Nevada, Revision 4.« less
Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.
2012-01-01
Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.
NASA Astrophysics Data System (ADS)
Sondkar, Pravin B.
The severity of combined aerodynamics and power transmission response in high-speed, high power density systems such as a rotorcraft is still a major cause of annoyance in spite of recent advancement in passive, semi-active and active control. With further increase in the capacity and power of this class of machinery systems, the acoustic noise levels are expected to increase even more. To achieve further improvements in sound quality, a more refined understanding of the factors and attributes controlling human perception is needed. In the case of rotorcraft systems, the perceived quality of the interior sound field is a major determining factor of passenger comfort. Traditionally, this sound quality factor is determined by measuring the response of a chosen set of juries who are asked to compare their qualitative reactions to two or more sounds based on their subjective impressions. This type of testing is very time-consuming, costly, often inconsistent, and not useful for practical design purposes. Furthermore, there is no known universal model for sound quality. The primary aim of this research is to achieve significant improvements in quantifying the sound quality of combined aerodynamic and power transmission response in high-speed, high power density machinery systems such as a rotorcraft by applying relevant objective measures related to the spectral characteristics of the sound field. Two models have been proposed in this dissertation research. First, a classical multivariate regression analysis model based on currently known sound quality metrics as well some new metrics derived in this study is presented. Even though the analysis resulted in the best possible multivariate model as a measure of the acoustic noise quality, it lacks incorporation of human judgment mechanism. The regression model can change depending on specific application, nature of the sounds and types of juries used in the study. Also, it predicts only the averaged preference scores and does not explain why two jury members differ in their judgment. To address the above shortcoming of applying regression analysis, a new human judgment model is proposed to further improve the ability to predict the degree of subjective annoyance. The human judgment model involves extraction of subjective attributes and their values using a proposed artificial jury processor. In this approach, a set of ear transfer functions are employed to compute the characteristics of sound pressure waves as perceived subjectively by human. The resulting basilar membrane displacement data from this proposed model is then applied to analyze the attribute values. Using this proposed human judgment model, the human judgment mechanism, which is highly sophisticated, will be examined. Since the human judgment model is essentially based on jury attributes that are not expected to change significantly with application or nature of the sound field, it gives a more common basis to evaluate sound quality. This model also attempts to explain the inter-juror differences in opinion, which is critical in understanding the variability in human response.
Development and application of computational fluid dynamics (CFD) simulations are being advanced through case studies for simulating air pollutant concentrations from sources within open fields and within complex urban building environments. CFD applications have been under deve...
Zhang, Lei; Zou, Zhihong; Shan, Wei
2017-06-01
Water quality forecasting is an essential part of water resource management. Spatiotemporal variations of water quality and their inherent constraints make it very complex. This study explored a data-based method for short-term water quality forecasting. Prediction of water quality indicators including dissolved oxygen, chemical oxygen demand by KMnO 4 and ammonia nitrogen using support vector machine was taken as inputs of the particle swarm algorithm based optimal wavelet neural network to forecast the whole status index of water quality. Gubeikou monitoring section of Miyun reservoir in Beijing, China was taken as the study case to examine effectiveness of this approach. The experiment results also revealed that the proposed model has advantages of stability and time reduction in comparison with other data-driven models including traditional BP neural network model, wavelet neural network model and Gradient Boosting Decision Tree model. It can be used as an effective approach to perform short-term comprehensive water quality prediction. Copyright © 2016. Published by Elsevier B.V.
Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.
2012-08-01
The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.
Validation, Edits, and Application Processing Phase II and Error-Prone Model Report.
ERIC Educational Resources Information Center
Gray, Susan; And Others
The impact of quality assurance procedures on the correct award of Basic Educational Opportunity Grants (BEOGs) for 1979-1980 was assessed, and a model for detecting error-prone applications early in processing was developed. The Bureau of Student Financial Aid introduced new comments into the edit system in 1979 and expanded the pre-established…
“FEST-C 1.0 for CMAQ Bi-directional NH3 Modeling and Apatial Allocator 4.1”
Accurate estimation of ammonia emissions in space and time has been a challenge in meso-scale air quality modeling. For instance, fertilizer applications vary in the date of application and amount by crop types and geographical area. With the support of the U.S EPA, we have devel...
MODELS-3/CMAQ APPLICATIONS WHICH ILLUSTRATE CAPABILITY AND FUNCTIONALITY
The Models-3/CMAQ developed by the U.S. Environmental Protections Agency (USEPA) is a third generation multiscale, multi-pollutant air quality modeling system within a high-level, object-oriented computer framework (Models-3). It has been available to the scientific community ...
Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...
NASA Astrophysics Data System (ADS)
Vanrolleghem, Peter A.; Mannina, Giorgio; Cosenza, Alida; Neumann, Marc B.
2015-03-01
Sensitivity analysis represents an important step in improving the understanding and use of environmental models. Indeed, by means of global sensitivity analysis (GSA), modellers may identify both important (factor prioritisation) and non-influential (factor fixing) model factors. No general rule has yet been defined for verifying the convergence of the GSA methods. In order to fill this gap this paper presents a convergence analysis of three widely used GSA methods (SRC, Extended FAST and Morris screening) for an urban drainage stormwater quality-quantity model. After the convergence was achieved the results of each method were compared. In particular, a discussion on peculiarities, applicability, and reliability of the three methods is presented. Moreover, a graphical Venn diagram based classification scheme and a precise terminology for better identifying important, interacting and non-influential factors for each method is proposed. In terms of convergence, it was shown that sensitivity indices related to factors of the quantity model achieve convergence faster. Results for the Morris screening method deviated considerably from the other methods. Factors related to the quality model require a much higher number of simulations than the number suggested in literature for achieving convergence with this method. In fact, the results have shown that the term "screening" is improperly used as the method may exclude important factors from further analysis. Moreover, for the presented application the convergence analysis shows more stable sensitivity coefficients for the Extended-FAST method compared to SRC and Morris screening. Substantial agreement in terms of factor fixing was found between the Morris screening and Extended FAST methods. In general, the water quality related factors exhibited more important interactions than factors related to water quantity. Furthermore, in contrast to water quantity model outputs, water quality model outputs were found to be characterised by high non-linearity.
Development of a Next Generation Air Quality Modeling System
In the presentation we will describe our modifications to MPAS to improve its suitability for retrospective air quality applications and show evaluations of global and regional meterological simulations. Our modifications include addition of physics schemes that we developed for...
Increasing the Use of Earth Science Data and Models in Air Quality Management.
Milford, Jana B; Knight, Daniel
2017-04-01
In 2010, the U.S. National Aeronautics and Space Administration (NASA) initiated the Air Quality Applied Science Team (AQAST) as a 5-year, $17.5-million award with 19 principal investigators. AQAST aims to increase the use of Earth science products in air quality-related research and to help meet air quality managers' information needs. We conducted a Web-based survey and a limited number of follow-up interviews to investigate federal, state, tribal, and local air quality managers' perspectives on usefulness of Earth science data and models, and on the impact AQAST has had. The air quality managers we surveyed identified meeting the National Ambient Air Quality Standards for ozone and particulate matter, emissions from mobile sources, and interstate air pollution transport as top challenges in need of improved information. Most survey respondents viewed inadequate coverage or frequency of satellite observations, data uncertainty, and lack of staff time or resources as barriers to increased use of satellite data by their organizations. Managers who have been involved with AQAST indicated that the program has helped build awareness of NASA Earth science products, and assisted their organizations with retrieval and interpretation of satellite data and with application of global chemistry and climate models. AQAST has also helped build a network between researchers and air quality managers with potential for further collaborations. NASA's Air Quality Applied Science Team (AQAST) aims to increase the use of satellite data and global chemistry and climate models for air quality management purposes, by supporting research and tool development projects of interest to both groups. Our survey and interviews of air quality managers indicate they found value in many AQAST projects and particularly appreciated the connections to the research community that the program facilitated. Managers expressed interest in receiving continued support for their organizations' use of satellite data, including assistance in retrieving and interpreting data from future geostationary platforms meant to provide more frequent coverage for air quality and other applications.
Although BASINS has been in use for the past 10 years, there has been limited modeling guidance on its applications for complex environmental problems, such as modeling impacts of hydro modification on water quantity and quality.
Effects of Meteorological Data Quality on Snowpack Modeling
NASA Astrophysics Data System (ADS)
Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.
2017-12-01
Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.
NASA Astrophysics Data System (ADS)
Zhang, J.; Lin, L. F.; Bras, R. L.
2017-12-01
Hydrological applications rely on the availability and quality of precipitation products, specially model- and satellite-based products for use in areas without ground measurements. It is known that the quality of model- and satellite-based precipitation products are complementary—model-based products exhibiting high quality during winters while satellite-based products seem to be better during summers. To explore that behavior, this study uses 2-m air temperature as auxiliary information to evaluate high-resolution (0.1°×0.1° every hour) precipitation products from Weather Research and Forecasting (WRF) simulations and from version-4 Integrated Multi-satellite Retrievals for GPM (IMERG) early and final runs. The products are evaluated relative to the reference NCEP Stage IV precipitation estimates over the central United States in 2016. The results show that the WRF and IMERG final-run estimates are nearly unbiased while the IMERG early-run estimates positively biased. The results also show that the WRF estimates exhibit high correlations with the reference data when the temperature falls below 280°K and the IMERG estimates (i.e., both early and final runs) do so when the temperature exceeds 280°K. Moreover, the temperature threshold of 280°K, which distinguishes the quality of the WRF and the IMERG products, does not vary significantly with either season or location. This study not only adds insight into current precipitation research on the quality of precipitation products but also suggests a simple way for choosing either a model- or satellite-based product or a hybrid model/satellite product for applications.
Parameterization guidelines and considerations for hydrologic models
R. W. Malone; G. Yagow; C. Baffaut; M.W Gitau; Z. Qi; Devendra Amatya; P.B. Parajuli; J.V. Bonta; T.R. Green
2015-01-01
 Imparting knowledge of the physical processes of a system to a model and determining a set of parameter values for a hydrologic or water quality model application (i.e., parameterization) are important and difficult tasks. An exponential...
XAL Application Framework and Bricks GUI Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelaia II, Tom
2007-01-01
The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.
Evaluating Air-Quality Models: Review and Outlook.
NASA Astrophysics Data System (ADS)
Weil, J. C.; Sykes, R. I.; Venkatram, A.
1992-10-01
Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
Modeling of facade leaching in urban catchments
NASA Astrophysics Data System (ADS)
Coutu, S.; Del Giudice, D.; Rossi, L.; Barry, D. A.
2012-12-01
Building facades are protected from microbial attack by incorporation of biocides within them. Flow over facades leaches these biocides and transports them to the urban environment. A parsimonious water quantity/quality model applicable for engineered urban watersheds was developed to compute biocide release from facades and their transport at the urban basin scale. The model couples two lumped submodels applicable at the basin scale, and a local model of biocide leaching at the facade scale. For the facade leaching, an existing model applicable at the individual wall scale was utilized. The two lumped models describe urban hydrodynamics and leachate transport. The integrated model allows prediction of biocide concentrations in urban rivers. It was applied to a 15 km2urban hydrosystem in western Switzerland, the Vuachère river basin, to study three facade biocides (terbutryn, carbendazim, diuron). The water quality simulated by the model matched well most of the pollutographs at the outlet of the Vuachère watershed. The model was then used to estimate possible ecotoxicological impacts of facade leachates. To this end, exceedance probabilities and cumulative pollutant loads from the catchment were estimated. Results showed that the considered biocides rarely exceeded the relevant predicted no-effect concentrations for the riverine system. Despite the heterogeneities and complexity of (engineered) urban catchments, the model application demonstrated that a computationally "light" model can be employed to simulate the hydrograph and pollutograph response within them. It thus allows catchment-scale assessment of the potential ecotoxicological impact of biocides on receiving waters.
LINKING THE CMAQ AND HYSPLIT MODELING SYSTEM INTERFACE PROGRAM AND EXAMPLE APPLICATION
A new software tool has been developed to link the Eulerian-based Community Multiscale Air Quality (CMAQ) modeling system with the Lagrangian-based HYSPLIT (HYbrid Single-Particle Lagrangian Integrated Trajectory) model. Both models require many of the same hourly meteorological...
Quality Evaluation of Raw Moutan Cortex Using the AHP and Gray Correlation-TOPSIS Method
Zhou, Sujuan; Liu, Bo; Meng, Jiang
2017-01-01
Background: Raw Moutan cortex (RMC) is an important Chinese herbal medicine. Comprehensive and objective quality evaluation of Chinese herbal medicine has been one of the most important issues in the modern herbs development. Objective: To evaluate and compare the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Materials and Methods: The percentage composition of gallic acid, catechin, oxypaeoniflorin, paeoniflorin, quercetin, benzoylpaeoniflorin, paeonol in different batches of RMC was determined, and then adopting MATLAB programming to construct the gray correlation-TOPSIS assessment model for quality evaluation of RMC. Results: The quality evaluation results of model evaluation and objective evaluation were consistent, reliable, and stable. Conclusion: The model of gray correlation-TOPSIS can be well applied to the quality evaluation of traditional Chinese medicine with multiple components and has broad prospect in application. SUMMARY The experiment tries to construct a model to evaluate the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Results show the model is reliable and provide a feasible way in evaluating quality of traditional Chinese medicine with multiple components. PMID:28839384
Measuring Quality in Special Libraries: Lessons from Service Marketing.
ERIC Educational Resources Information Center
White, Marilyn Domas; Abels, Eileen G.
1995-01-01
Surveys the service marketing literature for models and data-gathering instruments measuring service quality, particularly the instruments SERVQUAL and SERVPERF, and assesses their applicability to special libraries and information centers. Topics include service characteristics and definitions of service; performance-minus-expectations and…
Case studies of severe pollution events due to forest fires/dust storms/industrial haze, from the integrated 2001 aerosol dataset, will be presented within the context of air quality and human health.
No-reference quality assessment based on visual perception
NASA Astrophysics Data System (ADS)
Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao
2014-11-01
The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233 images of JPEG, 174 images of White Noise, 174 images of Gaussian Blur, 174 images of Fast Fading. The database includes subjective differential mean opinion score (DMOS) for each image. The experimental results show that the proposed approach not only can assess many kinds of distorted images quality, but also exhibits a superior accuracy and monotonicity.
NASA Astrophysics Data System (ADS)
Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.
2013-09-01
Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying Lateral Boundary Conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2000-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complimented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone vertical profiles. The results show performance is largely within uncertainty estimates for the Tropospheric Emission Spectrometer (TES) with some exceptions. The major difference shows a high bias in the upper troposphere along the southern boundary in January. This publication documents the global simulation database, the tool for conversion to LBC, and the fidelity of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.
Development of a three dimensional numerical water quality model for continental shelf applications
NASA Technical Reports Server (NTRS)
Spaulding, M.; Hunter, D.
1975-01-01
A model to predict the distribution of water quality parameters in three dimensions was developed. The mass transport equation was solved using a non-dimensional vertical axis and an alternating-direction-implicit finite difference technique. The reaction kinetics of the constituents were incorporated into a matrix method which permits computation of the interactions of multiple constituents. Methods for the computation of dispersion coefficients and coliform bacteria decay rates were determined. Numerical investigations of dispersive and dissipative effects showed that the three-dimensional model performs as predicted by analysis of simpler cases. The model was then applied to a two dimensional vertically averaged tidal dynamics model for the Providence River. It was also extended to a steady state application by replacing the time step with an iteration sequence. This modification was verified by comparison to analytical solutions and applied to a river confluence situation.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Application of a water quality model in the White Cart water catchment, Glasgow, UK.
Liu, S; Tucker, P; Mansell, M; Hursthouse, A
2003-03-01
Water quality models of urban systems have previously focused on point source (sewerage system) inputs. Little attention has been given to diffuse inputs and research into diffuse pollution has been largely confined to agriculture sources. This paper reports on new research that is aimed at integrating diffuse inputs into an urban water quality model. An integrated model is introduced that is made up of four modules: hydrology, contaminant point sources, nutrient cycling and leaching. The hydrology module, T&T consists of a TOPMODEL (a TOPography-based hydrological MODEL), which simulates runoff from pervious areas and a two-tank model, which simulates runoff from impervious urban areas. Linked into the two-tank model, the contaminant point source module simulates the overflow from the sewerage system in heavy rain. The widely known SOILN (SOIL Nitrate model) is the basis of nitrogen cycle module. Finally, the leaching module consists of two functions: the production function and the transfer function. The production function is based on SLIM (Solute Leaching Intermediate Model) while the transfer function is based on the 'flushing hypothesis' which postulates a relationship between contaminant concentrations in the receiving water course and the extent to which the catchment is saturated. This paper outlines the modelling methodology and the model structures that have been developed. An application of this model in the White Cart catchment (Glasgow) is also included.
Data Applicability of Heritage and New Hardware for Launch Vehicle System Reliability Models
NASA Technical Reports Server (NTRS)
Al Hassan Mohammad; Novack, Steven
2015-01-01
Many launch vehicle systems are designed and developed using heritage and new hardware. In most cases, the heritage hardware undergoes modifications to fit new functional system requirements, impacting the failure rates and, ultimately, the reliability data. New hardware, which lacks historical data, is often compared to like systems when estimating failure rates. Some qualification of applicability for the data source to the current system should be made. Accurately characterizing the reliability data applicability and quality under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This presentation will demonstrate a data-source classification method that ranks reliability data according to applicability and quality criteria to a new launch vehicle. This method accounts for similarities/dissimilarities in source and applicability, as well as operating environments like vibrations, acoustic regime, and shock. This classification approach will be followed by uncertainty-importance routines to assess the need for additional data to reduce uncertainty.
Application of Hierarchy Theory to Cross-Scale Hydrologic Modeling of Nutrient Loads
We describe a model called Regional Hydrologic Modeling for Environmental Evaluation 16 (RHyME2) for quantifying annual nutrient loads in stream networks and watersheds. RHyME2 is 17 a cross-scale statistical and process-based water-quality model. The model ...
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.
2006-01-01
Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.
Air Quality Response Modeling for Decision Support | Science ...
Air quality management relies on photochemical models to predict the responses of pollutant concentrations to changes in emissions. Such modeling is especially important for secondary pollutants such as ozone and fine particulate matter which vary nonlinearly with changes in emissions. Numerous techniques for probing pollutant-emission relationships within photochemical models have been developed and deployed for a variety of decision support applications. However, atmospheric response modeling remains complicated by the challenge of validating sensitivity results against observable data. This manuscript reviews the state of the science of atmospheric response modeling as well as efforts to characterize the accuracy and uncertainty of sensitivity results. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being use
Review of nitrogen fate models applicable to forest landscapes in the Southern U.S.
D. M. Amatya; C. G. Rossi; A. Saleh; Z. Dai; M. A. Youssef; R. G. Williams; D. D. Bosch; G. M. Chescheir; G. Sun; R. W. Skaggs; C. C. Trettin; E. D. Vance; J. E. Nettles; S. Tian
2013-01-01
Assessing the environmental impacts of fertilizer nitrogen (N) used to increase productivity in managed forests is complex due to a wide range of abiotic and biotic factors affecting its forms and movement. Models developed to predict fertilizer N fate (e.g., cycling processes) and water quality impacts vary widely in their design, scope, and potential application. We...
Christopher Daly; Jonathan W. Smith; Joseph I. Smith; Robert B. McKane
2007-01-01
High-quality daily meteorological data at high spatial resolution are essential for a variety of hydrologic and ecological modeling applications that support environmental risk assessments and decisionmaking. This paper describes the development. application. and assessment of methods to construct daily high resolution (~50-m cell size) meteorological grids for the...
Borgen, Nicolai T
2014-11-01
This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.
INVERSE MODEL ESTIMATION AND EVALUATION OF SEASONAL NH 3 EMISSIONS
The presentation topic is inverse modeling for estimate and evaluation of emissions. The case study presented is the need for seasonal estimates of NH3 emissions for air quality modeling. The inverse modeling application approach is first described, and then the NH
Sentinel site data for model improvement – Definition and characterization
USDA-ARS?s Scientific Manuscript database
Crop models are increasingly being used to assess the impacts of future climate change on production and food security. High quality site-specific data on weather, soils, management, and cultivar are needed for those model applications. Also important, is that model development, evaluation, improvem...
APPLICATION OF FINE SCALE AIR TOXICS MODELING WITH CMAQ TO HAPEM5
This paper provides a preliminary demonstration of the EPA neighborhood scale modeling paradigm for air toxics by linking concentration from the Community Multiscale Air Quality (CMAQ) modeling system to the fifth version of the Hazardous Pollutant Exposure Model (HAPEM5). For t...
Application of Wavelet Filters in an Evaluation of Photochemical Model Performance
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...
APPLICATION OF A FULLY DISTRIBUTED WASHOFF AND TRANSPORT MODEL FOR A GULF COAST WATERSHED
Advances in hydrologic modeling have been shown to improve the accuracy of rainfall runoff simulation and prediction. Building on the capabilities of distributed hydrologic modeling, a water quality model was developed to simulate buildup, washoff, and advective transport of a co...
APPLICATION OF THE HSPF MODEL TO THE SOUTH FORK OF THE BROAD RIVER WATERSHED IN NORTHEASTERN GEORGIA
The Hydrological Simulation Program-Fortran (HSPF) is a comprehensive watershed model which simulates hydrology and water quality at user-specified temporal and spatial scales. Well-established model calibration and validation procedures are followed when adjusting model paramete...
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
A Tentative Study on the Evaluation of Community Health Service Quality*
NASA Astrophysics Data System (ADS)
Ma, Zhi-qiang; Zhu, Yong-yue
Community health service is the key point of health reform in China. Based on pertinent studies, this paper constructed an indicator system for the community health service quality evaluation from such five perspectives as visible image, reliability, responsiveness, assurance and sympathy, according to service quality evaluation scale designed by Parasuraman, Zeithaml and Berry. A multilevel fuzzy synthetical evaluation model was constructed to evaluate community health service by fuzzy mathematics theory. The applicability and maneuverability of the evaluation indicator system and evaluation model were verified by empirical analysis.
Reduced-form air quality modeling for community-scale applications
Transportation plays an important role in modern society, but its impact on air quality has been shown to have significant adverse effects on public health. Numerous reviews (HEI, CDC, WHO) summarizing findings of hundreds of studies conducted mainly in the last decade, conclude ...
Evaluation of NASA Data Products for RPO Applications
NASA Technical Reports Server (NTRS)
Frisbie, Troy; Knowlton, Kelly; Andrews, Jane
2005-01-01
This presentation summarizes preliminary investigations at SSC by NASA's ASD in Air Quality including decision support tools, partner plans, working groups, and committees. An overview of follow-on short-term and long-term objectives is also provided. A table of potential NASA sensors for use with air quality applications is included, along with specifications for MODIS 04 and 06 products. This presentation was originally given by Rich Piorot of the Vermont Department of Environmental Conservation - Air Quality as part of a round-table discussion during "Exploring Collaborative Opportunities in Air Quality Monitoring, Modelling and Communication Workshop" in Boulder, CO, on March 21-22, 2005; verbal consent for this presentation to be provided to Mr. Piorot was given by the NASA SSC ASD Air Quality Program Manager on March 14, 2005.
DataFed: A Federated Data System for Visualization and Analysis of Spatio-Temporal Air Quality Data
NASA Astrophysics Data System (ADS)
Husar, R. B.; Hoijarvi, K.
2017-12-01
DataFed is a distributed web-services-based computing environment for accessing, processing, and visualizing atmospheric data in support of air quality science and management. The flexible, adaptive environment facilitates the access and flow of atmospheric data from provider to users by enabling the creation of user-driven data processing/visualization applications. DataFed `wrapper' components, non-intrusively wrap heterogeneous, distributed datasets for access by standards-based GIS web services. The mediator components (also web services) map the heterogeneous data into a spatio-temporal data model. Chained web services provide homogeneous data views (e.g., geospatial, time views) using a global multi-dimensional data model. In addition to data access and rendering, the data processing component services can be programmed for filtering, aggregation, and fusion of multidimensional data. A complete application software is written in a custom made data flow language. Currently, the federated data pool consists of over 50 datasets originating from globally distributed data providers delivering surface-based air quality measurements, satellite observations, emissions data as well as regional and global-scale air quality models. The web browser-based user interface allows point and click navigation and browsing the XYZT multi-dimensional data space. The key applications of DataFed are for exploring spatial pattern of pollutants, seasonal, weekly, diurnal cycles and frequency distributions for exploratory air quality research. Since 2008, DataFed has been used to support EPA in the implementation of the Exceptional Event Rule. The data system is also used at universities in the US, Europe and Asia.
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Booker, Kathy; Hilgenberg, Cheryl
2010-01-01
Nursing is often considered expensive in the cost analysis of academic programs. Yet nursing programs have the power to attract many students, and the national nursing shortage has resulted in a high demand for nurses. Methods to systematically assess programs across an entire university academic division are often dissimilar in technique and outcome. At a small, private, Midwestern university, a model for comprehensive program assessment, titled the Quality, Potential and Cost (QPC) model, was developed and applied to each major offered at the university through the collaborative effort of directors, chairs, deans, and the vice president for academic affairs. The QPC model provides a means of equalizing data so that single measures (such as cost) are not viewed in isolation. It also provides a common language to ensure that all academic leaders at an institution apply consistent methods for assessment of individual programs. The application of the QPC model allowed for consistent, fair assessments and the ability to allocate resources to programs according to strategic direction. In this article, the application of the QPC model to School of Nursing majors and other selected university majors will be illustrated. Copyright 2010 Elsevier Inc. All rights reserved.
AVAILABLE MICRO-ACTIVITY DATA AND THEIR APPLICABILITY TO AGGREGATE EXPOSURE MODELING
Several human exposure models have been developed in recent years to address children's aggregate and cumulative exposures to pesticides under the Food Quality Protection Act of 1996. These models estimate children's exposures via all significant routes and pathways including ...
Alava, Juan José; Ross, Peter S; Lachmuth, Cara; Ford, John K B; Hickie, Brendan E; Gobas, Frank A P C
2012-11-20
The development of an area-based polychlorinated biphenyl (PCB) food-web bioaccumulation model enabled a critical evaluation of the efficacy of sediment quality criteria and prey tissue residue guidelines in protecting fish-eating resident killer whales of British Columbia and adjacent waters. Model-predicted and observed PCB concentrations in resident killer whales and Chinook salmon were in good agreement, supporting the model's application for risk assessment and criteria development. Model application shows that PCB concentrations in the sediments from the resident killer whale's Critical Habitats and entire foraging range leads to PCB concentrations in most killer whales that exceed PCB toxicity threshold concentrations reported for marine mammals. Results further indicate that current PCB sediment quality and prey tissue residue criteria for fish-eating wildlife are not protective of killer whales and are not appropriate for assessing risks of PCB-contaminated sediments to high trophic level biota. We present a novel methodology for deriving sediment quality criteria and tissue residue guidelines that protect biota of high trophic levels under various PCB management scenarios. PCB concentrations in sediments and in prey that are deemed protective of resident killer whale health are much lower than current criteria values, underscoring the extreme vulnerability of high trophic level marine mammals to persistent and bioaccumulative contaminants.
Model-Driven Development of Interactive Multimedia Applications with MML
NASA Astrophysics Data System (ADS)
Pleuss, Andreas; Hussmann, Heinrich
There is an increasing demand for high-quality interactive applications which combine complex application logic with a sophisticated user interface, making use of individual media objects like graphics, animations, 3D graphics, audio or video. Their development is still challenging as it requires the integration of software design, user interface design, and media design.
A system framework of inter-enterprise machining quality control based on fractal theory
NASA Astrophysics Data System (ADS)
Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng
2014-03-01
In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.
Utilization of building information modeling in infrastructure’s design and construction
NASA Astrophysics Data System (ADS)
Zak, Josef; Macadam, Helen
2017-09-01
Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.
NASA Astrophysics Data System (ADS)
Amicarelli, A.; Gariazzo, C.; Finardi, S.; Pelliccioni, A.; Silibello, C.
2008-05-01
Data assimilation techniques are methods to limit the growth of errors in a dynamical model by allowing observations distributed in space and time to force (nudge) model solutions. They have become common for meteorological model applications in recent years, especially to enhance weather forecast and to support air-quality studies. In order to investigate the influence of different data assimilation techniques on the meteorological fields produced by RAMS model, and to evaluate their effects on the ozone and PM10 concentrations predicted by FARM model, several numeric experiments were conducted over the urban area of Rome, Italy, during a summer episode.
Environmental Flow for Sungai Johor Estuary
NASA Astrophysics Data System (ADS)
Adilah, A. Kadir; Zulkifli, Yusop; Zainura, Z. Noor; Bakhiah, Baharim N.
2018-03-01
Sungai Johor estuary is a vital water body in the south of Johor and greatly affects the water quality in the Johor Straits. In the development of the hydrodynamic and water quality models for Sungai Johor estuary, the Environmental Fluid Dynamics Code (EFDC) model was selected. In this application, the EFDC hydrodynamic model was configured to simulate time varying surface elevation, velocity, salinity, and water temperature. The EFDC water quality model was configured to simulate dissolved oxygen (DO), dissolved organic carbon (DOC), chemical oxygen demand (COD), ammoniacal nitrogen (NH3-N), nitrate nitrogen (NO3-N), phosphate (PO4), and Chlorophyll a. The hydrodynamic and water quality model calibration was performed utilizing a set of site specific data acquired in January 2008. The simulated water temperature, salinity and DO showed good and fairly good agreement with observations. The calculated correlation coefficients between computed and observed temperature and salinity were lower compared with the water level. Sensitivity analysis was performed on hydrodynamic and water quality models input parameters to quantify their impact on modeling results such as water surface elevation, salinity and dissolved oxygen concentration. It is anticipated and recommended that the development of this model be continued to synthesize additional field data into the modeling process.
Alamar, Priscila D; Caramês, Elem T S; Poppi, Ronei J; Pallone, Juliana A L
2016-07-01
The present study investigated the application of near infrared spectroscopy as a green, quick, and efficient alternative to analytical methods currently used to evaluate the quality (moisture, total sugars, acidity, soluble solids, pH and ascorbic acid) of frozen guava and passion fruit pulps. Fifty samples were analyzed by near infrared spectroscopy (NIR) and reference methods. Partial least square regression (PLSR) was used to develop calibration models to relate the NIR spectra and the reference values. Reference methods indicated adulteration by water addition in 58% of guava pulp samples and 44% of yellow passion fruit pulp samples. The PLS models produced lower values of root mean squares error of calibration (RMSEC), root mean squares error of prediction (RMSEP), and coefficient of determination above 0.7. Moisture and total sugars presented the best calibration models (RMSEP of 0.240 and 0.269, respectively, for guava pulp; RMSEP of 0.401 and 0.413, respectively, for passion fruit pulp) which enables the application of these models to determine adulteration in guava and yellow passion fruit pulp by water or sugar addition. The models constructed for calibration of quality parameters of frozen fruit pulps in this study indicate that NIR spectroscopy coupled with the multivariate calibration technique could be applied to determine the quality of guava and yellow passion fruit pulp. Copyright © 2016 Elsevier Ltd. All rights reserved.
A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado
Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.
2007-01-01
Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.
MODELS-3 INSTALLATION PROCEDURES FOR A PC WITH AN NT OPERATING SYSTEM (MODELS-3 VERSION 4.0)
Models-3 is a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of at...
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
Accreditation and Continuous Quality Improvement in Athletic Training Education.
ERIC Educational Resources Information Center
Peer, Kimberly S.; Rakich, Jonathon S.
2000-01-01
Describes the application of the continuous quality improvement model, commonly associated with the business sector, to entry-level athletic training education programs accredited by the Commission on the Accreditation of Allied Health Education Programs. After discussing historical perspectives on athletic training education programs, the paper…
Lindner-Lunsford, J. B.; Ellis, S.R.
1987-01-01
Multievent, conceptually based models and a single-event, multiple linear-regression model for estimating storm-runoff quantity and quality from urban areas were calibrated and verified for four small (57 to 167 acres) basins in the Denver metropolitan area, Colorado. The basins represented different land-use types - light commercial, single-family housing, and multi-family housing. Both types of models were calibrated using the same data set for each basin. A comparison was made between the storm-runoff volume, peak flow, and storm-runoff loads of seven water quality constituents simulated by each of the models by use of identical verification data sets. The models studied were the U.S. Geological Survey 's Distributed Routing Rainfall-Runoff Model-Version II (DR3M-II) (a runoff-quantity model designed for urban areas), and a multievent urban runoff quality model (DR3M-QUAL). Water quality constituents modeled were chemical oxygen demand, total suspended solids, total nitrogen, total phosphorus, total lead, total manganese, and total zinc. (USGS)
NASA Astrophysics Data System (ADS)
Maitra, Subrata; Banerjee, Debamalya
2010-10-01
Present article is based on application of the product quality and improvement of design related with the nature of failure of machineries and plant operational problems of an industrial blower fan Company. The project aims at developing the product on the basis of standardized production parameters for selling its products in the market. Special attention is also being paid to the blower fans which have been ordered directly by the customer on the basis of installed capacity of air to be provided by the fan. Application of quality function deployment is primarily a customer oriented approach. Proposed model of QFD integrated with AHP to select and rank the decision criterions on the commercial and technical factors and the measurement of the decision parameters for selection of best product in the compettitive environment. The present AHP-QFD model justifies the selection of a blower fan with the help of the group of experts' opinion by pairwise comparison of the customer's and ergonomy based technical design requirements. The steps invoved in implementation of the QFD—AHP and selection of weighted criterion may be helpful for all similar purpose industries maintaining cost and utility for competitive product.
NASA Technical Reports Server (NTRS)
Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.
2007-01-01
This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities
Computational intelligence in earth sciences and environmental applications: issues and challenges.
Cherkassky, V; Krasnopolsky, V; Solomatine, D P; Valdes, J
2006-03-01
This paper introduces a generic theoretical framework for predictive learning, and relates it to data-driven and learning applications in earth and environmental sciences. The issues of data quality, selection of the error function, incorporation of the predictive learning methods into the existing modeling frameworks, expert knowledge, model uncertainty, and other application-domain specific problems are discussed. A brief overview of the papers in the Special Issue is provided, followed by discussion of open issues and directions for future research.
Defining quality in radiology.
Blackmore, C Craig
2007-04-01
The introduction of pay for performance in medicine represents an opportunity for radiologists to define quality in radiology. Radiology quality can be defined on the basis of the production model that currently drives reimbursement, codifying the role of radiologists as being limited to the production of timely and accurate radiology reports produced in conditions of maximum patient safety and communicated in a timely manner. Alternately, quality in radiology can also encompass the professional role of radiologists as diagnostic imaging specialists responsible for the appropriate use, selection, interpretation, and application of imaging. Although potentially challenging to implement, the professional model for radiology quality is a comprehensive assessment of the ways in which radiologists add value to patient care. This essay is a discussion of the definition of radiology quality and the implications of that definition.
SPARROW MODELING - Enhancing Understanding of the Nation's Water Quality
Preston, Stephen D.; Alexander, Richard B.; Woodside, Michael D.; Hamilton, Pixie A.
2009-01-01
The information provided here is intended to assist water-resources managers with interpretation of the U.S. Geological Survey (USGS) SPARROW model and its products. SPARROW models can be used to explain spatial patterns in monitored stream-water quality in relation to human activities and natural processes as defined by detailed geospatial information. Previous SPARROW applications have identified the sources and transport of nutrients in the Mississippi River basin, Chesapeake Bay watershed, and other major drainages of the United States. New SPARROW models with improved accuracy and interpretability are now being developed by the USGS National Water Quality Assessment (NAWQA) Program for six major regions of the conterminous United States. These new SPARROW models are based on updated geospatial data and stream-monitoring records from local, State, and other federal agencies.
NASA Astrophysics Data System (ADS)
Wu, Huiquan; Khan, Mansoor
2012-08-01
As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.
1987-03-01
VENTED HYDROTURBINE .. 38 Model Development .......................................... 38 Model Application...mouth intake (Figures B26-B27). 37 A F -W V .0P V *W V *. V. VW . i. ~ ~ -% PART V: MODELING OF REAERATION THROUGH A VENTED HYDROTURBINE 75. Development
Quality tracing in meat supply chains
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-01-01
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production. PMID:24797136
Quality tracing in meat supply chains.
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-06-13
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production.
Modeling the Urban Boundary and Canopy Layers
Today, we are confronted with increasingly more sophisticated application requirements for urban modeling. These include those that address emergency response to acute exposures from toxic releases, health exposure assessments from adverse air quality, energy usage, and character...
The NASA Lightning Nitrogen Oxides Model (LNOM): Application to Air Quality Modeling
NASA Technical Reports Server (NTRS)
Koshak, William; Peterson, Harold; Khan, Maudood; Biazar, Arastoo; Wang, Lihua
2011-01-01
Recent improvements to the NASA Marshall Space Flight Center Lightning Nitrogen Oxides Model (LNOM) and its application to the Community Multiscale Air Quality (CMAQ) modeling system are discussed. The LNOM analyzes Lightning Mapping Array (LMA) and National Lightning Detection Network(TradeMark)(NLDN) data to estimate the raw (i.e., unmixed and otherwise environmentally unmodified) vertical profile of lightning NO(x) (= NO + NO2). The latest LNOM estimates of lightning channel length distributions, lightning 1-m segment altitude distributions, and the vertical profile of lightning NO(x) are presented. The primary improvement to the LNOM is the inclusion of non-return stroke lightning NOx production due to: (1) hot core stepped and dart leaders, (2) stepped leader corona sheath, K-changes, continuing currents, and M-components. The impact of including LNOM-estimates of lightning NO(x) for an August 2006 run of CMAQ is discussed.
A general health policy model: update and applications.
Kaplan, R M; Anderson, J P
1988-01-01
This article describes the development of a General Health Policy Model that can be used for program evaluation, population monitoring, clinical research, and policy analysis. An important component of the model, the Quality of Well-being scale (QWB) combines preference-weighted measures of symptoms and functioning to provide a numerical point-in-time expression of well-being, ranging from 0 for death to 1.0 for asymptomatic optimum functioning. The level of wellness at particular points in time is governed by the prognosis (transition rates or probabilities) generated by the underlying disease or injury under different treatment (control) variables. Well-years result from integrating the level of wellness, or health-related quality of life, over the life expectancy. Several issues relevant to the application of the model are discussed. It is suggested that a quality of life measure need not have separate components for social and mental health. Social health has been difficult to define; social support may be a poor criterion for resource allocation; and some evidence suggests that aspects of mental health are captured by the general measure. Although it has been suggested that measures of child health should differ from those used for adults, we argue that a separate conceptualization of child health creates new problems for policy analysis. After offering several applications of the model for the evaluation of prevention programs, we conclude that many of the advantages of general measures have been overlooked and should be given serious consideration in future studies. PMID:3384669
Capturing, Harmonizing and Delivering Data and Quality Provenance
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, Christopher
2011-01-01
Satellite remote sensing data have proven to be vital for various scientific and applications needs. However, the usability of these data depends not only on the data values but also on the ability of data users to assess and understand the quality of these data for various applications and for comparison or inter-usage of data from different sensors and models. In this paper, we describe some aspects of capturing, harmonizing and delivering this information to users in the framework of distributed web-based data tools.
Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.
Borrero, Ernesto E
2018-01-01
This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.
Modelling guidelines--terminology and guiding principles
NASA Astrophysics Data System (ADS)
Refsgaard, Jens Christian; Henriksen, Hans Jørgen
2004-01-01
Some scientists argue, with reference to Popper's scientific philosophical school, that models cannot be verified or validated. Other scientists and many practitioners nevertheless use these terms, but with very different meanings. As a result of an increasing number of examples of model malpractice and mistrust to the credibility of models, several modelling guidelines are being elaborated in recent years with the aim of improving the quality of modelling studies. This gap between the views and the lack of consensus experienced in the scientific community and the strongly perceived need for commonly agreed modelling guidelines is constraining the optimal use and benefits of models. This paper proposes a framework for quality assurance guidelines, including a consistent terminology and a foundation for a methodology bridging the gap between scientific philosophy and pragmatic modelling. A distinction is made between the conceptual model, the model code and the site-specific model. A conceptual model is subject to confirmation or falsification like scientific theories. A model code may be verified within given ranges of applicability and ranges of accuracy, but it can never be universally verified. Similarly, a model may be validated, but only with reference to site-specific applications and to pre-specified performance (accuracy) criteria. Thus, a model's validity will always be limited in terms of space, time, boundary conditions and types of application. This implies a continuous interaction between manager and modeller in order to establish suitable accuracy criteria and predictions associated with uncertainty analysis.
Aggregative Learning Method and Its Application for Communication Quality Evaluation
NASA Astrophysics Data System (ADS)
Akhmetov, Dauren F.; Kotaki, Minoru
2007-12-01
In this paper, so-called Aggregative Learning Method (ALM) is proposed to improve and simplify the learning and classification abilities of different data processing systems. It provides a universal basis for design and analysis of mathematical models of wide class. A procedure was elaborated for time series model reconstruction and analysis for linear and nonlinear cases. Data approximation accuracy (during learning phase) and data classification quality (during recall phase) are estimated from introduced statistic parameters. The validity and efficiency of the proposed approach have been demonstrated through its application for monitoring of wireless communication quality, namely, for Fixed Wireless Access (FWA) system. Low memory and computation resources were shown to be needed for the procedure realization, especially for data classification (recall) stage. Characterized with high computational efficiency and simple decision making procedure, the derived approaches can be useful for simple and reliable real-time surveillance and control system design.
[Quality of life in dysphonia].
Rosanowski, F; Grässel, E; Hoppe, U; Köllner, V
2009-09-01
Quality of life is multidimensional and comprises physical, emotional, and social aspects. It has always been the implicit focus of medical work. However, since the 1980s it has been possible to measure it explicitly. Quality of life is impaired in dysphonic patients; this finding is supported by specific studies on self-reported physical, emotional, and social well-being. For practical application of these data, it is recommended to measure all three domains. From a therapeutic point of view, verbal intervention following the PLISSIT model (permission, limited information, special suggestions, intensive therapy) has been proven to enhance patient satisfaction. Therefore, this clinical procedure is recommended for routine application in dysphonic patients.
Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong
2016-03-01
The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved and its future applicability in modern tea industrial.
Robertson, Dale M.; Schladow, S.G.
2008-01-01
Salton Sea, California, like many other lakes, has become eutrophic because of excessive nutrient loading, primarily phosphorus (P). A Total Maximum Daily Load (TMDL) is being prepared for P to reduce the input of P to the Sea. In order to better understand how P-load reductions should affect the average annual water quality of this terminal saline lake, three different eutrophication programs (BATHTUB, WiLMS, and the Seepage Lake Model) were applied. After verifying that specific empirical models within these programs were applicable to this saline lake, each model was calibrated using water-quality and nutrient-loading data for 1999 and then used to simulate the effects of specific P-load reductions. Model simulations indicate that a 50% decrease in external P loading would decrease near-surface total phosphorus concentrations (TP) by 25-50%. Application of other empirical models demonstrated that this decrease in loading should decrease near-surface chlorophyll a concentrations (Chl a) by 17-63% and increase Secchi depths (SD) by 38-97%. The wide range in estimated responses in Chl a and SD were primarily caused by uncertainty in how non-algal turbidity would respond to P-load reductions. If only the models most applicable to the Salton Sea are considered, a 70-90% P-load reduction is required for the Sea to be classified as moderately eutrophic (trophic state index of 55). These models simulate steady-state conditions in the Sea; therefore, it is difficult to ascertain how long it would take for the simulated changes to occur after load reductions. ?? 2008 Springer Science+Business Media B.V.
Quality Management in Career Services, a la Deming.
ERIC Educational Resources Information Center
Korschgen, Ann J.; Rounds, Dan
1992-01-01
Describes career services program at University of Wisconsin-La Crosse that adapted material from W.E. Deming's quality control manual to the needs of its office. Discusses Deming's work and its implications for career services professionals, then describes application of Deming's model to the career services program. (NB)
School Development Applications in Turkey
ERIC Educational Resources Information Center
Hosgörür, Vural
2014-01-01
This study aims to define and explain the establishment, functioning and problems of school development management teams (SDMTs), similar to quality circles used in total quality management practices, for the purposes of continuous development and improvement of schools on the basis of the planned school development model. This is a qualitative…
A New Admission System Model for Teacher Colleges
ERIC Educational Resources Information Center
Katz, Sara; Frish, Yehiel
2016-01-01
Purpose: Aspects of intellectual competence would not be sufficient for quality teaching that requires a mix of intellectual and personal qualities. The purpose of this paper was to elicit personal attributes of teachers' college applicants. Design/methodology/approach: This qualitative case study consisted of 99 participants aged 20-24 years of…
“Changes in US Regional Air Quality at 2030 Simulated Using RCP 6.0”
Session: Global/Regional Modeling Applications Recent improvements in air quality in the United States have been due to significant reductions in emissions of ozone and particulate matter (PM) precursors, and these downward emissions trends are expected to continue in the next...
A cropland farm management modeling system for regional air quality and field-scale applications of bi-directional ammonia exchange was presented at ITM XXI. The goal of this research is to improve estimates of nitrogen deposition to terrestrial and aquatic ecosystems and ambien...
Satellite Models for Global Environmental Change in the NASA Health and Air Quality Programs
NASA Astrophysics Data System (ADS)
Haynes, J.; Estes, S. M.
2015-12-01
Satellite remote sensing of the environment offers a unique vantage point that can fill in the gaps of environmental, spatial, and temporal data for tracking disease. Health and Air Quality providers and researchers are effective by the global environmental changes that are occurring and they need environmental data to study and understand the geographic, environmental, and meteorological differences in disease. This presentation maintains a diverse constellation of Earth observing research satellites and sponsors research in developing satellite data applications across a wide spectrum of areas including environmental health; infectious disease; air quality standards, policies, and regulations; and the impact of climate change on health and air quality. Successfully providing predictions with the accuracy and specificity required by decision makers will require advancements over current capabilities in a number of interrelated areas. These areas include observations, modeling systems, forecast development, application integration, and the research to operations transition process. This presentation will highlight many projects on which NASA satellites have been a primary partner with local, state, Federal, and international operational agencies over the past twelve years in these areas. Domestic and International officials have increasingly recognized links between environment and health. Health providers and researchers need environmental data to study and understand the geographic, environmental, and meteorological differences in disease. The presentation is directly related to Earth Observing systems and Global Health Surveillance and will present research results of the remote sensing environmental observations of earth and health applications, which can contribute to the health research. As part of NASA approach and methodology they have used Earth Observation Systems and Applications for Health Models to provide a method for bridging gaps of environmental, spatial, and temporal data for tracking disease.
NASA Astrophysics Data System (ADS)
Roth, Christian; Vorderer, Peter; Klimmt, Christoph
A conceptual account to the quality of the user experience that interactive storytelling intends to facilitate is introduced. Building on socialscientific research from 'old' entertainment media, the experiential qualities of curiosity, suspense, aesthetic pleasantness, self-enhancement, and optimal task engagement ("flow") are proposed as key elements of a theory of user experience in interactive storytelling. Perspectives for the evolution of the model, research and application are briefly discussed.
Morbidity and Mortality Conference: Its Purpose Reclaimed and Grounded in Theory.
Gregor, Alexander; Taylor, David
2016-01-01
The morbidity and mortality conference (MMC) remains a central activity within the departments of our academic healthcare institutions. It is deeply rooted in the premise that we can learn from our mistakes, thereby improving the care we provide. Recent advances in our understanding of medical error and quality improvement have challenged the value of traditional models of MMC. As a result the purpose of MMC has become clouded and ill-defined: Is it an educational conference that promotes mastery of clinical acumen, or is it a venue to drive quality improvement by addressing systems-based issues in delivering care? Or can it serve both purposes? Review of the history of MMC, the literature, and critical application of education theory demonstrates the source of the confusion and the challenges in viewing it through the exclusive lens of either education or quality improvement. Application of experiential learning theory helps resolve this discord showing how the conference facilitates the development of clinical mastery while informing quality improvement programs about important and relevant systems-based issues. Building on this, we present a model for MMC involving five essential elements: case-based involving an adverse patient event, anonymity for participants, expert guided critical analysis, reframing understanding of the case presentation and related systems-based factors, and projection to practice change. This model builds on previously described models, is grounded in the literature, and helps clarify its role from both the educational and the quality improvement perspectives.
NASA Astrophysics Data System (ADS)
Agarwal, Smriti; Singh, Dharmendra
2016-04-01
Millimeter wave (MMW) frequency has emerged as an efficient tool for different stand-off imaging applications. In this paper, we have dealt with a novel MMW imaging application, i.e., non-invasive packaged goods quality estimation for industrial quality monitoring applications. An active MMW imaging radar operating at 60 GHz has been ingeniously designed for concealed fault estimation. Ceramic tiles covered with commonly used packaging cardboard were used as concealed targets for undercover fault classification. A comparison of computer vision-based state-of-the-art feature extraction techniques, viz, discrete Fourier transform (DFT), wavelet transform (WT), principal component analysis (PCA), gray level co-occurrence texture (GLCM), and histogram of oriented gradient (HOG) has been done with respect to their efficient and differentiable feature vector generation capability for undercover target fault classification. An extensive number of experiments were performed with different ceramic tile fault configurations, viz., vertical crack, horizontal crack, random crack, diagonal crack along with the non-faulty tiles. Further, an independent algorithm validation was done demonstrating classification accuracy: 80, 86.67, 73.33, and 93.33 % for DFT, WT, PCA, GLCM, and HOG feature-based artificial neural network (ANN) classifier models, respectively. Classification results show good capability for HOG feature extraction technique towards non-destructive quality inspection with appreciably low false alarm as compared to other techniques. Thereby, a robust and optimal image feature-based neural network classification model has been proposed for non-invasive, automatic fault monitoring for a financially and commercially competent industrial growth.
Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Callahan, John R.; Whetten, Brian
1996-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
von Stosch, Moritz; Davy, Steven; Francois, Kjell; Galvanauskas, Vytautas; Hamelink, Jan-Martijn; Luebbert, Andreas; Mayer, Martin; Oliveira, Rui; O'Kennedy, Ronan; Rice, Paul; Glassey, Jarka
2014-06-01
This report highlights the drivers, challenges, and enablers of the hybrid modeling applications in biopharmaceutical industry. It is a summary of an expert panel discussion of European academics and industrialists with relevant scientific and engineering backgrounds. Hybrid modeling is viewed in its broader sense, namely as the integration of different knowledge sources in form of parametric and nonparametric models into a hybrid semi-parametric model, for instance the integration of fundamental and data-driven models. A brief description of the current state-of-the-art and industrial uptake of the methodology is provided. The report concludes with a number of recommendations to facilitate further developments and a wider industrial application of this modeling approach. These recommendations are limited to further exploiting the benefits of this methodology within process analytical technology (PAT) applications in biopharmaceutical industry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improving Water Management Decision Support Tools Using NASA Satellite and Modeling Data
NASA Astrophysics Data System (ADS)
Toll, D. L.; Arsenault, K.; Nigro, J.; Pinheiro, A.; Engman, E. T.; Triggs, J.; Cosgrove, B.; Alonge, C.; Boyle, D.; Allen, R.; Townsend, P.; Ni-Meister, W.
2006-05-01
One of twelve Applications of National priority within NASA's Applied Science Program, the Water Management Program Element addresses concerns and decision making related to water availability, water forecast and water quality. The goal of the Water Management Program Element is to encourage water management organizations to use NASA Earth science data, models products, technology and other capabilities in their decision support tools for problem solving. The Water Management Program Element partners with Federal agencies, academia, private firms, and may include international organizations. This paper further describes the Water Management Program with the objective of informing the applications community of the potential opportunities for using NASA science products for problem solving. We will illustrate some ongoing and application Water Management projects evaluating and benchmarking NASA data with partnering federal agencies and their decision support tools: 1) Environmental Protection Agency for water quality; 2) Bureau of Reclamation for water supply, demand and forecast; and 3) NOAA National Weather Service for improved weather prediction. Examples of the types of NASA contributions to the these agency decision support tools include: 1) satellite observations within models assist to estimate water storage, i.e., snow water equivalent, soil moisture, aquifer volumes, or reservoir storages; 2) model derived products, i.e., evapotranspiration, precipitation, runoff, ground water recharge, and other 4-dimensional data assimilation products; 3) improve water quality, assessments by using improved inputs from NASA models (precipitation, evaporation) and satellite observations (e.g., temperature, turbidity, land cover) to nonpoint source models; and 4) water (i.e., precipitation) and temperature predictions from days to decades over local, regional and global scales.
The Impact of Spatial Correlation and Incommensurability on Model Evaluation
Standard evaluations of air quality models rely heavily on a direct comparison of monitoring data matched with the model output for the grid cell containing the monitor’s location. While such techniques may be adequate for some applications, conclusions are limited by such facto...
Environmental decision-making and the influences of various stressors, such as landscape and climate changes on water quantity and quality, requires the application of environmental modeling. Spatially explicit environmental and watershed-scale models using GIS as a base framewor...
“Overview and Evaluation of AQMEII Phase 2 Coupled Simulations over North America”
This presentation provides an overview of the second phase of the Air Quality Model Evaluation International Initative (AQMEII). Activities in this phase are focused on the application and evaluation of coupled meteorology-chemistry models to assess how well these models can simu...
Case study applications of the BASINS climate assessment tool (CAT)
This EPA report will illustrate the application of different climate assessment capabilities within EPA’s BASINS modeling system for assessing a range of potential questions about the effects of climate change on streamflow and water quality in different watershed settings and us...
A one-dimensional water quality model, Gulf of Mexico Dissolved Oxygen Model (GoMDOM-1D), was developed to simulate phytoplankton, carbon, nutrients, and dissolved oxygen in Gulf of Mexico. The model was calibrated and corroborated against a comprehensive set of field observation...
Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley
2004-01-01
Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
The practice of quality-associated costing: application to transfusion manufacturing processes.
Trenchard, P M; Dixon, R
1997-01-01
This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.
A new method based on fuzzy logic to evaluate the contract service provider performance.
Miguel, C A; Barr, C; Moreno, M J L
2008-01-01
This paper puts forward a fuzzy inference system for evaluating the service quality performance of service contract providers. An application service provider (ASP) model for computerized maintenance management was used in establishing common performance indicators of the quality of service. This model was implemented in 10 separate hospitals. As a result, inference produced a service cost/acquisition cost (SC/AC) ratio reduction from 16.14% to 6.09%, an increase of 20.9% in availability, with a maintained repair quality (NRR) in the period of December 2001 to January 2003.
NASA Astrophysics Data System (ADS)
Wang, Zu-liang; Zhang, Ting; Xie, Shi-yang
2017-01-01
In order to improve the agricultural tracing efficiency and reduce tracking and monitoring cost, agricultural products quality tracking and tracing based on Radio-Frequency Identification(RFID) technology is studied, then tracing and tracking model is set up. Three-layer structure model is established to realize the high quality of agricultural products traceability and tracking. To solve the collision problems between multiple RFID tags and improve the identification efficiency a new reservation slot allocation mechanism is proposed. And then we analyze and optimize the parameter by numerical simulation method.
Remote measurements of water pollution with a lidar polarimeter
NASA Technical Reports Server (NTRS)
Sheives, T. C.; Rouse, J. W., Jr.; Mayo, W. T., Jr.
1974-01-01
This paper examines a dual polarization laser backscatter system as a method for remote measurements of certain water quality parameters. Analytical models for describing the backscatter from turbid water and oil on turbid water are presented and compared with experimental data. Laser backscatter field measurements from natural waterways are presented and compared with simultaneous ground observations of the water quality parameters: turbidity, suspended solids, and transmittance. The results of this study show that the analytical models appear valid and that the sensor investigated is applicable to remote measurements of these water quality parameters and oil spills on water.-
MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, G; Pan, X; Stayman, J
2014-06-15
Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less
Adopting software quality measures for healthcare processes.
Yildiz, Ozkan; Demirörs, Onur
2009-01-01
In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.
Application of Wavelet Filters in an Evaluation of ...
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model performance metrics lead one to devote resources to stochastic variations in model outputs. In this analysis, observations are compared with model outputs at seasonal, weekly, diurnal and intra-day time scales. Filters provide frequency specific information that can be used to compare the strength (amplitude) and timing (phase) of observations and model estimates. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollu
THz identification and Bayes modeling
NASA Astrophysics Data System (ADS)
Sokolnikov, Andre
2017-05-01
THz Identification is a developing technology. Sensing in the THz range potentially gives opportunity for short range radar sensing because THz waves can better penetrate through obscured atmosphere, such as fog, than visible light. The lower scattering of THz as opposed to the visible light results also in significantly better imaging than in IR spectrum. A much higher contrast can be achieved in medical trans-illumination applications than with X-rays or visible light. The same THz radiation qualities produce better tomographical images from hard surfaces, e.g. ceramics. This effect comes from the delay in time of reflected THz pulses detection. For special or commercial applications alike, the industrial quality control of defects is facilitated with a lower cost. The effectiveness of THz wave measurements is increased with computational methods. One of them is Bayes modeling. Examples of this kind of mathematical modeling are considered.
A methodology model for quality management in a general hospital.
Stern, Z; Naveh, E
1997-01-01
A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.
Fuzzy intelligent quality monitoring model for X-ray image processing.
Khalatbari, Azadeh; Jenab, Kouroush
2009-01-01
Today's imaging diagnosis needs to adapt modern techniques of quality engineering to maintain and improve its accuracy and reliability in health care system. One of the main factors that influences diagnostic accuracy of plain film X-ray on detecting pathology is the level of film exposure. If the level of film exposure is not adequate, a normal body structure may be interpretated as pathology and vice versa. This not only influences the patient management but also has an impact on health care cost and patient's quality of life. Therefore, providing an accurate and high quality image is the first step toward an excellent patient management in any health care system. In this paper, we study these techniques and also present a fuzzy intelligent quality monitoring model, which can be used to keep variables from degrading the image quality. The variables derived from chemical activity, cleaning procedures, maintenance, and monitoring may not be sensed, measured, or calculated precisely due to uncertain situations. Therefore, the gamma-level fuzzy Bayesian model for quality monitoring of an image processing is proposed. In order to apply the Bayesian concept, the fuzzy quality characteristics are assumed as fuzzy random variables. Using the fuzzy quality characteristics, the newly developed model calculates the degradation risk for image processing. A numerical example is also presented to demonstrate the application of the model.
Objective Video Quality Assessment Based on Machine Learning for Underwater Scientific Applications
Moreno-Roldán, José-Miguel; Luque-Nieto, Miguel-Ángel; Poncela, Javier; Otero, Pablo
2017-01-01
Video services are meant to be a fundamental tool in the development of oceanic research. The current technology for underwater networks (UWNs) imposes strong constraints in the transmission capacity since only a severely limited bitrate is available. However, previous studies have shown that the quality of experience (QoE) is enough for ocean scientists to consider the service useful, although the perceived quality can change significantly for small ranges of variation of video parameters. In this context, objective video quality assessment (VQA) methods become essential in network planning and real time quality adaptation fields. This paper presents two specialized models for objective VQA, designed to match the special requirements of UWNs. The models are built upon machine learning techniques and trained with actual user data gathered from subjective tests. Our performance analysis shows how both of them can successfully estimate quality as a mean opinion score (MOS) value and, for the second model, even compute a distribution function for user scores. PMID:28333123
Cho, Jae Heon; Ha, Sung Ryong
2010-03-15
An influence coefficient algorithm and a genetic algorithm (GA) were introduced to develop an automatic calibration model for QUAL2K, the latest version of the QUAL2E river and stream water-quality model. The influence coefficient algorithm was used for the parameter optimization in unsteady state, open channel flow. The GA, used in solving the optimization problem, is very simple and comprehensible yet still applicable to any complicated mathematical problem, where it can find the global-optimum solution quickly and effectively. The previously established model QUAL2Kw was used for the automatic calibration of the QUAL2K. The parameter-optimization method using the influence coefficient and genetic algorithm (POMIG) developed in this study and QUAL2Kw were each applied to the Gangneung Namdaecheon River, which has multiple reaches, and the results of the two models were compared. In the modeling, the river reach was divided into two parts based on considerations of the water quality and hydraulic characteristics. The calibration results by POMIG showed a good correspondence between the calculated and observed values for most of water-quality variables. In the application of POMIG and QUAL2Kw, relatively large errors were generated between the observed and predicted values in the case of the dissolved oxygen (DO) and chlorophyll-a (Chl-a) in the lowest part of the river; therefore, two weighting factors (1 and 5) were applied for DO and Chl-a in the lower river. The sums of the errors for DO and Chl-a with a weighting factor of 5 were slightly lower compared with the application of a factor of 1. However, with a weighting factor of 5 the sums of errors for other water-quality variables were slightly increased in comparison to the case with a factor of 1. Generally, the results of the POMIG were slightly better than those of the QUAL2Kw.
Managing the Quality of Experience in the Multimedia Internet of Things: A Layered-Based Approach †
Floris, Alessandro; Atzori, Luigi
2016-01-01
This paper addresses the issue of evaluating the Quality of Experience (QoE) for Internet of Things (IoT) applications, with particular attention to the case where multimedia content is involved. A layered IoT architecture is firstly analyzed to understand which QoE influence factors have to be considered in relevant application scenarios. We then introduce the concept of Multimedia IoT (MIoT) and define a layered QoE model aimed at evaluating and combining the contributions of each influence factor to estimate the overall QoE in MIoT applications. Finally, we present a use case related to the remote monitoring of vehicles during driving practices, which is used to validate the proposed layered model, and we discuss a second use case for smart surveillance, to emphasize the generality of the proposed framework. The effectiveness in evaluating classes of influence factors separately is demonstrated. PMID:27918437
Managing the Quality of Experience in the Multimedia Internet of Things: A Layered-Based Approach.
Floris, Alessandro; Atzori, Luigi
2016-12-02
This paper addresses the issue of evaluating the Quality of Experience (QoE) for Internet of Things (IoT) applications, with particular attention to the case where multimedia content is involved. A layered IoT architecture is firstly analyzed to understand which QoE influence factors have to be considered in relevant application scenarios. We then introduce the concept of Multimedia IoT (MIoT) and define a layered QoE model aimed at evaluating and combining the contributions of each influence factor to estimate the overall QoE in MIoT applications. Finally, we present a use case related to the remote monitoring of vehicles during driving practices, which is used to validate the proposed layered model, and we discuss a second use case for smart surveillance, to emphasize the generality of the proposed framework. The effectiveness in evaluating classes of influence factors separately is demonstrated.
A Hemispheric Version of the Community Multiscale Air Quality (CMAQ) Modeling System
This invited presentation will be given at the 4th Biannual Western Modeling Workshop in the Plenary session on Global model development, evaluation, and new source attribution tools. We describe the development and application of the hemispheric version of the CMAQ to examine th...
USER MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3 VERSION 3.0)
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
Consideration of drainage ditches and sediment rating cure on SWAT model performance
USDA-ARS?s Scientific Manuscript database
Water quality models most often require a considerable amount of data to be properly configured and in some cases this requires additional procedural steps prior to model applications. We examined two different scenarios of such input issues in a small watershed using the Soil and Water Assessment ...
ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...
Parameterization guidelines and considerations for hydrologic models
USDA-ARS?s Scientific Manuscript database
Imparting knowledge of the physical processes of a system to a model and determining a set of parameter values for a hydrologic or water quality model application (i.e., parameterization) is an important and difficult task. An exponential increase in literature has been devoted to the use and develo...
DOT National Transportation Integrated Search
2000-06-19
The Environmental Protection Agency (EPA) currently recommends the use of CALINE3 or CAL3QHC for modeling the dispersion of carbon monoxide (CO) near roadways. These models treat vehicles as part of a line source such that the emissions are homogeneo...
USDA-ARS?s Scientific Manuscript database
In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segal, H.M.
1988-08-01
This is one of three reports describing the Emissions and Dispersion Modeling System (EDMS). All reports use the same main title--A MICROCOMPUTER MODEL FOR CIVILIAN AIRPORTS AND AIR FORCE BASES--but different subtitles. The subtitles are: (1) USER'S GUIDE - ISSUE 2 (FAA-EE-88-3/ESL-TR-88-54); (2) MODEL DESCRIPTION (FAA-EE-88-4/ESL-TR-88-53); (S) MODEL APPLICATION AND BACKGROUND (FAA-EE-88-5/ESL-TR-88-55). The first and second reports above describe the EDMS model and provide instructions for its use. This is the third report. IT consists of an accumulation of five key documents describing the development and use of the EDMS model. This report is prepared in accordance with discussions withmore » the EPA and requirements outlined in the March 27, 1980 Federal Register for submitting air-quality models to the EPA. Contents: Model Development and Use - Its Chronology and Reports; Monitoring Concorde EMissions; The Influence of Aircraft Operations on Air Quality at Airports; Simplex A - A simplified Atmospheric Dispersion Model for Airport Use -(User's Guide); Microcomputer Graphics in Atmospheric Dispersion Modeling; Pollution from Motor Vehicles and Aircraft at Stapleton International Airport (Abbreviated Report).« less
Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.
Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew
2017-09-01
Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.
Burn injury models of care: A review of quality and cultural safety for care of Indigenous children.
Fraser, Sarah; Grant, Julian; Mackean, Tamara; Hunter, Kate; Holland, Andrew J A; Clapham, Kathleen; Teague, Warwick J; Ivers, Rebecca Q
2018-05-01
Safety and quality in the systematic management of burn care is important to ensure optimal outcomes. It is not clear if or how burn injury models of care uphold these qualities, or if they provide a space for culturally safe healthcare for Indigenous peoples, especially for children. This review is a critique of publically available models of care analysing their ability to facilitate safe, high-quality burn care for Indigenous children. Models of care were identified and mapped against cultural safety principles in healthcare, and against the National Health and Medical Research Council standard for clinical practice guidelines. An initial search and appraisal of tools was conducted to assess suitability of the tools in providing a mechanism to address quality and cultural safety. From the 53 documents found, 6 were eligible for review. Aspects of cultural safety were addressed in the models, but not explicitly, and were recorded very differently across all models. There was also limited or no cultural consultation documented in the models of care reviewed. Quality in the documents against National Health and Medical Research Council guidelines was evident; however, description or application of quality measures was inconsistent and incomplete. Gaps concerning safety and quality in the documented care pathways for Indigenous peoples' who sustain a burn injury and require burn care highlight the need for investigation and reform of current practices. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.
NASA Technical Reports Server (NTRS)
Hess, R. A.
1977-01-01
A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.
Using climate models to estimate the quality of global observational data sets.
Massonnet, François; Bellprat, Omar; Guemas, Virginie; Doblas-Reyes, Francisco J
2016-10-28
Observational estimates of the climate system are essential to monitoring and understanding ongoing climate change and to assessing the quality of climate models used to produce near- and long-term climate information. This study poses the dual and unconventional question: Can climate models be used to assess the quality of observational references? We show that this question not only rests on solid theoretical grounds but also offers insightful applications in practice. By comparing four observational products of sea surface temperature with a large multimodel climate forecast ensemble, we find compelling evidence that models systematically score better against the most recent, advanced, but also most independent product. These results call for generalized procedures of model-observation comparison and provide guidance for a more objective observational data set selection. Copyright © 2016, American Association for the Advancement of Science.
Guidelines for Calibration and Application of Storm.
1977-12-01
combination method uses the SCS method on pervious areas and the coefficient method on impervious areas of the watershed. Storm water quality is computed...stations, it should be accomplished according to procedures outlined In Reference 7. Adequate storm water quality data are the most difficult and costly...mass discharge of pollutants is negligible. The state-of-the-art in urban storm water quality modeling precludes highly accurate simulation of
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
A Systematic Process for Developing High Quality SaaS Cloud Services
NASA Astrophysics Data System (ADS)
La, Hyun Jung; Kim, Soo Dong
Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.
NASA Astrophysics Data System (ADS)
Hong, E.; Park, Y.; Muirhead, R.; Jeong, J.; Pachepsky, Y. A.
2017-12-01
Pathogenic microorganisms in recreational and irrigation waters remain the subject of concern. Water quality models are used to estimate microbial quality of water sources, to evaluate microbial contamination-related risks, to guide the microbial water quality monitoring, and to evaluate the effect of agricultural management on the microbial water quality. The Agricultural Policy/Environmental eXtender (APEX) is the watershed-scale water quality model that includes highly detailed representation of agricultural management. The APEX currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop the first APEX microbial fate and transport module that could use the APEX conceptual model of manure removal together with recently introduced conceptualizations of the in-stream microbial fate and transport. The module utilizes manure erosion rates found in the APEX. Bacteria survival in soil-manure mixing layer was simulated with the two-stage survival model. Individual survival patterns were simulated for each manure application date. Simulated in-stream microbial fate and transport processes included the reach-scale passive release of bacteria with resuspended bottom sediment during high flow events, the transport of bacteria from bottom sediment due to the hyporheic exchange during low flow periods, the deposition with settling sediment, and the two-stage survival. Default parameter values were available from recently published databases. The APEX model with the newly developed microbial fate and transport module was applied to simulate seven years of monitoring data for the Toenepi watershed in New Zealand. Based on calibration and testing results, the APEX with the microbe module reproduced well the monitored pattern of E. coli concentrations at the watershed outlet. The APEX with the microbial fate and transport module will be utilized for predicting microbial quality of water under various agricultural practices, evaluating monitoring protocols, and supporting the selection of management practices based on regulations that rely on fecal indicator bacteria concentrations.
Highlights from AQMEII Phase 2 & Next Steps
We present highlights of the results obtained in the second phase of the Air Quality Model Evaluation International Initiative (AQMEII) that was completed in May 2014. Activities in this phase were focused on the application and evaluation of coupled meteorology-chemistry models ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sondrup, Andrus Jeffrey
The Department of Energy Idaho Operations Office (DOE-ID) is applying for a synthetic minor, Sitewide, air quality permit to construct (PTC) with a facility emission cap (FEC) component from the Idaho Department of Environmental Quality (DEQ) for Idaho National Laboratory (INL) to limit its potential to emit to less than major facility limits for criteria air pollutants (CAPs) and hazardous air pollutants (HAPs) regulated under the Clean Air Act. This document is supplied as an appendix to the application, Idaho National Laboratory Application for a Synthetic Minor Sitewide Air Quality Permit to Construct with a Facility Emissions Cap Component, hereaftermore » referred to as “permit application” (DOE-ID 2015). Air dispersion modeling was performed as part of the permit application process to demonstrate pollutant emissions from the INL will not cause a violation of any ambient air quality standards. This report documents the modeling methodology and results for the air dispersion impact analysis. All CAPs regulated under Section 109 of the Clean Air Act were modeled with the exception of lead (Pb) and ozone, which are not required to be modeled by DEQ. Modeling was not performed for toxic air pollutants (TAPs) as uncontrolled emissions did not exceed screening emission levels for carcinogenic and non-carcinogenic TAPs. Modeling for CAPs was performed with the EPA approved AERMOD dispersion modeling system (Version 14134) (EPA 2004a) and five years (2000-2004) of meteorological data. The meteorological data set was produced with the companion AERMET model (Version 14134) (EPA 2004b) using surface data from the Idaho Falls airport, and upper-air data from Boise International Airport supplied by DEQ. Onsite meteorological data from the Grid 3 Mesonet tower located near the center of the INL (north of INTEC) and supplied by the local National Oceanic and Atmospheric Administration (NOAA) office was used for surface wind directions and wind speeds. Surface data (i.e., land use data that defines roughness, albedo, Bowen ratio, and other parameters) were processed using the AERSURFACE utility (Version 13016) (EPA 2013). Emission sources were modeled as point sources using actual stack locations and dimensions. Emissions, flow rates and exit temperatures were based on the design operating capacity of each source. All structures close enough to produce an area of wake effect were included for all sources. For multi-tiered structures, the heights of the tiers were included or the entire building height was assumed to be equal to the height of the tallest tier. Concentrations were calculated at 1,352 receptor locations provided by DEQ. All receptors were considered for each pollutant and averaging period. Maximum modeled CAP concentrations summed with average background concentration values were presented and compared to National Ambient Air Quality Standards (NAAQS). The background concentration values used were obtained using the Washington State University’s Laboratory for Atmospheric Research North West Airquest web-based retrieval tool (http://lar.wsu.edu/nw airquest/lookup.html). The air dispersion modeling results show the maximum impacts for CAPs are less than applicable standards and demonstrate the INL will not cause a violation of any ambient air quality standards.« less
USDA-ARS?s Scientific Manuscript database
Estimation of soil moisture has received considerable attention in the areas of hydrology, agriculture, meteorology and environmental studies because of its role in the partitioning water and energy at the land surface. In this study, the USDA, Agricultural Research Service, Root Zone Water Quality ...
Adolescent HIV Prevention: An Application of the Elaboration Likelihood Model.
ERIC Educational Resources Information Center
Metzler, April E.; Weiskotten, David; Morgen, Keith J.
Ninth grade students (n=298) participated in a study to examine the influence source credibility, message, quality, and personal relevance on HIV prevention message efficacy. A pilot study with adolescent focus groups created the high and low quality messages, as well as the high (HIV+) and low (worried parent) credibility sources. Participants…
USDA-ARS?s Scientific Manuscript database
Wind erosion of soil is a major concern of the agricultural community as it removes the most fertile part of the soil and thus degrades soil productivity. Furthermore, dust emissions due to wind erosion contribute to poor air quality, reduce visibility, and cause perturbations to regional radiation ...
Planning and assessment in land and water resource management are evolving from simple, local-scale problems toward complex, spatially explicit regional ones. Such problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and t...
The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in ...
A Quantitative Quality Control Model for Parallel and Distributed Crowdsourcing Tasks
ERIC Educational Resources Information Center
Zhu, Shaojian
2014-01-01
Crowdsourcing is an emerging research area that has experienced rapid growth in the past few years. Although crowdsourcing has demonstrated its potential in numerous domains, several key challenges continue to hinder its application. One of the major challenges is quality control. How can crowdsourcing requesters effectively control the quality…
ERIC Educational Resources Information Center
Faulkner, Jane B.
Institutional accreditation is a voluntary, non-governmental activity administered by the eight postsecondary accrediting institutions that are part of the six regional associations that serve colleges and universities in the United States. The author cites W. Edwards Deming's work on corporate quality improvement, and its applicability to…
Development of an Aura Chemical Reanalysis in support Air Quality Applications
NASA Astrophysics Data System (ADS)
Pierce, R. B.; Lenzen, A.; Schaack, T.
2015-12-01
We present results of chemical data assimilation experiments utilizing the NOAA National Environmental Satellite, Data, and Information Service (NESDIS), University of Wisconsin Space Science and Engineering (SSEC) Real-time Air Quality Modeling System (RAQMS) in conjunction with the NOAA National Centers for Environmental Prediction (NCEP) Operational Gridpoint Statistical Interpolation (GSI) 3-dimensional variational data assimilation system. The impact of assimilating NASA Ozone Monitoring Instrument (OMI) total column ozone, OMI tropospheric nitrogen dioxide columns, and Microwave Limb Sounder (MLS) stratospheric ozone profiles on background ozone is assessed using measurements from the 2010 NSF High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) Pole-to-Pole Observation (HIPPO) and NOAA California Nexus (CalNex) campaigns. Results show that the RAQMS/GSI Chemical Reanalysis is able to provide very good estimates of background ozone and large-scale ozone variability and is suitable for use in constraining regional air quality modeling activities. These experiments are being used to guide the development of a multi-year global chemical and aerosol reanalysis using NASA Aura and A-Train measurements to support air quality applications.
van den Ban, Sander; Pitt, Kendal G; Whiteman, Marshall
2018-02-01
A scientific understanding of interaction of product, film coat, film coating process, and equipment is important to enable design and operation of industrial scale pharmaceutical film coating processes that are robust and provide the level of control required to consistently deliver quality film coated product. Thermodynamic film coating conditions provided in the tablet film coating process impact film coat formation and subsequent product quality. A thermodynamic film coating model was used to evaluate film coating process performance over a wide range of film coating equipment from pilot to industrial scale (2.5-400 kg). An approximate process-imposed transition boundary, from operating in a dry to a wet environment, was derived, for relative humidity and exhaust temperature, and used to understand the impact of the film coating process on product formulation and process control requirements. This approximate transition boundary may aid in an enhanced understanding of risk to product quality, application of modern Quality by Design (QbD) based product development, technology transfer and scale-up, and support the science-based justification of critical process parameters (CPPs).
NASA Astrophysics Data System (ADS)
Osterman, G. B.; Eldering, A.; Neu, J. L.; Tang, Y.; McQueen, J.; Pinder, R. W.
2011-12-01
To help protect human health and ecosystems, regional-scale atmospheric chemistry models are used to forecast high ozone events and to design emission control strategies to decrease the frequency and severity of ozone events. Despite the impact that nighttime aloft ozone can have on surface ozone, regional-scale atmospheric chemistry models often do not simulate the nighttime ozone concentrations well and nor do they sufficiently capture the ozone transport patterns. Fully characterizing the importance of the nighttime ozone has been hampered by limited measurements of the vertical distribution of ozone and ozone-precursors. The main focus of this work is to begin to utilize remote sensing data sets to characterize the impact of nighttime aloft ozone to air quality events. We will describe our plans to use NASA satellite data sets, transport models and air quality models to study ozone transport, focusing primarily on nighttime ozone and provide initial results. We will use satellite and ozonesonde data to help understand how well the air quality models are simulating ozone in the lower free troposphere and attempt to characterize the impact of nighttime ozone to air quality events. Our specific objectives are: 1) Characterize nighttime aloft ozone using remote sensing data and sondes. 2) Evaluate the ability of the Community Multi-scale Air Quality (CMAQ) model and the National Air Quality Forecast Capability (NAQFC) model to capture the nighttime aloft ozone and its relationship to air quality events. 3) Analyze a set of air quality events and determine the relationship of air quality events to the nighttime aloft ozone. We will achieve our objectives by utilizing the ozone profile data from the NASA Earth Observing System (EOS) Tropospheric Emission Spectrometer (TES) and other sensors, ozonesonde data collected during the Aura mission (IONS), EPA AirNow ground station ozone data, the CMAQ continental-scale air quality model, and the National Air Quality Forecast model.
Cross-layer Energy Optimization Under Image Quality Constraints for Wireless Image Transmissions.
Yang, Na; Demirkol, Ilker; Heinzelman, Wendi
2012-01-01
Wireless image transmission is critical in many applications, such as surveillance and environment monitoring. In order to make the best use of the limited energy of the battery-operated cameras, while satisfying the application-level image quality constraints, cross-layer design is critical. In this paper, we develop an image transmission model that allows the application layer (e.g., the user) to specify an image quality constraint, and optimizes the lower layer parameters of transmit power and packet length, to minimize the energy dissipation in image transmission over a given distance. The effectiveness of this approach is evaluated by applying the proposed energy optimization to a reference ZigBee system and a WiFi system, and also by comparing to an energy optimization study that does not consider any image quality constraint. Evaluations show that our scheme outperforms the default settings of the investigated commercial devices and saves a significant amount of energy at middle-to-large transmission distances.
Hyperspectral imaging of water quality - past applications and future directions.
NASA Astrophysics Data System (ADS)
Ross, M. R. V.; Pavelsky, T.
2017-12-01
Inland waters control the delivery of sediment, carbon, and nutrients from land to ocean by transforming, depositing, and transporting constituents downstream. However, the dominant in situ conditions that control these processes are poorly constrained, especially at larger spatial scales. Hyperspectral imaging, a remote sensing technique that uses reflectance in hundreds of narrow spectral bands, can be used to estimate water quality parameters like sediment and carbon concentration over larger water bodies. Here, we review methods and applications for using hyperspectral imagery to generate near-surface two-dimensional models of water quality in lakes and rivers. Further, we show applications using newly available data from the National Ecological Observation Network aerial observation platform in the Black Warrior and Tombigbee Rivers, Alabama. We demonstrate large spatial variation in chlorophyll, colored dissolved organic matter, and turbidity in each river and uneven mixing of water quality constituents for several kilometers. Finally, we demonstrate some novel techniques using hyperspectral imagery to deconvolve dissolved organic matter spectral signatures to specific organic matter components.
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas
2003-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Kircher, Michael; Schmidt, Douglas C.
2000-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of-service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and often sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration frame-work for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines reflective middleware techniques designed to adaptively: (1) select optimal communication mechanisms, (2) man- age QoS properties of CORBA components in their containers, and (3) (re)configure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of reflective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.
2016-12-01
The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.
Space shuttle flying qualities and criteria assessment
NASA Technical Reports Server (NTRS)
Myers, T. T.; Johnston, D. E.; Mcruer, Duane T.
1987-01-01
Work accomplished under a series of study tasks for the Flying Qualities and Flight Control Systems Design Criteria Experiment (OFQ) of the Shuttle Orbiter Experiments Program (OEX) is summarized. The tasks involved review of applicability of existing flying quality and flight control system specification and criteria for the Shuttle; identification of potentially crucial flying quality deficiencies; dynamic modeling of the Shuttle Orbiter pilot/vehicle system in the terminal flight phases; devising a nonintrusive experimental program for extraction and identification of vehicle dynamics, pilot control strategy, and approach and landing performance metrics, and preparation of an OEX approach to produce a data archive and optimize use of the data to develop flying qualities for future space shuttle craft in general. Analytic modeling of the Orbiter's unconventional closed-loop dynamics in landing, modeling pilot control strategies, verification of vehicle dynamics and pilot control strategy from flight data, review of various existent or proposed aircraft flying quality parameters and criteria in comparison with the unique dynamic characteristics and control aspects of the Shuttle in landing; and finally a summary of conclusions and recommendations for developing flying quality criteria and design guides for future Shuttle craft.
Mesh quality oriented 3D geometric vascular modeling based on parallel transport frame.
Guo, Jixiang; Li, Shun; Chui, Yim Pan; Qin, Jing; Heng, Pheng Ann
2013-08-01
While a number of methods have been proposed to reconstruct geometrically and topologically accurate 3D vascular models from medical images, little attention has been paid to constantly maintain high mesh quality of these models during the reconstruction procedure, which is essential for many subsequent applications such as simulation-based surgical training and planning. We propose a set of methods to bridge this gap based on parallel transport frame. An improved bifurcation modeling method and two novel trifurcation modeling methods are developed based on 3D Bézier curve segments in order to ensure the continuous surface transition at furcations. In addition, a frame blending scheme is implemented to solve the twisting problem caused by frame mismatch of two successive furcations. A curvature based adaptive sampling scheme combined with a mesh quality guided frame tilting algorithm is developed to construct an evenly distributed, non-concave and self-intersection free surface mesh for vessels with distinct radius and high curvature. Extensive experiments demonstrate that our methodology can generate vascular models with better mesh quality than previous methods in terms of surface mesh quality criteria. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Banerjee, Polash; Ghose, Mrinal Kanti; Pradhan, Ratika
2018-05-01
Spatial analysis of water quality impact assessment of highway projects in mountainous areas remains largely unexplored. A methodology is presented here for Spatial Water Quality Impact Assessment (SWQIA) due to highway-broadening-induced vehicular traffic change in the East district of Sikkim. Pollution load of the highway runoff was estimated using an Average Annual Daily Traffic-Based Empirical model in combination with mass balance model to predict pollution in the rivers within the study area. Spatial interpolation and overlay analysis were used for impact mapping. Analytic Hierarchy Process-Based Water Quality Status Index was used to prepare a composite impact map. Model validation criteria, cross-validation criteria, and spatial explicit sensitivity analysis show that the SWQIA model is robust. The study shows that vehicular traffic is a significant contributor to water pollution in the study area. The model is catering specifically to impact analysis of the concerned project. It can be an aid for decision support system for the project stakeholders. The applicability of SWQIA model needs to be explored and validated in the context of a larger set of water quality parameters and project scenarios at a greater spatial scale.
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.
Wang, Li; Cai, Xuejiao; Cheng, Ping
2018-05-30
The management of medical devices is crucial to safe, high-quality surgical care, but has received little attention in the medical literature. This study explored the effect of a sub-specialties management model in the Central Sterile Supply Department (CSSD). A traditional routine management model (control) was applied from September 2015 through April 2016, and a newly developed sub-specialties management model (observation) was applied from July 2016 through February 2017. Health personnel from various clinical departments were randomly selected to participate as the control (n = 86) and observation (n = 90) groups, respectively. The groups were compared for rates of personnel satisfaction, complaints regarding device errors, and damage of medical devices. The satisfaction score of the observation group (95.8 ± 1.2) was significantly higher than that of the control (90.2 ± 2.3; P = 0.000). The rate of complaints of the observation group (3.3%) was significantly lower than that of the control (11.6%; P = 0.035). The quality control regarding recycle and packing was significantly higher during the observation period than the control period, which favorably influenced the scores for satisfaction. The rate of damage to specialist medical devices during the observation period (0.40%) was lower than during the control period (0.61%; P = 0.003). The theoretical knowledge and practical skills of the CSSD professionals improved after application of the sub-specialties management model. A management model that considers the requirements of specialist medical devices can improve quality control in the CSSD.
Although the focus in the 1970s was primarily on urban air pollution models, it is well known that pollution problems such as acid rain, ozone, and fine particulate matter are regional in scope, requiring regional-scale multipollutant models. In North America and Europe, several ...
The purpose of this study is to evaluate the Urban Airshed Model (UAM), a three-dimensional photochemical urban air quality simulation model, using field observations from the Tokyo Metropolitan Area. mphasis was placed on the photochemical smog formation mechanism under stagnant...
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
Kircher, J.E.; Dinicola, Richard S.; Middelburg, R.F.
1984-01-01
Monthly values were computed for water-quality constituents at four streamflow gaging stations in the Upper Colorado River basin for the determination of trends. Seasonal regression and seasonal Kendall trend analysis techniques were applied to two monthly data sets at each station site for four different time periods. A recently developed method for determining optimal water-discharge data-collection frequency was also applied to the monthly water-quality data. Trend analysis results varied with each monthly load computational method, period of record, and trend detection model used. No conclusions could be reached regarding which computational method was best to use in trend analysis. Time-period selection for analysis was found to be important with regard to intended use of the results. Seasonal Kendall procedures were found to be applicable to most data sets. Seasonal regression models were more difficult to apply and were sometimes of questionable validity; however, those results were more informative than seasonal Kendall results. The best model to use depends upon the characteristics of the data and the amount of trend information needed. The measurement-frequency optimization method had potential for application to water-quality data, but refinements are needed. (USGS)
Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions.
Forstmann, B U; Ratcliff, R; Wagenmakers, E-J
2016-01-01
Sequential sampling models assume that people make speeded decisions by gradually accumulating noisy information until a threshold of evidence is reached. In cognitive science, one such model--the diffusion decision model--is now regularly used to decompose task performance into underlying processes such as the quality of information processing, response caution, and a priori bias. In the cognitive neurosciences, the diffusion decision model has recently been adopted as a quantitative tool to study the neural basis of decision making under time pressure. We present a selective overview of several recent applications and extensions of the diffusion decision model in the cognitive neurosciences.
Representing urban terrain characteristics in mesoscale meteorological and dispersion models is critical to produce accurate predictions of wind flow and temperature fields, air quality, and contaminant transport. A key component of the urban terrain representation is the charac...
42 CFR § 512.460 - Compliance enforcement.
Code of Federal Regulations, 2010 CFR
2017-10-01
... (CONTINUED) HEALTH CARE INFRASTRUCTURE AND MODEL PROGRAMS EPISODE PAYMENT MODEL Quality Measures, Beneficiary... regulations under this part must not be construed to affect the applicable payment, coverage, program..., commonly referred to as a CAP. (iii) Reducing or eliminating the EPM participant's reconciliation payment...
Congdon, Peter
2016-01-01
Background: Enhanced quality of care and improved access are central to effective primary care management of long term conditions. However, research evidence is inconclusive in establishing a link between quality of primary care, or access, and adverse outcomes, such as unplanned hospitalisation. Methods: This paper proposes a structural equation model for quality and access as latent variables affecting adverse outcomes, such as unplanned hospitalisations. In a case study application, quality of care (QOC) is defined in relation to diabetes, and the aim is to assess impacts of care quality and access on unplanned hospital admissions for diabetes, while allowing also for socio-economic deprivation, diabetes morbidity, and supply effects. The study involves 90 general practitioner (GP) practices in two London Clinical Commissioning Groups, using clinical quality of care indicators, and patient survey data on perceived access. Results: As a single predictor, quality of care has a significant negative impact on emergency admissions, and this significant effect remains when socio-economic deprivation and morbidity are allowed. In a full structural equation model including access, the probability that QOC negatively impacts on unplanned admissions exceeds 0.9. Furthermore, poor access is linked to deprivation, diminished QOC, and larger list sizes. Conclusions: Using a Bayesian inference methodology, the evidence from the analysis is weighted towards negative impacts of higher primary care quality and improved access on unplanned admissions. The methodology of the paper is potentially applicable to other long term conditions, and relevant when care quality and access cannot be measured directly and are better regarded as latent variables. PMID:27598184
Congdon, Peter
2016-09-01
Enhanced quality of care and improved access are central to effective primary care management of long term conditions. However, research evidence is inconclusive in establishing a link between quality of primary care, or access, and adverse outcomes, such as unplanned hospitalisation. This paper proposes a structural equation model for quality and access as latent variables affecting adverse outcomes, such as unplanned hospitalisations. In a case study application, quality of care (QOC) is defined in relation to diabetes, and the aim is to assess impacts of care quality and access on unplanned hospital admissions for diabetes, while allowing also for socio-economic deprivation, diabetes morbidity, and supply effects. The study involves 90 general practitioner (GP) practices in two London Clinical Commissioning Groups, using clinical quality of care indicators, and patient survey data on perceived access. As a single predictor, quality of care has a significant negative impact on emergency admissions, and this significant effect remains when socio-economic deprivation and morbidity are allowed. In a full structural equation model including access, the probability that QOC negatively impacts on unplanned admissions exceeds 0.9. Furthermore, poor access is linked to deprivation, diminished QOC, and larger list sizes. Using a Bayesian inference methodology, the evidence from the analysis is weighted towards negative impacts of higher primary care quality and improved access on unplanned admissions. The methodology of the paper is potentially applicable to other long term conditions, and relevant when care quality and access cannot be measured directly and are better regarded as latent variables.
Predicting Nitrogen in Streams: A Comparison of Two Estimates of Fertilizer Application
NASA Astrophysics Data System (ADS)
Mehaffey, M.; Neale, A.
2011-12-01
Decision makers frequently rely on water and air quality models to develop nutrient management strategies. Obviously, the results of these models (e.g., SWAT, SPARROW, CMAQ) are only as good as the nutrient source input data and recently the Nutrient Innovations Task Group has called for a better accounting of nonpoint nutrient sources. Currently, modelers frequently rely on county level fertilizer sales records combined with acreage of crops to estimate nitrogen sources from fertilizer for counties or watersheds. However, since fertilizer sales data are based on reported amounts they do not necessarily reflect actual use on the fields. In addition the reported sales data quality varies by state resulting in differing accuracy between states. In this study we examine an alternative method potentially providing a more uniform, spatially explicit, estimate of fertilizer use. Our nitrogen application data is estimated at a 30m pixel resolution which allows for scalable inputs for use in water and air quality models. To develop this dataset we combined raster data from the National Cropland data layer (CDL) data with the National Land Cover Data (NLCD). This process expanded the NLCD's 'cultivated crops' classes to included major grains, cover crops, and vegetable and fruits. The Agriculture Resource Management Survey chemical fertilizer application rate data were summarized by crop type and year for each state, encompassing the corn, soybean, spring wheat, and winter wheat crop types (ARMS, 2002-2005). The chemical fertilizer application rate data were then used to estimate annual application parameters for nitrogen, phosphate, potash, herbicide, pesticide, and total pesticide, all expressed on a mass-per-unit-crop-area basis for each state for each crop type. By linking crop types to nitrogen application rates, we can better estimate where applied fertilizer would likely be in excess of the amounts used by crops or where conservation practices may improve retention and uptake helping offset the impacts to water. To test the accuracy of our finer resolution nitrogen application data, we compare its ability to predict nitrogen concentrations in streams with the ability of the county sales data to do the same.
Uncertainty analyses of the calibrated parameter values of a water quality model
NASA Astrophysics Data System (ADS)
Rode, M.; Suhr, U.; Lindenschmidt, K.-E.
2003-04-01
For river basin management water quality models are increasingly used for the analysis and evaluation of different management measures. However substantial uncertainties exist in parameter values depending on the available calibration data. In this paper an uncertainty analysis for a water quality model is presented, which considers the impact of available model calibration data and the variance of input variables. The investigation was conducted based on four extensive flowtime related longitudinal surveys in the River Elbe in the years 1996 to 1999 with varying discharges and seasonal conditions. For the model calculations the deterministic model QSIM of the BfG (Germany) was used. QSIM is a one dimensional water quality model and uses standard algorithms for hydrodynamics and phytoplankton dynamics in running waters, e.g. Michaelis Menten/Monod kinetics, which are used in a wide range of models. The multi-objective calibration of the model was carried out with the nonlinear parameter estimator PEST. The results show that for individual flow time related measuring surveys very good agreements between model calculation and measured values can be obtained. If these parameters are applied to deviating boundary conditions, substantial errors in model calculation can occur. These uncertainties can be decreased with an increased calibration database. More reliable model parameters can be identified, which supply reasonable results for broader boundary conditions. The extension of the application of the parameter set on a wider range of water quality conditions leads to a slight reduction of the model precision for the specific water quality situation. Moreover the investigations show that highly variable water quality variables like the algal biomass always allow a smaller forecast accuracy than variables with lower coefficients of variation like e.g. nitrate.
NASA Astrophysics Data System (ADS)
Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.
2014-02-01
Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying lateral boundary conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2001-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complemented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone and carbon monoxide vertical profiles. The results show performance is largely within uncertainty estimates for ozone from the Ozone Monitoring Instrument and carbon monoxide from the Measurements Of Pollution In The Troposphere (MOPITT), but there were some notable biases compared with Tropospheric Emission Spectrometer (TES) ozone. Compared with TES, our ozone predictions are high-biased in the upper troposphere, particularly in the south during January. This publication documents the global simulation database, the tool for conversion to LBC, and the evaluation of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
NASA Astrophysics Data System (ADS)
van Loon, E. G. C. P.; Schüler, M.; Katsnelson, M. I.; Wehling, T. O.
2016-10-01
We investigate the Peierls-Feynman-Bogoliubov variational principle to map Hubbard models with nonlocal interactions to effective models with only local interactions. We study the renormalization of the local interaction induced by nearest-neighbor interaction and assess the quality of the effective Hubbard models in reproducing observables of the corresponding extended Hubbard models. We compare the renormalization of the local interactions as obtained from numerically exact determinant quantum Monte Carlo to approximate but more generally applicable calculations using dual boson, dynamical mean field theory, and the random phase approximation. These more approximate approaches are crucial for any application with real materials in mind. Furthermore, we use the dual boson method to calculate observables of the extended Hubbard models directly and benchmark these against determinant quantum Monte Carlo simulations of the effective Hubbard model.
NASA Astrophysics Data System (ADS)
Ziemba, Alexander; El Serafy, Ghada
2016-04-01
Ecological modeling and water quality investigations are complex processes which can require a high level of parameterization and a multitude of varying data sets in order to properly execute the model in question. Since models are generally complex, their calibration and validation can benefit from the application of data and information fusion techniques. The data applied to ecological models comes from a wide range of sources such as remote sensing, earth observation, and in-situ measurements, resulting in a high variability in the temporal and spatial resolution of the various data sets available to water quality investigators. It is proposed that effective fusion into a comprehensive singular set will provide a more complete and robust data resource with which models can be calibrated, validated, and driven by. Each individual product contains a unique valuation of error resulting from the method of measurement and application of pre-processing techniques. The uncertainty and error is further compounded when the data being fused is of varying temporal and spatial resolution. In order to have a reliable fusion based model and data set, the uncertainty of the results and confidence interval of the data being reported must be effectively communicated to those who would utilize the data product or model outputs in a decision making process[2]. Here we review an array of data fusion techniques applied to various remote sensing, earth observation, and in-situ data sets whose domains' are varied in spatial and temporal resolution. The data sets examined are combined in a manner so that the various classifications, complementary, redundant, and cooperative, of data are all assessed to determine classification's impact on the propagation and compounding of error. In order to assess the error of the fused data products, a comparison is conducted with data sets containing a known confidence interval and quality rating. We conclude with a quantification of the performance of the data fusion techniques and a recommendation on the feasibility of applying of the fused products in operating forecast systems and modeling scenarios. The error bands and confidence intervals derived can be used in order to clarify the error and confidence of water quality variables produced by prediction and forecasting models. References [1] F. Castanedo, "A Review of Data Fusion Techniques", The Scientific World Journal, vol. 2013, pp. 1-19, 2013. [2] T. Keenan, M. Carbone, M. Reichstein and A. Richardson, "The model-data fusion pitfall: assuming certainty in an uncertain world", Oecologia, vol. 167, no. 3, pp. 587-597, 2011.
Method for the visualization of landform by mapping using low altitude UAV application
NASA Astrophysics Data System (ADS)
Sharan Kumar, N.; Ashraf Mohamad Ismail, Mohd; Sukor, Nur Sabahiah Abdul; Cheang, William
2018-05-01
Unmanned Aerial Vehicle (UAV) and Digital Photogrammetry are evolving drastically in mapping technology. The significance and necessity for digital landform mapping are developing with years. In this study, a mapping workflow is applied to obtain two different input data sets which are the orthophoto and DSM. A fine flying technology is used to capture Low Altitude Aerial Photography (LAAP). Low altitude UAV (Drone) with the fixed advanced camera was utilized for imagery while computerized photogrammetry handling using Photo Scan was applied for cartographic information accumulation. The data processing through photogrammetry and orthomosaic processes is the main applications. High imagery quality is essential for the effectiveness and nature of normal mapping output such as 3D model, Digital Elevation Model (DEM), Digital Surface Model (DSM) and Ortho Images. The exactitude of Ground Control Points (GCP), flight altitude and the resolution of the camera are essential for good quality DEM and Orthophoto.
Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.
Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A
2016-04-01
The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.
Space Shuttle flying qualities and flight control system assessment study, phase 2
NASA Technical Reports Server (NTRS)
Myers, T. T.; Johnston, D. E.; Mcruer, D. T.
1983-01-01
A program of flying qualities experiments as part of the Orbiter Experiments Program (OEX) is defined. Phase 1, published as CR-170391, reviewed flying qualities criteria and shuttle data. The review of applicable experimental and shuttle data to further define the OEX plan is continued. An unconventional feature of this approach is the use of pilot strategy model identification to relate flight and simulator results. Instrumentation, software, and data analysis techniques for pilot model measurements are examined. The relationship between shuttle characteristics and superaugmented aircraft is established. STS flights 1 through 4 are reviewed from the point of view of flying qualities. A preliminary plan for a coordinated program of inflight and simulator research is presented.
Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)
ERIC Educational Resources Information Center
Yavuz, Guler; Hambleton, Ronald K.
2017-01-01
Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…
The NOAA Atmospheric Sciences Modeling Division (ASMD) celebrated its Golden Jubilee in September 2005. The partnership between NOAA and EPA began when the Air Pollution Unit of the Public Health Service, which later became part of the EPA, requested the Weather Bureau provide ...
An Application of the Social Support Deterioration Deterrence Model to Rescue Workers
ERIC Educational Resources Information Center
Prati, Gabriele; Pietrantoni, Luca
2010-01-01
This study examined the role of social support in promoting quality of life in the aftermath of critical incidents involvement. Participants were a sample of 586 Italian rescue workers. Structural equation modelling was used to test the social support deterioration deterrence model. Results showed that the impact of critical incident involvement…
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science chemical transport model (CTM) capable of simulating the emission, transport and fate of numerous air pollutants. Similarly, the Weather Research and Forecasting (WRF) model is a state-of-the-science mete...
A review of AirQ Models and their applications for forecasting the air pollution health outcomes.
Oliveri Conti, Gea; Heibati, Behzad; Kloog, Itai; Fiore, Maria; Ferrante, Margherita
2017-03-01
Even though clean air is considered as a basic requirement for the maintenance of human health, air pollution continues to pose a significant health threat in developed and developing countries alike. Monitoring and modeling of classic and emerging pollutants is vital to our knowledge of health outcomes in exposed subjects and to our ability to predict them. The ability to anticipate and manage changes in atmospheric pollutant concentrations relies on an accurate representation of the chemical state of the atmosphere. The task of providing the best possible analysis of air pollution thus requires efficient computational tools enabling efficient integration of observational data into models. A number of air quality models have been developed and play an important role in air quality management. Even though a large number of air quality models have been discussed or applied, their heterogeneity makes it difficult to select one approach above the others. This paper provides a brief review on air quality models with respect to several aspects such as prediction of health effects.
The Role of Reliability, Vulnerability and Resilience in the Management of Water Quality Systems
NASA Astrophysics Data System (ADS)
Lence, B. J.; Maier, H. R.
2001-05-01
The risk based performance indicators reliability, vulnerability and resilience provide measures of the frequency, magnitude and duration of the failure of water resources systems, respectively. They have been applied primarily to water supply problems, including the assessment of the performance of reservoirs and water distribution systems. Applications to water quality case studies have been limited, although the need to consider the length and magnitude of violations of a particular water quality standard has been recognized for some time. In this research, the role of reliability, vulnerability and resilience in water quality management applications is investigated by examining their significance as performance measures for water quality systems and assessing their potential for assisting in decision making processes. The importance of each performance indicator is discussed and a framework for classifying such systems, based on the relative significance of each of these indicators, is introduced and illustrated qualitatively with various case studies. Quantitative examples drawn from both lake and river water quality modeling exercises are then provided.
Loftus, Kelli; Tilley, Terry; Hoffman, Jason; Bradburn, Eric; Harvey, Ellen
2015-01-01
The creation of a consistent culture of safety and quality in an intensive care unit is challenging. We applied the Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) model for quality improvement (QI) to develop a long-term solution to improve outcomes in a high-risk neurotrauma intensive care unit. We sought to reduce central line utilization as a cornerstone in preventing central line-associated bloodstream infections (CLABSIs). This study describes the successful application of the DMAIC model in the creation and implementation of evidence-based quality improvement designed to reduce CLABSIs to below national benchmarks.
Towards the Next Generation Air Quality Modeling System ...
The community multiscale air quality (CMAQ) model of the U.S. Environmental Protection Agency is one of the most widely used air quality model worldwide; it is employed for both research and regulatory applications at major universities and government agencies for improving understanding of the formation and transport of air pollutants. It is noted, however, that air quality issues and climate change assessments need to be addressed globally recognizing the linkages and interactions between meteorology and atmospheric chemistry across a wide range of scales. Therefore, an effort is currently underway to develop the next generation air quality modeling system (NGAQM) that will be based on a global integrated meteorology and chemistry system. The model for prediction across scales-atmosphere (MPAS-A), a global fully compressible non-hydrostatic model with seamlessly refined centroidal Voronoi grids, has been chosen as the meteorological driver of this modeling system. The initial step of adapting MPAS-A for the NGAQM was to implement and test the physics parameterizations and options that are preferred for retrospective air quality simulations (see the work presented by R. Gilliam, R. Bullock, and J. Herwehe at this workshop). The next step, presented herein, would be to link the chemistry from CMAQ to MPAS-A to build a prototype for the NGAQM. Furthermore, the techniques to harmonize transport processes between CMAQ and MPAS-A, methodologies to connect the chemis
Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling
NASA Astrophysics Data System (ADS)
Huber, I.; Archontoulis, S.
2017-12-01
In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar application rate.
Combined effect of noise and vibration on passenger acceptance
NASA Technical Reports Server (NTRS)
Leatherwood, J. D.
1984-01-01
An extensive research program conducted at NASA Langley Research Center to develop a comprehensive model of passenger comfort response to combined noise and vibration environments has been completed. This model was developed for use in the prediction and/or assessment of vehicle ride quality and as a ride quality design tool. The model has the unique capability to transform individual elements of vehicle interior noise and vibration into subjective units and combining the subjective units to produce a total subjective discomfort index as well as the other useful subjective indices. This paper summarizes the basic approach used in the development of the NASA ride comfort model, presents some of the more fundamental results obtained, describes several application of the model to operational vehicles, and discusses a portable, self-contained ride quality meter system that is a direct hardware/software implementation of the NASA comfort algorithm.
NASA Astrophysics Data System (ADS)
Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing
2014-11-01
Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.
Akbaş, Halil; Bilgen, Bilge; Turhan, Aykut Melih
2015-11-01
This study proposes an integrated prediction and optimization model by using multi-layer perceptron neural network and particle swarm optimization techniques. Three different objective functions are formulated. The first one is the maximization of methane percentage with single output. The second one is the maximization of biogas production with single output. The last one is the maximization of biogas quality and biogas production with two outputs. Methane percentage, carbon dioxide percentage, and other contents' percentage are used as the biogas quality criteria. Based on the formulated models and data from a wastewater treatment facility, optimal values of input variables and their corresponding maximum output values are found out for each model. It is expected that the application of the integrated prediction and optimization models increases the biogas production and biogas quality, and contributes to the quantity of electricity production at the wastewater treatment facility. Copyright © 2015 Elsevier Ltd. All rights reserved.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G
2013-01-01
Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931
An analytical approach for predicting pilot induced oscillations
NASA Technical Reports Server (NTRS)
Hess, R. A.
1981-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion or determining the susceptability of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
NASA Technical Reports Server (NTRS)
Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.
1977-01-01
Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.
Air Quality Modeling of Traffic-related Air Pollutants for the NEXUS Study
The paper presents the results of the model applications to estimate exposure metrics in support of an epidemiologic study in Detroit, Michigan. A major challenge in traffic-related air pollution exposure studies is the lack of information regarding pollutant exposure characteriz...
Examining Air Quality-Meteorology Interactions on Regional to Hemispheric Scales
This presentation provides motivation for coupling the atmospheric dynamics and chemistry calculations in air pollution modeling systems, provides an overview of how this coupling is achieved in the WRF-CMAQ 2-way coupled model, presents results from various applications of the m...
Data-driven modeling of background and mine-related acidity and metals in river basins
Friedel, Michael J
2013-01-01
A novel application of self-organizing map (SOM) and multivariate statistical techniques is used to model the nonlinear interaction among basin mineral-resources, mining activity, and surface-water quality. First, the SOM is trained using sparse measurements from 228 sample sites in the Animas River Basin, Colorado. The model performance is validated by comparing stochastic predictions of basin-alteration assemblages and mining activity at 104 independent sites. The SOM correctly predicts (>98%) the predominant type of basin hydrothermal alteration and presence (or absence) of mining activity. Second, application of the Davies–Bouldin criteria to k-means clustering of SOM neurons identified ten unique environmental groups. Median statistics of these groups define a nonlinear water-quality response along the spatiotemporal hydrothermal alteration-mining gradient. These results reveal that it is possible to differentiate among the continuum between inputs of background and mine-related acidity and metals, and it provides a basis for future research and empirical model development.
Can Deming's Concept of Total Quality Management Be Applied to Education?
ERIC Educational Resources Information Center
Sevick, Charles
This paper explores the meaning of Total Quality Management (TQM), examines the development of the concept, and assesses the application of TQM to education. In summary, TQM has the following points of relevance for education: (1) The interest and welfare of every student must be a primary concern; (2) the authoritarian management model does not…
USDA-ARS?s Scientific Manuscript database
Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems mod...
USDA-ARS?s Scientific Manuscript database
Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems mode...
Family Quality of Life: Moving from Measurement to Application
ERIC Educational Resources Information Center
Zuna, Nina I.; Turnbull, Ann; Summers, Jean Ann
2009-01-01
Noting the absence of sound theoretical underpinnings for family quality of life (FQoL) research and work, the authors note that, to guide FQoL practice, research findings must be schematically organized so as to enable practitioners to implement empirical findings effectively. One way to meet this goal is to introduce a theoretical model that…
Within the next several years NOAA and EPA will begin to issue PM2.5 air quality forecasts over the entire domain of the eastern United States, eventually extending to national coverage. These forecasts will provide continuous estimated values of particulate matter on ...
Multi-Resolution Unstructured Grid-Generation for Geophysical Applications on the Sphere
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2015-01-01
An algorithm for the generation of non-uniform unstructured grids on ellipsoidal geometries is described. This technique is designed to generate high quality triangular and polygonal meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric and ocean simulation, and numerical weather predication. Using a recently developed Frontal-Delaunay-refinement technique, a method for the construction of high-quality unstructured ellipsoidal Delaunay triangulations is introduced. A dual polygonal grid, derived from the associated Voronoi diagram, is also optionally generated as a by-product. Compared to existing techniques, it is shown that the Frontal-Delaunay approach typically produces grids with near-optimal element quality and smooth grading characteristics, while imposing relatively low computational expense. Initial results are presented for a selection of uniform and non-uniform ellipsoidal grids appropriate for large-scale geophysical applications. The use of user-defined mesh-sizing functions to generate smoothly graded, non-uniform grids is discussed.
AQUATOX Model Validation Reports
AQUATOX has a myriad of potential applications to water management issues and programs, including water quality criteria and standards, TMDLs (Total Maximum Daily Loads), and ecological risk assessments of aquatic systems.
Turrini, Enrico; Carnevale, Claudio; Finzi, Giovanna; Volta, Marialuisa
2018-04-15
This paper introduces the MAQ (Multi-dimensional Air Quality) model aimed at defining cost-effective air quality plans at different scales (urban to national) and assessing the co-benefits for GHG emissions. The model implements and solves a non-linear multi-objective, multi-pollutant decision problem where the decision variables are the application levels of emission abatement measures allowing the reduction of energy consumption, end-of pipe technologies and fuel switch options. The objectives of the decision problem are the minimization of tropospheric secondary pollution exposure and of internal costs. The model assesses CO 2 equivalent emissions in order to support decision makers in the selection of win-win policies. The methodology is tested on Lombardy region, a heavily polluted area in northern Italy. Copyright © 2017 Elsevier B.V. All rights reserved.
Demirci, Müşerref Duygu Saçar; Allmer, Jens
2017-07-28
MicroRNAs (miRNAs) are involved in the post-transcriptional regulation of protein abundance and thus have a great impact on the resulting phenotype. It is, therefore, no wonder that they have been implicated in many diseases ranging from virus infections to cancer. This impact on the phenotype leads to a great interest in establishing the miRNAs of an organism. Experimental methods are complicated which led to the development of computational methods for pre-miRNA detection. Such methods generally employ machine learning to establish models for the discrimination between miRNAs and other sequences. Positive training data for model establishment, for the most part, stems from miRBase, the miRNA registry. The quality of the entries in miRBase has been questioned, though. This unknown quality led to the development of filtering strategies in attempts to produce high quality positive datasets which can lead to a scarcity of positive data. To analyze the quality of filtered data we developed a machine learning model and found it is well able to establish data quality based on intrinsic measures. Additionally, we analyzed which features describing pre-miRNAs could discriminate between low and high quality data. Both models are applicable to data from miRBase and can be used for establishing high quality positive data. This will facilitate the development of better miRNA detection tools which will make the prediction of miRNAs in disease states more accurate. Finally, we applied both models to all miRBase data and provide the list of high quality hairpins.
Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M
2016-01-01
This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087
Schwartz, Carolyn E; Rapkin, Bruce D
2004-01-01
The increasing evidence for response shift phenomena in quality of life (QOL) assessment points to the necessity to reconsider both the measurement model and the application of psychometric analyses. The proposed psychometric model posits that the QOL true score is always contingent upon parameters of the appraisal process. This new model calls into question existing methods for establishing the reliability and validity of QOL assessment tools and suggests several new approaches for describing the psychometric properties of these scales. Recommendations for integrating the assessment of appraisal into QOL research and clinical practice are discussed. PMID:15038830
Fault recovery in the reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Whetten, Brian
1995-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis
NASA Astrophysics Data System (ADS)
Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.
2013-12-01
Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web-based Exceptional Event Decision System (EE DSS) application program, designed to support air quality analysts at the Federal and Regional EPA offices and the EE-affected States. EE DSS screening tool automatically identifies the EPA PM2.5 mass samples that are candidates for EE flagging, based mainly on the NAAPS-simulated surface concentration of dust and smoke. The AQ analysts at the States and the EPA can also use the EE DSS to gather further evidence from the examination of spatio-temporal pattern, Absorbing Aerosol Index, CO and NO2 concentration, backward and forward airmass trajectories and other signatures. Since early 2013, the DSS has been used for the identification and analysis of dozens of events. Hence, integration of multi-sensory observations and modeling with data assimilation is maturing to support real-world operational AQ management applications. The remaining challenges can be resolved by seeking ';closure' of the system components; i.e. the systematic adjustments to reconcile the satellite and surface observations, the emissions and their integration through a suitable AQ model.
Improving Factor Score Estimation Through the Use of Observed Background Characteristics
Curran, Patrick J.; Cole, Veronica; Bauer, Daniel J.; Hussong, Andrea M.; Gottfredson, Nisha
2016-01-01
A challenge facing nearly all studies in the psychological sciences is how to best combine multiple items into a valid and reliable score to be used in subsequent modelling. The most ubiquitous method is to compute a mean of items, but more contemporary approaches use various forms of latent score estimation. Regardless of approach, outside of large-scale testing applications, scoring models rarely include background characteristics to improve score quality. The current paper used a Monte Carlo simulation design to study score quality for different psychometric models that did and did not include covariates across levels of sample size, number of items, and degree of measurement invariance. The inclusion of covariates improved score quality for nearly all design factors, and in no case did the covariates degrade score quality relative to not considering the influences at all. Results suggest that the inclusion of observed covariates can improve factor score estimation. PMID:28757790
NASA Astrophysics Data System (ADS)
Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang
2018-06-01
Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.
Gundersen, Kenneth; Kvaløy, Jan Terje; Eftestøl, Trygve; Kramer-Johansen, Jo
2015-10-15
For patients undergoing cardiopulmonary resuscitation (CPR) and being in a shockable rhythm, the coarseness of the electrocardiogram (ECG) signal is an indicator of the state of the patient. In the current work, we show how mixed effects stochastic differential equations (SDE) models, commonly used in pharmacokinetic and pharmacodynamic modelling, can be used to model the relationship between CPR quality measurements and ECG coarseness. This is a novel application of mixed effects SDE models to a setting quite different from previous applications of such models and where using such models nicely solves many of the challenges involved in analysing the available data. Copyright © 2015 John Wiley & Sons, Ltd.
Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin
2017-03-01
Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.
Zhang, Ying-Ying; Zhou, Xiao-Bin; Wang, Qiu-Zhen; Zhu, Xiao-Yan
2017-05-01
Multivariable logistic regression (MLR) has been increasingly used in Chinese clinical medical research during the past few years. However, few evaluations of the quality of the reporting strategies in these studies are available.To evaluate the reporting quality and model accuracy of MLR used in published work, and related advice for authors, readers, reviewers, and editors.A total of 316 articles published in 5 leading Chinese clinical medical journals with high impact factor from January 2010 to July 2015 were selected for evaluation. Articles were evaluated according 12 established criteria for proper use and reporting of MLR models.Among the articles, the highest quality score was 9, the lowest 1, and the median 5 (4-5). A total of 85.1% of the articles scored below 6. No significant differences were found among these journals with respect to quality score (χ = 6.706, P = .15). More than 50% of the articles met the following 5 criteria: complete identification of the statistical software application that was used (97.2%), calculation of the odds ratio and its confidence interval (86.4%), description of sufficient events (>10) per variable, selection of variables, and fitting procedure (78.2%, 69.3%, and 58.5%, respectively). Less than 35% of the articles reported the coding of variables (18.7%). The remaining 5 criteria were not satisfied by a sufficient number of articles: goodness-of-fit (10.1%), interactions (3.8%), checking for outliers (3.2%), collinearity (1.9%), and participation of statisticians and epidemiologists (0.3%). The criterion of conformity with linear gradients was applicable to 186 articles; however, only 7 (3.8%) mentioned or tested it.The reporting quality and model accuracy of MLR in selected articles were not satisfactory. In fact, severe deficiencies were noted. Only 1 article scored 9. We recommend authors, readers, reviewers, and editors to consider MLR models more carefully and cooperate more closely with statisticians and epidemiologists. Journals should develop statistical reporting guidelines concerning MLR.
Big data analytics to improve cardiovascular care: promise and challenges.
Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M
2016-06-01
The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.
Jin, Haoyi; Yu, Yanqiu
2016-10-01
High-quality preclinical bioassay models are essential for drug research and development. We reviewed the emerging body-on-a-chip technology, which serves as a promising model to overcome the limitations of traditional bioassay models, and introduced existing models of body-on-a-chip, their constitutional details, application for drug testing, and individual features of these models. We put special emphasis on the latest trend in this field of incorporating barrier tissue into body-on-a-chip and discussed several remaining challenges of current body-on-a-chip. © 2015 Society for Laboratory Automation and Screening.
Cook, David J; Thompson, Jeffrey E; Suri, Rakesh; Prinsen, Sharon K
2014-01-01
The absence of standardization in surgical care process, exemplified in a "solution shop" model, can lead to unwarranted variation, increased cost, and reduced quality. A comprehensive effort was undertaken to improve quality of care around indwelling bladder catheter use following surgery by creating a "focused factory" model within the cardiac surgical practice. Baseline compliance with Surgical Care Improvement Inf-9, removal of urinary catheter by the end of surgical postoperative day 2, was determined. Comparison of baseline data to postintervention results showed clinically important reductions in the duration of indwelling bladder catheters as well as marked reduction in practice variation. Following the intervention, Surgical Care Improvement Inf-9 guidelines were met in 97% of patients. Although clinical quality improvement was notable, the process to accomplish this-identification of patients suitable for standardized pathways, protocol application, and electronic systems to support the standardized practice model-has potentially greater relevance than the specific clinical results. © 2013 by the American College of Medical Quality.
Habitat Suitability Index Models: Black-shouldered kite
Faanes, Craig A.; Howard, Rebecca J.
1987-01-01
A review and synthesis of existing information were used to develop a model for evaluating black-shouldered kite habitat quality. The model is scaled to produce an index between 0 (unsuitable habitat) to 1.0 (optimal habitat). Habitat suitability index models are designed for use with the Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service. Guidelines for model application are provided.
Towards A Complete Model Of Photopic Visual Threshold Performance
NASA Astrophysics Data System (ADS)
Overington, I.
1982-02-01
Based on a wide variety of fragmentary evidence taken from psycho-physics, neurophysiology and electron microscopy, it has been possible to put together a very widely applicable conceptual model of photopic visual threshold performance. Such a model is so complex that a single comprehensive mathematical version is excessively cumbersome. It is, however, possible to set up a suite of related mathematical models, each of limited application but strictly known envelope of usage. Such models may be used for assessment of a variety of facets of visual performance when using display imagery, including effects and interactions of image quality, random and discrete display noise, viewing distance, image motion, etc., both for foveal interrogation tasks and for visual search tasks. The specific model may be selected from the suite according to the assessment task in hand. The paper discusses in some depth the major facets of preperceptual visual processing and their interaction with instrumental image quality and noise. It then highlights the statistical nature of visual performance before going on to consider a number of specific mathematical models of partial visual function. Where appropriate, these are compared with widely popular empirical models of visual function.
An expert system for water quality modelling.
Booty, W G; Lam, D C; Bobba, A G; Wong, I; Kay, D; Kerby, J P; Bowen, G S
1992-12-01
The RAISON-micro (Regional Analysis by Intelligent System ON a micro-computer) expert system is being used to predict the effects of mine effluents on receiving waters in Ontario. The potential of this system to assist regulatory agencies and mining industries to define more acceptable effluent limits was shown in an initial study. This system has been further developed so that the expert system helps the model user choose the most appropriate model for a particular application from a hierarchy of models. The system currently contains seven models which range from steady state to time dependent models, for both conservative and nonconservative substances in rivers and lakes. The menu driven expert system prompts the model user for information such as the nature of the receiving water system, the type of effluent being considered, and the range of background data available for use as input to the models. The system can also be used to determine the nature of the environmental conditions at the site which are not available in the textual information database, such as the components of river flow. Applications of the water quality expert system are presented for representative mine sites in the Timmins area of Ontario.
Adapting water treatment design and operations to the impacts of global climate change
NASA Astrophysics Data System (ADS)
Clark, Robert M.; Li, Zhiwei; Buchberger, Steven G.
2011-12-01
It is anticipated that global climate change will adversely impact source water quality in many areas of the United States and will therefore, potentially, impact the design and operation of current and future water treatment systems. The USEPA has initiated an effort called the Water Resources Adaptation Program (WRAP) which is intended to develop tools and techniques that can assess the impact of global climate change on urban drinking water and wastewater infrastructure. A three step approach for assessing climate change impacts on water treatment operation and design is being persude in this effort. The first step is the stochastic characterization of source water quality, the second step is the application of the USEPA Water Treatment Plant model and the third step is the application of cost algorithms to provide a metric that can be used to assess the coat impact of climate change. A model has been validated using data collected from Cincinnati's Richard Miller Water Treatment Plant for the USEPA Information Collection Rule (ICR) database. An analysis of the water treatment processes in response to assumed perturbations in raw water quality identified TOC, pH, and bromide as the three most important parameters affecting performance of the Miller WTP. The Miller Plant was simulated using the EPA WTP model to examine the impact of these parameters on selected regulated water quality parameters. Uncertainty in influent water quality was analyzed to estimate the risk of violating drinking water maximum contaminant levels (MCLs).Water quality changes in the Ohio River were projected for 2050 using Monte Carlo simulation and the WTP model was used to evaluate the effects of water quality changes on design and operation. Results indicate that the existing Miller WTP might not meet Safe Drinking Water Act MCL requirements for certain extreme future conditions. However, it was found that the risk of MCL violations under future conditions could be controlled by enhancing existing WTP design and operation or by process retrofitting and modification.
Weykamp, Cas; Siebelder, Carla
2017-11-01
HbA1c is a key parameter in diabetes management. For years the test has been used exclusively for monitoring of long-term diabetic control. However, due to improvement of the performance, HbA1c is considered more and more for diagnosis and screening. With this new application, quality demands further increase. A task force of the International Federation of Clinical Chemistry and Laboratory Medicine developed a model to set and evaluate quality targets for HbA1c. The model is based on the concept of total error and takes into account the major sources of analytical errors in the medical laboratory: bias and imprecision. Performance criteria are derived from sigma-metrics and biological variation. This review shows 2 examples of the application of the model: at the level of single laboratories, and at the level of a group of laboratories. In the first example data of 125 individual laboratories of a recent external quality assessment program in the Netherlands are evaluated. Differences between laboratories as well as their relation to method principles are shown. The second example uses recent and 3-year-old data of the proficiency test of the College of American Pathologists. The differences in performance between 26 manufacturer-related groups of laboratories are shown. Over time these differences are quite consistent although some manufacturers improved substantially either by better standardization or by replacing a test. The IFCC model serves all who are involved in HbA1c testing in the ongoing process of better performance and better patient care.
Grey fuzzy optimization model for water quality management of a river system
NASA Astrophysics Data System (ADS)
Karmakar, Subhankar; Mujumdar, P. P.
2006-07-01
A grey fuzzy optimization model is developed for water quality management of river system to address uncertainty involved in fixing the membership functions for different goals of Pollution Control Agency (PCA) and dischargers. The present model, Grey Fuzzy Waste Load Allocation Model (GFWLAM), has the capability to incorporate the conflicting goals of PCA and dischargers in a deterministic framework. The imprecision associated with specifying the water quality criteria and fractional removal levels are modeled in a fuzzy mathematical framework. To address the imprecision in fixing the lower and upper bounds of membership functions, the membership functions themselves are treated as fuzzy in the model and the membership parameters are expressed as interval grey numbers, a closed and bounded interval with known lower and upper bounds but unknown distribution information. The model provides flexibility for PCA and dischargers to specify their aspirations independently, as the membership parameters for different membership functions, specified for different imprecise goals are interval grey numbers in place of a deterministic real number. In the final solution optimal fractional removal levels of the pollutants are obtained in the form of interval grey numbers. This enhances the flexibility and applicability in decision-making, as the decision-maker gets a range of optimal solutions for fixing the final decision scheme considering technical and economic feasibility of the pollutant treatment levels. Application of the GFWLAM is illustrated with case study of the Tunga-Bhadra river system in India.
Application of BIM technology in green scientific research office building
NASA Astrophysics Data System (ADS)
Ni, Xin; Sun, Jianhua; Wang, Bo
2017-05-01
BIM technology as a kind of information technology, has been along with the advancement of building industrialization application in domestic building industry gradually. Based on reasonable construction BIM model, using BIM technology platform, through collaborative design tools can effectively improve the design efficiency and design quality. Vanda northwest engineering design and research institute co., LTD., the scientific research office building project in combination with the practical situation of engineering using BIM technology, formed in the BIM model combined with related information according to the energy energy model (BEM) and the application of BIM technology in construction management stage made exploration, and the direct experience and the achievements gained by the architectural design part made a summary.
Automated workflows for data curation and standardization of chemical structures for QSAR modeling
Large collections of chemical structures and associated experimental data are publicly available, and can be used to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experime...
Nutrient Dynamics in Flooded Wetlands. II: Model Application
In this paper we applied and evaluated the wetland nutrient model described in an earlier paper. Hydrologic and water quality data from a small restored wetland located on Kent Island, Maryland, which is part of the Delmarva Peninsula on the Eastern shores of the Chesapeake Bay...
The Utility of the OMI HCHO/NO2 in Air Quality Decision-Making Activities
NASA Technical Reports Server (NTRS)
Duncan, Bryan
2010-01-01
I will discuss a novel and practical application of the OMI HCHU and NO2 data products to the "weight of evidence" in the air quality decision-making process (e.g., State Implementation Plan (SIP)) for a city, region, or state to demonstrate that it is making progress toward attainment of the National Ambient Air Quality Standard (NAAQS) for ozone. Any trend, or lack thereof, in the observed OMI HCHO/NO2 may support that an emission control strategy implemented to reduce ozone is or is not occurring for a metropolitan area. In addition, the observed OMI HCHO/NO2 may be used to define new emission control strategies as the photochemical environments of urban areas evolve over time. I will demonstrate the utility of the OMI HCHO/NO2 over the U.S. for air quality applications with support from simulations with both a regional model and a photochemical box model. These results support mission planning of an OMI-like instrument for the proposed GEO-CAPE satellite that has as one of its objectives to study air quality from space. However, I'm attending the meeting as the Aura Deputy Project Scientist, so I don't technically need to present anything to justify the travel.
2010-01-01
Background The measurement of healthcare provider performance is becoming more widespread. Physicians have been guarded about performance measurement, in part because the methodology for comparative measurement of care quality is underdeveloped. Comprehensive quality improvement will require comprehensive measurement, implying the aggregation of multiple quality metrics into composite indicators. Objective To present a conceptual framework to develop comprehensive, robust, and transparent composite indicators of pediatric care quality, and to highlight aspects specific to quality measurement in children. Methods We reviewed the scientific literature on composite indicator development, health systems, and quality measurement in the pediatric healthcare setting. Frameworks were selected for explicitness and applicability to a hospital-based measurement system. Results We synthesized various frameworks into a comprehensive model for the development of composite indicators of quality of care. Among its key premises, the model proposes identifying structural, process, and outcome metrics for each of the Institute of Medicine's six domains of quality (safety, effectiveness, efficiency, patient-centeredness, timeliness, and equity) and presents a step-by-step framework for embedding the quality of care measurement model into composite indicator development. Conclusions The framework presented offers researchers an explicit path to composite indicator development. Without a scientifically robust and comprehensive approach to measurement of the quality of healthcare, performance measurement will ultimately fail to achieve its quality improvement goals. PMID:20181129
“Fine-Scale Application of the coupled WRF-CMAQ System to ...
The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in urban areas using satellite, aircraft, vertical profiler and ground based measurements (http://discover-aq.larc.nasa.gov). In July 2011, the DISCOVER-AQ project conducted intensive air quality measurements in the Baltimore, MD and Washington, D.C. area in the eastern U.S. To take advantage of these unique data, the Community Multiscale Air Quality (CMAQ) model, coupled with the Weather Research and Forecasting (WRF) model is used to simulate the meteorology and air quality in the same region using 12-km, 4-km and 1-km horizontal grid spacings. The goal of the modeling exercise is to demonstrate the capability of the coupled WRF-CMAQ modeling system to simulate air quality at fine grid spacings in an urban area. Development of new data assimilation techniques and the use of higher resolution input data for the WRF model have been implemented to improve the meteorological results, particularly at the 4-km and 1-km grid resolutions. In addition, a number of updates to the CMAQ model were made to enhance the capability of the modeling system to accurately represent the magnitude and spatial distribution of pollutants at fine model resolutions. Data collected during the 2011 DISCOVER-AQ campa
“Application and evaluation of the two-way coupled WRF ...
The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in urban areas using satellite, aircraft, vertical profiler and ground based measurements (http://discover-aq.larc.nasa.gov). In July 2011, the DISCOVER-AQ project conducted intensive air quality measurements in the Baltimore, MD and Washington, D.C. area in the eastern U.S. To take advantage of these unique data, the Community Multiscale Air Quality (CMAQ) model, coupled with the Weather Research and Forecasting (WRF) model is used to simulate the meteorology and air quality in the same region using 12-km, 4-km and 1-km horizontal grid spacings. The goal of the modeling exercise is to demonstrate the capability of the coupled WRF-CMAQ modeling system to simulate air quality at fine grid spacings in an urban area. Development of new data assimilation techniques and the use of higher resolution input data for the WRF model have been implemented to improve the meteorological results, particularly at the 4-km and 1-km grid resolutions. In addition, a number of updates to the CMAQ model were made to enhance the capability of the modeling system to accurately represent the magnitude and spatial distribution of pollutants at fine model resolutions. Data collected during the 2011 DISCOVER-AQ campa
Baron, Ronan; Saffell, John
2017-11-22
This review examines the use of amperometric electrochemical gas sensors for monitoring inorganic gases that affect urban air quality. First, we consider amperometric gas sensor technology including its development toward specifically designed air quality sensors. We then review recent academic and research organizations' studies where this technology has been trialed for air quality monitoring applications: early studies showed the potential of electrochemical gas sensors when colocated with reference Air Quality Monitoring (AQM) stations. Spatially dense networks with fast temporal resolution provide information not available from sparse AQMs with longer recording intervals. We review how this technology is being offered as commercial urban air quality networks and consider the remaining challenges. Sensors must be sensitive, selective, and stable; air quality monitors/nodes must be electronically and mechanically well designed. Data correction is required and models with differing levels of sophistication are being designed. Data analysis and validation is possibly the biggest remaining hurdle needed to deliver reliable concentration readings. Finally, this review also considers the roles of companies, urban infrastructure requirements, and public research in the development of this technology.
A real time quality control application for animal production by image processing.
Sungur, Cemil; Özkan, Halil
2015-11-01
Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.
Tavakoli, Ali; Nikoo, Mohammad Reza; Kerachian, Reza; Soltani, Maryam
2015-04-01
In this paper, a new fuzzy methodology is developed to optimize water and waste load allocation (WWLA) in rivers under uncertainty. An interactive two-stage stochastic fuzzy programming (ITSFP) method is utilized to handle parameter uncertainties, which are expressed as fuzzy boundary intervals. An iterative linear programming (ILP) is also used for solving the nonlinear optimization model. To accurately consider the impacts of the water and waste load allocation strategies on the river water quality, a calibrated QUAL2Kw model is linked with the WWLA optimization model. The soil, water, atmosphere, and plant (SWAP) simulation model is utilized to determine the quantity and quality of each agricultural return flow. To control pollution loads of agricultural networks, it is assumed that a part of each agricultural return flow can be diverted to an evaporation pond and also another part of it can be stored in a detention pond. In detention ponds, contaminated water is exposed to solar radiation for disinfecting pathogens. Results of applying the proposed methodology to the Dez River system in the southwestern region of Iran illustrate its effectiveness and applicability for water and waste load allocation in rivers. In the planning phase, this methodology can be used for estimating the capacities of return flow diversion system and evaporation and detention ponds.
Arons, Alexander M M; Krabbe, Paul F M
2013-02-01
Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.
New global fire emission estimates and evaluation of volatile organic compounds
C. Wiedinmyer; L. K. Emmons; S. K. Akagi; R. J. Yokelson; J. J. Orlando; J. A. Al-Saadi; A. J. Soja
2010-01-01
A daily, high-resolution, global fire emissions model has been built to estimate emissions from open burning for air quality modeling applications: The Fire INventory from NCAR (FINN version 1). The model framework uses daily fire detections from the MODIS instruments and updated emission factors, specifically for speciated non-methane organic compounds (NMOC). Global...
The RCS model allows us to estimate the distribution of population exposure to air pollutants in any city given only the outdoor measurements in that city. Since outdoor measurements are made in many cities, but personal exposures are measured in few, the model could conceivab...
ERIC Educational Resources Information Center
Butcher, Samuel S.; And Others
1985-01-01
Part I of this paper (SE 538 295) described a simple model for estimating laboratory concentrations of gas phase pollutants. In this part, the measurement of ventilation rates and applications of the model are discussed. The model can provide a useful starting point in planning for safer instructional laboratories. (JN)
Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit
2015-01-01
While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. PMID:26178332
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
NASA Astrophysics Data System (ADS)
Adams, R.; Quinn, P. F.; Bowes, M. J.
2014-09-01
A model for simulating runoff pathways and water quality fluxes has been developed using the Minimum Information (MIR) approach. The model, the Catchment Runoff Attenuation Tool (CRAFT) is applicable to meso-scale catchments which focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used investigate the impact of management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers, for example in Europe, meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset UK, has been described here as an application of the CRAFT model. The model was primarily calibrated on ten years of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N - and phosphorus - P) concentrations. Also data from two years of sub-daily high resolution monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily timestep for this meso-scale modelling study as the minimum information requirement. A management intervention scenario was also run to show how the model can support catchment managers to investigate how reducing the concentrations of N and P in the various flow pathways. This scale appropriate modelling tool can help policy makers consider a range of strategies to to meet the European Union (EU) water quality targets for this type of catchment.
Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin
Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.
1989-01-01
Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.
MPEG-21 in broadcasting: the novel digital broadcast item model
NASA Astrophysics Data System (ADS)
Lugmayr, Artur R.; Touimi, Abdellatif B.; Kaneko, Itaru; Kim, Jong-Nam; Alberti, Claudio; Yona, Sadigurschi; Kim, Jaejoon; Andrade, Maria Teresa; Kalli, Seppo
2004-05-01
The MPEG experts are currently developing the MPEG-21 set of standards and this includes a framework and specifications for digital rights management (DRM), delivery of quality of services (QoS) over heterogeneous networks and terminals, packaging of multimedia content and other things essential for the infrastructural aspects of multimedia content distribution. Considerable research effort is being applied to these new developments and the capabilities of MPEG-21 technologies to address specific application areas are being investigated. One such application area is broadcasting, in particular the development of digital TV and its services. In more practical terms, digital TV addresses networking, events, channels, services, programs, signaling, encoding, bandwidth, conditional access, subscription, advertisements and interactivity. MPEG-21 provides an excellent framework of standards to be applied in digital TV applications. Within the scope of this research work we describe a new model based on MPEG-21 and its relevance to digital TV: the digital broadcast item model (DBIM). The goal of the DBIM is to elaborate the potential of MPEG-21 for digital TV applications. Within this paper we focus on a general description of the DBIM, quality of service (QoS) management and metadata filtering, digital rights management and also present use-cases and scenarios where the DBIM"s role is explored in detail.
The DISCOVER-AQ project (Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality), is a joint collaboration between NASA, U.S. EPA and a number of other local organizations with the goal of characterizing air quality in ...
A prescribed fire emission factors database for land management and air quality applications
E. Lincoln; WeiMin Hao; S. Baker; R. J. Yokelson; I. R. Burling; Shawn Urbanski; W. Miller; D. R. Weise; T. J. Johnson
2010-01-01
Prescribed fire is a significant emissions source in the U.S. and that needs to be adequately characterized in atmospheric transport/chemistry models. In addition, the Clean Air Act, its amendments, and air quality regulations require that prescribed fire managers estimate the quantity of emissions that a prescribed fire will produce. Several published papers contain a...
An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression
Weiss, Brandi A.; Dardick, William
2015-01-01
This article introduces an entropy-based measure of data–model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data–model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data–model fit to assess how well logistic regression models classify cases into observed categories. PMID:29795897
An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression.
Weiss, Brandi A; Dardick, William
2016-12-01
This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data-model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data-model fit to assess how well logistic regression models classify cases into observed categories.
76 FR 15004 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-18
... computer software-based models or applications, termed under the rule as ``interactive websites.'' These... information; (c) Ways to enhance the quality, utility, and clarity of the information collected; and (d) Ways...
Yuan, Tao; Zheng, Xinqi; Hu, Xuan; Zhou, Wei; Wang, Wei
2014-01-01
Objective and effective image quality assessment (IQA) is directly related to the application of optical remote sensing images (ORSI). In this study, a new IQA method of standardizing the target object recognition rate (ORR) is presented to reflect quality. First, several quality degradation treatments with high-resolution ORSIs are implemented to model the ORSIs obtained in different imaging conditions; then, a machine learning algorithm is adopted for recognition experiments on a chosen target object to obtain ORRs; finally, a comparison with commonly used IQA indicators was performed to reveal their applicability and limitations. The results showed that the ORR of the original ORSI was calculated to be up to 81.95%, whereas the ORR ratios of the quality-degraded images to the original images were 65.52%, 64.58%, 71.21%, and 73.11%. The results show that these data can more accurately reflect the advantages and disadvantages of different images in object identification and information extraction when compared with conventional digital image assessment indexes. By recognizing the difference in image quality from the application effect perspective, using a machine learning algorithm to extract regional gray scale features of typical objects in the image for analysis, and quantitatively assessing quality of ORSI according to the difference, this method provides a new approach for objective ORSI assessment.
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamartino, R.J.; Smith, D.G.; Bremer, S.A.
1980-07-01
This report documents the results of the Federal Aviation Administration (FAA)/Environmental Protection Agency (EPA) air quality study which has been conducted to assess the impact of aircraft emissions of carbon monoxide (CO), hydrocarbons (HC), and oxides of nitrogen (NOx) in the vicinity of airports. This assessment includes the results of recent modeling and monitoring efforts at Washington National (DCA), Los Angeles International (LAX), Dulles International (IAD), and Lakeland, Florida airports and an updated modeling of aircraft generated pollution at LAX, John F. Kennedy (JFK) and Chicago O'Hare (ORD) airports. The Airport Vicinity Air Pollution (AVAP) model which was designed formore » use at civil airports was used in this assessment. In addition the results of the application of the military version of the AVAP model the Air Quality Assessment Model (AQAM), are summarized.« less
Large collections of chemical structures and associated experimental data are publicly available, and can be used to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experime...
In most ecosystems, atmospheric deposition is the primary input of mercury. The total wet deposition of mercury in atmospheric chemistry models is sensitive to parameterization of the aqueous-phase reduction of divalent oxidized mercury (Hg2+). However, most atmospheric chemistry...
There is a need to develop modeling and data analysis tools to increase our understanding of human exposures to air pollutants beyond what can be explained by "limited" field data. Modeling simulations of complex distributions of pollutant concentrations within roadw...
REGIONAL MODELING OF THE ATMOSPHERIC TRANSPORT AND DEPOSITION OF ATRAZINE
A version of the Community Multiscale Air Quality (CMAQ) model has been developed by the U.S. EPA that is capable of addressing the atmospheric fate, transport and deposition of some common trace toxics. An initial, 36-km rectangular grid-cell application for atrazine has been...
Evaluation of satellite-based, modeled-derived daily solar radiation data for the continental U.S.
USDA-ARS?s Scientific Manuscript database
Many applications of simulation models and related decision support tools for agriculture and natural resource management require daily meteorological data as inputs. Availability and quality of such data, however, often constrain research and decision support activities that require use of these to...
Result from a new air pollution model were tested against data from the Southern California Air Quality Study (SCAQS) period of 26-29 August 1987. Gross errors for sulfate, sodium, light absorption, temperatures, surface solar radiation, sulfur dioxide gas, formaldehyde gas, and ...
A BAYESIAN STATISTICAL APPROACHES FOR THE EVALUATION OF CMAQ
This research focuses on the application of spatial statistical techniques for the evaluation of the Community Multiscale Air Quality (CMAQ) model. The upcoming release version of the CMAQ model was run for the calendar year 2001 and is in the process of being evaluated by EPA an...
Predicting Nitrogen in Streams : A Comparison of Two Estimates of Fertilizer Application
Decision makers frequently rely on water and air quality models to develop nutrient management strategies. Obviously, the results of these models (e.g., SWAT, SPARROW, CMAQ) are only as good as the nutrient source input data and recently the Nutrient Innovations Task Group has ca...
Weather Research and Forecasting (WRF) meteorological data are used for USEPA multimedia air and water quality modeling applications, within the CMAQ modeling system to estimate wet deposition and to evaluate future climate and land-use scenarios. While it is not expected that hi...
Spatial analysis studies have included application of land use regression models (LURs) for health and air quality assessments. Recent LUR studies have collected nitrogen dioxide (NO2) and volatile organic compounds (VOCs) using passive samplers at urban air monitoring networks ...
Effects of urbanization on the water quality of lakes in Eagan, Minnesota
Ayers, M.A.; Payne, G.A.; Have, Mark A.
1980-01-01
Three phosphorus-prediction models developed during the study are applicable to shallow (less than about 12 feet), nonstratifying lakes and ponds. The data base was not sufficient to select an appropriate model to predict the effects of future loading from continuing urbanization on the deeper lakes.
Increasing availability of large collections of chemical structures and associated experimental data provides an opportunity to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associat...
Jeff Jenness; J. Judson Wynne
2005-01-01
In the field of spatially explicit modeling, well-developed accuracy assessment methodologies are often poorly applied. Deriving model accuracy metrics have been possible for decades, but these calculations were made by hand or with the use of a spreadsheet application. Accuracy assessments may be useful for: (1) ascertaining the quality of a model; (2) improving model...
Prediction of passenger ride quality in a multifactor environment
NASA Technical Reports Server (NTRS)
Dempsey, T. K.; Leatherwood, J. D.
1976-01-01
A model being developed, permits the understanding and prediction of passenger discomfort in a multifactor environment with particular emphasis upon combined noise and vibration. The model has general applicability to diverse transportation systems and provides a means of developing ride quality design criteria as well as a diagnostic tool for identifying the vibration and/or noise stimuli causing discomfort. Presented are: (1) a review of the basic theoretical and mathematical computations associated with the model, (2) a discussion of methodological and criteria investigations for both the vertical and roll axes of vibration, (3) a description of within-axis masking of discomfort responses for the vertical axis, thereby allowing prediction of the total discomfort due to any random vertical vibration, (4) a discussion of initial data on between-axis masking, and (5) discussion of a study directed towards extension of the vibration model to the more general case of predicting ride quality in the combined noise and vibration environments.
A modeling analysis program for the JPL Table Mountain Io sodium cloud data
NASA Technical Reports Server (NTRS)
Smyth, W. H.; Goldberg, B. A.
1986-01-01
Progress and achievements in the second year are discussed in three main areas: (1) data quality review of the 1981 Region B/C images; (2) data processing activities; and (3) modeling activities. The data quality review revealed that almost all 1981 Region B/C images are of sufficient quality to be valuable in the analyses of the JPL data set. In the second area, the major milestone reached was the successful development and application of complex image-processing software required to render the original image data suitable for modeling analysis studies. In the third area, the lifetime description of sodium atoms in the planet magnetosphere was improved in the model to include the offset dipole nature of the magnetic field as well as an east-west electric field. These improvements are important in properly representing the basic morphology as well as the east-west asymmetries of the sodium cloud.
Yu, Lei; Kang, Jian
2009-09-01
This research aims to explore the feasibility of using computer-based models to predict the soundscape quality evaluation of potential users in urban open spaces at the design stage. With the data from large scale field surveys in 19 urban open spaces across Europe and China, the importance of various physical, behavioral, social, demographical, and psychological factors for the soundscape evaluation has been statistically analyzed. Artificial neural network (ANN) models have then been explored at three levels. It has been shown that for both subjective sound level and acoustic comfort evaluation, a general model for all the case study sites is less feasible due to the complex physical and social environments in urban open spaces; models based on individual case study sites perform well but the application range is limited; and specific models for certain types of location/function would be reliable and practical. The performance of acoustic comfort models is considerably better than that of sound level models. Based on the ANN models, soundscape quality maps can be produced and this has been demonstrated with an example.
NASA Astrophysics Data System (ADS)
Curci, Gabriele; Falasca, Serena
2017-04-01
Deterministic air quality forecast is routinely carried out at many local Environmental Agencies in Europe and throughout the world by means of eulerian chemistry-transport models. The skill of these models in predicting the ground-level concentrations of relevant pollutants (ozone, nitrogen dioxide, particulate matter) a few days ahead has greatly improved in recent years, but it is not yet always compliant with the required quality level for decision making (e.g. the European Commission has set a maximum uncertainty of 50% on daily values of relevant pollutants). Post-processing of deterministic model output is thus still regarded as a useful tool to make the forecast more reliable. In this work, we test several bias correction techniques applied to a long-term dataset of air quality forecasts over Europe and Italy. We used the WRF-CHIMERE modelling system, which provides operational experimental chemical weather forecast at CETEMPS (http://pumpkin.aquila.infn.it/forechem/), to simulate the years 2008-2012 at low resolution over Europe (0.5° x 0.5°) and moderate resolution over Italy (0.15° x 0.15°). We compared the simulated dataset with available observation from the European Environmental Agency database (AirBase) and characterized model skill and compliance with EU legislation using the Delta tool from FAIRMODE project (http://fairmode.jrc.ec.europa.eu/). The bias correction techniques adopted are, in order of complexity: (1) application of multiplicative factors calculated as the ratio of model-to-observed concentrations averaged over the previous days; (2) correction of the statistical distribution of model forecasts, in order to make it similar to that of the observations; (3) development and application of Model Output Statistics (MOS) regression equations. We illustrate differences and advantages/disadvantages of the three approaches. All the methods are relatively easy to implement for other modelling systems.
Advanced Water Quality Modelling in Marine Systems: Application to the Wadden Sea, the Netherlands
NASA Astrophysics Data System (ADS)
Boon, J.; Smits, J. G.
2006-12-01
There is an increasing demand for knowledge and models that arise from water management in relation to water quality, sediment quality (ecology) and sediment accumulation (ecomorphology). Recently, models for sediment diagenesis and erosion developed or incorporated by Delft Hydraulics integrates the relevant physical, (bio)chemical and biological processes for the sediment-water exchange of substances. The aim of the diagenesis models is the prediction of both sediment quality and the return fluxes of substances such as nutrients and micropollutants to the overlying water. The resulting so-called DELWAQ-G model is a new, generic version of the water and sediment quality model of the DELFT3D framework. One set of generic water quality process formulations is used to calculate process rates in both water and sediment compartments. DELWAQ-G involves the explicit simulation of sediment layers in the water quality model with state-of-the-art process kinetics. The local conditions in a water layer or sediment layer such as the dissolved oxygen concentration determine if and how individual processes come to expression. New processes were added for sulphate, sulphide, methane and the distribution of the electron-acceptor demand over dissolved oxygen, nitrate, sulphate and carbon dioxide. DELWAQ-G also includes the dispersive and advective transport processes in the sediment and across the sediment-water interface. DELWAQ-G has been applied for the Wadden Sea. A very dynamic tidal and ecologically active estuary with a complex hydrodynamic behaviour located at the north of the Netherlands. The predicted profiles in the sediment reflect the typical interactions of diagenesis processes.
Fan, Xin-Gang; Mi, Wen-Bao; Ma, Zhen-Ning
2015-02-01
For deep analysis on the regional environmental economic system, the paper analyzes the mutual relation of regional economy development, environmental quality, environmental pollution, and builds the theoretical basis. Then, the economy-pollution-environment quality three-dimensional coupling evaluation model for district is constructed. It includes economic development level index, environmental pollution index, and environmental quality index. The model is a cube, which has spatialization and visualization characteristics. The model includes 8 sub cubes, which expresses 8 types of state, e. g. low pollution-inferior quality-low level of economic development etc. The model can be used to evaluate the status of region, divide development phase, analyze evolution trend etc. It has two ways including relative meaning evaluation (RME) and absolute meaning evaluation (AME). Based on the model, Yinchuan City in the Ningxia Hui Autonomous Region is used as an example for the empirical study. Using RME, compared with Guangzhou city, The result shows that the Yinchuan City has been a high pollution-low quality-low level of economic development state for a long period during 1996-2010. After 2007, the state changed to a high pollution-high quality-low level of economic development. Now, the environmental quality of Yinchuan city gets better, but pollutant discharge pressure is high, and tends to be the break point of high environment quality and low environment. With AME, using national standard, the Yinchuan City remains a high pollution-low quality-low level of economic development state during 1996-2010. Empirical research verifies that different target reference areas and relevant national standards have different main parameters, the evaluating result has an flexible range. The dimensionless data enhances the coupling of index. The data position in model increases the visibility to the environmental management decisions. The model improves mismatches of calculated data size, time asymmetry of spatial data, verification of the former multi-target coupling model.
Modeling of the laser beam shape for high-power applications
NASA Astrophysics Data System (ADS)
Jabczyński, Jan K.; Kaskow, Mateusz; Gorajek, Lukasz; Kopczyński, Krzysztof; Zendzian, Waldemar
2018-04-01
Aperture losses and thermo-optic effects (TOE) inside optics as well as the effective beam width in far field should be taken into account in the analysis of the most appropriate laser beam profile for high-power applications. We have theoretically analyzed such a problem for a group of super-Gaussian beams taking first only diffraction limitations. Furthermore, we have investigated TOE on far-field parameters of such beams to determine the influence of absorption in optical elements on beam quality degradation. The best compromise gives the super-Gaussian profile of index p = 5, for which beam quality does not decrease noticeably and the thermo-optic higher order aberrations are compensated. The simplified formulas were derived for beam quality metrics (parameter M2 and Strehl ratio), which enable estimation of the influence of heat deposited in optics on degradation of beam quality. The method of dynamic compensation of such effect was proposed.
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
A review of distributed parameter groundwater management modeling methods
Gorelick, Steven M.
1983-01-01
Models which solve the governing groundwater flow or solute transport equations in conjunction with optimization techniques, such as linear and quadratic programing, are powerful aquifer management tools. Groundwater management models fall in two general categories: hydraulics or policy evaluation and water allocation. Groundwater hydraulic management models enable the determination of optimal locations and pumping rates of numerous wells under a variety of restrictions placed upon local drawdown, hydraulic gradients, and water production targets. Groundwater policy evaluation and allocation models can be used to study the influence upon regional groundwater use of institutional policies such as taxes and quotas. Furthermore, fairly complex groundwater-surface water allocation problems can be handled using system decomposition and multilevel optimization. Experience from the few real world applications of groundwater optimization-management techniques is summarized. Classified separately are methods for groundwater quality management aimed at optimal waste disposal in the subsurface. This classification is composed of steady state and transient management models that determine disposal patterns in such a way that water quality is protected at supply locations. Classes of research missing from the literature are groundwater quality management models involving nonlinear constraints, models which join groundwater hydraulic and quality simulations with political-economic management considerations, and management models that include parameter uncertainty.
NASA Technical Reports Server (NTRS)
Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.
2015-01-01
Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.
A Review of Distributed Parameter Groundwater Management Modeling Methods
NASA Astrophysics Data System (ADS)
Gorelick, Steven M.
1983-04-01
Models which solve the governing groundwater flow or solute transport equations in conjunction with optimization techniques, such as linear and quadratic programing, are powerful aquifer management tools. Groundwater management models fall in two general categories: hydraulics or policy evaluation and water allocation. Groundwater hydraulic management models enable the determination of optimal locations and pumping rates of numerous wells under a variety of restrictions placed upon local drawdown, hydraulic gradients, and water production targets. Groundwater policy evaluation and allocation models can be used to study the influence upon regional groundwater use of institutional policies such as taxes and quotas. Furthermore, fairly complex groundwater-surface water allocation problems can be handled using system decomposition and multilevel optimization. Experience from the few real world applications of groundwater optimization-management techniques is summarized. Classified separately are methods for groundwater quality management aimed at optimal waste disposal in the subsurface. This classification is composed of steady state and transient management models that determine disposal patterns in such a way that water quality is protected at supply locations. Classes of research missing from the literature are groundwater quality management models involving nonlinear constraints, models which join groundwater hydraulic and quality simulations with political-economic management considerations, and management models that include parameter uncertainty.
US EPA 2012 Air Quality Fused Surface for the Conterminous U.S. Map Service
This web service contains a polygon layer that depicts fused air quality predictions for 2012 for census tracts in the conterminous United States. Fused air quality predictions (for ozone and PM2.5) are modeled using a Bayesian space-time downscaling fusion model approach described in a series of three published journal papers: 1) (Berrocal, V., Gelfand, A. E. and Holland, D. M. (2012). Space-time fusion under error in computer model output: an application to modeling air quality. Biometrics 68, 837-848; 2) Berrocal, V., Gelfand, A. E. and Holland, D. M. (2010). A bivariate space-time downscaler under space and time misalignment. The Annals of Applied Statistics 4, 1942-1975; and 3) Berrocal, V., Gelfand, A. E., and Holland, D. M. (2010). A spatio-temporal downscaler for output from numerical models. J. of Agricultural, Biological,and Environmental Statistics 15, 176-197) is used to provide daily, predictive PM2.5 (daily average) and O3 (daily 8-hr maximum) surfaces for 2012. Summer (O3) and annual (PM2.5) means calculated and published. The downscaling fusion model uses both air quality monitoring data from the National Air Monitoring Stations/State and Local Air Monitoring Stations (NAMS/SLAMS) and numerical output from the Models-3/Community Multiscale Air Quality (CMAQ). Currently, predictions at the US census tract centroid locations within the 12 km CMAQ domain are archived. Predictions at the CMAQ grid cell centroids, or any desired set of locations co
Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris
2016-12-01
Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.
Sequentially Executed Model Evaluation Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less
A user-oriented and computerized model for estimating vehicle ride quality
NASA Technical Reports Server (NTRS)
Leatherwood, J. D.; Barker, L. M.
1984-01-01
A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.
Infrared image enhancement using H(infinity) bounds for surveillance applications.
Qidwai, Uvais
2008-08-01
In this paper, two algorithms have been presented to enhance the infrared (IR) images. Using the autoregressive moving average model structure and H(infinity) optimal bounds, the image pixels are mapped from the IR pixel space into normal optical image space, thus enhancing the IR image for improved visual quality. Although H(infinity)-based system identification algorithms are very common now, they are not quite suitable for real-time applications owing to their complexity. However, many variants of such algorithms are possible that can overcome this constraint. Two such algorithms have been developed and implemented in this paper. Theoretical and algorithmic results show remarkable enhancement in the acquired images. This will help in enhancing the visual quality of IR images for surveillance applications.
Using LiDAR datasets to improve HSPF water quality modeling in the Red River of the North Basin
NASA Astrophysics Data System (ADS)
Burke, M. P.; Foreman, C. S.
2013-12-01
The Red River of the North Basin (RRB), located in the lakebed of ancient glacial Lake Agassiz, comprises one of the flattest landscapes in North America. The topography of the basin, coupled with the Red River's direction of flow from south to north results in a system that is highly susceptible to flooding. The magnitude and frequency of flood events in the RRB has prompted several multijurisdictional projects and mitigation efforts. In response to the devastating 1997 flood, an International Joint Commission sponsored task force established the need for accurate elevation data to help improve flood forecasting and better understand risks. This led to the International Water Institute's Red River Basin Mapping Initiative, and the acquisition LiDAR Data for the entire US portion of the RRB. The resulting 1 meter bare earth digital elevation models have been used to improve hydraulic and hydrologic modeling within the RRB, with focus on flood prediction and mitigation. More recently, these LiDAR datasets have been incorporated into Hydrological Simulation Program-FORTRAN (HSPF) model applications to improve water quality predictions in the MN portion of the RRB. RESPEC is currently building HSPF model applications for five of MN's 8-digit HUC watersheds draining to the Red River, including: the Red Lake River, Clearwater River, Sandhill River, Two Rivers, and Tamarac River watersheds. This work is being conducted for the Minnesota Pollution Control Agency (MPCA) as part of MN's statewide watershed approach to restoring and protecting water. The HSPF model applications simulate hydrology (discharge, stage), as well as a number of water quality constituents (sediment, temperature, organic and inorganic nitrogen, total ammonia, organic and inorganic phosphorus, dissolved oxygen and biochemical oxygen demand, and algae) continuously for the period 1995-2009 and are formulated to provide predictions at points of interest within the watersheds, such as observation gages, management boundaries, compliance points, and impaired water body endpoints. Incorporation of the LiDAR datasets has been critical to representing the topographic characteristics that impact hydrologic and water quality processes in the extremely flat, heavily drained sub-basins of the RRB. Beyond providing more detailed elevation and slope measurements, the high resolution LiDAR datasets have helped to identify drainage alterations due to agricultural practices, as well as improve representation of channel geometry. Additionally, when available, LiDAR based hydraulic models completed as part of the RRB flood mitigation efforts, are incorporated to further improve flow routing. The MPCA will ultimately use these HSPF models to aid in Total Maximum Daily Load (TMDL) development, permit development/compliance, analysis of Best Management Practice (BMP) implementation scenarios, and other watershed planning and management objectives. LiDAR datasets are an essential component of the water quality models build for the watersheds within the RRB and would greatly benefit water quality modeling efforts in similarly characterized areas.
Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management
1990-12-12
Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and
Developing a quality assurance program for online services.
Humphries, A W; Naisawald, G V
1991-01-01
A quality assurance (QA) program provides not only a mechanism for establishing training and competency standards, but also a method for continuously monitoring current service practices to correct shortcomings. The typical QA cycle includes these basic steps: select subject for review, establish measurable standards, evaluate existing services using the standards, identify problems, implement solutions, and reevaluate services. The Claude Moore Health Sciences Library (CMHSL) developed a quality assurance program for online services designed to evaluate services against specific criteria identified by research studies as being important to customer satisfaction. These criteria include reliability, responsiveness, approachability, communication, and physical factors. The application of these criteria to the library's existing online services in the quality review process is discussed with specific examples of the problems identified in each service area, as well as the solutions implemented to correct deficiencies. The application of the QA cycle to an online services program serves as a model of possible interventions. The use of QA principles to enhance online service quality can be extended to other library service areas. PMID:1909197
Developing a quality assurance program for online services.
Humphries, A W; Naisawald, G V
1991-07-01
A quality assurance (QA) program provides not only a mechanism for establishing training and competency standards, but also a method for continuously monitoring current service practices to correct shortcomings. The typical QA cycle includes these basic steps: select subject for review, establish measurable standards, evaluate existing services using the standards, identify problems, implement solutions, and reevaluate services. The Claude Moore Health Sciences Library (CMHSL) developed a quality assurance program for online services designed to evaluate services against specific criteria identified by research studies as being important to customer satisfaction. These criteria include reliability, responsiveness, approachability, communication, and physical factors. The application of these criteria to the library's existing online services in the quality review process is discussed with specific examples of the problems identified in each service area, as well as the solutions implemented to correct deficiencies. The application of the QA cycle to an online services program serves as a model of possible interventions. The use of QA principles to enhance online service quality can be extended to other library service areas.
Modeling of liquid flow in surface discontinuities
NASA Astrophysics Data System (ADS)
Lobanova, I. S.; Meshcheryakov, V. A.; Kalinichenko, A. N.
2018-01-01
Polymer composite and metallic materials have found wide application in various industries such as aviation, rocket, car manufacturing, ship manufacturing, etc. Many design elements need permanent quality control. Ensuring high quality and reliability of products is impossible without effective nondestructive testing methods. One of these methods is penetrant testing using penetrating substances based on liquid penetration into defect cavities. In this paper, we propose a model of liquid flow to determine the rates of filling the defect cavities with various materials and, based on this, to choose optimal control modes.
NASA Astrophysics Data System (ADS)
Jomaa, Seifeddine; Jiang, Sanyuan; Yang, Xiaoqiang; Rode, Michael
2016-04-01
Eutrophication is a serious environmental problem. Despite numerous experimental and modelling efforts, understanding of the effect of land use and agriculture practices on in-stream nitrogen fluxes is still not fully achieved. This study combined intensive field monitoring and numerical modelling using 30 years of surface water quality data of a drinking water reservoir catchment in central Germany. The Weida catchment (99.5 km2) is part of the Elbe river basin and has a share of 67% of agricultural land use with significant changes in agricultural practices within the investigation period. The geology of the Weida catchment is characterized by clay schists and eruptive rocks, where rocks have low permeability. The semi-distributed hydrological water quality HYPE (Hydrological Predictions for the Environment) model was used to reproduce the measured data. First, the model was calibrated for discharge and nitrate-N concentrations (NO3-N) during the period 1997-2000. Then, the HYPE model was validated successfully for three different periods 1983-1987, 1989-1996 and 2000-2003, which are charaterized by different fertilizer application rates (with lowest discharge prediction performance of NSE = 0.78 and PBIAS = 3.74%, considering calibration and validation periods). Results showed that the measured as well as simulated in-stream nitrate-N concentration respond quickly to fertilizer application changes (increase/decrease). This rapid response can be explained with short residence times of interflow and baseflow runoff components due to the hardrock geological properties of the catchment. Results revealed that the surface runoff and interflow are the most dominant runoff components. HYPE model could reproduce reasonably well the NO3-N daily loads for varying fertilizer application, when detailed input data in terms of crop management (field-specific survey) are considered.
The Bobath concept - a model to illustrate clinical practice.
Michielsen, Marc; Vaughan-Graham, Julie; Holland, Ann; Magri, Alba; Suzuki, Mitsuo
2017-12-17
The model of Bobath clinical practice provides a framework identifying the unique aspects of the Bobath concept in terms of contemporary neurological rehabilitation. The utilisation of a framework to illustrate the clinical application of the Bobath concept provides the basis for a common understanding with respect to Bobath clinical practice, education, and research. The development process culminating in the model of Bobath clinical practice is described. The use of the model in clinical practice is illustrated using two cases: a client with a chronic incomplete spinal cord injury and a client with a stroke. This article describes the clinical application of the Bobath concept in terms of the integration of posture and movement with respect to the quality of task performance, applying the Model of Bobath Clinical Practice. Facilitation, a key aspect of Bobath clinical practice, was utilised to positively affect motor control and perception in two clients with impairment-related movement problems due to neurological pathology and associated activity limitations and participation restrictions - the outcome measures used to reflect the individual clinical presentation. Implications for Rehabilitation The model of Bobath clinical practice provides a framework identifying the unique aspects of the Bobath-concept. The model of Bobath clinical practice provides the basis for a common understanding with respect to Bobath clinical practice, education, and research. The clinical application of the Bobath-concept highlights the integration of posture and movement with respect to the quality of task performance. Facilitation, a key aspect of Bobath clinical practice, positively affects motor control, and perception.
A communication library for the parallelization of air quality models on structured grids
NASA Astrophysics Data System (ADS)
Miehe, Philipp; Sandu, Adrian; Carmichael, Gregory R.; Tang, Youhua; Dăescu, Dacian
PAQMSG is an MPI-based, Fortran 90 communication library for the parallelization of air quality models (AQMs) on structured grids. It consists of distribution, gathering and repartitioning routines for different domain decompositions implementing a master-worker strategy. The library is architecture and application independent and includes optimization strategies for different architectures. This paper presents the library from a user perspective. Results are shown from the parallelization of STEM-III on Beowulf clusters. The PAQMSG library is available on the web. The communication routines are easy to use, and should allow for an immediate parallelization of existing AQMs. PAQMSG can also be used for constructing new models.
Sultan, Torky; Khedr, Ayman E; Sayed, Mostafa
2013-01-01
NONE DECLARED Defect tracking systems play an important role in the software development organizations as they can store historical information about defects. There are many research in defect tracking models and systems to enhance their capabilities to be more specifically tracking, and were adopted with new technology. Furthermore, there are different studies in classifying bugs in a step by step method to have clear perception and applicable method in detecting such bugs. This paper shows a new proposed defect tracking model for the purpose of classifying the inserted defects reports in a step by step method for more enhancement of the software quality.
2009-04-08
The Food and Drug Administration (FDA) is announcing the availability of a guidance entitled "Q10 Pharmaceutical Quality System." The guidance was prepared under the auspices of the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The guidance describes a model for an effective quality management system for the pharmaceutical industry, referred to as the Pharmaceutical Quality System. The guidance is intended to provide a comprehensive approach to an effective pharmaceutical quality system that is based on International Organization for Standardization (ISO) concepts, includes applicable good manufacturing practice (GMP) regulations and complements ICH guidances on "Q8 Pharmaceutical Development" and "Q9 Quality Risk Management."
Wang, Xin; Su, Xia; Sun, Wentao; Xie, Yanming; Wang, Yongyan
2011-10-01
In post-marketing study of traditional Chinese medicine (TCM), pharmacoeconomic evaluation has an important applied significance. However, the economic literatures of TCM have been unable to fully and accurately reflect the unique overall outcomes of treatment with TCM. For the special nature of TCM itself, we recommend that Markov model could be introduced into post-marketing pharmacoeconomic evaluation of TCM, and also explore the feasibility of model application. Markov model can extrapolate the study time horizon, suit with effectiveness indicators of TCM, and provide measurable comprehensive outcome. In addition, Markov model can promote the development of TCM quality of life scale and the methodology of post-marketing pharmacoeconomic evaluation.
Face Processing: Models For Recognition
NASA Astrophysics Data System (ADS)
Turk, Matthew A.; Pentland, Alexander P.
1990-03-01
The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.
“Summary of the Emission Inventories compiled for the ...
We present a summary of the emission inventories from the US, Canada, and Mexico developed for the second phase of the Air Quality Model Evaluation International Initiative (AQMEII). Activities in this second phase are focused on the application and evaluation of coupled meteorology-chemistry models over both North America and Europe using common emissions and boundary conditions for all modeling groups for the years of 2006 and 2010. We will compare the emission inventories developed for these two years focusing on the SO2 and NOx reductions over these years and compare with socio-economic data. In addition we will highlight the differences in the inventories for the US and Canada compared with the inventories used in the phase 1 of this project. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollut
NASA Astrophysics Data System (ADS)
Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra
2016-02-01
In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.
Developing a method for estimating AADT on all Louisiana roads.
DOT National Transportation Integrated Search
2015-07-01
Traffic flow volumes present key information needed for making transportation engineering and planning decisions. : Accurate traffic volume count has many applications including: roadway planning, design, air quality compliance, travel : model valida...
Speed and Delay Prediction Models for Planning Applications
DOT National Transportation Integrated Search
1999-01-01
Estimation of vehicle speed and delay is fundamental to many forms of : transportation planning analyses including air quality, long-range travel : forecasting, major investment studies, and congestion management systems. : However, existing planning...
Režek Jambrak, Anet; Šimunek, Marina; Grbeš, Franjo; Mandura, Ana; Djekic, Ilija
2018-04-01
The objective of this paper was to demonstrate application of quality function deployment in analysing effects of high power ultrasound on quality properties of apple juices and nectars. In order to develop a quality function deployment model, joint with instrumental analysis of treated samples, a field survey was performed to identify consumer preferences towards quality characteristics of juices/nectar. Based on field research, the three most important characteristics were 'taste' and 'aroma' with 28.5% of relative absolute weight importance, followed by 'odour' (16.9%). The quality function deployment model showed that the top three 'quality scores' for apple juice were treatments with amplitude 90 µm, 9 min treatment time and sample temperature 40 °C; 60 µm, 9 min, 60 °C; and 90 µm, 6 min, 40 °C. For nectars, the top three were treatments 120 µm, 9 min, 20 °C; 60 µm, 9 min, 60 °C; and A2.16 60 µm, 9 min, 20 °C. This type of quality model enables a more complex measure of large scale of different quality parameters. Its simplicity should be understood as its practical advantage and, as such, this tool can be a part of design quality when using novel preservation technologies. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Principles of continuous quality improvement applied to intravenous therapy.
Dunavin, M K; Lane, C; Parker, P E
1994-01-01
Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.
Ying Ouyang; Prem B. Parajuli; Gary Feng; Theodor D. Leininger; Yongshan Wan; Padmanava Dash
2018-01-01
A vast amount of future climate scenario datasets, created by climate models such as general circulation models (GCMs), have been used in conjunction with watershed models to project future climate variability impact on hydrological processes and water quality. However, these low spatial-temporal resolution datasets are often difficult to downscale spatially and...
Diagnosing Alzheimer's disease: a systematic review of economic evaluations.
Handels, Ron L H; Wolfs, Claire A G; Aalten, Pauline; Joore, Manuela A; Verhey, Frans R J; Severens, Johan L
2014-03-01
The objective of this study is to systematically review the literature on economic evaluations of interventions for the early diagnosis of Alzheimer's disease (AD) and related disorders and to describe their general and methodological characteristics. We focused on the diagnostic aspects of the decision models to assess the applicability of existing decision models for the evaluation of the recently revised diagnostic research criteria for AD. PubMed and the National Institute for Health Research Economic Evaluation database were searched for English-language publications related to economic evaluations on diagnostic technologies. Trial-based economic evaluations were assessed using the Consensus on Health Economic Criteria list. Modeling studies were assessed using the framework for quality assessment of decision-analytic models. The search retrieved 2109 items, from which eight decision-analytic modeling studies and one trial-based economic evaluation met all eligibility criteria. Diversity among the study objective and characteristics was considerable and, despite considerable methodological quality, several flaws were indicated. Recommendations were focused on diagnostic aspects and the applicability of existing models for the evaluation of recently revised diagnostic research criteria for AD. Copyright © 2014 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.