[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM
The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...
Intelligent Systems Approaches to Product Sound Quality Analysis
NASA Astrophysics Data System (ADS)
Pietila, Glenn M.
As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach. Next, an unsupervised jury clustering algorithm is used to identify and classify subgroups within a jury who have conflicting preferences. In addition, a nested Artificial Neural Network (ANN) architecture is developed to predict subjective preference based on objective sound quality metrics, in the presence of non-linear preferences. Finally, statistical decomposition and correlation algorithms are reviewed that can help an analyst establish a clear understanding of the variability of the product sounds used as inputs into the jury study and to identify correlations between preference scores and sound quality metrics in the presence of non-linearities.
Towards the XML schema measurement based on mapping between XML and OO domain
NASA Astrophysics Data System (ADS)
Rakić, Gordana; Budimac, Zoran; Heričko, Marjan; Pušnik, Maja
2017-07-01
Measuring quality of IT solutions is a priority in software engineering. Although numerous metrics for measuring object-oriented code already exist, measuring quality of UML models or XML Schemas is still developing. One of the research questions in the overall research leaded by ideas described in this paper is whether we can apply already defined object-oriented design metrics on XML schemas based on predefined mappings. In this paper, basic ideas for mentioned mapping are presented. This mapping is prerequisite for setting the future approach to XML schema quality measuring with object-oriented metrics.
The Educational Consequences of W. Edwards Deming.
ERIC Educational Resources Information Center
Holt, Maurice
1993-01-01
Taylorism (the rational-managerial model) still dominates U.S. education. Deming's quality and improvement concepts cut much deeper than "total quality management" externalities and differ markedly from management by objectives or outcome-based education approaches. The Deming approach is no quick fix but requires a fundamental change in…
An objective method for a video quality evaluation in a 3DTV service
NASA Astrophysics Data System (ADS)
Wilczewski, Grzegorz
2015-09-01
The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann
As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less
Quality Assurance in Higher Education: Proposals for Consultation.
ERIC Educational Resources Information Center
Higher Education Funding Council for England, Bristol.
This document sets out for consultation proposals for a revised method for quality assurance of teaching and learning in higher education. The proposals cover: (1) the objectives and principles of quality assurance; (2) an approach to quality assurance based on external audit principles; (3) the collection and publication of information; (4)…
Post Pareto optimization-A case
NASA Astrophysics Data System (ADS)
Popov, Stoyan; Baeva, Silvia; Marinova, Daniela
2017-12-01
Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.
An approach to detecting deliberately introduced defects and micro-defects in 3D printed objects
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2017-05-01
In prior work, Zeltmann, et al. demonstrated the negative impact that can be created by defects of various sizes in 3D printed objects. These defects may make the object unsuitable for its application or even present a hazard, if the object is being used for a safety-critical application. With the uses of 3D printing proliferating and consumer access to printers increasing, the desire of a nefarious individual or group to subvert the desired printing quality and safety attributes of a printer or printed object must be considered. Several different approaches to subversion may exist. Attackers may physically impair the functionality of the printer or launch a cyber-attack. Detecting introduced defects, from either attack, is critical to maintaining public trust in 3D printed objects and the technology. This paper presents an alternate approach. It applies a quality assurance technology based on visible light sensing to this challenge and assesses its capability for detecting introduced defects of multiple sizes.
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
An Assistant for Loading Learning Object Metadata: An Ontology Based Approach
ERIC Educational Resources Information Center
Casali, Ana; Deco, Claudia; Romano, Agustín; Tomé, Guillermo
2013-01-01
In the last years, the development of different Repositories of Learning Objects has been increased. Users can retrieve these resources for reuse and personalization through searches in web repositories. The importance of high quality metadata is key for a successful retrieval. Learning Objects are described with metadata usually in the standard…
A Regression-Based Family of Measures for Full-Reference Image Quality Assessment
NASA Astrophysics Data System (ADS)
Oszust, Mariusz
2016-12-01
The advances in the development of imaging devices resulted in the need of an automatic quality evaluation of displayed visual content in a way that is consistent with human visual perception. In this paper, an approach to full-reference image quality assessment (IQA) is proposed, in which several IQA measures, representing different approaches to modelling human visual perception, are efficiently combined in order to produce objective quality evaluation of examined images, which is highly correlated with evaluation provided by human subjects. In the paper, an optimisation problem of selection of several IQA measures for creating a regression-based IQA hybrid measure, or a multimeasure, is defined and solved using a genetic algorithm. Experimental evaluation on four largest IQA benchmarks reveals that the multimeasures obtained using the proposed approach outperform state-of-the-art full-reference IQA techniques, including other recently developed fusion approaches.
An Approach to Improve the Quality of Infrared Images of Vein-Patterns
Lin, Chih-Lung
2011-01-01
This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images. PMID:22247674
An approach to improve the quality of infrared images of vein-patterns.
Lin, Chih-Lung
2011-01-01
This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images.
Is a Quality Course a Worthy Course? Designing for Value and Worth in Online Courses
ERIC Educational Resources Information Center
Youger, Robin E.; Ahern, Terence C.
2015-01-01
There are many strategies for estimating the effectiveness of instruction. Typically, most methods are based on the student evaluation. Recently a more standardized approach, Quality Matters (QM), has been developed that uses an objectives-based strategy. QM, however, does not account for the learning process, nor for the value and worth of the…
High-quality slab-based intermixing method for fusion rendering of multiple medical objects.
Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil
2016-01-01
The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Adaptive zooming in X-ray computed tomography.
Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan
2014-01-01
In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.
Toward objective image quality metrics: the AIC Eval Program of the JPEG
NASA Astrophysics Data System (ADS)
Richter, Thomas; Larabi, Chaker
2008-08-01
Objective quality assessment of lossy image compression codecs is an important part of the recent call of the JPEG for Advanced Image Coding. The target of the AIC ad-hoc group is twofold: First, to receive state-of-the-art still image codecs and to propose suitable technology for standardization; and second, to study objective image quality metrics to evaluate the performance of such codes. Even tthough the performance of an objective metric is defined by how well it predicts the outcome of a subjective assessment, one can also study the usefulness of a metric in a non-traditional way indirectly, namely by measuring the subjective quality improvement of a codec that has been optimized for a specific objective metric. This approach shall be demonstrated here on the recently proposed HDPhoto format14 introduced by Microsoft and a SSIM-tuned17 version of it by one of the authors. We compare these two implementations with JPEG1 in two variations and a visual and PSNR optimal JPEG200013 implementation. To this end, we use subjective and objective tests based on the multiscale SSIM and a new DCT based metric.
No-reference video quality measurement: added value of machine learning
NASA Astrophysics Data System (ADS)
Mocanu, Decebal Constantin; Pokhrel, Jeevan; Garella, Juan Pablo; Seppänen, Janne; Liotou, Eirini; Narwaria, Manish
2015-11-01
Video quality measurement is an important component in the end-to-end video delivery chain. Video quality is, however, subjective, and thus, there will always be interobserver differences in the subjective opinion about the visual quality of the same video. Despite this, most existing works on objective quality measurement typically focus only on predicting a single score and evaluate their prediction accuracies based on how close it is to the mean opinion scores (or similar average based ratings). Clearly, such an approach ignores the underlying diversities in the subjective scoring process and, as a result, does not allow further analysis on how reliable the objective prediction is in terms of subjective variability. Consequently, the aim of this paper is to analyze this issue and present a machine-learning based solution to address it. We demonstrate the utility of our ideas by considering the practical scenario of video broadcast transmissions with focus on digital terrestrial television (DTT) and proposing a no-reference objective video quality estimator for such application. We conducted meaningful verification studies on different video content (including video clips recorded from real DTT broadcast transmissions) in order to verify the performance of the proposed solution.
Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.
2016-01-01
Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
NASA Astrophysics Data System (ADS)
Maboudi, Mehdi; Amini, Jalal; Malihi, Shirin; Hahn, Michael
2018-04-01
Updated road network as a crucial part of the transportation database plays an important role in various applications. Thus, increasing the automation of the road extraction approaches from remote sensing images has been the subject of extensive research. In this paper, we propose an object based road extraction approach from very high resolution satellite images. Based on the object based image analysis, our approach incorporates various spatial, spectral, and textural objects' descriptors, the capabilities of the fuzzy logic system for handling the uncertainties in road modelling, and the effectiveness and suitability of ant colony algorithm for optimization of network related problems. Four VHR optical satellite images which are acquired by Worldview-2 and IKONOS satellites are used in order to evaluate the proposed approach. Evaluation of the extracted road networks shows that the average completeness, correctness, and quality of the results can reach 89%, 93% and 83% respectively, indicating that the proposed approach is applicable for urban road extraction. We also analyzed the sensitivity of our algorithm to different ant colony optimization parameter values. Comparison of the achieved results with the results of four state-of-the-art algorithms and quantifying the robustness of the fuzzy rule set demonstrate that the proposed approach is both efficient and transferable to other comparable images.
Deccache, A
1997-06-01
Health promotion and health education have often been limited to evaluation of the effectiveness of actions and programmes. However, since 1996 with the Third European Conference on Health Promotion and Education Effectiveness, many researchers have become interested in "quality assessment" and new ways of thinking have emerged. Quality assurance is a concept and activity developed in industry with the objective of increasing production efficiency. There are two distinct approaches: External Standard Inspection (ESI) and Continuous Quality Improvement (CQI). ESI involves establishing criteria of quality, evaluating them and improving whatever needs improvement. CQI views the activity or service as a process and includes the quality assessment as part of the process. This article attempts to answer the questions of whether these methods are sufficient and suitable for operationalising the concepts of evaluation, effectiveness and quality in health promotion and education, whether it is necessary to complement them with other methods, and whether the ESI approach is appropriate. The first section of the article explains that health promotion is based on various paradigms from epidemiology to psychology and anthropology. Many authors warn against the exclusive use of public health disciplines for understanding, implementing and evaluating health promotion. The author argues that in practice, health promotion: -integrates preventive actions with those aiming to maintain and improve health, a characteristic which widens the actions of health promotion from those of classic public health which include essentially an epidemiological or "risk" focus; -aims to replace vertical approaches to prevention with a global approach based on educational sciences; -involves a community approach which includes the individual in a "central position of power" as much in the definition of needs as in the evaluation of services; -includes the participation and socio-political actions which necessitate the use of varied and specific instruments for action and evaluation. With the choice of health promotion ideology, there exist corresponding theories, concepts of quality, and therefore methods and techniques that differ from those used until now. The educational sciences have led to a widening of the definition of process to include both "throughput and input", which has meant that the methods of needs analysis, objective and priority setting and project development in health promotion have become objects of quality assessment. Also, the modes of action and interaction among actors are included, which has led to evaluation of ethical and ideological aspects of projects. The second section of the article discusses quality assessment versus evaluation of effectiveness. Different paradigms of evaluation such as the public health approach based on the measurement of (epidemiological) effectiveness, social marketing and communication, and the anthropological approach are briefly discussed, pointing out that there are many approaches which can both complement and contradict one another. The author explains the difference between impact (the intermediate effects, direct or indirect, planned or not planned, changes in practical or theoretical knowledge, perceptions, and attitudes) and results (final effects of mid to long term changes such as changes in morbidity, mortality, or access to services or cost of health care). He argues that by being too concerned with results of programmes, we have often ignored the issue of impact. Also, by limiting ourselves to evaluating effectiveness (i.e. that the expected effects were obtained), we ignore other possible unexpected, unplanned and positive and negative secondary effects. There are therefore many reasons to: -evaluate all possible effects rather than only those lined to objectives; -evaluate the entire process rather than only the resources, procedures and costs; -evaluate the impact rather than results; -evalu
A new approach in the development of quality management systems for (micro)electronics
NASA Astrophysics Data System (ADS)
Bacivarov, Ioan C.; Bacivarov, Angelica; Gherghina, Cǎtǎlina
2016-12-01
This paper presents the new approach in the analysis of the Quality Management Systems (QMS) of companies, based on the revised standard ISO 9001:2015. In the first part of the paper, QMS based on ISO 9001 certification are introduced; the changes and the updates proposed for the new version of ISO 9001:2015 are critically analyzed, based on the documents elaborated by ISO/TC 176. The approach based on ISO 9001:2015 could be considered as "beginning of a new era in development of quality management systems". A comparison between the between the "old" standard ISO 9001:2008 and the "new" standard ISO 9001:2015 is made. In the second part of the paper, steps to be followed in a company to implement this new standard are presented. A peculiar attention is given to the new concept of risk-based thinking in order to support and improve application of the process based approach. The authors conclude that, by considering risk throughout the organization the likelihood of achieving stated objectives is improved, output is more consistent and customers can be confident that they will receive the expected results. Finally, the benefits of the new approach in the development of quality management systems are outlined, as well as how they are reflected in the management of companies in general and those in electronics field, in particular. As demonstrated in this paper, well understood and properly applied, the new approach based on the revised standard ISO9001:2015 could offer a better quality management for companies operating in electronics and beyond.
Australian Recognition Framework Arrangements. Australia's National Training Framework.
ERIC Educational Resources Information Center
Australian National Training Authority, Brisbane.
This document explains the objectives, principles, standards, and protocols of the Australian Recognition Framework (ARF), which is a comprehensive approach to national recognition of vocational education and training (VET) that is based on a quality-assured approach to the registration of training organizations seeking to deliver training, assess…
A new approach to the identification of Landscape Quality Objectives (LQOs) as a set of indicators.
Sowińska-Świerkosz, Barbara Natalia; Chmielewski, Tadeusz J
2016-12-15
The objective of the paper is threefold: (1) to introduce Landscape Quality Objectives (LQOs) as a set of indicators; (2) to present a method of linking social and expert opinion in the process of the formulation of landscape indicators; and (3) to present a methodological framework for the identification of LQOs. The implementation of these goals adopted a six-stage procedure based on the use of landscape units: (1) GIS analysis; (2) classification; (3) social survey; (4) expert value judgement; (5) quality assessment; and (6) guidelines formulation. The essence of the research was the presentation of features that determine landscape quality according to public opinion as a set of indicators. The results showed that 80 such indicators were identified, of both a qualitative (49) and a quantitative character (31). Among the analysed units, 60% (18 objects) featured socially expected (and confirmed by experts) levels of landscape quality, and 20% (6 objects) required overall quality improvement in terms of both public and expert opinion. The adopted procedure provides a new tool for integrating social responsibility into environmental management. The advantage of the presented method is the possibility of its application in the territories of various European countries. It is flexible enough to be based on cartographic studies, landscape research methods, and environmental quality standards existing in a given country. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multiview 3D sensing and analysis for high quality point cloud reconstruction
NASA Astrophysics Data System (ADS)
Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard
2018-04-01
Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.
Fritscher, Karl; Grunerbl, Agnes; Hanni, Markus; Suhm, Norbert; Hengg, Clemens; Schubert, Rainer
2009-10-01
Currently, conventional X-ray and CT images as well as invasive methods performed during the surgical intervention are used to judge the local quality of a fractured proximal femur. However, these approaches are either dependent on the surgeon's experience or cannot assist diagnostic and planning tasks preoperatively. Therefore, in this work a method for the individual analysis of local bone quality in the proximal femur based on model-based analysis of CT- and X-ray images of femur specimen will be proposed. A combined representation of shape and spatial intensity distribution of an object and different statistical approaches for dimensionality reduction are used to create a statistical appearance model in order to assess the local bone quality in CT and X-ray images. The developed algorithms are tested and evaluated on 28 femur specimen. It will be shown that the tools and algorithms presented herein are highly adequate to automatically and objectively predict bone mineral density values as well as a biomechanical parameter of the bone that can be measured intraoperatively.
Approach to developing numeric water quality criteria for ...
Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and potentially cause harmful ecological effects. States can adopt numeric water quality criteria into their water quality standards to protect the designated uses of their coastal waters from eutrophication impacts. The first objective of this study was to provide an approach for developing numeric water quality criteria for coastal waters based on archived SeaWiFS ocean color satellite data. The second objective was to develop an approach for transferring water quality criteria assessments to newer ocean color satellites such as MODIS and MERIS. Spatial and temporal measures of SeaWiFS, MODIS, and MERIS chlorophyll-a (ChlRS-a, mg m-3) were resolved across Florida’s coastal waters between 1998 and 2009. Annual geometric means of SeaWiFS ChlRS-a were evaluated to determine a quantitative reference baseline from the 90th percentile of the annual geometric means. A method for transferring to multiple ocean color sensors was implemented with SeaWiFS as the reference instrument. The ChlRS-a annual geometric means for each coastal segment from MODIS and MERIS were regressed against SeaWiFS to provide a similar response among all three satellites. Standardization factors for each coastal segment were calculated based on differences between 90th percentiles from SeaWiFS to MODIS and SeaWiFS to MERIS. This transfer approach allowed for futu
Makeeva, I M; Moskalev, E E; Kuz'ko, E I
2010-01-01
A new method of color quality control based on spectrophotometry has been developed for dental restoration. A comparative analysis of quality of subjective color control by trained and non-trained observers has been made. Based on comparative analysis of the results of subjective color-control and spectrophotometry the maximum amount of allowed color difference has been set (dE=2.8).
Video quality assesment using M-SVD
NASA Astrophysics Data System (ADS)
Tao, Peining; Eskicioglu, Ahmet M.
2007-01-01
Objective video quality measurement is a challenging problem in a variety of video processing application ranging from lossy compression to printing. An ideal video quality measure should be able to mimic the human observer. We present a new video quality measure, M-SVD, to evaluate distorted video sequences based on singular value decomposition. A computationally efficient approach is developed for full-reference (FR) video quality assessment. This measure is tested on the Video Quality Experts Group (VQEG) phase I FR-TV test data set. Our experiments show the graphical measure displays the amount of distortion as well as the distribution of error in all frames of the video sequence while the numerical measure has a good correlation with perceived video quality outperforms PSNR and other objective measures by a clear margin.
NASA Astrophysics Data System (ADS)
Wang, Y. S.; Shen, G. Q.; Xing, Y. F.
2014-03-01
Based on the artificial neural network (ANN) technique, an objective sound quality evaluation (SQE) model for synthesis annoyance of vehicle interior noises is presented in this paper. According to the standard named GB/T18697, firstly, the interior noises under different working conditions of a sample vehicle are measured and saved in a noise database. Some mathematical models for loudness, sharpness and roughness of the measured vehicle noises are established and performed by Matlab programming. Sound qualities of the vehicle interior noises are also estimated by jury tests following the anchored semantic differential (ASD) procedure. Using the objective and subjective evaluation results, furthermore, an ANN-based model for synthetical annoyance evaluation of vehicle noises, so-called ANN-SAE, is developed. Finally, the ANN-SAE model is proved by some verification tests with the leave-one-out algorithm. The results suggest that the proposed ANN-SAE model is accurate and effective and can be directly used to estimate sound quality of the vehicle interior noises, which is very helpful for vehicle acoustical designs and improvements. The ANN-SAE approach may be extended to deal with other sound-related fields for product quality evaluations in SQE engineering.
NASA Astrophysics Data System (ADS)
Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet
2016-10-01
Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Fast Object Motion Estimation Based on Dynamic Stixels.
Morales, Néstor; Morell, Antonio; Toledo, Jonay; Acosta, Leopoldo
2016-07-28
The stixel world is a simplification of the world in which obstacles are represented as vertical instances, called stixels, standing on a surface assumed to be planar. In this paper, previous approaches for stixel tracking are extended using a two-level scheme. In the first level, stixels are tracked by matching them between frames using a bipartite graph in which edges represent a matching cost function. Then, stixels are clustered into sets representing objects in the environment. These objects are matched based on the number of stixels paired inside them. Furthermore, a faster, but less accurate approach is proposed in which only the second level is used. Several configurations of our method are compared to an existing state-of-the-art approach to show how our methodology outperforms it in several areas, including an improvement in the quality of the depth reconstruction.
Li, Mingjie; Zhou, Ping; Wang, Hong; ...
2017-09-19
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Mingjie; Zhou, Ping; Wang, Hong
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
Using Clinical Data Standards to Measure Quality: A New Approach.
D'Amore, John D; Li, Chun; McCrary, Laura; Niloff, Jonathan M; Sittig, Dean F; McCoy, Allison B; Wright, Adam
2018-04-01
Value-based payment for care requires the consistent, objective calculation of care quality. Previous initiatives to calculate ambulatory quality measures have relied on billing data or individual electronic health records (EHRs) to calculate and report performance. New methods for quality measure calculation promoted by federal regulations allow qualified clinical data registries to report quality outcomes based on data aggregated across facilities and EHRs using interoperability standards. This research evaluates the use of clinical document interchange standards as the basis for quality measurement. Using data on 1,100 patients from 11 ambulatory care facilities and 5 different EHRs, challenges to quality measurement are identified and addressed for 17 certified quality measures. Iterative solutions were identified for 14 measures that improved patient inclusion and measure calculation accuracy. Findings validate this approach to improving measure accuracy while maintaining measure certification. Organizations that report care quality should be aware of how identified issues affect quality measure selection and calculation. Quality measure authors should consider increasing real-world validation and the consistency of measure logic in respect to issues identified in this research. Schattauer GmbH Stuttgart.
Regional Principal Color Based Saliency Detection
Lou, Jing; Ren, Mingwu; Wang, Huan
2014-01-01
Saliency detection is widely used in many visual applications like image segmentation, object recognition and classification. In this paper, we will introduce a new method to detect salient objects in natural images. The approach is based on a regional principal color contrast modal, which incorporates low-level and medium-level visual cues. The method allows a simple computation of color features and two categories of spatial relationships to a saliency map, achieving higher F-measure rates. At the same time, we present an interpolation approach to evaluate resulting curves, and analyze parameters selection. Our method enables the effective computation of arbitrary resolution images. Experimental results on a saliency database show that our approach produces high quality saliency maps and performs favorably against ten saliency detection algorithms. PMID:25379960
Behmel, S; Damour, M; Ludwig, R; Rodriguez, M J
2018-07-15
Water quality monitoring programs (WQMPs) must be based on monitoring objectives originating from the real knowledge needs of all stakeholders in a watershed and users of the resource. This paper proposes a participative approach to elicit knowledge needs and preferred modes of communication from citizens and representatives of organized stakeholders (ROS) on water quality and quantity issues. The participative approach includes six steps and is adaptable and transferable to different types of watersheds. These steps are: (1) perform a stakeholder analysis; (2) conduct an adaptable survey accompanied by a user-friendly public participation geographical information system (PPGIS); (3) hold workshops to meet with ROS to inform them of the results of the survey and PPGIS; discuss attainment of past monitoring objectives; exchange views on new knowledge needs and concerns on water quality and quantity; (4) meet with citizens to obtain the same type of input (as from ROS); (5) analyze the data and information collected to identify new knowledge needs and modes of communication and (6) identify, in collaboration with the individuals in charge of the WQMPs, the short-, medium- and long-term monitoring objectives and communication strategies to be pursued. The participative approach was tested on two distinct watersheds in the province of Quebec, Canada. It resulted in a series of optimization objectives of the existing WQMPs, new monitoring objectives and recommendations regarding communication strategies of the WQMPs' results. The results of this study show that the proposed methodology is appreciated by all parties and that the outcomes and monitoring objectives are acceptable. We also conclude that successful integrated watershed management is a question of scale, and that every aspect of integrated watershed management needs to be adapted to the surface watershed, the groundwater watershed (aquifers) and the human catchment area. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
Depth map occlusion filling and scene reconstruction using modified exemplar-based inpainting
NASA Astrophysics Data System (ADS)
Voronin, V. V.; Marchuk, V. I.; Fisunov, A. V.; Tokareva, S. V.; Egiazarian, K. O.
2015-03-01
RGB-D sensors are relatively inexpensive and are commercially available off-the-shelf. However, owing to their low complexity, there are several artifacts that one encounters in the depth map like holes, mis-alignment between the depth and color image and lack of sharp object boundaries in the depth map. Depth map generated by Kinect cameras also contain a significant amount of missing pixels and strong noise, limiting their usability in many computer vision applications. In this paper, we present an efficient hole filling and damaged region restoration method that improves the quality of the depth maps obtained with the Microsoft Kinect device. The proposed approach is based on a modified exemplar-based inpainting and LPA-ICI filtering by exploiting the correlation between color and depth values in local image neighborhoods. As a result, edges of the objects are sharpened and aligned with the objects in the color image. Several examples considered in this paper show the effectiveness of the proposed approach for large holes removal as well as recovery of small regions on several test images of depth maps. We perform a comparative study and show that statistically, the proposed algorithm delivers superior quality results compared to existing algorithms.
Object Detection Based on Template Matching through Use of Best-So-Far ABC
2014-01-01
Best-so-far ABC is a modified version of the artificial bee colony (ABC) algorithm used for optimization tasks. This algorithm is one of the swarm intelligence (SI) algorithms proposed in recent literature, in which the results demonstrated that the best-so-far ABC can produce higher quality solutions with faster convergence than either the ordinary ABC or the current state-of-the-art ABC-based algorithm. In this work, we aim to apply the best-so-far ABC-based approach for object detection based on template matching by using the difference between the RGB level histograms corresponding to the target object and the template object as the objective function. Results confirm that the proposed method was successful in both detecting objects and optimizing the time used to reach the solution. PMID:24812556
The scientific learning approach using multimedia-based maze game to improve learning outcomes
NASA Astrophysics Data System (ADS)
Setiawan, Wawan; Hafitriani, Sarah; Prabawa, Harsa Wara
2016-02-01
The objective of curriculum 2013 is to improve the quality of education in Indonesia, which leads to improving the quality of learning. The scientific approach and supported empowerment media is one approach as massaged of curriculum 2013. This research aims to design a labyrinth game based multimedia and apply in the scientific learning approach. This study was conducted in one of the Vocational School in Subjects of Computer Network on 2 (two) classes of experimental and control. The method used Mix Method Research (MMR) which combines qualitative in multimedia design, and quantitative in the study of learning impact. The results of a survey showed that the general of vocational students like of network topology material (68%), like multimedia (74%), and in particular, like interactive multimedia games and flash (84%). Multimediabased maze game developed good eligibility based on media and material aspects of each value 840% and 82%. Student learning outcomes as a result of using a scientific approach to learning with a multimediabased labyrinth game increase with an average of gain index about (58%) and higher than conventional multimedia with index average gain of 0.41 (41%). Based on these results the scientific approach to learning by using multimediabased labyrinth game can improve the quality of learning and increase understanding of students. Multimedia of learning based labyrinth game, which developed, got a positive response from the students with a good qualification level (75%).
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
Code of Federal Regulations, 2013 CFR
2013-07-01
... approach follows: 4.3A source conducts an initial series of at least three runs. The owner or operator may... Confidence Limit Approaches for Alternative Capture Efficiency Protocols and Test Methods A Appendix A to... to Subpart KK of Part 63—Data Quality Objective and Lower Confidence Limit Approaches for Alternative...
Code of Federal Regulations, 2010 CFR
2010-07-01
... approach follows: 4.3A source conducts an initial series of at least three runs. The owner or operator may... Confidence Limit Approaches for Alternative Capture Efficiency Protocols and Test Methods A Appendix A to... to Subpart KK of Part 63—Data Quality Objective and Lower Confidence Limit Approaches for Alternative...
Code of Federal Regulations, 2014 CFR
2014-07-01
... of the LCL approach follows: 4.3A source conducts an initial series of at least three runs. The owner... Confidence Limit Approaches for Alternative Capture Efficiency Protocols and Test Methods A Appendix A to... to Subpart KK of Part 63—Data Quality Objective and Lower Confidence Limit Approaches for Alternative...
Code of Federal Regulations, 2012 CFR
2012-07-01
... approach follows: 4.3A source conducts an initial series of at least three runs. The owner or operator may... Confidence Limit Approaches for Alternative Capture Efficiency Protocols and Test Methods A Appendix A to... to Subpart KK of Part 63—Data Quality Objective and Lower Confidence Limit Approaches for Alternative...
Quality assessment for color reproduction using a blind metric
NASA Astrophysics Data System (ADS)
Bringier, B.; Quintard, L.; Larabi, M.-C.
2007-01-01
This paper deals with image quality assessment. This field plays nowadays an important role in various image processing applications. Number of objective image quality metrics, that correlate or not, with the subjective quality have been developed during the last decade. Two categories of metrics can be distinguished, the first with full-reference and the second with no-reference. Full-reference metric tries to evaluate the distortion introduced to an image with regards to the reference. No-reference approach attempts to model the judgment of image quality in a blind way. Unfortunately, the universal image quality model is not on the horizon and empirical models established on psychophysical experimentation are generally used. In this paper, we focus only on the second category to evaluate the quality of color reproduction where a blind metric, based on human visual system modeling is introduced. The objective results are validated by single-media and cross-media subjective tests.
Telemedicine-based system for quality management and peer review in radiology.
Morozov, Sergey; Guseva, Ekaterina; Ledikhova, Natalya; Vladzymyrskyy, Anton; Safronov, Dmitry
2018-06-01
Quality assurance is the key component of modern radiology. A telemedicine-based quality assurance system helps to overcome the "scoring" approach and makes the quality control more accessible and objective. A concept for quality assurance in radiology is developed. Its realization is a set of strategies, actions, and tools. The latter is based on telemedicine-based peer review of 23,199 computed tomography (CT) and magnetic resonance imaging (MRI) images. The conception of the system for quality management in radiology represents a chain of actions: "discrepancies evaluation - routine support - quality improvement activity - discrepancies evaluation". It is realized by an audit methodology, telemedicine, elearning, and other technologies. After a year of systemic telemedicine-based peer reviews, the authors have estimated that clinically significant discrepancies were detected in 6% of all cases, while clinically insignificant ones were found in 19% of cases. Most often, problems appear in musculoskeletal records; 80% of the examinations have diagnostic or technical imperfections. The presence of routine telemedicine support and personalized elearning allowed improving the diagnostics quality. The level of discrepancies has decreased significantly (p < 0.05). The telemedicine-based peer review system allows improving radiology departments' network effectiveness. • "Scoring" approach to radiologists' performance assessment must be changed. • Telemedicine peer review and personalized elearning significantly decrease the number of discrepancies. • Teleradiology allows linking all primary-level hospitals to a common peer review network.
Shape Optimization of Rubber Bushing Using Differential Evolution Algorithm
2014-01-01
The objective of this study is to design rubber bushing at desired level of stiffness characteristics in order to achieve the ride quality of the vehicle. A differential evolution algorithm based approach is developed to optimize the rubber bushing through integrating a finite element code running in batch mode to compute the objective function values for each generation. Two case studies were given to illustrate the application of proposed approach. Optimum shape parameters of 2D bushing model were determined by shape optimization using differential evolution algorithm. PMID:25276848
NASA Astrophysics Data System (ADS)
d'Oleire-Oltmanns, Sebastian; Marzolff, Irene; Tiede, Dirk; Blaschke, Thomas
2015-04-01
The need for area-wide landform mapping approaches, especially in terms of land degradation, can be ascribed to the fact that within area-wide landform mapping approaches, the (spatial) context of erosional landforms is considered by providing additional information on the physiography neighboring the distinct landform. This study presents an approach for the detection of gully-affected areas by applying object-based image analysis in the region of Taroudannt, Morocco, which is highly affected by gully erosion while simultaneously representing a major region of agro-industry with a high demand of arable land. Various sensors provide readily available high-resolution optical satellite data with a much better temporal resolution than 3D terrain data which lead to the development of an area-wide mapping approach to extract gully-affected areas using only optical satellite imagery. The classification rule-set was developed with a clear focus on virtual spatial independence within the software environment of eCognition Developer. This allows the incorporation of knowledge about the target objects under investigation. Only optical QuickBird-2 satellite data and freely-available OpenStreetMap (OSM) vector data were used as input data. The OSM vector data were incorporated in order to mask out plantations and residential areas. Optical input data are more readily available for a broad range of users compared to terrain data, which is considered to be a major advantage. The methodology additionally incorporates expert knowledge and freely-available vector data in a cyclic object-based image analysis approach. This connects the two fields of geomorphology and remote sensing. The classification results allow conclusions on the current distribution of gullies. The results of the classification were checked against manually delineated reference data incorporating expert knowledge based on several field campaigns in the area, resulting in an overall classification accuracy of 62%. The error of omission accounts for 38% and the error of commission for 16%, respectively. Additionally, a manual assessment was carried out to assess the quality of the applied classification algorithm. The limited error of omission contributes with 23% to the overall error of omission and the limited error of commission contributes with 98% to the overall error of commission. This assessment improves the results and confirms the high quality of the developed approach for area-wide mapping of gully-affected areas in larger regions. In the field of landform mapping, the overall quality of the classification results is often assessed with more than one method to incorporate all aspects adequately.
Implementation of quality by design toward processing of food products.
Rathore, Anurag S; Kapoor, Gautam
2017-05-28
Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.
NASA Technical Reports Server (NTRS)
Strand, Albert A.; Jackson, Darryl J.
1992-01-01
As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.
NASA Astrophysics Data System (ADS)
Strand, Albert A.; Jackson, Darryl J.
As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.
Making Quality Health Websites a National Public Health Priority: Toward Quality Standards
2016-01-01
Background Most US adults have limited health literacy skills. They struggle to understand complex health information and services and to make informed health decisions. The Internet has quickly become one of the most popular places for people to search for information about their health, thereby making access to quality information on the Web a priority. However, there are no standardized criteria for evaluating Web-based health information. Every 10 years, the US Department of Health and Human Services' Office of Disease Prevention and Health Promotion (ODPHP) develops a set of measurable objectives for improving the health of the nation over the coming decade, known as Healthy People. There are two objectives in Healthy People 2020 related to website quality. The first is objective Health Communication and Health Information Technology (HC/HIT) 8.1: increase the proportion of health-related websites that meet 3 or more evaluation criteria for disclosing information that can be used to assess information reliability. The second is objective HC/HIT-8.2: increase the proportion of health-related websites that follow established usability principles. Objective The ODPHP conducted a nationwide assessment of the quality of Web-based health information using the Healthy People 2020 objectives. The ODPHP aimed to establish (1) a standardized approach to defining and measuring the quality of health websites; (2) benchmarks for measurement; (3) baseline data points to capture the current status of website quality; and (4) targets to drive improvement. Methods The ODPHP developed the National Quality Health Website Survey instrument to assess the quality of health-related websites. The ODPHP used this survey to review 100 top-ranked health-related websites in order to set baseline data points for these two objectives. The ODPHP then set targets to drive improvement by 2020. Results This study reviewed 100 health-related websites. For objective HC/HIT-8.1, a total of 58 out of 100 (58.0%) websites met 3 or more out of 6 reliability criteria. For objective HC/HIT-8.2, a total of 42 out of 100 (42.0%) websites followed 10 or more out of 19 established usability principles. On the basis of these baseline data points, ODPHP set targets for the year 2020 that meet the minimal statistical significance—increasing objective HC/HIT-8.1 data point to 70.5% and objective HC/HIT-8.2 data point to 55.7%. Conclusions This research is a critical first step in evaluating the quality of Web-based health information. The criteria proposed by ODPHP provide methods to assess website quality for professionals designing, developing, and managing health-related websites. The criteria, baseline data, and targets are valuable tools for driving quality improvement. PMID:27485512
The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training
ERIC Educational Resources Information Center
Sandrey, Michelle A.; Bulger, Sean M.
2008-01-01
Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…
Spatial coding-based approach for partitioning big spatial data in Hadoop
NASA Astrophysics Data System (ADS)
Yao, Xiaochuang; Mokbel, Mohamed F.; Alarabi, Louai; Eldawy, Ahmed; Yang, Jianyu; Yun, Wenju; Li, Lin; Ye, Sijing; Zhu, Dehai
2017-09-01
Spatial data partitioning (SDP) plays a powerful role in distributed storage and parallel computing for spatial data. However, due to skew distribution of spatial data and varying volume of spatial vector objects, it leads to a significant challenge to ensure both optimal performance of spatial operation and data balance in the cluster. To tackle this problem, we proposed a spatial coding-based approach for partitioning big spatial data in Hadoop. This approach, firstly, compressed the whole big spatial data based on spatial coding matrix to create a sensing information set (SIS), including spatial code, size, count and other information. SIS was then employed to build spatial partitioning matrix, which was used to spilt all spatial objects into different partitions in the cluster finally. Based on our approach, the neighbouring spatial objects can be partitioned into the same block. At the same time, it also can minimize the data skew in Hadoop distributed file system (HDFS). The presented approach with a case study in this paper is compared against random sampling based partitioning, with three measurement standards, namely, the spatial index quality, data skew in HDFS, and range query performance. The experimental results show that our method based on spatial coding technique can improve the query performance of big spatial data, as well as the data balance in HDFS. We implemented and deployed this approach in Hadoop, and it is also able to support efficiently any other distributed big spatial data systems.
Pitch-informed solo and accompaniment separation towards its use in music education applications
NASA Astrophysics Data System (ADS)
Cano, Estefanía; Schuller, Gerald; Dittmar, Christian
2014-12-01
We present a system for the automatic separation of solo instruments and music accompaniment in polyphonic music recordings. Our approach is based on a pitch detection front-end and a tone-based spectral estimation. We assess the plausibility of using sound separation technologies to create practice material in a music education context. To better understand the sound separation quality requirements in music education, a listening test was conducted to determine the most perceptually relevant signal distortions that need to be improved. Results from the listening test show that solo and accompaniment tracks pose different quality requirements and should be optimized differently. We propose and evaluate algorithm modifications to better understand their effects on objective perceptual quality measures. Finally, we outline possible ways of optimizing our separation approach to better suit the requirements of music education applications.
Lehmann, Ronny; Thiessen, Christiane; Frick, Barbara; Bosse, Hans Martin; Nikendei, Christoph; Hoffmann, Georg Friedrich; Tönshoff, Burkhard; Huwendiek, Sören
2015-07-02
E-learning and blended learning approaches gain more and more popularity in emergency medicine curricula. So far, little data is available on the impact of such approaches on procedural learning and skill acquisition and their comparison with traditional approaches. This study investigated the impact of a blended learning approach, including Web-based virtual patients (VPs) and standard pediatric basic life support (PBLS) training, on procedural knowledge, objective performance, and self-assessment. A total of 57 medical students were randomly assigned to an intervention group (n=30) and a control group (n=27). Both groups received paper handouts in preparation of simulation-based PBLS training. The intervention group additionally completed two Web-based VPs with embedded video clips. Measurements were taken at randomization (t0), after the preparation period (t1), and after hands-on training (t2). Clinical decision-making skills and procedural knowledge were assessed at t0 and t1. PBLS performance was scored regarding adherence to the correct algorithm, conformance to temporal demands, and the quality of procedural steps at t1 and t2. Participants' self-assessments were recorded in all three measurements. Procedural knowledge of the intervention group was significantly superior to that of the control group at t1. At t2, the intervention group showed significantly better adherence to the algorithm and temporal demands, and better procedural quality of PBLS in objective measures than did the control group. These aspects differed between the groups even at t1 (after VPs, prior to practical training). Self-assessments differed significantly only at t1 in favor of the intervention group. Training with VPs combined with hands-on training improves PBLS performance as judged by objective measures.
Enabling task-based information prioritization via semantic web encodings
NASA Astrophysics Data System (ADS)
Michaelis, James R.
2016-05-01
Modern Soldiers rely upon accurate and actionable information technology to achieve mission objectives. While increasingly rich sensor networks for Areas of Operation (AO) can offer many directions for aiding Soldiers, limitations are imposed by current tactical edge systems on the rate that content can be transmitted. Furthermore, mission tasks will often require very specific sets of information which may easily be drowned out by other content sources. Prior research on Quality and Value of Information (QoI/VoI) has aimed to define ways to prioritize information objects based on their intrinsic attributes (QoI) and perceived value to a consumer (VoI). As part of this effort, established ranking approaches for obtaining Subject Matter Expert (SME) recommendations, such as the Analytic Hierarchy Process (AHP) have been considered. However, limited work has been done to tie Soldier context - such as descriptions of their mission and tasks - back to intrinsic attributes of information objects. As a first step toward addressing the above challenges, this work introduces an ontology-backed approach - rooted in Semantic Web publication practices - for expressing both AHP decision hierarchies and corresponding SME feedback. Following a short discussion on related QoI/VoI research, an ontology-based data structure is introduced for supporting evaluation of Information Objects, using AHP rankings designed to facilitate information object prioritization. Consistent with alternate AHP approaches, prioritization in this approach is based on pairwise comparisons between Information Objects with respect to established criteria, as well as on pairwise comparison of the criteria to assess their relative importance. The paper concludes with a discussion of both ongoing and future work.
Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J
1995-01-01
OBJECTIVE: This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. DATA SOURCES AND STUDY SETTING: Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. STUDY DESIGN: The study involved cross-sectional examination of the named relationships. DATA COLLECTION/EXTRACTION METHODS: Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. PRINCIPAL FINDINGS: A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. CONCLUSIONS: What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult challenges in this regard. PMID:7782222
Making Quality Health Websites a National Public Health Priority: Toward Quality Standards.
Devine, Theresa; Broderick, Jordan; Harris, Linda M; Wu, Huijuan; Hilfiker, Sandra Williams
2016-08-02
Most US adults have limited health literacy skills. They struggle to understand complex health information and services and to make informed health decisions. The Internet has quickly become one of the most popular places for people to search for information about their health, thereby making access to quality information on the Web a priority. However, there are no standardized criteria for evaluating Web-based health information. Every 10 years, the US Department of Health and Human Services' Office of Disease Prevention and Health Promotion (ODPHP) develops a set of measurable objectives for improving the health of the nation over the coming decade, known as Healthy People. There are two objectives in Healthy People 2020 related to website quality. The first is objective Health Communication and Health Information Technology (HC/HIT) 8.1: increase the proportion of health-related websites that meet 3 or more evaluation criteria for disclosing information that can be used to assess information reliability. The second is objective HC/HIT-8.2: increase the proportion of health-related websites that follow established usability principles. The ODPHP conducted a nationwide assessment of the quality of Web-based health information using the Healthy People 2020 objectives. The ODPHP aimed to establish (1) a standardized approach to defining and measuring the quality of health websites; (2) benchmarks for measurement; (3) baseline data points to capture the current status of website quality; and (4) targets to drive improvement. The ODPHP developed the National Quality Health Website Survey instrument to assess the quality of health-related websites. The ODPHP used this survey to review 100 top-ranked health-related websites in order to set baseline data points for these two objectives. The ODPHP then set targets to drive improvement by 2020. This study reviewed 100 health-related websites. For objective HC/HIT-8.1, a total of 58 out of 100 (58.0%) websites met 3 or more out of 6 reliability criteria. For objective HC/HIT-8.2, a total of 42 out of 100 (42.0%) websites followed 10 or more out of 19 established usability principles. On the basis of these baseline data points, ODPHP set targets for the year 2020 that meet the minimal statistical significance-increasing objective HC/HIT-8.1 data point to 70.5% and objective HC/HIT-8.2 data point to 55.7%. This research is a critical first step in evaluating the quality of Web-based health information. The criteria proposed by ODPHP provide methods to assess website quality for professionals designing, developing, and managing health-related websites. The criteria, baseline data, and targets are valuable tools for driving quality improvement.
Improving Quality Using Architecture Fault Analysis with Confidence Arguments
2015-03-01
the same time, T text, diagram, and table-based requirements documentation and the use of Microsoft Word and Dynamic Object - Oriented Requirements...Lamsweerde 2003] Van Lamsweerde, Axel & Letier, Emmanuel. “From Object Orientation to Goal Orientation : A Paradigm Shift for Requirements Engineering,” 4–8...Introduction 1 Approach , Concepts, and Notations 5 2.1 Requirement Specification and Architecture Design 5 2.2 AADL Concepts Supporting Architecture
Web-based rehabilitation interventions for people with rheumatoid arthritis: A systematic review.
Srikesavan, Cynthia; Bryer, Catherine; Ali, Usama; Williamson, Esther
2018-01-01
Background Rehabilitation approaches for people with rheumatoid arthritis include joint protection, exercises and self-management strategies. Health interventions delivered via the web have the potential to improve access to health services overcoming time constraints, physical limitations, and socioeconomic and geographic barriers. The objective of this review is to determine the effects of web-based rehabilitation interventions in adults with rheumatoid arthritis. Methods Randomised controlled trials that compared web-based rehabilitation interventions with usual care, waiting list, no treatment or another web-based intervention in adults with rheumatoid arthritis were included. The outcomes were pain, function, quality of life, self-efficacy, rheumatoid arthritis knowledge, physical activity and adverse effects. Methodological quality was assessed using the Cochrane Risk of Bias tool and quality of evidence with the Grading of Recommendations Assessment, Development and Evaluation approach. Results Six source documents from four trials ( n = 567) focusing on self-management, health information or physical activity were identified. The effects of web-based rehabilitation interventions on pain, function, quality of life, self-efficacy, rheumatoid arthritis knowledge and physical activity are uncertain because of the very low quality of evidence mostly from small single trials. Adverse effects were not reported. Conclusion Large, well-designed trials are needed to evaluate the clinical and cost-effectiveness of web-based rehabilitation interventions in rheumatoid arthritis.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.
Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985
Manipulation of Unknown Objects to Improve the Grasp Quality Using Tactile Information.
Montaño, Andrés; Suárez, Raúl
2018-05-03
This work presents a novel and simple approach in the area of manipulation of unknown objects considering both geometric and mechanical constraints of the robotic hand. Starting with an initial blind grasp, our method improves the grasp quality through manipulation considering the three common goals of the manipulation process: improving the hand configuration, the grasp quality and the object positioning, and, at the same time, prevents the object from falling. Tactile feedback is used to obtain local information of the contacts between the fingertips and the object, and no additional exteroceptive feedback sources are considered in the approach. The main novelty of this work lies in the fact that the grasp optimization is performed on-line as a reactive procedure using the tactile and kinematic information obtained during the manipulation. Experimental results are shown to illustrate the efficiency of the approach.
Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II
NASA Astrophysics Data System (ADS)
Pal, Kamal; Pal, Surjya K.
2018-05-01
Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.
Caggiano, Michael D; Tinkham, Wade T; Hoffman, Chad; Cheng, Antony S; Hawbaker, Todd J
2016-10-01
The wildland-urban interface (WUI), the area where human development encroaches on undeveloped land, is expanding throughout the western United States resulting in increased wildfire risk to homes and communities. Although census based mapping efforts have provided insights into the pattern of development and expansion of the WUI at regional and national scales, these approaches do not provide sufficient detail for fine-scale fire and emergency management planning, which requires maps of individual building locations. Although fine-scale maps of the WUI have been developed, they are often limited in their spatial extent, have unknown accuracies and biases, and are costly to update over time. In this paper we assess a semi-automated Object Based Image Analysis (OBIA) approach that utilizes 4-band multispectral National Aerial Image Program (NAIP) imagery for the detection of individual buildings within the WUI. We evaluate this approach by comparing the accuracy and overall quality of extracted buildings to a building footprint control dataset. In addition, we assessed the effects of buffer distance, topographic conditions, and building characteristics on the accuracy and quality of building extraction. The overall accuracy and quality of our approach was positively related to buffer distance, with accuracies ranging from 50 to 95% for buffer distances from 0 to 100 m. Our results also indicate that building detection was sensitive to building size, with smaller outbuildings (footprints less than 75 m 2 ) having detection rates below 80% and larger residential buildings having detection rates above 90%. These findings demonstrate that this approach can successfully identify buildings in the WUI in diverse landscapes while achieving high accuracies at buffer distances appropriate for most fire management applications while overcoming cost and time constraints associated with traditional approaches. This study is unique in that it evaluates the ability of an OBIA approach to extract highly detailed data on building locations in a WUI setting.
Caggiano, Michael D.; Tinkham, Wade T.; Hoffman, Chad; Cheng, Antony S.; Hawbaker, Todd J.
2016-01-01
The wildland-urban interface (WUI), the area where human development encroaches on undeveloped land, is expanding throughout the western United States resulting in increased wildfire risk to homes and communities. Although census based mapping efforts have provided insights into the pattern of development and expansion of the WUI at regional and national scales, these approaches do not provide sufficient detail for fine-scale fire and emergency management planning, which requires maps of individual building locations. Although fine-scale maps of the WUI have been developed, they are often limited in their spatial extent, have unknown accuracies and biases, and are costly to update over time. In this paper we assess a semi-automated Object Based Image Analysis (OBIA) approach that utilizes 4-band multispectral National Aerial Image Program (NAIP) imagery for the detection of individual buildings within the WUI. We evaluate this approach by comparing the accuracy and overall quality of extracted buildings to a building footprint control dataset. In addition, we assessed the effects of buffer distance, topographic conditions, and building characteristics on the accuracy and quality of building extraction. The overall accuracy and quality of our approach was positively related to buffer distance, with accuracies ranging from 50 to 95% for buffer distances from 0 to 100 m. Our results also indicate that building detection was sensitive to building size, with smaller outbuildings (footprints less than 75 m2) having detection rates below 80% and larger residential buildings having detection rates above 90%. These findings demonstrate that this approach can successfully identify buildings in the WUI in diverse landscapes while achieving high accuracies at buffer distances appropriate for most fire management applications while overcoming cost and time constraints associated with traditional approaches. This study is unique in that it evaluates the ability of an OBIA approach to extract highly detailed data on building locations in a WUI setting.
NASA Astrophysics Data System (ADS)
Alizadeh, Mohammad Reza; Nikoo, Mohammad Reza; Rakhshandehroo, Gholam Reza
2017-08-01
Sustainable management of water resources necessitates close attention to social, economic and environmental aspects such as water quality and quantity concerns and potential conflicts. This study presents a new fuzzy-based multi-objective compromise methodology to determine the socio-optimal and sustainable policies for hydro-environmental management of groundwater resources, which simultaneously considers the conflicts and negotiation of involved stakeholders, uncertainties in decision makers' preferences, existing uncertainties in the groundwater parameters and groundwater quality and quantity issues. The fuzzy multi-objective simulation-optimization model is developed based on qualitative and quantitative groundwater simulation model (MODFLOW and MT3D), multi-objective optimization model (NSGA-II), Monte Carlo analysis and Fuzzy Transformation Method (FTM). Best compromise solutions (best management policies) on trade-off curves are determined using four different Fuzzy Social Choice (FSC) methods. Finally, a unanimity fallback bargaining method is utilized to suggest the most preferred FSC method. Kavar-Maharloo aquifer system in Fars, Iran, as a typical multi-stakeholder multi-objective real-world problem is considered to verify the proposed methodology. Results showed an effective performance of the framework for determining the most sustainable allocation policy in groundwater resource management.
A Quality Sorting of Fruit Using a New Automatic Image Processing Method
NASA Astrophysics Data System (ADS)
Amenomori, Michihiro; Yokomizu, Nobuyuki
This paper presents an innovative approach for quality sorting of objects such as apples sorting in an agricultural factory, using an image processing algorithm. The objective of our approach are; firstly to sort the objects by their colors precisely; secondly to detect any irregularity of the colors surrounding the apples efficiently. An experiment has been conducted and the results have been obtained and compared with that has been preformed by human sorting process and by color sensor sorting devices. The results demonstrate that our approach is capable to sort the objects rapidly and the percentage of classification valid rate was 100 %.
Yoshino, Hiroyuki; Hara, Yuko; Dohi, Masafumi; Yamashita, Kazunari; Hakomori, Tadashi; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru
2018-04-01
Scale-up approaches for film coating process have been established for each type of film coating equipment from thermodynamic and mechanical analyses for several decades. The objective of the present study was to establish a versatile scale-up approach for film coating process applicable to commercial production that is based on critical quality attribute (CQA) using the Quality by Design (QbD) approach and is independent of the equipment used. Experiments on a pilot scale using the Design of Experiment (DoE) approach were performed to find a suitable CQA from surface roughness, contact angle, color difference, and coating film properties by terahertz spectroscopy. Surface roughness was determined to be a suitable CQA from a quantitative appearance evaluation. When surface roughness was fixed as the CQA, the water content of the film-coated tablets was determined to be the critical material attribute (CMA), a parameter that does not depend on scale or equipment. Finally, to verify the scale-up approach determined from the pilot scale, experiments on a commercial scale were performed. The good correlation between the surface roughness (CQA) and the water content (CMA) identified at the pilot scale was also retained at the commercial scale, indicating that our proposed method should be useful as a scale-up approach for film coating process.
Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments
Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361
Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.
Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Objects Grouping for Segmentation of Roads Network in High Resolution Images of Urban Areas
NASA Astrophysics Data System (ADS)
Maboudi, M.; Amini, J.; Hahn, M.
2016-06-01
Updated road databases are required for many purposes such as urban planning, disaster management, car navigation, route planning, traffic management and emergency handling. In the last decade, the improvement in spatial resolution of VHR civilian satellite sensors - as the main source of large scale mapping applications - was so considerable that GSD has become finer than size of common urban objects of interest such as building, trees and road parts. This technological advancement pushed the development of "Object-based Image Analysis (OBIA)" as an alternative to pixel-based image analysis methods. Segmentation as one of the main stages of OBIA provides the image objects on which most of the following processes will be applied. Therefore, the success of an OBIA approach is strongly affected by the segmentation quality. In this paper, we propose a purpose-dependent refinement strategy in order to group road segments in urban areas using maximal similarity based region merging. For investigations with the proposed method, we use high resolution images of some urban sites. The promising results suggest that the proposed approach is applicable in grouping of road segments in urban areas.
3D shape measurement of moving object with FFT-based spatial matching
NASA Astrophysics Data System (ADS)
Guo, Qinghua; Ruan, Yuxi; Xi, Jiangtao; Song, Limei; Zhu, Xinjun; Yu, Yanguang; Tong, Jun
2018-03-01
This work presents a new technique for 3D shape measurement of moving object in translational motion, which finds applications in online inspection, quality control, etc. A low-complexity 1D fast Fourier transform (FFT)-based spatial matching approach is devised to obtain accurate object displacement estimates, and it is combined with single shot fringe pattern prolometry (FPP) techniques to achieve high measurement performance with multiple captured images through coherent combining. The proposed technique overcomes some limitations of existing ones. Specifically, the placement of marks on object surface and synchronization between projector and camera are not needed, the velocity of the moving object is not required to be constant, and there is no restriction on the movement trajectory. Both simulation and experimental results demonstrate the effectiveness of the proposed technique.
Gordeev, Evgeniy G; Galushko, Alexey S; Ananikov, Valentine P
2018-01-01
Additive manufacturing with fused deposition modeling (FDM) is currently optimized for a wide range of research and commercial applications. The major disadvantage of FDM-created products is their low quality and structural defects (porosity), which impose an obstacle to utilizing them in functional prototyping and direct digital manufacturing of objects intended to contact with gases and liquids. This article describes a simple and efficient approach for assessing the quality of 3D printed objects. Using this approach it was shown that the wall permeability of a printed object depends on its geometric shape and is gradually reduced in a following series: cylinder > cube > pyramid > sphere > cone. Filament feed rate, wall geometry and G-code-defined wall structure were found as primary parameters that influence the quality of 3D-printed products. Optimization of these parameters led to an overall increase in quality and improvement of sealing properties. It was demonstrated that high quality of 3D printed objects can be achieved using routinely available printers and standard filaments.
Álvarez-Romero, Jorge G; Pressey, Robert L; Ban, Natalie C; Brodie, Jon
2015-01-01
Human-induced changes to river loads of nutrients and sediments pose a significant threat to marine ecosystems. Ongoing land-use change can further increase these loads, and amplify the impacts of land-based threats on vulnerable marine ecosystems. Consequently, there is a need to assess these threats and prioritise actions to mitigate their impacts. A key question regarding prioritisation is whether actions in catchments to maintain coastal-marine water quality can be spatially congruent with actions for other management objectives, such as conserving terrestrial biodiversity. In selected catchments draining into the Gulf of California, Mexico, we employed Land Change Modeller to assess the vulnerability of areas with native vegetation to conversion into crops, pasture, and urban areas. We then used SedNet, a catchment modelling tool, to map the sources and estimate pollutant loads delivered to the Gulf by these catchments. Following these analyses, we used modelled river plumes to identify marine areas likely influenced by land-based pollutants. Finally, we prioritised areas for catchment management based on objectives for conservation of terrestrial biodiversity and objectives for water quality that recognised links between pollutant sources and affected marine areas. Our objectives for coastal-marine water quality were to reduce sediment and nutrient discharges from anthropic areas, and minimise future increases in coastal sedimentation and eutrophication. Our objectives for protection of terrestrial biodiversity covered species of vertebrates. We used Marxan, a conservation planning tool, to prioritise interventions and explore spatial differences in priorities for both objectives. Notable differences in the distributions of land values for terrestrial biodiversity and coastal-marine water quality indicated the likely need for trade-offs between catchment management objectives. However, there were priority areas that contributed to both sets of objectives. Our study demonstrates a practical approach to integrating models of catchments, land-use change, and river plumes with conservation planning software to inform prioritisation of catchment management.
Álvarez-Romero, Jorge G.; Pressey, Robert L.; Ban, Natalie C.; Brodie, Jon
2015-01-01
Human-induced changes to river loads of nutrients and sediments pose a significant threat to marine ecosystems. Ongoing land-use change can further increase these loads, and amplify the impacts of land-based threats on vulnerable marine ecosystems. Consequently, there is a need to assess these threats and prioritise actions to mitigate their impacts. A key question regarding prioritisation is whether actions in catchments to maintain coastal-marine water quality can be spatially congruent with actions for other management objectives, such as conserving terrestrial biodiversity. In selected catchments draining into the Gulf of California, Mexico, we employed Land Change Modeller to assess the vulnerability of areas with native vegetation to conversion into crops, pasture, and urban areas. We then used SedNet, a catchment modelling tool, to map the sources and estimate pollutant loads delivered to the Gulf by these catchments. Following these analyses, we used modelled river plumes to identify marine areas likely influenced by land-based pollutants. Finally, we prioritised areas for catchment management based on objectives for conservation of terrestrial biodiversity and objectives for water quality that recognised links between pollutant sources and affected marine areas. Our objectives for coastal-marine water quality were to reduce sediment and nutrient discharges from anthropic areas, and minimise future increases in coastal sedimentation and eutrophication. Our objectives for protection of terrestrial biodiversity covered species of vertebrates. We used Marxan, a conservation planning tool, to prioritise interventions and explore spatial differences in priorities for both objectives. Notable differences in the distributions of land values for terrestrial biodiversity and coastal-marine water quality indicated the likely need for trade-offs between catchment management objectives. However, there were priority areas that contributed to both sets of objectives. Our study demonstrates a practical approach to integrating models of catchments, land-use change, and river plumes with conservation planning software to inform prioritisation of catchment management. PMID:26714166
NASA Astrophysics Data System (ADS)
Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi
2005-10-01
MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.
Statistical approaches used to assess and redesign surface water-quality-monitoring networks.
Khalil, B; Ouarda, T B M J
2009-11-01
An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.
Comparison of penalty functions on a penalty approach to mixed-integer optimization
NASA Astrophysics Data System (ADS)
Francisco, Rogério B.; Costa, M. Fernanda P.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.
2016-06-01
In this paper, we present a comparative study involving several penalty functions that can be used in a penalty approach for globally solving bound mixed-integer nonlinear programming (bMIMLP) problems. The penalty approach relies on a continuous reformulation of the bMINLP problem by adding a particular penalty term to the objective function. A penalty function based on the `erf' function is proposed. The continuous nonlinear optimization problems are sequentially solved by the population-based firefly algorithm. Preliminary numerical experiments are carried out in order to analyze the quality of the produced solutions, when compared with other penalty functions available in the literature.
Enhanced data validation strategy of air quality monitoring network.
Harkat, Mohamed-Faouzi; Mansouri, Majdi; Nounou, Mohamed; Nounou, Hazem
2018-01-01
Quick validation and detection of faults in measured air quality data is a crucial step towards achieving the objectives of air quality networks. Therefore, the objectives of this paper are threefold: (i) to develop a modeling technique that can be used to predict the normal behavior of air quality variables and help provide accurate reference for monitoring purposes; (ii) to develop fault detection method that can effectively and quickly detect any anomalies in measured air quality data. For this purpose, a new fault detection method that is based on the combination of generalized likelihood ratio test (GLRT) and exponentially weighted moving average (EWMA) will be developed. GLRT is a well-known statistical fault detection method that relies on maximizing the detection probability for a given false alarm rate. In this paper, we propose to develop GLRT-based EWMA fault detection method that will be able to detect the changes in the values of certain air quality variables; (iii) to develop fault isolation and identification method that allows defining the fault source(s) in order to properly apply appropriate corrective actions. In this paper, reconstruction approach that is based on Midpoint-Radii Principal Component Analysis (MRPCA) model will be developed to handle the types of data and models associated with air quality monitoring networks. All air quality modeling, fault detection, fault isolation and reconstruction methods developed in this paper will be validated using real air quality data (such as particulate matter, ozone, nitrogen and carbon oxides measurement). Copyright © 2017 Elsevier Inc. All rights reserved.
Including robustness in multi-criteria optimization for intensity-modulated proton therapy
NASA Astrophysics Data System (ADS)
Chen, Wei; Unkelbach, Jan; Trofimov, Alexei; Madden, Thomas; Kooy, Hanne; Bortfeld, Thomas; Craft, David
2012-02-01
We present a method to include robustness in a multi-criteria optimization (MCO) framework for intensity-modulated proton therapy (IMPT). The approach allows one to simultaneously explore the trade-off between different objectives as well as the trade-off between robustness and nominal plan quality. In MCO, a database of plans each emphasizing different treatment planning objectives, is pre-computed to approximate the Pareto surface. An IMPT treatment plan that strikes the best balance between the different objectives can be selected by navigating on the Pareto surface. In our approach, robustness is integrated into MCO by adding robustified objectives and constraints to the MCO problem. Uncertainties (or errors) of the robust problem are modeled by pre-calculated dose-influence matrices for a nominal scenario and a number of pre-defined error scenarios (shifted patient positions, proton beam undershoot and overshoot). Objectives and constraints can be defined for the nominal scenario, thus characterizing nominal plan quality. A robustified objective represents the worst objective function value that can be realized for any of the error scenarios and thus provides a measure of plan robustness. The optimization method is based on a linear projection solver and is capable of handling large problem sizes resulting from a fine dose grid resolution, many scenarios, and a large number of proton pencil beams. A base-of-skull case is used to demonstrate the robust optimization method. It is demonstrated that the robust optimization method reduces the sensitivity of the treatment plan to setup and range errors to a degree that is not achieved by a safety margin approach. A chordoma case is analyzed in more detail to demonstrate the involved trade-offs between target underdose and brainstem sparing as well as robustness and nominal plan quality. The latter illustrates the advantage of MCO in the context of robust planning. For all cases examined, the robust optimization for each Pareto optimal plan takes less than 5 min on a standard computer, making a computationally friendly interface possible to the planner. In conclusion, the uncertainty pertinent to the IMPT procedure can be reduced during treatment planning by optimizing plans that emphasize different treatment objectives, including robustness, and then interactively seeking for a most-preferred one from the solution Pareto surface.
Quality risk management in pharmaceutical development.
Charoo, Naseem Ahmad; Ali, Areeg Anwer
2013-07-01
The objective of ICH Q8, Q9 and Q10 documents is application of systemic and science based approach to formulation development for building quality into product. There is always some uncertainty in new product development. Good risk management practice is essential for success of new product development in decreasing this uncertainty. In quality by design paradigm, the product performance properties relevant to the patient are predefined in target product profile (TPP). Together with prior knowledge and experience, TPP helps in identification of critical quality attributes (CQA's). Initial risk assessment which identifies risks to these CQA's provides impetus for product development. Product and process are designed to gain knowledge about these risks, devise strategies to eliminate or mitigate these risks and meet objectives set in TPP. By laying more emphasis on high risk events the protection level of patient is increased. The process being scientifically driven improves the transparency and reliability of the manufacturer. The focus on risk to the patient together with flexible development approach saves invaluable resources, increases confidence on quality and reduces compliance risk. The knowledge acquired in analysing risks to CQA's permits construction of meaningful design space. Within the boundaries of the design space, variation in critical material characteristics and process parameters must be managed in order to yield a product having the desired characteristics. Specifications based on product and process understanding are established such that product will meet the specifications if tested. In this way, the product is amenable to real time release, since specifications only confirm quality but they do not serve as a means of effective process control.
Imaging through atmospheric turbulence for laser based C-RAM systems: an analytical approach
NASA Astrophysics Data System (ADS)
Buske, Ivo; Riede, Wolfgang; Zoz, Jürgen
2013-10-01
High Energy Laser weapons (HEL) have unique attributes which distinguish them from limitations of kinetic energy weapons. HEL weapons engagement process typical starts with identifying the target and selecting the aim point on the target through a high magnification telescope. One scenario for such a HEL system is the countermeasure against rockets, artillery or mortar (RAM) objects to protect ships, camps or other infrastructure from terrorist attacks. For target identification and especially to resolve the aim point it is significant to ensure high resolution imaging of RAM objects. During the whole ballistic flight phase the knowledge about the expectable imaging quality is important to estimate and evaluate the countermeasure system performance. Hereby image quality is mainly influenced by unavoidable atmospheric turbulence. Analytical calculations have been taken to analyze and evaluate image quality parameters during an approaching RAM object. In general, Kolmogorov turbulence theory was implemented to determine atmospheric coherence length and isoplanatic angle. The image acquisition is distinguishing between long and short exposure times to characterize tip/tilt image shift and the impact of high order turbulence fluctuations. Two different observer positions are considered to show the influence of the selected sensor site. Furthermore two different turbulence strengths are investigated to point out the effect of climate or weather condition. It is well known that atmospheric turbulence degenerates image sharpness and creates blurred images. Investigations are done to estimate the effectiveness of simple tip/tilt systems or low order adaptive optics for laser based C-RAM systems.
ERIC Educational Resources Information Center
Holt, Maurice
1995-01-01
The central idea in W. Edwards Deming's approach to quality management is the need to improve process. Outcome-based education's central defect is its failure to address process. Deming would reject OBE along with management-by-objectives. Education is not a product defined by specific output measures, but a process to develop the mind. (MLH)
Increasing Critical Thinking in Web-Based Graduate Management Courses
ERIC Educational Resources Information Center
Condon, Conna; Valverde, Raul
2014-01-01
A common approach for demonstrating learning in online classrooms is through submittal of research essays of a discussion topic followed by classroom participation. Issues arose at an online campus of a university regarding the originality and quality of critical thinking in the original submittals. Achievement of new course objectives oriented to…
Scanlon, Dennis P; Alexander, Jeffrey A; Beich, Jeff; Christianson, Jon B; Hasnain-Wynia, Romana; McHugh, Megan C; Mittler, Jessica N; Shi, Yunfeng; Bodenschatz, Laura J
2012-09-01
The Aligning Forces for Quality (AF4Q) initiative is the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site, complex program, the RWJF funds an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced in the evaluation of this 10-year initiative are discussed. A descriptive overview of the evaluation research design for a multi-site, community based, healthcare quality improvement initiative is provided. The multiphase research design employed by the evaluation team is discussed. Evaluation provides formative feedback to the RWJF, participants, and other interested audiences in real time; develops approaches to assess innovative and under-studied interventions; furthers the analysis and understanding of effective community-based collaborative work in healthcare; and helps to differentiate the various facilitators, barriers, and contextual dimensions that affect the implementation and outcomes of community-based health interventions. The AF4Q initiative is arguably the largest community-level healthcare improvement demonstration in the United States to date; it is being implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similar community-based initiatives.
Pope, Ronald; Wu, Jianguo
2014-06-01
In the United States, air pollution is primarily measured by Air Quality Monitoring Networks (AQMN). These AQMNs have multiple objectives, including characterizing pollution patterns, protecting the public health, and determining compliance with air quality standards. In 2006, the U.S. Environmental Protection Agency issued a directive that air pollution agencies assess the performance of their AQMNs. Although various methods to design and assess AQMNs exist, here we demonstrate a geographic information system (GIS)-based approach that combines environmental, economic, and social indicators through the assessment of the ozone (O3) and particulate matter (PM10) networks in Maricopa County, Arizona. The assessment was conducted in three phases: (1) to evaluate the performance of the existing networks, (2) to identify areas that would benefit from the addition of new monitoring stations, and (3) to recommend changes to the AQMN. A comprehensive set of indicators was created for evaluating differing aspects of the AQMNs' objectives, and weights were applied to emphasize important indicators. Indicators were also classified according to their sustainable development goal. Our results showed that O3 was well represented in the county with some redundancy in terms of the urban monitors. The addition of weights to the indicators only had a minimal effect on the results. For O3, urban monitors had greater social scores, while rural monitors had greater environmental scores. The results did not suggest a need for adding more O3 monitoring sites. For PM10, clustered urban monitors were redundant, and weights also had a minimal effect on the results. The clustered urban monitors had overall low scores; sites near point sources had high environmental scores. Several areas were identified as needing additional PM10 monitors. This study demonstrates the usefulness of a multi-indicator approach to assess AQMNs. Network managers and planners may use this method to assess the performance of air quality monitoring networks in urban regions. The U.S. Environmental Protection Agency issued a directive in 2006 that air pollution agencies assess the performance of their AQMNs; as a result, we developed a GIS-based, multi-objective assessment approach that integrates environmental, economic, and social indicators, and demonstrates its use through assessing the O3 and PM10 monitoring networks in the Phoenix metropolitan area. We exhibit a method of assessing network performance and identifying areas that would benefit from new monitoring stations; also, we demonstrate the effect of adding weights to the indicators. Our study shows that using a multi-indicator approach gave detailed assessment results for the Phoenix AQMN.
Townley, Greg; Kloos, Bret
2014-12-01
There is a disagreement in place-based research regarding whether objective indicators or individual perceptions of environments are better predictors of well-being. This study assessed environmental influences on well-being for 373 individuals with psychiatric disabilities living independently in 66 neighborhoods in the southeastern United States. Three questions were examined utilizing random effects models: (1) How much variance in personal and neighborhood well-being can be explained by neighborhood membership? (2) What is the relationship between participant perceptions of neighborhood quality and researcher ratings of neighborhood quality? and (3) What is the relative influence of individual perceptions, perceptions aggregated by neighborhood, and researcher ratings of neighborhood quality in predicting personal and neighborhood well-being? Results indicate that individual perceptions of neighborhood quality were more closely related to well-being than either aggregated perceptions or researcher ratings. Thus, participants' perceptions of their neighborhoods were more important indicators of their well-being than objective ratings made by researchers. Findings have implications for measurement approaches and intervention design in placed-based research.
Economic Benefits of Improved Water Quality: Public Perceptions of Option and Preservation Values
NASA Astrophysics Data System (ADS)
Bouwes, Nicolaas W., Sr.
The primary objective of this book is to report the authors‧ research approach to the estimation of benefits of water quality improvements in the South Platte River of northeastern Colorado. Benefits included a “consumer surplus” from enhanced enjoyment of water-based recreation, an “option value” of assured choice of future recreation use, and a “preservation value” of the ecosystem and its bequest to future generations. Concepts such as preservation and option value benefits have been often mentioned but seldom estimated in natural resources research. The authors have met their objective by providing the reader with a detailed description of their research without being tedious.
Using game theory for perceptual tuned rate control algorithm in video coding
NASA Astrophysics Data System (ADS)
Luo, Jiancong; Ahmad, Ishfaq
2005-03-01
This paper proposes a game theoretical rate control technique for video compression. Using a cooperative gaming approach, which has been utilized in several branches of natural and social sciences because of its enormous potential for solving constrained optimization problems, we propose a dual-level scheme to optimize the perceptual quality while guaranteeing "fairness" in bit allocation among macroblocks. At the frame level, the algorithm allocates target bits to frames based on their coding complexity. At the macroblock level, the algorithm distributes bits to macroblocks by defining a bargaining game. Macroblocks play cooperatively to compete for shares of resources (bits) to optimize their quantization scales while considering the Human Visual System"s perceptual property. Since the whole frame is an entity perceived by viewers, macroblocks compete cooperatively under a global objective of achieving the best quality with the given bit constraint. The major advantage of the proposed approach is that the cooperative game leads to an optimal and fair bit allocation strategy based on the Nash Bargaining Solution. Another advantage is that it allows multi-objective optimization with multiple decision makers (macroblocks). The simulation results testify the algorithm"s ability to achieve accurate bit rate with good perceptual quality, and to maintain a stable buffer level.
Imaging with a small number of photons
Morris, Peter A.; Aspden, Reuben S.; Bell, Jessica E. C.; Boyd, Robert W.; Padgett, Miles J.
2015-01-01
Low-light-level imaging techniques have application in many diverse fields, ranging from biological sciences to security. A high-quality digital camera based on a multi-megapixel array will typically record an image by collecting of order 105 photons per pixel, but by how much could this photon flux be reduced? In this work we demonstrate a single-photon imaging system based on a time-gated intensified camera from which the image of an object can be inferred from very few detected photons. We show that a ghost-imaging configuration, where the image is obtained from photons that have never interacted with the object, is a useful approach for obtaining images with high signal-to-noise ratios. The use of heralded single photons ensures that the background counts can be virtually eliminated from the recorded images. By applying principles of image compression and associated image reconstruction, we obtain high-quality images of objects from raw data formed from an average of fewer than one detected photon per image pixel. PMID:25557090
Van Lierde, Kristiane M; De Bodt, Marc; Dhaeseleer, Evelien; Wuyts, Floris; Claeys, Sofie
2010-05-01
The purpose of the present study is to measure the effectiveness of two treatment techniques--vocalization with abdominal breath support and manual circumlaryngeal therapy (MCT)--in patients with muscle tension dysphonia (MTD). The vocal quality before and after the two treatment techniques was measured by means of the dysphonia severity index (DSI), which is designed to establish an objective and quantitative correlate of the perceived vocal quality. The DSI is based on the weighted combination of the following set of voice measurements: maximum phonation time (MPT), highest frequency, lowest intensity, and jitter. The repeated-measures analysis of variance (ANOVA) revealed a significant difference between the objective overall vocal quality before and after MCT. No significant differences were measured between the objective overall vocal quality before and after vocalization with abdominal breath support. This study showed evidence that MCT is an effective treatment technique for patients with elevated laryngeal position, increased laryngeal muscle tension, and MTD. The precise way in which MCT has an effect on vocal quality has not been addressed in this experiment, but merits study. Further research into this topic could focus on electromyography (EMG) recordings in relation to vocal improvements with larger sample of subjects. (c) 2010 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Investigations of image fusion
NASA Astrophysics Data System (ADS)
Zhang, Zhong
1999-12-01
The objective of image fusion is to combine information from multiple images of the same scene. The result of image fusion is a single image which is more suitable for the purpose of human visual perception or further image processing tasks. In this thesis, a region-based fusion algorithm using the wavelet transform is proposed. The identification of important features in each image, such as edges and regions of interest, are used to guide the fusion process. The idea of multiscale grouping is also introduced and a generic image fusion framework based on multiscale decomposition is studied. The framework includes all of the existing multiscale-decomposition- based fusion approaches we found in the literature which did not assume a statistical model for the source images. Comparisons indicate that our framework includes some new approaches which outperform the existing approaches for the cases we consider. Registration must precede our fusion algorithms. So we proposed a hybrid scheme which uses both feature-based and intensity-based methods. The idea of robust estimation of optical flow from time- varying images is employed with a coarse-to-fine multi- resolution approach and feature-based registration to overcome some of the limitations of the intensity-based schemes. Experiments show that this approach is robust and efficient. Assessing image fusion performance in a real application is a complicated issue. In this dissertation, a mixture probability density function model is used in conjunction with the Expectation- Maximization algorithm to model histograms of edge intensity. Some new techniques are proposed for estimating the quality of a noisy image of a natural scene. Such quality measures can be used to guide the fusion. Finally, we study fusion of images obtained from several copies of a new type of camera developed for video surveillance. Our techniques increase the capability and reliability of the surveillance system and provide an easy way to obtain 3-D information of objects in the space monitored by the system.
NASA Astrophysics Data System (ADS)
Shrivastava, Prashant Kumar; Pandey, Arun Kumar
2018-06-01
Inconel-718 has found high demand in different industries due to their superior mechanical properties. The traditional cutting methods are facing difficulties for cutting these alloys due to their low thermal potential, lower elasticity and high chemical compatibility at inflated temperature. The challenges of machining and/or finishing of unusual shapes and/or sizes in these materials have also faced by traditional machining. Laser beam cutting may be applied for the miniaturization and ultra-precision cutting and/or finishing by appropriate control of different process parameter. This paper present multi-objective optimization the kerf deviation, kerf width and kerf taper in the laser cutting of Incone-718 sheet. The second order regression models have been developed for different quality characteristics by using the experimental data obtained through experimentation. The regression models have been used as objective function for multi-objective optimization based on the hybrid approach of multiple regression analysis and genetic algorithm. The comparison of optimization results to experimental results shows an improvement of 88%, 10.63% and 42.15% in kerf deviation, kerf width and kerf taper, respectively. Finally, the effects of different process parameters on quality characteristics have also been discussed.
Objective assessment of MPEG-2 video quality
NASA Astrophysics Data System (ADS)
Gastaldo, Paolo; Zunino, Rodolfo; Rovetta, Stefano
2002-07-01
The increasing use of video compression standards in broadcasting television systems has required, in recent years, the development of video quality measurements that take into account artifacts specifically caused by digital compression techniques. In this paper we present a methodology for the objective quality assessment of MPEG video streams by using circular back-propagation feedforward neural networks. Mapping neural networks can render nonlinear relationships between objective features and subjective judgments, thus avoiding any simplifying assumption on the complexity of the model. The neural network processes an instantaneous set of input values, and yields an associated estimate of perceived quality. Therefore, the neural-network approach turns objective quality assessment into adaptive modeling of subjective perception. The objective features used for the estimate are chosen according to the assessed relevance to perceived quality and are continuously extracted in real time from compressed video streams. The overall system mimics perception but does not require any analytical model of the underlying physical phenomenon. The capability to process compressed video streams represents an important advantage over existing approaches, like avoiding the stream-decoding process greatly enhances real-time performance. Experimental results confirm that the system provides satisfactory, continuous-time approximations for actual scoring curves concerning real test videos.
Machine vision system: a tool for quality inspection of food and agricultural products.
Patel, Krishna Kumar; Kar, A; Jha, S N; Khan, M A
2012-04-01
Quality inspection of food and agricultural produce are difficult and labor intensive. Simultaneously, with increased expectations for food products of high quality and safety standards, the need for accurate, fast and objective quality determination of these characteristics in food products continues to grow. However, these operations generally in India are manual which is costly as well as unreliable because human decision in identifying quality factors such as appearance, flavor, nutrient, texture, etc., is inconsistent, subjective and slow. Machine vision provides one alternative for an automated, non-destructive and cost-effective technique to accomplish these requirements. This inspection approach based on image analysis and processing has found a variety of different applications in the food industry. Considerable research has highlighted its potential for the inspection and grading of fruits and vegetables, grain quality and characteristic examination and quality evaluation of other food products like bakery products, pizza, cheese, and noodles etc. The objective of this paper is to provide in depth introduction of machine vision system, its components and recent work reported on food and agricultural produce.
Assessment of visual landscape quality using IKONOS imagery.
Ozkan, Ulas Yunus
2014-07-01
The assessment of visual landscape quality is of importance to the management of urban woodlands. Satellite remote sensing may be used for this purpose as a substitute for traditional survey techniques that are both labour-intensive and time-consuming. This study examines the association between the quality of the perceived visual landscape in urban woodlands and texture measures extracted from IKONOS satellite data, which features 4-m spatial resolution and four spectral bands. The study was conducted in the woodlands of Istanbul (the most important element of urban mosaic) lying along both shores of the Bosporus Strait. The visual quality assessment applied in this study is based on the perceptual approach and was performed via a survey of expressed preferences. For this purpose, representative photographs of real scenery were used to elicit observers' preferences. A slide show comprising 33 images was presented to a group of 153 volunteers (all undergraduate students), and they were asked to rate the visual quality of each on a 10-point scale (1 for very low visual quality, 10 for very high). Average visual quality scores were calculated for landscape. Texture measures were acquired using the two methods: pixel-based and object-based. Pixel-based texture measures were extracted from the first principle component (PC1) image. Object-based texture measures were extracted by using the original four bands. The association between image texture measures and perceived visual landscape quality was tested via Pearson's correlation coefficient. The analysis found a strong linear association between image texture measures and visual quality. The highest correlation coefficient was calculated between standard deviation of gray levels (SDGL) (one of the pixel-based texture measures) and visual quality (r = 0.82, P < 0.05). The results showed that perceived visual quality of urban woodland landscapes can be estimated by using texture measures extracted from satellite data in combination with appropriate modelling techniques.
Smith, Wade P; Doctor, Jason; Meyer, Jürgen; Kalet, Ira J; Phillips, Mark H
2009-06-01
The prognosis of cancer patients treated with intensity-modulated radiation-therapy (IMRT) is inherently uncertain, depends on many decision variables, and requires that a physician balance competing objectives: maximum tumor control with minimal treatment complications. In order to better deal with the complex and multiple objective nature of the problem we have combined a prognostic probabilistic model with multi-attribute decision theory which incorporates patient preferences for outcomes. The response to IMRT for prostate cancer was modeled. A Bayesian network was used for prognosis for each treatment plan. Prognoses included predicting local tumor control, regional spread, distant metastases, and normal tissue complications resulting from treatment. A Markov model was constructed and used to calculate a quality-adjusted life-expectancy which aids in the multi-attribute decision process. Our method makes explicit the tradeoffs patients face between quality and quantity of life. This approach has advantages over current approaches because with our approach risks of health outcomes and patient preferences determine treatment decisions.
Comparing image quality of print-on-demand books and photobooks from web-based vendors
NASA Astrophysics Data System (ADS)
Phillips, Jonathan; Bajorski, Peter; Burns, Peter; Fredericks, Erin; Rosen, Mitchell
2010-01-01
Because of the emergence of e-commerce and developments in print engines designed for economical output of very short runs, there are increased business opportunities and consumer options for print-on-demand books and photobooks. The current state of these printing modes allows for direct uploading of book files via the web, printing on nonoffset printers, and distributing by standard parcel or mail delivery services. The goal of this research is to assess the image quality of print-on-demand books and photobooks produced by various Web-based vendors and to identify correlations between psychophysical results and objective metrics. Six vendors were identified for one-off (single-copy) print-on-demand books, and seven vendors were identified for photobooks. Participants rank ordered overall quality of a subset of individual pages from each book, where the pages included text, photographs, or a combination of the two. Observers also reported overall quality ratings and price estimates for the bound books. Objective metrics of color gamut, color accuracy, accuracy of International Color Consortium profile usage, eye-weighted root mean square L*, and cascaded modulation transfer acutance were obtained and compared to the observer responses. We introduce some new methods for normalizing data as well as for strengthening the statistical significance of the results. Our approach includes the use of latent mixed-effect models. We found statistically significant correlation with overall image quality and some of the spatial metrics, but correlations between psychophysical results and other objective metrics were weak or nonexistent. Strong correlation was found between psychophysical results of overall quality assessment and estimated price associated with quality. The photobook set of vendors reached higher image-quality ratings than the set of print-on-demand vendors. However, the photobook set had higher image-quality variability.
Bicycles, transportation sustainability, and quality of life.
DOT National Transportation Integrated Search
2014-01-01
The research presented in this report focuses on the exploration of a variety of objective and subjective quality of life indicators and approaches for bicycle transportation using a mixed methods approach. The authors have created a conceptual frame...
A no-reference video quality assessment metric based on ROI
NASA Astrophysics Data System (ADS)
Jia, Lixiu; Zhong, Xuefei; Tu, Yan; Niu, Wenjuan
2015-01-01
A no reference video quality assessment metric based on the region of interest (ROI) was proposed in this paper. In the metric, objective video quality was evaluated by integrating the quality of two compressed artifacts, i.e. blurring distortion and blocking distortion. The Gaussian kernel function was used to extract the human density maps of the H.264 coding videos from the subjective eye tracking data. An objective bottom-up ROI extraction model based on magnitude discrepancy of discrete wavelet transform between two consecutive frames, center weighted color opponent model, luminance contrast model and frequency saliency model based on spectral residual was built. Then only the objective saliency maps were used to compute the objective blurring and blocking quality. The results indicate that the objective ROI extraction metric has a higher the area under the curve (AUC) value. Comparing with the conventional video quality assessment metrics which measured all the video quality frames, the metric proposed in this paper not only decreased the computation complexity, but improved the correlation between subjective mean opinion score (MOS) and objective scores.
Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs
NASA Astrophysics Data System (ADS)
Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung
2014-05-01
The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object-based accuracy assessment is conducted by quantitatively comparing extracted landslide objects with landslide polygons that were visually interpreted by local experts. The applicability and transferability of the mapping system are evaluated by comparing initial accuracies with those achieved for the following two tests: first, usage of a SPOT image from the same year, but for a different area within the Baichi catchment; second, usage of SPOT images from multiple years for the same region. The integration of the common knowledge via digital landslide signatures is new in object-based landslide studies. In combination with strategies to optimize image segmentation this may lead to a more objective, transferable and stable knowledge-based system for the mapping of landslides from optical satellite data and DEMs.
Automatic assessment of voice quality according to the GRBAS scale.
Sáenz-Lechón, Nicolás; Godino-Llorente, Juan I; Osma-Ruiz, Víctor; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando
2006-01-01
Nowadays, the most extended techniques to measure the voice quality are based on perceptual evaluation by well trained professionals. The GRBAS scale is a widely used method for perceptual evaluation of voice quality. The GRBAS scale is widely used in Japan and there is increasing interest in both Europe and the United States. However, this technique needs well-trained experts, and is based on the evaluator's expertise, depending a lot on his own psycho-physical state. Furthermore, a great variability in the assessments performed from one evaluator to another is observed. Therefore, an objective method to provide such measurement of voice quality would be very valuable. In this paper, the automatic assessment of voice quality is addressed by means of short-term Mel cepstral parameters (MFCC), and learning vector quantization (LVQ) in a pattern recognition stage. Results show that this approach provides acceptable results for this purpose, with accuracy around 65% at the best.
NASA Technical Reports Server (NTRS)
Grantham, William D.
1989-01-01
The primary objective was to provide information to the flight controls/flying qualities engineer that will assist him in determining the incremental flying qualities and/or pilot-performance differences that may be expected between results obtained via ground-based simulation (and, in particular, the six-degree-of-freedom Langley Visual/Motion Simulator (VMS)) and flight tests. Pilot opinion and performance parameters derived from a ground-based simulator and an in-flight simulator are compared for a jet-transport airplane having 32 different longitudinal dynamic response characteristics. The primary pilot tasks were the approach and landing tasks with emphasis on the landing-flare task. The results indicate that, in general, flying qualities results obtained from the ground-based simulator may be considered conservative-especially when the pilot task requires tight pilot control as during the landing flare. The one exception to this, according to the present study, was that the pilots were more tolerant of large time delays in the airplane response on the ground-based simulator. The results also indicated that the ground-based simulator (particularly the Langley VMS) is not adequate for assessing pilot/vehicle performance capabilities (i.e., the sink rate performance for the landing-flare task when the pilot has little depth/height perception from the outside scene presentation).
Super-resolution imaging applied to moving object tracking
NASA Astrophysics Data System (ADS)
Swalaganata, Galandaru; Ratna Sulistyaningrum, Dwi; Setiyono, Budi
2017-10-01
Moving object tracking in a video is a method used to detect and analyze changes that occur in an object that being observed. Visual quality and the precision of the tracked target are highly wished in modern tracking system. The fact that the tracked object does not always seem clear causes the tracking result less precise. The reasons are low quality video, system noise, small object, and other factors. In order to improve the precision of the tracked object especially for small object, we propose a two step solution that integrates a super-resolution technique into tracking approach. First step is super-resolution imaging applied into frame sequences. This step was done by cropping the frame in several frame or all of frame. Second step is tracking the result of super-resolution images. Super-resolution image is a technique to obtain high-resolution images from low-resolution images. In this research single frame super-resolution technique is proposed for tracking approach. Single frame super-resolution was a kind of super-resolution that it has the advantage of fast computation time. The method used for tracking is Camshift. The advantages of Camshift was simple calculation based on HSV color that use its histogram for some condition and color of the object varies. The computational complexity and large memory requirements required for the implementation of super-resolution and tracking were reduced and the precision of the tracked target was good. Experiment showed that integrate a super-resolution imaging into tracking technique can track the object precisely with various background, shape changes of the object, and in a good light conditions.
A Bio-Inspired Herbal Tea Flavour Assessment Technique
Zakaria, Nur Zawatil Isqi; Masnan, Maz Jamilah; Zakaria, Ammar; Shakaff, Ali Yeon Md
2014-01-01
Herbal-based products are becoming a widespread production trend among manufacturers for the domestic and international markets. As the production increases to meet the market demand, it is very crucial for the manufacturer to ensure that their products have met specific criteria and fulfil the intended quality determined by the quality controller. One famous herbal-based product is herbal tea. This paper investigates bio-inspired flavour assessments in a data fusion framework involving an e-nose and e-tongue. The objectives are to attain good classification of different types and brands of herbal tea, classification of different flavour masking effects and finally classification of different concentrations of herbal tea. Two data fusion levels were employed in this research, low level data fusion and intermediate level data fusion. Four classification approaches; LDA, SVM, KNN and PNN were examined in search of the best classifier to achieve the research objectives. In order to evaluate the classifiers' performance, an error estimator based on k-fold cross validation and leave-one-out were applied. Classification based on GC-MS TIC data was also included as a comparison to the classification performance using fusion approaches. Generally, KNN outperformed the other classification techniques for the three flavour assessments in the low level data fusion and intermediate level data fusion. However, the classification results based on GC-MS TIC data are varied. PMID:25010697
NASA Astrophysics Data System (ADS)
Castelletti, A.; Pianosi, F.; Soncini-Sessa, R.; Antenucci, J. P.
2010-06-01
Improved data collection techniques as well as increasing computing power are opening up new opportunities for the development of sophisticated models that can accurately reproduce hydrodynamic and biochemical conditions of water bodies. While increasing model complexity is considered a virtue for scientific purposes, it is a definite disadvantage for management (engineering) purposes, as it limits the model applicability to what-if analysis over a few, a priori defined interventions. In the recent past, this has become a significant limitation, particularly considering recent advances in water quality rehabilitation technologies (e.g., mixers or oxygenators) for which many design parameters have to be decided. In this paper, a novel approach toward integrating science-oriented and engineering-oriented models and improving water quality planning is presented. It is based on the use of a few appropriately designed simulations of a complex process-based model to iteratively identify the multidimensional function (response surface) that maps the rehabilitation interventions into the objective function. On the basis of the response surface (RS), a greater number of interventions can be quickly evaluated and the corresponding Pareto front can be approximated. Interesting points on the front are then selected and the corresponding interventions are simulated using the original process-based model, thus obtaining new decision-objective samples to refine the RS approximation. The approach is demonstrated in Googong Reservoir (Australia), which is periodically affected by high concentrations of manganese and cyanobacteria. Results indicate that significant improvements could be observed by simply changing the location of the two mixers installed in 2007. Furthermore, it also suggests the best location for an additional pair of mixers.
Water quality monitoring strategies - A review and future perspectives.
Behmel, S; Damour, M; Ludwig, R; Rodriguez, M J
2016-11-15
The reliable assessment of water quality through water quality monitoring programs (WQMPs) is crucial in order for decision-makers to understand, interpret and use this information in support of their management activities aiming at protecting the resource. The challenge of water quality monitoring has been widely addressed in the literature since the 1940s. However, there is still no generally accepted, holistic and practical strategy to support all phases of WQMPs. The purpose of this paper is to report on the use cases a watershed manager has to address to plan or optimize a WQMP from the challenge of identifying monitoring objectives; selecting sampling sites and water quality parameters; identifying sampling frequencies; considering logistics and resources to the implementation of actions based on information acquired through the WQMP. An inventory and critique of the information, approaches and tools placed at the disposal of watershed managers was proposed to evaluate how the existing information could be integrated in a holistic, user-friendly and evolvable solution. Given the differences in regulatory requirements, water quality standards, geographical and geological differences, land-use variations, and other site specificities, a one-in-all solution is not possible. However, we advance that an intelligent decision support system (IDSS) based on expert knowledge that integrates existing approaches and past research can guide a watershed manager through the process according to his/her site-specific requirements. It is also necessary to tap into local knowledge and to identify the knowledge needs of all the stakeholders through participative approaches based on geographical information systems and adaptive survey-based questionnaires. We believe that future research should focus on developing such participative approaches and further investigate the benefits of IDSS's that can be updated quickly and make it possible for a watershed manager to obtain a timely, holistic view and support for every aspect of planning and optimizing a WQMP. Copyright © 2016 Elsevier B.V. All rights reserved.
Volumetric Medical Image Coding: An Object-based, Lossy-to-lossless and Fully Scalable Approach
Danyali, Habibiollah; Mertins, Alfred
2011-01-01
In this article, an object-based, highly scalable, lossy-to-lossless 3D wavelet coding approach for volumetric medical image data (e.g., magnetic resonance (MR) and computed tomography (CT)) is proposed. The new method, called 3DOBHS-SPIHT, is based on the well-known set partitioning in the hierarchical trees (SPIHT) algorithm and supports both quality and resolution scalability. The 3D input data is grouped into groups of slices (GOS) and each GOS is encoded and decoded as a separate unit. The symmetric tree definition of the original 3DSPIHT is improved by introducing a new asymmetric tree structure. While preserving the compression efficiency, the new tree structure allows for a small size of each GOS, which not only reduces memory consumption during the encoding and decoding processes, but also facilitates more efficient random access to certain segments of slices. To achieve more compression efficiency, the algorithm only encodes the main object of interest in each 3D data set, which can have any arbitrary shape, and ignores the unnecessary background. The experimental results on some MR data sets show the good performance of the 3DOBHS-SPIHT algorithm for multi-resolution lossy-to-lossless coding. The compression efficiency, full scalability, and object-based features of the proposed approach, beside its lossy-to-lossless coding support, make it a very attractive candidate for volumetric medical image information archiving and transmission applications. PMID:22606653
Exploring the feasibility of traditional image querying tasks for industrial radiographs
NASA Astrophysics Data System (ADS)
Bray, Iliana E.; Tsai, Stephany J.; Jimenez, Edward S.
2015-08-01
Although there have been great strides in object recognition with optical images (photographs), there has been comparatively little research into object recognition for X-ray radiographs. Our exploratory work contributes to this area by creating an object recognition system designed to recognize components from a related database of radiographs. Object recognition for radiographs must be approached differently than for optical images, because radiographs have much less color-based information to distinguish objects, and they exhibit transmission overlap that alters perceived object shapes. The dataset used in this work contained more than 55,000 intermixed radiographs and photographs, all in a compressed JPEG form and with multiple ways of describing pixel information. For this work, a robust and efficient system is needed to combat problems presented by properties of the X-ray imaging modality, the large size of the given database, and the quality of the images contained in said database. We have explored various pre-processing techniques to clean the cluttered and low-quality images in the database, and we have developed our object recognition system by combining multiple object detection and feature extraction methods. We present the preliminary results of the still-evolving hybrid object recognition system.
[Clinical autopsies. Practical approach, legal foundations and ethical considerations].
Friemann, J
2010-07-01
Only an autopsy can demonstrate topographical and morphological circumstances in detail and correlate the clinical and autopsy findings based on the examination of all organs. The practical approach in a fatality is described based on the example of the Lüdenscheid Hospital. A uniform legal regulation for dealing with corpses does not exist in Germany. There are two approaches to the question under which circumstances a clinical autopsy is allowed: the extended permission solution and the objection solution. Whether a clinical autopsy can be carried out is decided by the medical specialist selected on application. Autopsies can be necessary from insurance or administrative legal grounds or in the case of an anatomical autopsy is decided by the persons themselves. In order to guarantee the quality of an autopsy it is necessary to use a standardized approach with evaluation and assessment of the results, for example using a quality assurance protocol and the production of an autopsy report. Using this approach important information can be gained not only on the accuracy of the main diagnosis and cause of death but also on additional diseases, response to therapy and the course of the disease and under circumstances can lead to modifications in the approach.
NASA Astrophysics Data System (ADS)
Dong, Jian; Kudo, Hiroyuki
2017-03-01
Compressed sensing (CS) is attracting growing concerns in sparse-view computed tomography (CT) image reconstruction. The most standard approach of CS is total variation (TV) minimization. However, images reconstructed by TV usually suffer from distortions, especially in reconstruction of practical CT images, in forms of patchy artifacts, improper serrate edges and loss of image textures. Most existing CS approaches including TV achieve image quality improvement by applying linear transforms to object image, but linear transforms usually fail to take discontinuities into account, such as edges and image textures, which is considered to be the key reason for image distortions. Actually, discussions on nonlinear filter based image processing has a long history, leading us to clarify that the nonlinear filters yield better results compared to linear filters in image processing task such as denoising. Median root prior was first utilized by Alenius as nonlinear transform in CT image reconstruction, with significant gains obtained. Subsequently, Zhang developed the application of nonlocal means-based CS. A fact is gradually becoming clear that the nonlinear transform based CS has superiority in improving image quality compared with the linear transform based CS. However, it has not been clearly concluded in any previous paper within the scope of our knowledge. In this work, we investigated the image quality differences between the conventional TV minimization and nonlinear sparsifying transform based CS, as well as image quality differences among different nonlinear sparisying transform based CSs in sparse-view CT image reconstruction. Additionally, we accelerated the implementation of nonlinear sparsifying transform based CS algorithm.
Bhidayasiri, Roongroj; Trenkwalder, Claudia
2018-05-01
When Parkinson's disease (PD) patients are asked about the quality of their sleep, their answers are dominated by difficulties associated with impaired mobility in bed, medically referred to as nocturnal hypokinesia. Nocturnal hypokinesia is symptomatic from the mid-stage of the disease, affecting up to 70% of PD patients, and contributes to poor sleep quality, and increased carer burden. Here we explore four areas of nocturnal hypokinesia that are relevant to clinical practice, namely: manifestations and definition; clinical assessment and objective monitoring; etiologies and contributing factors; and evidence-based therapeutic approaches. In addition, we provide an operational definition of what constitutes nocturnal hypokinesia and outline different methods of assessment, ranging from clinical interviews and rating scales to objective night-time monitoring with inertial sensors. Optimal management of nocturnal hypokinesia in PD begins with recognizing its manifestation by inquiring about cardinal symptoms and contributing factors from, not only patients, but also carers, followed by formal assessment, and the application of individualized evidence-based treatment. Night-time dopaminergic treatment is the primary therapy; however, careful clinical judgment is required to balance the benefits with the potential adverse events related to nocturnal dopaminergic stimulation. Future studies are needed to explore the practicality of home-based objective assessment of nocturnal hypokinesia, new therapeutic options not limited to dopaminergic medications, and non-pharmacologic approaches, including training on compensatory strategies and bedroom adaptations. Copyright © 2018 Elsevier Ltd. All rights reserved.
An object programming based environment for protein secondary structure prediction.
Giacomini, M; Ruggiero, C; Sacile, R
1996-01-01
The most frequently used methods for protein secondary structure prediction are empirical statistical methods and rule based methods. A consensus system based on object-oriented programming is presented, which integrates the two approaches with the aim of improving the prediction quality. This system uses an object-oriented knowledge representation based on the concepts of conformation, residue and protein, where the conformation class is the basis, the residue class derives from it and the protein class derives from the residue class. The system has been tested with satisfactory results on several proteins of the Brookhaven Protein Data Bank. Its results have been compared with the results of the most widely used prediction methods, and they show a higher prediction capability and greater stability. Moreover, the system itself provides an index of the reliability of its current prediction. This system can also be regarded as a basis structure for programs of this kind.
Incoherent coincidence imaging of space objects
NASA Astrophysics Data System (ADS)
Mao, Tianyi; Chen, Qian; He, Weiji; Gu, Guohua
2016-10-01
Incoherent Coincidence Imaging (ICI), which is based on the second or higher order correlation of fluctuating light field, has provided great potentialities with respect to standard conventional imaging. However, the deployment of reference arm limits its practical applications in the detection of space objects. In this article, an optical aperture synthesis with electronically connected single-pixel photo-detectors was proposed to remove the reference arm. The correlation in our proposed method is the second order correlation between the intensity fluctuations observed by any two detectors. With appropriate locations of single-pixel detectors, this second order correlation is simplified to absolute-square Fourier transform of source and the unknown object. We demonstrate the image recovery with the Gerchberg-Saxton-like algorithms and investigate the reconstruction quality of our approach. Numerical experiments has been made to show that both binary and gray-scale objects can be recovered. This proposed method provides an effective approach to promote detection of space objects and perhaps even the exo-planets.
Analyzing compound and project progress through multi-objective-based compound quality assessment.
Nissink, J Willem M; Degorce, Sébastien
2013-05-01
Compound-quality scoring methods designed to evaluate multiple drug properties concurrently are useful to analyze and prioritize output from drug-design efforts. However, formalized multiparameter optimization approaches are not widely used in drug design. We rank molecules synthesized in drug-discovery projects using simple and aggregated desirability functions reflecting medicinal chemistry 'rules'. Our quality score deals transparently with missing data, a key requirement in drug-hunting projects where data availability is often limited. We further estimate confidence in the interpretation of such a compound-quality measure. Scores and associated confidences provide systematic insight in the quality of emerging chemical equity. Tracking quality of synthetic output over time yields valuable insight into the progress of drug-design teams, with potential applications in risk and resource management of a drug portfolio.
On the evaluation of segmentation editing tools
Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.
2014-01-01
Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063
ERIC Educational Resources Information Center
Hsiao, Kuo-Lun; Huang, Tien-Chi; Chen, Mu-Yen; Chiang, Nien-Ting
2018-01-01
Although ubiquitous learning is a novel and creative teaching approach, two key issues inhibit its success overall: a lack of appropriate learning strategies regarding learning objectives, and ineffective learning tools for receiving knowledge regarding the chosen subjects. To address these issues, we develops and designs a game-based educational…
Kable, Ashley K; Levett-Jones, Tracy L; Arthur, Carol; Reid-Searl, Kerry; Humphreys, Melanie; Morris, Sara; Walsh, Pauline; Witton, Nicola J
2018-01-01
The aim of this paper is to report the results of a cross-national study that evaluated a range of simulation sessions using an observation schedule developed from evidence-based quality indicators. Observational data were collected from 17 simulation sessions conducted for undergraduate nursing students at three universities in Australia and the United Kingdom. The observation schedule contained 27 questions that rated simulation quality. Data were collected by direct observation and from video recordings of the simulation sessions. Results indicated that the highest quality scores were for provision of learning objectives prior to the simulation session (90%) and debriefing (72%). Student preparatiosn and orientation (67%) and perceived realism and fidelity (67%) were scored lower than other components of the simulation sessions. This observational study proved to be an effective strategy to identify areas of strength and those needing further development to improve simulation sessions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Models of formation and some algorithms of hyperspectral image processing
NASA Astrophysics Data System (ADS)
Achmetov, R. N.; Stratilatov, N. R.; Yudakov, A. A.; Vezenov, V. I.; Eremeev, V. V.
2014-12-01
Algorithms and information technologies for processing Earth hyperspectral imagery are presented. Several new approaches are discussed. Peculiar properties of processing the hyperspectral imagery, such as multifold signal-to-noise reduction, atmospheric distortions, access to spectral characteristics of every image point, and high dimensionality of data, were studied. Different measures of similarity between individual hyperspectral image points and the effect of additive uncorrelated noise on these measures were analyzed. It was shown that these measures are substantially affected by noise, and a new measure free of this disadvantage was proposed. The problem of detecting the observed scene object boundaries, based on comparing the spectral characteristics of image points, is considered. It was shown that contours are processed much better when spectral characteristics are used instead of energy brightness. A statistical approach to the correction of atmospheric distortions, which makes it possible to solve the stated problem based on analysis of a distorted image in contrast to analytical multiparametric models, was proposed. Several algorithms used to integrate spectral zonal images with data from other survey systems, which make it possible to image observed scene objects with a higher quality, are considered. Quality characteristics of hyperspectral data processing were proposed and studied.
Objective vocal quality in children using cochlear implants: a multiparameter approach.
Baudonck, Nele; D'haeseleer, Evelien; Dhooge, Ingeborg; Van Lierde, Kristiane
2011-11-01
The purpose of this study was to determine the objective vocal quality in 36 prelingually deaf children using cochlear implant (CI) with a mean age of 9 years. An additional purpose was to compare the objective vocal quality of these 36 CI users with 25 age-matched children with prelingual severe hearing loss using conventional hearing aids (HAs) and 25 normal hearing (NH) children. The design for this cross-sectional study was a multigroup posttest-only design. The objective vocal quality was measured by means of the dysphonia severity index (DSI). Moreover, perceptual voice assessment using the GRBASI scale was performed. CI children have a vocal quality by means of the DSI of +1.8, corresponding with a DSI% of 68%, indicating a borderline vocal quality situated 2% above the limit of normality. The voice was perceptually characterized by the presence of a very slight grade of hoarseness, roughness, strained phonation, and higher pitch and intensity levels. No significant objective vocal quality differences were measured between the voices of the CI children, HA users, and NH children. According to the results, one aspect of the vocal approach in children with CI and using HAs must be focused on the improvement of the strained vocal characteristic and the use of a lower pitch and intensity level. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Robust feedback zoom tracking for digital video surveillance.
Zou, Tengyue; Tang, Xiaoqi; Song, Bao; Wang, Jin; Chen, Jihong
2012-01-01
Zoom tracking is an important function in video surveillance, particularly in traffic management and security monitoring. It involves keeping an object of interest in focus during the zoom operation. Zoom tracking is typically achieved by moving the zoom and focus motors in lenses following the so-called "trace curve", which shows the in-focus motor positions versus the zoom motor positions for a specific object distance. The main task of a zoom tracking approach is to accurately estimate the trace curve for the specified object. Because a proportional integral derivative (PID) controller has historically been considered to be the best controller in the absence of knowledge of the underlying process and its high-quality performance in motor control, in this paper, we propose a novel feedback zoom tracking (FZT) approach based on the geometric trace curve estimation and PID feedback controller. The performance of this approach is compared with existing zoom tracking methods in digital video surveillance. The real-time implementation results obtained on an actual digital video platform indicate that the developed FZT approach not only solves the traditional one-to-many mapping problem without pre-training but also improves the robustness for tracking moving or switching objects which is the key challenge in video surveillance.
[Risk-adjusted assessment: late-onset infection in neonates].
Gmyrek, Dieter; Koch, Rainer; Vogtmann, Christoph; Kaiser, Annette; Friedrich, Annette
2011-01-01
The weak point of the countrywide perinatal/neonatal quality surveillance is the ignorance of interhospital differences in the case mix of patients. As a result, this approach does not produce reliable benchmarking. The objective of this study was to adjust the result of the late-onset infection incidence of different hospitals according to their risk profile of patients by multivariate analysis. The perinatal/neonatal database of 41,055 newborns of the Saxonian quality surveillance from 1998 to 2004 was analysed. Based on 18 possible risk factors, a logistic regression model was used to develop a specific risk predictor for the quality indicator "late-onset infection". The developed risk predictor for the incidence of late-onset infection could be described by 4 of the 18 analysed risk factors, namely gestational age, admission from home, hypoxic ischemic encephalopathy and B-streptococcal infection. The AUC(ROC) value of this quality indicator was 83.3%, which demonstrates its reliability. The hospital ranking based on the adjusted risk assessment was very different from hospital rankings before this adjustment. The average correction of ranking position was 4.96 for 35 clinics. The application of the risk adjustment method proposed here allows for a more objective comparison of the incidence of the quality indicator "late onset infection" among different hospitals. Copyright © 2011. Published by Elsevier GmbH.
Intercell scheduling: A negotiation approach using multi-agent coalitions
NASA Astrophysics Data System (ADS)
Tian, Yunna; Li, Dongni; Zheng, Dan; Jia, Yunde
2016-10-01
Intercell scheduling problems arise as a result of intercell transfers in cellular manufacturing systems. Flexible intercell routes are considered in this article, and a coalition-based scheduling (CBS) approach using distributed multi-agent negotiation is developed. Taking advantage of the extended vision of the coalition agents, the global optimization is improved and the communication cost is reduced. The objective of the addressed problem is to minimize mean tardiness. Computational results show that, compared with the widely used combinatorial rules, CBS provides better performance not only in minimizing the objective, i.e. mean tardiness, but also in minimizing auxiliary measures such as maximum completion time, mean flow time and the ratio of tardy parts. Moreover, CBS is better than the existing intercell scheduling approach for the same problem with respect to the solution quality and computational costs.
Line Segmentation of 2d Laser Scanner Point Clouds for Indoor Slam Based on a Range of Residuals
NASA Astrophysics Data System (ADS)
Peter, M.; Jafri, S. R. U. N.; Vosselman, G.
2017-09-01
Indoor mobile laser scanning (IMLS) based on the Simultaneous Localization and Mapping (SLAM) principle proves to be the preferred method to acquire data of indoor environments at a large scale. In previous work, we proposed a backpack IMLS system containing three 2D laser scanners and an according SLAM approach. The feature-based SLAM approach solves all six degrees of freedom simultaneously and builds on the association of lines to planes. Because of the iterative character of the SLAM process, the quality and reliability of the segmentation of linear segments in the scanlines plays a crucial role in the quality of the derived poses and consequently the point clouds. The orientations of the lines resulting from the segmentation can be influenced negatively by narrow objects which are nearly coplanar with walls (like e.g. doors) which will cause the line to be tilted if those objects are not detected as separate segments. State-of-the-art methods from the robotics domain like Iterative End Point Fit and Line Tracking were found to not handle such situations well. Thus, we describe a novel segmentation method based on the comparison of a range of residuals to a range of thresholds. For the definition of the thresholds we employ the fact that the expected value for the average of residuals of n points with respect to the line is σ / √n. Our method, as shown by the experiments and the comparison to other methods, is able to deliver more accurate results than the two approaches it was tested against.
Influence of pansharpening techniques in obtaining accurate vegetation thematic maps
NASA Astrophysics Data System (ADS)
Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier
2016-10-01
In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.
Defining the performance gap: Conducting a self-assessment
NASA Technical Reports Server (NTRS)
Braymer, Susan A.; Stoner, David L.; Powell, William C.
1992-01-01
This paper presents two different approaches to performing self-assessments of continuous improvement activities. Case Study 1 describes the activities performed by JSC to assess the implementation of continuous improvement efforts at the NASA Center. The JSC approach included surveys administered to randomly selected NASA personnel and personal interviews with NASA and contractor management personnel. Case Study 2 describes the continuous improvement survey performed by the JSC Safety, Reliability, and Quality Assurance (SR&QA) organization. This survey consisted of a short questionnaire (50 questions) administered to all NASA and contractor SR&QA personnel. The questionnaire is based on the eight categories of the President's Award for Quality and Productivity Improvement. It is designed to objectively determine placement on the TQ benchmark and identify a roadmap for improvement.
Figure-ground segmentation based on class-independent shape priors
NASA Astrophysics Data System (ADS)
Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu
2018-01-01
We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
NASA Astrophysics Data System (ADS)
Uemura, Satoshi; Fukumoto, Norihiro; Yamada, Hideaki; Nakamura, Hajime
A feature of services provided in a Next Generation Network (NGN) is that the end-to-end quality is guaranteed. This is quite a challenging issue, given the considerable fluctuation in network conditions within a Fixed Mobile Convergence (FMC) network. Therefore, a novel approach, whereby a network node and a mobile terminal such as a cellular phone cooperate with each other to control service quality is essential. In order to achieve such cooperation, the mobile terminal needs to become more intelligent so it can estimate the service quality, including the user's perceptual quality, and notify the measurement result to the network node. Subsequently, the network node implements some kind of service control function, such as a resource and admission control function, based on the notification from the mobile terminal. In this paper, the role of the mobile terminal in such collaborative system is focused on. As a part of a QoS/QoE measurement system, we describe an objective speech quality assessment with payload discrimination of lost packets to measure the user's perceptual quality of VoIP. The proposed assessment is so simple that it can be implemented on a cellular phone. We therefore did this as part of the QoS/QoE measurement system. By using the implemented system, we can measure the user's perceptual quality of VoIP as well as the network QoS metrics, in terms of criteria such as packet loss rate, jitter and burstiness in real time.
Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana
2018-06-01
This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.
[An object-oriented remote sensing image segmentation approach based on edge detection].
Tan, Yu-Min; Huai, Jian-Zhu; Tang, Zhong-Shi
2010-06-01
Satellite sensor technology endorsed better discrimination of various landscape objects. Image segmentation approaches to extracting conceptual objects and patterns hence have been explored and a wide variety of such algorithms abound. To this end, in order to effectively utilize edge and topological information in high resolution remote sensing imagery, an object-oriented algorithm combining edge detection and region merging is proposed. Susan edge filter is firstly applied to the panchromatic band of Quickbird imagery with spatial resolution of 0.61 m to obtain the edge map. Thanks to the resulting edge map, a two-phrase region-based segmentation method operates on the fusion image from panchromatic and multispectral Quickbird images to get the final partition result. In the first phase, a quad tree grid consisting of squares with sides parallel to the image left and top borders agglomerates the square subsets recursively where the uniform measure is satisfied to derive image object primitives. Before the merger of the second phrase, the contextual and spatial information, (e. g., neighbor relationship, boundary coding) of the resulting squares are retrieved efficiently by means of the quad tree structure. Then a region merging operation is performed with those primitives, during which the criterion for region merging integrates edge map and region-based features. This approach has been tested on the QuickBird images of some site in Sanxia area and the result is compared with those of ENVI Zoom Definiens. In addition, quantitative evaluation of the quality of segmentation results is also presented. Experiment results demonstrate stable convergence and efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, E.; Nelson, K.; Meinhold, C.B.
1991-10-01
In January 1990, the Nuclear Regulatory Commission (NRC) proposed amendments to 10 CFR Part 35 that would require medical licensees using byproduct material to establish and implement a basic quality assurance program. A 60-day real-world trial of the proposed rules was initiated to obtain information beyond that generally found through standard public comment procedures. Volunteers from randomly selected institutions had opportunities to review the details of the proposed regulations and to implement these rules on a daily basis during the trial. The participating institutions were then asked to evaluate the proposed regulations based on their personal experiences. The pilot projectmore » sought to determine whether medical institutions could develop written quality assurance programs that would meet the eight performance-based objectives of proposed Section 35.35. In addition, the NRC wanted to learn from these volunteers if they had any recommendations on how the rule could be revised to minimized its cost and to clarify its objectives without decreasing its effectiveness. It was found that licensees could develop acceptable QA programs under a performance-based approach, that most licensee programs did meet the proposed objectives, and that most written QA plans would require consultations with NRC or Agreement State personnel before they would fully meet all objectives of proposed Section 35.35. This report describes the overall pilot program. The methodology used to select and assemble the group of participating licensees is presented. The various workshops and evaluation questionnaires are discussed, and detailed findings are presented. 7 refs.« less
A new approach of objective quality evaluation on JPEG2000 lossy-compressed lung cancer CT images
NASA Astrophysics Data System (ADS)
Cai, Weihua; Tan, Yongqiang; Zhang, Jianguo
2007-03-01
Image compression has been used to increase the communication efficiency and storage capacity. JPEG 2000 compression, based on the wavelet transformation, has its advantages comparing to other compression methods, such as ROI coding, error resilience, adaptive binary arithmetic coding and embedded bit-stream. However it is still difficult to find an objective method to evaluate the image quality of lossy-compressed medical images so far. In this paper, we present an approach to evaluate the image quality by using a computer aided diagnosis (CAD) system. We selected 77 cases of CT images, bearing benign and malignant lung nodules with confirmed pathology, from our clinical Picture Archiving and Communication System (PACS). We have developed a prototype of CAD system to classify these images into benign ones and malignant ones, the performance of which was evaluated by the receiver operator characteristics (ROC) curves. We first used JPEG 2000 to compress these cases of images with different compression ratio from lossless to lossy, and used the CAD system to classify the cases with different compressed ratio, then compared the ROC curves from the CAD classification results. Support vector machine (SVM) and neural networks (NN) were used to classify the malignancy of input nodules. In each approach, we found that the area under ROC (AUC) decreases with the increment of compression ratio with small fluctuations.
NASA Astrophysics Data System (ADS)
Diez, Matteo; Iemma, Umberto
2012-05-01
The article presents a novel approach to include community noise considerations based on sound quality in the Multidisciplinary Conceptual Design Optimization (MCDO) of civil transportation aircraft. The novelty stems from the use of an unconventional objective function, defined as a measure of the difference between the noise emission of the aircraft under analysis and a reference 'weakly annoying' noise, the target sound. The minimization of such a merit factor yields an aircraft concept with a noise signature as close as possible to the given target. The reference sound is one of the outcomes of the European Research Project SEFA (Sound Engineering For Aircraft, VI Framework Programme, 2004-2007), and used here as an external input. The aim of the present work is to address the definition and the inclusion of the sound-matching-based objective function in the MCDO of aircraft.
Kalali, Amir; West, Mark; Walling, David; Hilt, Dana; Engelhardt, Nina; Alphs, Larry; Loebel, Antony; Vanover, Kim; Atkinson, Sarah; Opler, Mark; Sachs, Gary; Nations, Kari; Brady, Chris
2016-01-01
This paper summarizes the results of the CNS Summit Data Quality Monitoring Workgroup analysis of current data quality monitoring techniques used in central nervous system (CNS) clinical trials. Based on audience polls conducted at the CNS Summit 2014, the panel determined that current techniques used to monitor data and quality in clinical trials are broad, uncontrolled, and lack independent verification. The majority of those polled endorse the value of monitoring data. Case examples of current data quality methodology are presented and discussed. Perspectives of pharmaceutical companies and trial sites regarding data quality monitoring are presented. Potential future developments in CNS data quality monitoring are described. Increased utilization of biomarkers as objective outcomes and for patient selection is considered to be the most impactful development in data quality monitoring over the next 10 years. Additional future outcome measures and patient selection approaches are discussed. PMID:27413584
Quality measurement and benchmarking of HPV vaccination services: a new approach.
Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta
2014-01-01
A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.
An approach for using soil surveys to guide the placement of water quality buffers
M.G. Dosskey; M.J. Helmers; D.E. Eisenhauer
2006-01-01
Vegetative buffers may function better for filtering agricultural runoff in some locations than in others because of intrinsic characteristics of the land on which they are placed. The objective of this study was to develop a method based on soil survey attributes that can be used to compare soil map units for how effectively a buffer installed in them could remove...
ERIC Educational Resources Information Center
Holmes, George W., III; Seawell, William H.
This report presents (1) details of a program for educational administration by objectives and (2) the results of such a program developed by the Virginia State Department of Education to upgrade the quality of education in the public schools of that State. Administration by objectives is a systematic approach to education planning using…
An opposite view data replacement approach for reducing artifacts due to metallic dental objects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yazdi, Mehran; Lari, Meghdad Asadi; Bernier, Gaston
Purpose: To present a conceptually new method for metal artifact reduction (MAR) that can be used on patients with multiple objects within the scan plane that are also of small sized along the longitudinal (scanning) direction, such as dental fillings. Methods: The proposed algorithm, named opposite view replacement, achieves MAR by first detecting the projection data affected by metal objects and then replacing the affected projections by the corresponding opposite view projections, which are not affected by metal objects. The authors also applied a fading process to avoid producing any discontinuities in the boundary of the affected projection areas inmore » the sinogram. A skull phantom with and without a variety of dental metal inserts was made to extract the performance metric of the algorithm. A head and neck case, typical of IMRT planning, was also tested. Results: The reconstructed CT images based on this new replacement scheme show a significant improvement in image quality for patients with metallic dental objects compared to the MAR algorithms based on the interpolation scheme. For the phantom, the authors showed that the artifact reduction algorithm can efficiently recover the CT numbers in the area next to the metallic objects. Conclusions: The authors presented a new and efficient method for artifact reduction due to multiple small metallic objects. The obtained results from phantoms and clinical cases fully validate the proposed approach.« less
Objective quality assessment for multiexposure multifocus image fusion.
Hassen, Rania; Wang, Zhou; Salama, Magdy M A
2015-09-01
There has been a growing interest in image fusion technologies, but how to objectively evaluate the quality of fused images has not been fully understood. Here, we propose a method for objective quality assessment of multiexposure multifocus image fusion based on the evaluation of three key factors of fused image quality: 1) contrast preservation; 2) sharpness; and 3) structure preservation. Subjective experiments are conducted to create an image fusion database, based on which, performance evaluation shows that the proposed fusion quality index correlates well with subjective scores, and gives a significant improvement over the existing fusion quality measures.
Creating targeted initial populations for genetic product searches in heterogeneous markets
NASA Astrophysics Data System (ADS)
Foster, Garrett; Turner, Callaway; Ferguson, Scott; Donndelinger, Joseph
2014-12-01
Genetic searches often use randomly generated initial populations to maximize diversity and enable a thorough sampling of the design space. While many of these initial configurations perform poorly, the trade-off between population diversity and solution quality is typically acceptable for small-scale problems. Navigating complex design spaces, however, often requires computationally intelligent approaches that improve solution quality. This article draws on research advances in market-based product design and heuristic optimization to strategically construct 'targeted' initial populations. Targeted initial designs are created using respondent-level part-worths estimated from discrete choice models. These designs are then integrated into a traditional genetic search. Two case study problems of differing complexity are presented to illustrate the benefits of this approach. In both problems, targeted populations lead to computational savings and product configurations with improved market share of preferences. Future research efforts to tailor this approach and extend it towards multiple objectives are also discussed.
A deterministic aggregate production planning model considering quality of products
NASA Astrophysics Data System (ADS)
Madadi, Najmeh; Yew Wong, Kuan
2013-06-01
Aggregate Production Planning (APP) is a medium-term planning which is concerned with the lowest-cost method of production planning to meet customers' requirements and to satisfy fluctuating demand over a planning time horizon. APP problem has been studied widely since it was introduced and formulated in 1950s. However, in several conducted studies in the APP area, most of the researchers have concentrated on some common objectives such as minimization of cost, fluctuation in the number of workers, and inventory level. Specifically, maintaining quality at the desirable level as an objective while minimizing cost has not been considered in previous studies. In this study, an attempt has been made to develop a multi-objective mixed integer linear programming model that serves those companies aiming to incur the minimum level of operational cost while maintaining quality at an acceptable level. In order to obtain the solution to the multi-objective model, the Fuzzy Goal Programming approach and max-min operator of Bellman-Zadeh were applied to the model. At the final step, IBM ILOG CPLEX Optimization Studio software was used to obtain the experimental results based on the data collected from an automotive parts manufacturing company. The results show that incorporating quality in the model imposes some costs, however a trade-off should be done between the cost resulting from producing products with higher quality and the cost that the firm may incur due to customer dissatisfaction and sale losses.
Evidence-based dentistry: a model for clinical practice.
Faggion, Clóvis M; Tu, Yu-Kang
2007-06-01
Making decisions in dentistry should be based on the best evidence available. The objective of this study was to demonstrate a practical procedure and model that clinicians can use to apply the results of well-conducted studies to patient care by critically appraising the evidence with checklists and letter grade scales. To demonstrate application of this model for critically appraising the quality of research evidence, a hypothetical case involving an adult male with chronic periodontitis is used as an example. To determine the best clinical approach for this patient, a four-step, evidence-based model is demonstrated, consisting of the following: definition of a research question using the PICO format, search and selection of relevant literature, critical appraisal of identified research reports using checklists, and the application of evidence. In this model, the quality of research evidence was assessed quantitatively based on different levels of quality that are assigned letter grades of A, B, and C by evaluating the studies against the QUOROM (Quality of Reporting Meta-Analyses) and CONSORT (Consolidated Standards of Reporting Trials) checklists in a tabular format. For this hypothetical periodontics case, application of the model identified the best available evidence for clinical decision making, i.e., one randomized controlled trial and one systematic review of randomized controlled trials. Both studies showed similar answers for the research question. The use of a letter grade scale allowed an objective analysis of the quality of evidence. A checklist-driven model that assesses and applies evidence to dental practice may substantially improve dentists' decision making skill.
Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.
1993-01-01
Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.
NASA Astrophysics Data System (ADS)
Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi
2013-12-01
This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.
Three-dimensional propagation in near-field tomographic X-ray phase retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruhlandt, Aike, E-mail: aruhlan@gwdg.de; Salditt, Tim
An extension of phase retrieval algorithms for near-field X-ray (propagation) imaging to three dimensions is presented, enhancing the quality of the reconstruction by exploiting previously unused three-dimensional consistency constraints. This paper presents an extension of phase retrieval algorithms for near-field X-ray (propagation) imaging to three dimensions, enhancing the quality of the reconstruction by exploiting previously unused three-dimensional consistency constraints. The approach is based on a novel three-dimensional propagator and is derived for the case of optically weak objects. It can be easily implemented in current phase retrieval architectures, is computationally efficient and reduces the need for restrictive prior assumptions, resultingmore » in superior reconstruction quality.« less
NASA Astrophysics Data System (ADS)
Tournaire, O.; Paparoditis, N.
Road detection has been a topic of great interest in the photogrammetric and remote sensing communities since the end of the 70s. Many approaches dealing with various sensor resolutions, the nature of the scene or the wished accuracy of the extracted objects have been presented. This topic remains challenging today as the need for accurate and up-to-date data is becoming more and more important. Based on this context, we will study in this paper the road network from a particular point of view, focusing on road marks, and in particular dashed lines. Indeed, they are very useful clues, for evidence of a road, but also for tasks of a higher level. For instance, they can be used to enhance quality and to improve road databases. It is also possible to delineate the different circulation lanes, their width and functionality (speed limit, special lanes for buses or bicycles...). In this paper, we propose a new robust and accurate top-down approach for dashed line detection based on stochastic geometry. Our approach is automatic in the sense that no intervention from a human operator is necessary to initialise the algorithm or to track errors during the process. The core of our approach relies on defining geometric, radiometric and relational models for dashed lines objects. The model also has to deal with the interactions between the different objects making up a line, meaning that it introduces external knowledge taken from specifications. Our strategy is based on a stochastic method, and in particular marked point processes. Our goal is to find the objects configuration minimising an energy function made-up of a data attachment term measuring the consistency of the image with respect to the objects and a regularising term managing the relationship between neighbouring objects. To sample the energy function, we use Green algorithm's; coupled with a simulated annealing to find its minimum. Results from aerial images at various resolutions are presented showing that our approach is relevant and accurate as it can handle the most frequent layouts of dashed lines. Some issues, for instance, such as the relative weighting of both terms of the energy are also discussed in the conclusion.
Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy JH
2014-01-01
Background Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as this data is often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. Objective As standard abstraction approaches resulted in sub-standard data reliability for unstructured data elements collected as part of a multi-site, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. Research Design We adopted a “fit-for-use” framework to guide the development and evaluation of abstraction methods using a four step, phase-based approach including (1) team building, (2) identification of challenges, (3) adaptation of abstraction methods, and (4) systematic data quality monitoring. Measures Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (e.g., warfarin initiation) and medical follow-up (e.g., timeframe for follow-up). Results After implementation of the phase-based approach, inter-rater reliability for all unstructured data elements demonstrated kappas of ≥ 0.89 -- an average increase of + 0.25 for each unstructured data element. Conclusions As compared to standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multi-site EMR documentation. PMID:27624585
A Treatment Stage Specific Approach to Improving Quality of Life for Women with Ovarian Cancer
2005-10-01
This study focuses on quality of life among women with ovarian cancer. The primary objective of the study is to identify the issues that are of...repeated measures design will be used to assess changes in problem areas and quality of life from diagnosis to recurrence among women newly diagnosed with...objectives are of the study are as follows: (1) to assess changes in quality of life (as quantified by the FACT-O questionnaire) across the different
Eckermann, Simon; Coelli, Tim
2013-01-01
Evidence based medicine supports net benefit maximising therapies and strategies in processes of health technology assessment (HTA) for reimbursement and subsidy decisions internationally. However, translation of evidence based medicine to practice is impeded by efficiency measures such as cost per case-mix adjusted separation in hospitals, which ignore health effects of care. In this paper we identify a correspondence method that allows quality variables under control of providers to be incorporated in efficiency measures consistent with maximising net benefit. Including effects framed from a disutility bearing (utility reducing) perspective (e.g. mortality, morbidity or reduction in life years) as inputs and minimising quality inclusive costs on the cost-disutility plane is shown to enable efficiency measures consistent with maximising net benefit under a one to one correspondence. The method combines advantages of radial properties with an appropriate objective of maximising net benefit to overcome problems of inappropriate objectives implicit with alternative methods, whether specifying quality variables with utility bearing output (e.g. survival, reduction in morbidity or life years), hyperbolic or exogenous variables. This correspondence approach is illustrated in undertaking efficiency comparison at a clinical activity level for 45 Australian hospitals allowing for their costs and mortality rates per admission. Explicit coverage and comparability conditions of the underlying correspondence method are also shown to provide a robust framework for preventing cost-shifting and cream-skimming incentives, with appropriate qualification of analysis and support for data linkage and risk adjustment where these conditions are not satisfied. Comparison on the cost-disutility plane has previously been shown to have distinct advantages in comparing multiple strategies in HTA, which this paper naturally extends to a robust method and framework for comparing efficiency of health care providers in practice. Consequently, the proposed approach provides a missing link between HTA and practice, to allow active incentives for evidence based net benefit maximisation in practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
Sparsity-based fast CGH generation using layer-based approach for 3D point cloud model
NASA Astrophysics Data System (ADS)
Kim, Hak Gu; Jeong, Hyunwook; Ro, Yong Man
2017-03-01
Computer generated hologram (CGH) is becoming increasingly important for a 3-D display in various applications including virtual reality. In the CGH, holographic fringe patterns are generated by numerically calculating them on computer simulation systems. However, a heavy computational cost is required to calculate the complex amplitude on CGH plane for all points of 3D objects. This paper proposes a new fast CGH generation based on the sparsity of CGH for 3D point cloud model. The aim of the proposed method is to significantly reduce computational complexity while maintaining the quality of the holographic fringe patterns. To that end, we present a new layer-based approach for calculating the complex amplitude distribution on the CGH plane by using sparse FFT (sFFT). We observe the CGH of a layer of 3D objects is sparse so that dominant CGH is rapidly generated from a small set of signals by sFFT. Experimental results have shown that the proposed method is one order of magnitude faster than recently reported fast CGH generation.
Perceptual video quality assessment in H.264 video coding standard using objective modeling.
Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu
2014-01-01
Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.
Quality Assurance of Cancer Study Common Data Elements Using A Post-Coordination Approach
Jiang, Guoqian; Solbrig, Harold R.; Prud’hommeaux, Eric; Tao, Cui; Weng, Chunhua; Chute, Christopher G.
2015-01-01
Domain-specific common data elements (CDEs) are emerging as an effective approach to standards-based clinical research data storage and retrieval. A limiting factor, however, is the lack of robust automated quality assurance (QA) tools for the CDEs in clinical study domains. The objectives of the present study are to prototype and evaluate a QA tool for the study of cancer CDEs using a post-coordination approach. The study starts by integrating the NCI caDSR CDEs and The Cancer Genome Atlas (TCGA) data dictionaries in a single Resource Description Framework (RDF) data store. We designed a compositional expression pattern based on the Data Element Concept model structure informed by ISO/IEC 11179, and developed a transformation tool that converts the pattern-based compositional expressions into the Web Ontology Language (OWL) syntax. Invoking reasoning and explanation services, we tested the system utilizing the CDEs extracted from two TCGA clinical cancer study domains. The system could automatically identify duplicate CDEs, and detect CDE modeling errors. In conclusion, compositional expressions not only enable reuse of existing ontology codes to define new domain concepts, but also provide an automated mechanism for QA of terminological annotations for CDEs. PMID:26958201
Jiang, Nanfeng; Song, Weiran; Wang, Hui; Guo, Gongde; Liu, Yuanyuan
2018-05-23
As the expectation for higher quality of life increases, consumers have higher demands for quality food. Food authentication is the technical means of ensuring food is what it says it is. A popular approach to food authentication is based on spectroscopy, which has been widely used for identifying and quantifying the chemical components of an object. This approach is non-destructive and effective but expensive. This paper presents a computer vision-based sensor system for food authentication, i.e., differentiating organic from non-organic apples. This sensor system consists of low-cost hardware and pattern recognition software. We use a flashlight to illuminate apples and capture their images through a diffraction grating. These diffraction images are then converted into a data matrix for classification by pattern recognition algorithms, including k -nearest neighbors ( k -NN), support vector machine (SVM) and three partial least squares discriminant analysis (PLS-DA)- based methods. We carry out experiments on a reasonable collection of apple samples and employ a proper pre-processing, resulting in a highest classification accuracy of 94%. Our studies conclude that this sensor system has the potential to provide a viable solution to empower consumers in food authentication.
Application of target costing in machining
NASA Astrophysics Data System (ADS)
Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.
2004-11-01
In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.
Dealing with office emergencies. Stepwise approach for family physicians.
Sempowski, Ian P.; Brison, Robert J.
2002-01-01
OBJECTIVE: To develop a simple stepwise approach to initial management of emergencies in family physicians' offices; to review how to prepare health care teams and equipment; and to illustrate a general approach to three of the most common office emergencies. QUALITY OF EVIDENCE: MEDLINE was searched from January 1980 to December 2001. Articles were selected based on their clinical relevance, quality of evidence, and date of publication. We reviewed American family medicine, pediatric, dental, and dermatologic articles, but found that the area has not been well studied from a Canadian family medicine perspective. Consensus statements by specialty professional groups were used to identify accepted emergency medical treatments. MAIN MESSAGE: Family medicine offices are frequently poorly equipped and inadequately prepared to deal with emergencies. Straightforward emergency response plans can be designed and tailored to an office's risk profile. A systematic team approach and effective use of skills, support staff, and equipment is important. The general approach can be modified for specific patients or conditions. CONCLUSION: Family physicians can plan ahead and use a team approach to develop a simple stepwise response to emergency situations in the office. PMID:12371305
SU-E-T-776: Use of Quality Metrics for a New Hypo-Fractionated Pre-Surgical Mesothelioma Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richardson, S; Mehta, V
Purpose: The “SMART” (Surgery for Mesothelioma After Radiation Therapy) approach involves hypo-fractionated radiotherapy of the lung pleura to 25Gy over 5 days followed by surgical resection within 7. Early clinical results suggest that this approach is very promising, but also logistically challenging due to the multidisciplinary involvement. Due to the compressed schedule, high dose, and shortened planning time, the delivery of the planned doses were monitored for safety with quality metric software. Methods: Hypo-fractionated IMRT treatment plans were developed for all patients and exported to Quality Reports™ software. Plan quality metrics or PQMs™ were created to calculate an objective scoringmore » function for each plan. This allows for an objective assessment of the quality of the plan and a benchmark for plan improvement for subsequent patients. The priorities of various components were incorporated based on similar hypo-fractionated protocols such as lung SBRT treatments. Results: Five patients have been treated at our institution using this approach. The plans were developed, QA performed, and ready within 5 days of simulation. Plan Quality metrics utilized in scoring included doses to OAR and target coverage. All patients tolerated treatment well and proceeded to surgery as scheduled. Reported toxicity included grade 1 nausea (n=1), grade 1 esophagitis (n=1), grade 2 fatigue (n=3). One patient had recurrent fluid accumulation following surgery. No patients experienced any pulmonary toxicity prior to surgery. Conclusion: An accelerated course of pre-operative high dose radiation for mesothelioma is an innovative and promising new protocol. Without historical data, one must proceed cautiously and monitor the data carefully. The development of quality metrics and scoring functions for these treatments allows us to benchmark our plans and monitor improvement. If subsequent toxicities occur, these will be easy to investigate and incorporate into the metrics. This will improve the safe delivery of large doses for these patients.« less
Object view in spatial system dynamics: a grassland farming example
Neuwirth, Christian; Hofer, Barbara; Schaumberger, Andreas
2016-01-01
Abstract Spatial system dynamics (SSD) models are typically implemented by linking stock variables to raster grids while the use of object representations of human artefacts such as buildings or ownership has been limited. This limitation is addressed by this article, which demonstrates the use of object representations in SSD. The objects are parcels of land that are attributed to grassland farms. The model simulates structural change in agriculture, i.e., change in the size of farms. The aim of the model is to reveal relations between structural change, farmland fragmentation and variable farmland quality. Results show that fragmented farms tend to become consolidated by structural change, whereas consolidated initial conditions result in a significant increase of fragmentation. Consolidation is reinforced by a dynamic land market and high transportation costs. The example demonstrates the capabilities of the object-based approach for integrating object geometries (parcel shapes) and relations between objects (distances between parcels) dynamically in SSD. PMID:28190972
Competence and Quality in Real-Life Decision Making.
Geisler, Martin; Allwood, Carl Martin
2015-01-01
What distinguishes a competent decision maker and how should the issue of decision quality be approached in a real-life context? These questions were explored in three studies. In Study 1, using a web-based questionnaire and targeting a community sample, we investigated the relationships between objective and subjective indicators of real-life decision-making success. In Study 2 and 3, targeting two different samples of professionals, we explored if the prevalent cognitively oriented definition of decision-making competence could be beneficially expanded by adding aspects of competence in terms of social skills and time-approach. The predictive power for each of these three aspects of decision-making competence was explored for different indicators of real-life decision-making success. Overall, our results suggest that research on decision-making competence would benefit by expanding the definition of competence, by including decision-related abilities in terms of social skills and time-approach. Finally, the results also indicate that individual differences in real-life decision-making success profitably can be approached and measured by different criteria.
NASA Astrophysics Data System (ADS)
Jia, Zhao-hong; Pei, Ming-li; Leung, Joseph Y.-T.
2017-12-01
In this paper, we investigate the batch-scheduling problem with rejection on parallel machines with non-identical job sizes and arbitrary job-rejected weights. If a job is rejected, the corresponding penalty has to be paid. Our objective is to minimise the makespan of the processed jobs and the total rejection cost of the rejected jobs. Based on the selected multi-objective optimisation approaches, two problems, P1 and P2, are considered. In P1, the two objectives are linearly combined into one single objective. In P2, the two objectives are simultaneously minimised and the Pareto non-dominated solution set is to be found. Based on the ant colony optimisation (ACO), two algorithms, called LACO and PACO, are proposed to address the two problems, respectively. Two different objective-oriented pheromone matrices and heuristic information are designed. Additionally, a local optimisation algorithm is adopted to improve the solution quality. Finally, simulated experiments are conducted, and the comparative results verify the effectiveness and efficiency of the proposed algorithms, especially on large-scale instances.
Pilot-optimal augmentation synthesis
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1978-01-01
An augmentation synthesis method usable in the absence of quantitative handling qualities specifications, and yet explicitly including design objectives based on pilot-rating concepts, is presented. The algorithm involves the unique approach of simultaneously solving for the stability augmentation system (SAS) gains, pilot equalization and pilot rating prediction via optimal control techniques. Simultaneous solution is required in this case since the pilot model (gains, etc.) depends upon the augmented plant dynamics, and the augmentation is obviously not a priori known. Another special feature is the use of the pilot's objective function (from which the pilot model evolves) to design the SAS.
Robust Feedback Zoom Tracking for Digital Video Surveillance
Zou, Tengyue; Tang, Xiaoqi; Song, Bao; Wang, Jin; Chen, Jihong
2012-01-01
Zoom tracking is an important function in video surveillance, particularly in traffic management and security monitoring. It involves keeping an object of interest in focus during the zoom operation. Zoom tracking is typically achieved by moving the zoom and focus motors in lenses following the so-called “trace curve”, which shows the in-focus motor positions versus the zoom motor positions for a specific object distance. The main task of a zoom tracking approach is to accurately estimate the trace curve for the specified object. Because a proportional integral derivative (PID) controller has historically been considered to be the best controller in the absence of knowledge of the underlying process and its high-quality performance in motor control, in this paper, we propose a novel feedback zoom tracking (FZT) approach based on the geometric trace curve estimation and PID feedback controller. The performance of this approach is compared with existing zoom tracking methods in digital video surveillance. The real-time implementation results obtained on an actual digital video platform indicate that the developed FZT approach not only solves the traditional one-to-many mapping problem without pre-training but also improves the robustness for tracking moving or switching objects which is the key challenge in video surveillance. PMID:22969388
Control design for future agile fighters
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Davidson, John B.
1991-01-01
The CRAFT control design methodology is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The approach combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, and a graphical approach for representing control design metrics that captures numerous design goals in one composite illustration. The methodology makes use of control design metrics from four design objective areas, namely, control power, robustness, agility, and flying qualities. An example of the CRAFT methodology as well as associated design issues are presented.
NASA Astrophysics Data System (ADS)
Selker, J. S.; Kahsai, S. K.
2017-12-01
Green Infrastructure (GI) or Low impact development (LID), is a land use planning and design approach with the objective of mitigating land development impacts to the environment, and is ever more looked to as a way to lessen runoff and pollutant loading to receiving water bodies. Broad-scale approaches for siting GI/LID have been developed for agricultural watersheds, but are rare for urban watersheds, largely due to greater land use complexity. And it is even more challenging when it comes to Urban Africa due to the combination of poor data quality, rapid and unplanned development, and civic institutions unable to reliably carry out regular maintenance. We present a spacio-temporal simulation-based approach to identify an optimal prioritization of sites for GI/LID based on DEM, land use and land cover. Optimization used is a multi-objective optimization tool along with an urban storm water management model (SWMM) to identify the most cost-effective combination of LID/GI. This was applied to an urban watershed in NW Kampala, Lubigi Catchment (notorious for being heavily flooded every year), with a miscellaneous use watershed in Uganda, as a case-study to demonstrate the approach.
NASA Astrophysics Data System (ADS)
Gong, Changfei; Han, Ce; Gan, Guanghui; Deng, Zhenxiang; Zhou, Yongqiang; Yi, Jinling; Zheng, Xiaomin; Xie, Congying; Jin, Xiance
2017-04-01
Dynamic myocardial perfusion CT (DMP-CT) imaging provides quantitative functional information for diagnosis and risk stratification of coronary artery disease by calculating myocardial perfusion hemodynamic parameter (MPHP) maps. However, the level of radiation delivered by dynamic sequential scan protocol can be potentially high. The purpose of this work is to develop a pre-contrast normal-dose scan induced structure tensor total variation regularization based on the penalized weighted least-squares (PWLS) criteria to improve the image quality of DMP-CT with a low-mAs CT acquisition. For simplicity, the present approach was termed as ‘PWLS-ndiSTV’. Specifically, the ndiSTV regularization takes into account the spatial-temporal structure information of DMP-CT data and further exploits the higher order derivatives of the objective images to enhance denoising performance. Subsequently, an effective optimization algorithm based on the split-Bregman approach was adopted to minimize the associative objective function. Evaluations with modified dynamic XCAT phantom and preclinical porcine datasets have demonstrated that the proposed PWLS-ndiSTV approach can achieve promising gains over other existing approaches in terms of noise-induced artifacts mitigation, edge details preservation, and accurate MPHP maps calculation.
ERIC Educational Resources Information Center
Chetley, Andrew
In 1977, the Bernard van Leer Foundation began supporting a project in Colombia that had the objective of improving the quality of early childhood care and education in a small village. The Costa Atlantica project offered an approach to development that was based on community organization, social management, participation, cooperation, popular…
Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.
Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang
2014-01-01
Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.
Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment
Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang
2014-01-01
Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884
Sample size determination for bibliographic retrieval studies
Yao, Xiaomei; Wilczynski, Nancy L; Walter, Stephen D; Haynes, R Brian
2008-01-01
Background Research for developing search strategies to retrieve high-quality clinical journal articles from MEDLINE is expensive and time-consuming. The objective of this study was to determine the minimal number of high-quality articles in a journal subset that would need to be hand-searched to update or create new MEDLINE search strategies for treatment, diagnosis, and prognosis studies. Methods The desired width of the 95% confidence intervals (W) for the lowest sensitivity among existing search strategies was used to calculate the number of high-quality articles needed to reliably update search strategies. New search strategies were derived in journal subsets formed by 2 approaches: random sampling of journals and top journals (having the most high-quality articles). The new strategies were tested in both the original large journal database and in a low-yielding journal (having few high-quality articles) subset. Results For treatment studies, if W was 10% or less for the lowest sensitivity among our existing search strategies, a subset of 15 randomly selected journals or 2 top journals were adequate for updating search strategies, based on each approach having at least 99 high-quality articles. The new strategies derived in 15 randomly selected journals or 2 top journals performed well in the original large journal database. Nevertheless, the new search strategies developed using the random sampling approach performed better than those developed using the top journal approach in a low-yielding journal subset. For studies of diagnosis and prognosis, no journal subset had enough high-quality articles to achieve the expected W (10%). Conclusion The approach of randomly sampling a small subset of journals that includes sufficient high-quality articles is an efficient way to update or create search strategies for high-quality articles on therapy in MEDLINE. The concentrations of diagnosis and prognosis articles are too low for this approach. PMID:18823538
NASA Astrophysics Data System (ADS)
Wesemann, Johannes; Burgholzer, Reinhard; Herrnegger, Mathew; Schulz, Karsten
2017-04-01
In recent years, a lot of research in hydrological modelling has been invested to improve the automatic calibration of rainfall-runoff models. This includes for example (1) the implementation of new optimisation methods, (2) the incorporation of new and different objective criteria and signatures in the optimisation and (3) the usage of auxiliary data sets apart from runoff. Nevertheless, in many applications manual calibration is still justifiable and frequently applied. The hydrologist performing the manual calibration, with his expert knowledge, is able to judge the hydrographs simultaneously concerning details but also in a holistic view. This integrated eye-ball verification procedure available to man can be difficult to formulate in objective criteria, even when using a multi-criteria approach. Comparing the results of automatic and manual calibration is not straightforward. Automatic calibration often solely involves objective criteria such as Nash-Sutcliffe Efficiency Coefficient or the Kling-Gupta-Efficiency as a benchmark during the calibration. Consequently, a comparison based on such measures is intrinsically biased towards automatic calibration. Additionally, objective criteria do not cover all aspects of a hydrograph leaving questions concerning the quality of a simulation open. This contribution therefore seeks to examine the quality of manually and automatically calibrated hydrographs by interactively involving expert knowledge in the evaluation. Simulations have been performed for the Mur catchment in Austria with the rainfall-runoff model COSERO using two parameter sets evolved from a manual and an automatic calibration. A subset of resulting hydrographs for observation and simulation, representing the typical flow conditions and events, will be evaluated in this study. In an interactive crowdsourcing approach experts attending the session can vote for their preferred simulated hydrograph without having information on the calibration method that produced the respective hydrograph. Therefore, the result of the poll can be seen as an additional quality criterion for the comparison of the two different approaches and help in the evaluation of the automatic calibration method.
2013-01-01
Background The objective of screening programs is to discover life threatening diseases in as many patients as early as possible and to increase the chance of survival. To be able to compare aspects of health care quality, methods are needed for benchmarking that allow comparisons on various health care levels (regional, national, and international). Objectives Applications and extensions of algorithms can be used to link the information on disease phases with relative survival rates and to consolidate them in composite measures. The application of the developed SAS-macros will give results for benchmarking of health care quality. Data examples for breast cancer care are given. Methods A reference scale (expected, E) must be defined at a time point at which all benchmark objects (observed, O) are measured. All indices are defined as O/E, whereby the extended standardized screening-index (eSSI), the standardized case-mix-index (SCI), the work-up-index (SWI), and the treatment-index (STI) address different health care aspects. The composite measures called overall-performance evaluation (OPE) and relative overall performance indices (ROPI) link the individual indices differently for cross-sectional or longitudinal analyses. Results Algorithms allow a time point and a time interval associated comparison of the benchmark objects in the indices eSSI, SCI, SWI, STI, OPE, and ROPI. Comparisons between countries, states and districts are possible. Exemplarily comparisons between two countries are made. The success of early detection and screening programs as well as clinical health care quality for breast cancer can be demonstrated while the population’s background mortality is concerned. Conclusions If external quality assurance programs and benchmark objects are based on population-based and corresponding demographic data, information of disease phase and relative survival rates can be combined to indices which offer approaches for comparative analyses between benchmark objects. Conclusions on screening programs and health care quality are possible. The macros can be transferred to other diseases if a disease-specific phase scale of prognostic value (e.g. stage) exists. PMID:23316692
NASA Astrophysics Data System (ADS)
Bouter, Anton; Alderliesten, Tanja; Bosman, Peter A. N.
2017-02-01
Taking a multi-objective optimization approach to deformable image registration has recently gained attention, because such an approach removes the requirement of manually tuning the weights of all the involved objectives. Especially for problems that require large complex deformations, this is a non-trivial task. From the resulting Pareto set of solutions one can then much more insightfully select a registration outcome that is most suitable for the problem at hand. To serve as an internal optimization engine, currently used multi-objective algorithms are competent, but rather inefficient. In this paper we largely improve upon this by introducing a multi-objective real-valued adaptation of the recently introduced Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) for discrete optimization. In this work, GOMEA is tailored specifically to the problem of deformable image registration to obtain substantially improved efficiency. This improvement is achieved by exploiting a key strength of GOMEA: iteratively improving small parts of solutions, allowing to faster exploit the impact of such updates on the objectives at hand through partial evaluations. We performed experiments on three registration problems. In particular, an artificial problem containing a disappearing structure, a pair of pre- and post-operative breast CT scans, and a pair of breast MRI scans acquired in prone and supine position were considered. Results show that compared to the previously used evolutionary algorithm, GOMEA obtains a speed-up of up to a factor of 1600 on the tested registration problems while achieving registration outcomes of similar quality.
The Quality Control Circle: Is It for Education?
ERIC Educational Resources Information Center
Land, Arthur J.
From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…
Intelligent Control Approaches for Aircraft Applications
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; KrishnaKumar, K.; Soloway, Don; Kaneshige, John; Clancy, Daniel (Technical Monitor)
2001-01-01
This paper presents an overview of various intelligent control technologies currently being developed and studied under the Intelligent Flight Control (IFC) program at the NASA Ames Research Center. The main objective of the intelligent flight control program is to develop the next generation of flight controllers for the purpose of automatically compensating for a broad spectrum of damaged or malfunctioning aircraft components and to reduce control law development cost and time. The approaches being examined include: (a) direct adaptive dynamic inverse controller and (b) an adaptive critic-based dynamic inverse controller. These approaches can utilize, but do not require, fault detection and isolation information. Piloted simulation studies are performed to examine if the intelligent flight control techniques adequately: 1) Match flying qualities of modern fly-by-wire flight controllers under nominal conditions; 2) Improve performance under failure conditions when sufficient control authority is available; and 3) Achieve consistent handling qualities across the flight envelope and for different aircraft configurations. Results obtained so far demonstrate the potential for improving handling qualities and significantly increasing survivability rates under various simulated failure conditions.
On the performance of metrics to predict quality in point cloud representations
NASA Astrophysics Data System (ADS)
Alexiou, Evangelos; Ebrahimi, Touradj
2017-09-01
Point clouds are a promising alternative for immersive representation of visual contents. Recently, an increased interest has been observed in the acquisition, processing and rendering of this modality. Although subjective and objective evaluations are critical in order to assess the visual quality of media content, they still remain open problems for point cloud representation. In this paper we focus our efforts on subjective quality assessment of point cloud geometry, subject to typical types of impairments such as noise corruption and compression-like distortions. In particular, we propose a subjective methodology that is closer to real-life scenarios of point cloud visualization. The performance of the state-of-the-art objective metrics is assessed by considering the subjective scores as the ground truth. Moreover, we investigate the impact of adopting different test methodologies by comparing them. Advantages and drawbacks of every approach are reported, based on statistical analysis. The results and conclusions of this work provide useful insights that could be considered in future experimentation.
Designing train-speed trajectory with energy efficiency and service quality
NASA Astrophysics Data System (ADS)
Jia, Jiannan; Yang, Kai; Yang, Lixing; Gao, Yuan; Li, Shukai
2018-05-01
With the development of automatic train operations, optimal trajectory design is significant to the performance of train operations in railway transportation systems. Considering energy efficiency and service quality, this article formulates a bi-objective train-speed trajectory optimization model to minimize simultaneously the energy consumption and travel time in an inter-station section. This article is distinct from previous studies in that more sophisticated train driving strategies characterized by the acceleration/deceleration gear, the cruising speed, and the speed-shift site are specifically considered. For obtaining an optimal train-speed trajectory which has equal satisfactory degree on both objectives, a fuzzy linear programming approach is applied to reformulate the objectives. In addition, a genetic algorithm is developed to solve the proposed train-speed trajectory optimization problem. Finally, a series of numerical experiments based on a real-world instance of Beijing-Tianjin Intercity Railway are implemented to illustrate the practicability of the proposed model as well as the effectiveness of the solution methodology.
Quality assessment of color images based on the measure of just noticeable color difference
NASA Astrophysics Data System (ADS)
Chou, Chun-Hsien; Hsu, Yun-Hsiang
2014-01-01
Accurate assessment on the quality of color images is an important step to many image processing systems that convey visual information of the reproduced images. An accurate objective image quality assessment (IQA) method is expected to give the assessment result highly agreeing with the subjective assessment. To assess the quality of color images, many approaches simply apply the metric for assessing the quality of gray scale images to each of three color channels of the color image, neglecting the correlation among three color channels. In this paper, a metric for assessing color images' quality is proposed, in which the model of variable just-noticeable color difference (VJNCD) is employed to estimate the visibility thresholds of distortion inherent in each color pixel. With the estimated visibility thresholds of distortion, the proposed metric measures the average perceptible distortion in terms of the quantized distortion according to the perceptual error map similar to that defined by National Bureau of Standards (NBS) for converting the color difference enumerated by CIEDE2000 to the objective score of perceptual quality assessment. The perceptual error map in this case is designed for each pixel according to the visibility threshold estimated by the VJNCD model. The performance of the proposed metric is verified by assessing the test images in the LIVE database, and is compared with those of many well-know IQA metrics. Experimental results indicate that the proposed metric is an effective IQA method that can accurately predict the image quality of color images in terms of the correlation between objective scores and subjective evaluation.
A hybrid solution approach for a multi-objective closed-loop logistics network under uncertainty
NASA Astrophysics Data System (ADS)
Mehrbod, Mehrdad; Tu, Nan; Miao, Lixin
2015-06-01
The design of closed-loop logistics (forward and reverse logistics) has attracted growing attention with the stringent pressures of customer expectations, environmental concerns and economic factors. This paper considers a multi-product, multi-period and multi-objective closed-loop logistics network model with regard to facility expansion as a facility location-allocation problem, which more closely approximates real-world conditions. A multi-objective mixed integer nonlinear programming formulation is linearized by defining new variables and adding new constraints to the model. By considering the aforementioned model under uncertainty, this paper develops a hybrid solution approach by combining an interactive fuzzy goal programming approach and robust counterpart optimization based on three well-known robust counterpart optimization formulations. Finally, this paper compares the results of the three formulations using different test scenarios and parameter-sensitive analysis in terms of the quality of the final solution, CPU time, the level of conservatism, the degree of closeness to the ideal solution, the degree of balance involved in developing a compromise solution, and satisfaction degree.
Keefer, Matthew W; Wilson, Sara E; Dankowicz, Harry; Loui, Michael C
2014-03-01
Recent research in ethics education shows a potentially problematic variation in content, curricular materials, and instruction. While ethics instruction is now widespread, studies have identified significant variation in both the goals and methods of ethics education, leaving researchers to conclude that many approaches may be inappropriately paired with goals that are unachievable. This paper speaks to these concerns by demonstrating the importance of aligning classroom-based assessments to clear ethical learning objectives in order to help students and instructors track their progress toward meeting those objectives. Two studies at two different universities demonstrate the usefulness of classroom-based, formative assessments for improving the quality of students' case responses in computational modeling and research ethics.
No-reference quality assessment based on visual perception
NASA Astrophysics Data System (ADS)
Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao
2014-11-01
The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233 images of JPEG, 174 images of White Noise, 174 images of Gaussian Blur, 174 images of Fast Fading. The database includes subjective differential mean opinion score (DMOS) for each image. The experimental results show that the proposed approach not only can assess many kinds of distorted images quality, but also exhibits a superior accuracy and monotonicity.
Stochastic control approaches for sensor management in search and exploitation
NASA Astrophysics Data System (ADS)
Hitchings, Darin Chester
Recent improvements in the capabilities of autonomous vehicles have motivated their increased use in such applications as defense, homeland security, environmental monitoring, and surveillance. To enhance performance in these applications, new algorithms are required to control teams of robots autonomously and through limited interactions with human operators. In this dissertation we develop new algorithms for control of robots performing information-seeking missions in unknown environments. These missions require robots to control their sensors in order to discover the presence of objects, keep track of the objects, and learn what these objects are, given a fixed sensing budget. Initially, we investigate control of multiple sensors, with a finite set of sensing options and finite-valued measurements, to locate and classify objects given a limited resource budget. The control problem is formulated as a Partially Observed Markov Decision Problem (POMDP), but its exact solution requires excessive computation. Under the assumption that sensor error statistics are independent and time-invariant, we develop a class of algorithms using Lagrangian Relaxation techniques to obtain optimal mixed strategies using performance bounds developed in previous research. We investigate alternative Receding Horizon (RH) controllers to convert the mixed strategies to feasible adaptive-sensing strategies and evaluate the relative performance of these controllers in simulation. The resulting controllers provide superior performance to alternative algorithms proposed in the literature and obtain solutions to large-scale POMDP problems several orders of magnitude faster than optimal Dynamic Programming (DP) approaches with comparable performance quality. We extend our results for finite action, finite measurement sensor control to scenarios with moving objects. We use Hidden Markov Models (HMMs) for the evolution of objects, according to the dynamics of a birth-death process. We develop a new lower bound on the performance of adaptive controllers in these scenarios, develop algorithms for computing solutions to this lower bound, and use these algorithms as part of a RH controller for sensor allocation in the presence of moving objects We also consider an adaptive Search problem where sensing actions are continuous and the underlying measurement space is also continuous. We extend our previous hierarchical decomposition approach based on performance bounds to this problem and develop novel implementations of Stochastic Dynamic Programming (SDP) techniques to solve this problem. Our algorithms are nearly two orders of magnitude faster than previously proposed approaches and yield solutions of comparable quality. For supervisory control, we discuss how human operators can work with and augment robotic teams performing these tasks. Our focus is on how tasks are partitioned among teams of robots and how a human operator can make intelligent decisions for task partitioning. We explore these questions through the design of a game that involves robot automata controlled by our algorithms and a human supervisor that partitions tasks based on different levels of support information. This game can be used with human subject experiments to explore the effect of information on quality of supervisory control.
Assessment of Multiresolution Segmentation for Extracting Greenhouses from WORLDVIEW-2 Imagery
NASA Astrophysics Data System (ADS)
Aguilar, M. A.; Aguilar, F. J.; García Lorca, A.; Guirado, E.; Betlej, M.; Cichon, P.; Nemmaoui, A.; Vallario, A.; Parente, C.
2016-06-01
The latest breed of very high resolution (VHR) commercial satellites opens new possibilities for cartographic and remote sensing applications. In this way, object based image analysis (OBIA) approach has been proved as the best option when working with VHR satellite imagery. OBIA considers spectral, geometric, textural and topological attributes associated with meaningful image objects. Thus, the first step of OBIA, referred to as segmentation, is to delineate objects of interest. Determination of an optimal segmentation is crucial for a good performance of the second stage in OBIA, the classification process. The main goal of this work is to assess the multiresolution segmentation algorithm provided by eCognition software for delineating greenhouses from WorldView- 2 multispectral orthoimages. Specifically, the focus is on finding the optimal parameters of the multiresolution segmentation approach (i.e., Scale, Shape and Compactness) for plastic greenhouses. The optimum Scale parameter estimation was based on the idea of local variance of object heterogeneity within a scene (ESP2 tool). Moreover, different segmentation results were attained by using different combinations of Shape and Compactness values. Assessment of segmentation quality based on the discrepancy between reference polygons and corresponding image segments was carried out to identify the optimal setting of multiresolution segmentation parameters. Three discrepancy indices were used: Potential Segmentation Error (PSE), Number-of-Segments Ratio (NSR) and Euclidean Distance 2 (ED2).
Chen, Zhihuan; Yuan, Yanbin; Yuan, Xiaohui; Huang, Yuehua; Li, Xianshan; Li, Wenwu
2015-05-01
A hydraulic turbine regulating system (HTRS) is one of the most important components of hydropower plant, which plays a key role in maintaining safety, stability and economical operation of hydro-electrical installations. At present, the conventional PID controller is widely applied in the HTRS system for its practicability and robustness, and the primary problem with respect to this control law is how to optimally tune the parameters, i.e. the determination of PID controller gains for satisfactory performance. In this paper, a kind of multi-objective evolutionary algorithms, named adaptive grid particle swarm optimization (AGPSO) is applied to solve the PID gains tuning problem of the HTRS system. This newly AGPSO optimized method, which differs from a traditional one-single objective optimization method, is designed to take care of settling time and overshoot level simultaneously, in which a set of non-inferior alternatives solutions (i.e. Pareto solution) is generated. Furthermore, a fuzzy-based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto set. An illustrative example associated with the best compromise solution for parameter tuning of the nonlinear HTRS system is introduced to verify the feasibility and the effectiveness of the proposed AGPSO-based optimization approach, as compared with two another prominent multi-objective algorithms, i.e. Non-dominated Sorting Genetic Algorithm II (NSGAII) and Strength Pareto Evolutionary Algorithm II (SPEAII), for the quality and diversity of obtained Pareto solutions set. Consequently, simulation results show that this AGPSO optimized approach outperforms than compared methods with higher efficiency and better quality no matter whether the HTRS system works under unload or load conditions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Multi Objective Optimization of Yarn Quality and Fibre Quality Using Evolutionary Algorithm
NASA Astrophysics Data System (ADS)
Ghosh, Anindya; Das, Subhasis; Banerjee, Debamalya
2013-03-01
The quality and cost of resulting yarn play a significant role to determine its end application. The challenging task of any spinner lies in producing a good quality yarn with added cost benefit. The present work does a multi-objective optimization on two objectives, viz. maximization of cotton yarn strength and minimization of raw material quality. The first objective function has been formulated based on the artificial neural network input-output relation between cotton fibre properties and yarn strength. The second objective function is formulated with the well known regression equation of spinning consistency index. It is obvious that these two objectives are conflicting in nature i.e. not a single combination of cotton fibre parameters does exist which produce maximum yarn strength and minimum cotton fibre quality simultaneously. Therefore, it has several optimal solutions from which a trade-off is needed depending upon the requirement of user. In this work, the optimal solutions are obtained with an elitist multi-objective evolutionary algorithm based on Non-dominated Sorting Genetic Algorithm II (NSGA-II). These optimum solutions may lead to the efficient exploitation of raw materials to produce better quality yarns at low costs.
An ontology-based telemedicine tasks management system architecture.
Nageba, Ebrahim; Fayn, Jocelyne; Rubel, Paul
2008-01-01
The recent developments in ambient intelligence and ubiquitous computing offer new opportunities for the design of advanced Telemedicine systems providing high quality services, anywhere, anytime. In this paper we present an approach for building an ontology-based task-driven telemedicine system. The architecture is composed of a task management server, a communication server and a knowledge base for enabling decision makings taking account of different telemedical concepts such as actors, resources, services and the Electronic Health Record. The final objective is to provide an intelligent management of the different types of available human, material and communication resources.
Engels, Yvonne; van den Hombergh, Pieter; Mokkink, Henk; van den Hoogen, Henk; van den Bosch, Wil; Grol, Richard
2006-01-01
Aim To study the effects of a team-based model for continuous quality improvement (CQI) on primary care practice management. Design of study Randomised controlled trial. Setting Twenty-six intervention and 23 control primary care practices in the Netherlands. Method Practices interested in taking part in the CQI project were, after assessment of their practice organisation, randomly assigned to the intervention or control groups. During a total of five meetings, a facilitator helped the teams in the intervention group select suitable topics for quality improvement and follow a structured approach to achieve improvement objectives. Checklists completed by an outreach visitor, questionnaires for the GPs, staff and patients were used to assemble data on the number and quality of improvement activities undertaken and on practice management prior to the start of the intervention and 1 year later. Results Pre-test and post-test data were compared for the 26 intervention and 23 control practices. A significant intervention effect was found for the number of improvement objectives actually defined (93 versus 54, P<0.001) and successfully completed (80 versus 69% of the projects, P<0.001). The intervention group also improved on more aspects of practice management, as measured by our practice visit method, than the control group but none of these differences proved statistically significant. Conclusion The intervention exerted a significant effect on the number and quality of improvement projects undertaken and self-defined objectives met. Failure of the effects of the intervention on the other dimensions of practice management to achieve significance may be due to the topics selected for some of the improvement projects being only partly covered by the assessment instrument. PMID:17007709
ERIC Educational Resources Information Center
Rakap, Salih
2015-01-01
Individualised education programmes (IEPs) are the road maps for individualising services for children with disabilities, specifically through the development of high-quality child goals/objectives. High-quality IEP goals/objectives that are developed based on a comprehensive assessment of child functioning and directly connected to intervention…
NASA Astrophysics Data System (ADS)
Hussein, I.; Wilkins, M.; Roscoe, C.; Faber, W.; Chakravorty, S.; Schumacher, P.
2016-09-01
Finite Set Statistics (FISST) is a rigorous Bayesian multi-hypothesis management tool for the joint detection, classification and tracking of multi-sensor, multi-object systems. Implicit within the approach are solutions to the data association and target label-tracking problems. The full FISST filtering equations, however, are intractable. While FISST-based methods such as the PHD and CPHD filters are tractable, they require heavy moment approximations to the full FISST equations that result in a significant loss of information contained in the collected data. In this paper, we review Smart Sampling Markov Chain Monte Carlo (SSMCMC) that enables FISST to be tractable while avoiding moment approximations. We study the effect of tuning key SSMCMC parameters on tracking quality and computation time. The study is performed on a representative space object catalog with varying numbers of RSOs. The solution is implemented in the Scala computing language at the Maui High Performance Computing Center (MHPCC) facility.
NASA Astrophysics Data System (ADS)
Belkacemi, Mohamed; Stolz, Christophe; Mathieu, Alexandre; Lemaitre, Guillaume; Massich, Joan; Aubreton, Olivier
2015-11-01
Today, industries ensure the quality of their manufactured products through computer vision techniques and nonconventional imaging. Three-dimensional (3-D) scanners and nondestructive testing (NDT) systems are commonly used independently for such applications. Furthermore, these approaches combined constitute hybrid systems, providing a 3-D reconstruction and NDT analysis. These systems, however, suffer from drawbacks such as errors during the data fusion and higher cost for manufacturers. In an attempt to solve these problems, a single active thermography system based on scanning-from-heating is proposed in this paper. In addition to 3-D digitization of the object, our contributions are twofold: (1) the nonthrough defect detection for a homogeneous metallic object and (2) fiber orientation assessment for a long fiber composite material. The experiments on steel and aluminum plates show that our method achieves the detection of nonthrough defects. Additionally, the estimation of the fiber orientation is evaluated on carbon-fiber composite material.
Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J
1995-06-01
This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. The study involved cross-sectional examination of the named relationships. Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult challenges in this regard.
ERIC Educational Resources Information Center
Hou, Angela Yung-Chi; Ince, Martin; Tsai, Sandy; Chiang, Chung Lin
2015-01-01
As quality guardians of higher education, quality assurance agencies are required to guarantee the credibility of the review process and to ensure the objectivity and transparency of their decisions and recommendations. These agencies are therefore expected to use a range of internal and external approaches to prove the quality of their review…
NASA Astrophysics Data System (ADS)
Ţîţu, M. A.; Pop, A. B.; Ţîţu, Ș
2017-06-01
This paper presents a study on the modelling and optimization of certain variables by using the Taguchi Method with a view to modelling and optimizing the process of pressing tappets into anchors, process conducted in an organization that promotes knowledge-based management. The paper promotes practical concepts of the Taguchi Method and describes the way in which the objective functions are obtained and used during the modelling and optimization of the process of pressing tappets into the anchors.
Stevenson, Katherine; Busch, Angela; Scott, Darlene J.; Henry, Carol; Wall, Patricia A.
2009-01-01
Objectives To develop and evaluate a classroom-based curriculum designed to promote interprofessional competencies by having undergraduate students from various health professions work together on system-based problems using quality improvement (QI) methods and tools to improve patient-centered care. Design Students from 4 health care programs (nursing, nutrition, pharmacy, and physical therapy) participated in an interprofessional QI activity. In groups of 6 or 7, students completed pre-intervention and post-intervention reflection tools on attitudes relating to interprofessio nal teams, and a tool designed to evaluate group process. Assessment One hundred thirty-four students (76.6%) completed both self-reflection instruments, and 132 (74.2%) completed the post-course group evaluation instrument. Although already high prior to the activity, students' mean post-intervention reflection scores increased for 12 of 16 items. Post-intervention group evaluation scores reflected a high level of satisfaction with the experience. Conclusion Use of a quality-based case study and QI methodology were an effective approach to enhancing interprofessional experiences among students. PMID:19657497
Blind image quality assessment based on aesthetic and statistical quality-aware features
NASA Astrophysics Data System (ADS)
Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi
2017-07-01
The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.
NASA Astrophysics Data System (ADS)
Qiao, T.; Ren, J.; Craigie, C.; Zabalza, J.; Maltin, Ch.; Marshall, S.
2015-03-01
It is well known that the eating quality of beef has a significant influence on the repurchase behavior of consumers. There are several key factors that affect the perception of quality, including color, tenderness, juiciness, and flavor. To support consumer repurchase choices, there is a need for an objective measurement of quality that could be applied to meat prior to its sale. Objective approaches such as offered by spectral technologies may be useful, but the analytical algorithms used remain to be optimized. For visible and near infrared (VISNIR) spectroscopy, Partial Least Squares Regression (PLSR) is a widely used technique for meat related quality modeling and prediction. In this paper, a Support Vector Machine (SVM) based machine learning approach is presented to predict beef eating quality traits. Although SVM has been successfully used in various disciplines, it has not been applied extensively to the analysis of meat quality parameters. To this end, the performance of PLSR and SVM as tools for the analysis of meat tenderness is evaluated, using a large dataset acquired under industrial conditions. The spectral dataset was collected using VISNIR spectroscopy with the wavelength ranging from 350 to 1800 nm on 234 beef M. longissimus thoracis steaks from heifers, steers, and young bulls. As the dimensionality with the VISNIR data is very high (over 1600 spectral bands), the Principal Component Analysis (PCA) technique was applied for feature extraction and data reduction. The extracted principal components (less than 100) were then used for data modeling and prediction. The prediction results showed that SVM has a greater potential to predict beef eating quality than PLSR, especially for the prediction of tenderness. The infl uence of animal gender on beef quality prediction was also investigated, and it was found that beef quality traits were predicted most accurately in beef from young bulls.
Xie, Bin; da Silva, Orlando; Zaric, Greg
2012-01-01
OBJECTIVE: To evaluate the incremental cost-effectiveness of a system-based approach for the management of neonatal jaundice and the prevention of kernicterus in term and late-preterm (≥35 weeks) infants, compared with the traditional practice based on visual inspection and selected bilirubin testing. STUDY DESIGN: Two hypothetical cohorts of 150,000 term and late-preterm neonates were used to compare the costs and outcomes associated with the use of a system-based or traditional practice approach. Data for the evaluation were obtained from the case costing centre at a large teaching hospital in Ontario, supplemented by data from the literature. RESULTS: The per child cost for the system-based approach cohort was $176, compared with $173 in the traditional practice cohort. The higher cost associated with the system-based cohort reflects increased costs for predischarge screening and treatment and increased postdischarge follow-up visits. These costs are partially offset by reduced costs from fewer emergency room visits, hospital readmissions and kernicterus cases. Compared with the traditional approach, the cost to prevent one kernicterus case using the system-based approach was $570,496, the cost per life year gained was $26,279, and the cost per quality-adjusted life year gained was $65,698. CONCLUSION: The cost to prevent one kernicterus case using the system-based approach is much lower than previously reported in the literature. PMID:23277747
Schwarzkopf, Daniel; Rüddel, Hendrik; Gründling, Matthias; Putensen, Christian; Reinhart, Konrad
2018-01-18
While sepsis-related mortality decreased substantially in other developed countries, mortality of severe sepsis remained as high as 44% in Germany. A recent German cluster randomized trial was not able to improve guideline adherence and decrease sepsis-related mortality within the participating hospitals, partly based on lacking support by hospital management and lacking resources for documentation of prospective data. Thus, more pragmatic approaches are needed to improve quality of sepsis care in Germany. The primary objective of the study is to decrease sepsis-related hospital mortality within a quality collaborative relying on claims data. The German Quality Network Sepsis (GQNS) is a quality collaborative involving 75 hospitals. This study protocol describes the conduction and evaluation of the start-up period of the GQNS running from March 2016 to August 2018. Democratic structures assure participatory action, a study coordination bureau provides central support and resources, and local interdisciplinary quality improvement teams implement changes within the participating hospitals. Quarterly quality reports focusing on risk-adjusted hospital mortality in cases with sepsis based on claims data are provided. Hospitals committed to publish their individual risk-adjusted mortality compared to the German average. A complex risk-model is used to control for differences in patient-related risk factors. Hospitals are encouraged to implement a bundle of interventions, e.g., interdisciplinary case analyses, external peer-reviews, hospital-wide staff education, and implementation of rapid response teams. The effectiveness of the GQNS is evaluated in a quasi-experimental difference-in-differences design by comparing the change of hospital mortality of cases with sepsis with organ dysfunction from a retrospective baseline period (January 2014 to December 2015) and the intervention period (April 2016 to March 2018) between the participating hospitals and all other German hospitals. Structural and process quality indicators of sepsis care as well as efforts for quality improvement are monitored regularly. The GQNS is a large-scale quality collaborative using a pragmatic approach based on claims data. A complex risk-adjustment model allows valid quality comparisons between hospitals and with the German average. If this study finds the approach to be useful for improving quality of sepsis care, it may also be applied to other diseases. ClinicalTrials.gov NCT02820675.
A framework for automatic information quality ranking of diabetes websites.
Belen Sağlam, Rahime; Taskaya Temizel, Tugba
2015-01-01
Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).
Solbrig, Harold R; Chute, Christopher G
2012-01-01
Objective The objective of this study is to develop an approach to evaluate the quality of terminological annotations on the value set (ie, enumerated value domain) components of the common data elements (CDEs) in the context of clinical research using both unified medical language system (UMLS) semantic types and groups. Materials and methods The CDEs of the National Cancer Institute (NCI) Cancer Data Standards Repository, the NCI Thesaurus (NCIt) concepts and the UMLS semantic network were integrated using a semantic web-based framework for a SPARQL-enabled evaluation. First, the set of CDE-permissible values with corresponding meanings in external controlled terminologies were isolated. The corresponding value meanings were then evaluated against their NCI- or UMLS-generated semantic network mapping to determine whether all of the meanings fell within the same semantic group. Results Of the enumerated CDEs in the Cancer Data Standards Repository, 3093 (26.2%) had elements drawn from more than one UMLS semantic group. A random sample (n=100) of this set of elements indicated that 17% of them were likely to have been misclassified. Discussion The use of existing semantic web tools can support a high-throughput mechanism for evaluating the quality of large CDE collections. This study demonstrates that the involvement of multiple semantic groups in an enumerated value domain of a CDE is an effective anchor to trigger an auditing point for quality evaluation activities. Conclusion This approach produces a useful quality assurance mechanism for a clinical study CDE repository. PMID:22511016
50 CFR 84.22 - What needs to be included in grant proposals?
Code of Federal Regulations, 2010 CFR
2010-10-01
... need within the purposes of the Act; (2) Discrete, quantifiable, and verifiable objective(s) to be... waters, the hydrology, water quality, or fish and wildlife dependent on the wetlands; (4) The approach to... the proposal, and a map of the project site; (6) Estimated costs to attain the objective(s) (the...
50 CFR 84.22 - What needs to be included in grant proposals?
Code of Federal Regulations, 2011 CFR
2011-10-01
... need within the purposes of the Act; (2) Discrete, quantifiable, and verifiable objective(s) to be... waters, the hydrology, water quality, or fish and wildlife dependent on the wetlands; (4) The approach to... the proposal, and a map of the project site; (6) Estimated costs to attain the objective(s) (the...
NASA Astrophysics Data System (ADS)
Kamal, Muhammad; Johansen, Kasper
2017-10-01
Effective mangrove management requires spatially explicit information of mangrove tree crown map as a basis for ecosystem diversity study and health assessment. Accuracy assessment is an integral part of any mapping activities to measure the effectiveness of the classification approach. In geographic object-based image analysis (GEOBIA) the assessment of the geometric accuracy (shape, symmetry and location) of the created image objects from image segmentation is required. In this study we used an explicit area-based accuracy assessment to measure the degree of similarity between the results of the classification and reference data from different aspects, including overall quality (OQ), user's accuracy (UA), producer's accuracy (PA) and overall accuracy (OA). We developed a rule set to delineate the mangrove tree crown using WorldView-2 pan-sharpened image. The reference map was obtained by visual delineation of the mangrove tree crowns boundaries form a very high-spatial resolution aerial photograph (7.5cm pixel size). Ten random points with a 10 m radius circular buffer were created to calculate the area-based accuracy assessment. The resulting circular polygons were used to clip both the classified image objects and reference map for area comparisons. In this case, the area-based accuracy assessment resulted 64% and 68% for the OQ and OA, respectively. The overall quality of the calculation results shows the class-related area accuracy; which is the area of correctly classified as tree crowns was 64% out of the total area of tree crowns. On the other hand, the overall accuracy of 68% was calculated as the percentage of all correctly classified classes (tree crowns and canopy gaps) in comparison to the total class area (an entire image). Overall, the area-based accuracy assessment was simple to implement and easy to interpret. It also shows explicitly the omission and commission error variations of object boundary delineation with colour coded polygons.
Quality assurance and accreditation.
1997-01-01
In 1996, the Joint Commission International (JCI), which is a partnership between the Joint Commission on Accreditation of Healthcare Organizations and Quality Healthcare Resources, Inc., became one of the contractors of the Quality Assurance Project (QAP). JCI recognizes the link between accreditation and quality, and uses a collaborative approach to help a country develop national quality standards that will improve patient care, satisfy patient-centered objectives, and serve the interest of all affected parties. The implementation of good standards provides support for the good performance of professionals, introduces new ideas for improvement, enhances the quality of patient care, reduces costs, increases efficiency, strengthens public confidence, improves management, and enhances the involvement of the medical staff. Such good standards are objective and measurable; achievable with current resources; adaptable to different institutions and cultures; and demonstrate autonomy, flexibility, and creativity. The QAP offers the opportunity to approach accreditation through research efforts, training programs, and regulatory processes. QAP work in the area of accreditation has been targeted for Zambia, where the goal is to provide equal access to cost-effective, quality health care; Jordan, where a consensus process for the development of standards, guidelines, and policies has been initiated; and Ecuador, where JCI has been asked to help plan an approach to the evaluation and monitoring of the health care delivery system.
Wilfley, Denise E.; Staiano, Amanda E.; Altman, Myra; Lindros, Jeanne; Lima, Angela; Hassink, Sandra G.; Dietz, William H.; Cook, Stephen
2017-01-01
Objectives To improve systems of care to advance implementation of the U.S. Preventive Services Task Force recommendations for childhood obesity treatment (i.e. clinicians offer/refer children with obesity to intensive, multicomponent behavioral interventions of >25 hours over 6–12 months to improve weight status) and to expand payment for these services. Methods In July 2015, forty-three cross-sector stakeholders attended a conference supported by the Agency for Healthcare Research and Quality, American Academy of Pediatrics Institute for Healthy Childhood Weight, and The Obesity Society. Plenary sessions presenting scientific evidence and clinical and payment practices were interspersed with breakout sessions to identify consensus recommendations. Results Consensus recommendations for childhood obesity treatment included: family-based multicomponent behavioral therapy; integrated care model; and multi-disciplinary care team. The use of evidence-based protocols, a well-trained healthcare team, medical oversight, and treatment at or above the minimum dose (e.g. >25 hours) are critical components to ensure effective delivery of high-quality care and to achieve clinically meaningful weight loss. Approaches to secure reimbursement for evidence-based obesity treatment within payment models were recommended. Conclusion Continued cross-sector collaboration is crucial to ensure a unified approach to increase payment and access for childhood obesity treatment and to scale-up training to ensure quality of care. PMID:27925451
On local search for bi-objective knapsack problems.
Liefooghe, Arnaud; Paquete, Luís; Figueira, José Rui
2013-01-01
In this article, a local search approach is proposed for three variants of the bi-objective binary knapsack problem, with the aim of maximizing the total profit and minimizing the total weight. First, an experimental study on a given structural property of connectedness of the efficient set is conducted. Based on this property, a local search algorithm is proposed and its performance is compared to exact algorithms in terms of runtime and quality metrics. The experimental results indicate that this simple local search algorithm is able to find a representative set of optimal solutions in most of the cases, and in much less time than exact algorithms.
The Italian Dementia National Plan. Commentary.
Di Fiandra, Teresa; Canevelli, Marco; Di Pucchio, Alessandra; Vanacore, Nicola
2015-01-01
The Italian Dementia National Plan was formulated in October 2014 by the Italian Ministry of Health in close cooperation with the regions, the National Institute of Health and the three major national associations of patients and carers. The main purpose of this strategy was to provide directive indications for promoting and improving interventions in the dementia field, not limiting to specialist and therapeutic actions, but particularly focusing on the support of patients and families throughout the pathways of care. Four main objectives are indicated: 1) promote health- and social-care interventions and policies; 2) create/strengthen the integrated network of services for dementia based on an integrated approach; 3) implement strategies for promoting appropriateness and quality of care; and 4) improve the quality of life of persons with dementia and their families by supporting empowerment and stigma reduction. These objectives and the pertaining actions are described in the present paper.
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Maclean, A.; Tolson, B. A.; Burn, D. H.
2009-05-01
Hydrologic model calibration aims to find a set of parameters that adequately simulates observations of watershed behavior, such as streamflow, or a state variable, such as snow water equivalent (SWE). There are different metrics for evaluating calibration effectiveness that involve quantifying prediction errors, such as the Nash-Sutcliffe (NS) coefficient and bias evaluated for the entire calibration period, on a seasonal basis, for low flows, or for high flows. Many of these metrics are conflicting such that the set of parameters that maximizes the high flow NS differs from the set of parameters that maximizes the low flow NS. Conflicting objectives are very likely when different calibration objectives are based on different fluxes and/or state variables (e.g., NS based on streamflow versus SWE). One of the most popular ways to balance different metrics is to aggregate them based on their importance and find the set of parameters that optimizes a weighted sum of the efficiency metrics. Comparing alternative hydrologic models (e.g., assessing model improvement when a process or more detail is added to the model) based on the aggregated objective might be misleading since it represents one point on the tradeoff of desired error metrics. To derive a more comprehensive model comparison, we solved a bi-objective calibration problem to estimate the tradeoff between two error metrics for each model. Although this approach is computationally more expensive than the aggregation approach, it results in a better understanding of the effectiveness of selected models at each level of every error metric and therefore provides a better rationale for judging relative model quality. The two alternative models used in this study are two MESH hydrologic models (version 1.2) of the Wolf Creek Research basin that differ in their watershed spatial discretization (a single Grouped Response Unit, GRU, versus multiple GRUs). The MESH model, currently under development by Environment Canada, is a coupled land-surface and hydrologic model. Results will demonstrate the conclusions a modeller might make regarding the value of additional watershed spatial discretization under both an aggregated (single-objective) and multi-objective model comparison framework.
Enhancing the quality and credibility of qualitative analysis.
Patton, M Q
1999-12-01
Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.
NASA Astrophysics Data System (ADS)
Ghaffarian, S.; Ghaffarian, S.
2014-08-01
This paper presents a novel approach to detect the buildings by automization of the training area collecting stage for supervised classification. The method based on the fact that a 3d building structure should cast a shadow under suitable imaging conditions. Therefore, the methodology begins with the detection and masking out the shadow areas using luminance component of the LAB color space, which indicates the lightness of the image, and a novel double thresholding technique. Further, the training areas for supervised classification are selected by automatically determining a buffer zone on each building whose shadow is detected by using the shadow shape and the sun illumination direction. Thereafter, by calculating the statistic values of each buffer zone which is collected from the building areas the Improved Parallelepiped Supervised Classification is executed to detect the buildings. Standard deviation thresholding applied to the Parallelepiped classification method to improve its accuracy. Finally, simple morphological operations conducted for releasing the noises and increasing the accuracy of the results. The experiments were performed on set of high resolution Google Earth images. The performance of the proposed approach was assessed by comparing the results of the proposed approach with the reference data by using well-known quality measurements (Precision, Recall and F1-score) to evaluate the pixel-based and object-based performances of the proposed approach. Evaluation of the results illustrates that buildings detected from dense and suburban districts with divers characteristics and color combinations using our proposed method have 88.4 % and 853 % overall pixel-based and object-based precision performances, respectively.
A multi-objective approach to improve SWAT model calibration in alpine catchments
NASA Astrophysics Data System (ADS)
Tuo, Ye; Marcolini, Giorgia; Disse, Markus; Chiogna, Gabriele
2018-04-01
Multi-objective hydrological model calibration can represent a valuable solution to reduce model equifinality and parameter uncertainty. The Soil and Water Assessment Tool (SWAT) model is widely applied to investigate water quality and water management issues in alpine catchments. However, the model calibration is generally based on discharge records only, and most of the previous studies have defined a unique set of snow parameters for an entire basin. Only a few studies have considered snow observations to validate model results or have taken into account the possible variability of snow parameters for different subbasins. This work presents and compares three possible calibration approaches. The first two procedures are single-objective calibration procedures, for which all parameters of the SWAT model were calibrated according to river discharge alone. Procedures I and II differ from each other by the assumption used to define snow parameters: The first approach assigned a unique set of snow parameters to the entire basin, whereas the second approach assigned different subbasin-specific sets of snow parameters to each subbasin. The third procedure is a multi-objective calibration, in which we considered snow water equivalent (SWE) information at two different spatial scales (i.e. subbasin and elevation band), in addition to discharge measurements. We tested these approaches in the Upper Adige river basin where a dense network of snow depth measurement stations is available. Only the set of parameters obtained with this multi-objective procedure provided an acceptable prediction of both river discharge and SWE. These findings offer the large community of SWAT users a strategy to improve SWAT modeling in alpine catchments.
Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…
Canada-wide standards and innovative transboundary air quality initiatives.
Barton, Jane
2008-01-01
Canada's approach to air quality management is one that has brought with it opportunities for the development of unique approaches to risk management. Even with Canada's relatively low levels of pollution, science has demonstrated clearly that air quality and ecosystem improvements are worthwhile. To achieve change and address air quality in Canada, Canadian governments work together since, under the constitution, they share responsibility for the environment. At the same time, because air pollution knows no boundaries, working with the governments of other nations is essential to get results. International cooperation at all levels provides opportunities with potential for real change. Cooperation within transboundary airsheds is proving a fruitful source of innovative opportunities to reduce cross-border barriers to air quality improvements. In relation to the NERAM Colloquium objective to establish principles for air quality management based on the identification of international best practice in air quality policy development and implementation, Canada has developed, both at home and with the United States, interesting air management strategies and initiatives from which certain lessons may be taken that could be useful in other countries with similar situations. In particular, the Canada-wide strategies for smog and acid rain were developed by Canadian governments, strategies that improve and protect air quality at home, while Canada-U.S. transboundary airshed projects provide examples of international initiatives to improve air quality.
Paquet, Catherine; St-Arnaud-Mckenzie, Danielle; Ferland, Guylaine; Dubé, Laurette
2003-03-01
Ensuring nutritionally adequate food intake in institutions is a complex and important challenge for dietitians. To tackle this problem, we argue that dietitians need to adopt a systematic, integrative, and patient-centered approach to identify and manage more effectively organizational determinants of the quality of food intake under their control. In this study, we introduce such an approach, the blueprint-based case study, that we applied in the context of a midterm care facility for elderly patients. Data gathered through interviews and field observations were used to develop, from the perspective of key patient encounters, detailed representations of the food, nutrition, and nursing activities necessary to ensure adequate food intake. These service "blueprints" were developed to illustrate all activities that might potentially impact on the nutritional, sensory, functional, and social quality of patients' meals. They were also used as roadmaps to develop a case study analysis in which critical areas were identified and opportunities for improvement put forth, while considering services' resources and priorities. By providing a precise, objective, yet comprehensive mapping of the service operations and management, the blueprint-based case study approach represents a valuable tool to determine the optimal allocation of resources to insure nutritionally adequate food intake to patients.
Robustness and cognition in stabilization problem of dynamical systems based on asymptotic methods
NASA Astrophysics Data System (ADS)
Dubovik, S. A.; Kabanov, A. A.
2017-01-01
The problem of synthesis of stabilizing systems based on principles of cognitive (logical-dynamic) control for mobile objects used under uncertain conditions is considered. This direction in control theory is based on the principles of guaranteeing robust synthesis focused on worst-case scenarios of the controlled process. The guaranteeing approach is able to provide functioning of the system with the required quality and reliability only at sufficiently low disturbances and in the absence of large deviations from some regular features of the controlled process. The main tool for the analysis of large deviations and prediction of critical states here is the action functional. After the forecast is built, the choice of anti-crisis control is the supervisory control problem that optimizes the control system in a normal mode and prevents escape of the controlled process in critical states. An essential aspect of the approach presented here is the presence of a two-level (logical-dynamic) control: the input data are used not only for generating of synthesized feedback (local robust synthesis) in advance (off-line), but also to make decisions about the current (on-line) quality of stabilization in the global sense. An example of using the presented approach for the problem of development of the ship tilting prediction system is considered.
Hofmann, Julia; Kien, Christina; Gartlehner, Gerald
2015-01-01
Evidence-based information materials about the pros and cons of cancer screening are important sources for men and women to decide for or against cancer screening. The aim of this paper was to compare recommendations from different cancer institutions in German-speaking countries (Austria, Germany, and Switzerland) regarding screening for breast, cervix, colon, and prostate cancer and to assess the quality and development process of patient information materials. Relevant information material was identified through web searches and personal contact with cancer institutions. To achieve our objective, we employed a qualitative approach. The quality of 22 patient information materials was analysed based on established guidance by Bunge et al. In addition, we conducted guided interviews about the process of developing information materials with decision-makers of cancer institutes. Overall, major discrepancies in cancer screening recommendations exist among the Austrian, German, and Swiss cancer institutes. Process evaluation revealed that crucial steps of quality assurance, such as assembling a multi-disciplinary panel, assessing conflicts of interest, or transparency regarding funding sources, have frequently not been undertaken. All information materials had substantial quality deficits in multiple areas. Three out of four institutes issued information materials that met fewer than half of the quality criteria. Most patient information materials of cancer institutes in German-speaking countries are fraught with substantial deficits and do not provide an objective source for patients to be able to make an informed decision for or against cancer screening. Copyright © 2015. Published by Elsevier GmbH.
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C. A.; Zielstorff, R. D.; Fox, R. L.; O'Connell, E. M.; Carroll, D. L.; Conley, K. A.; Fitzgerald, P.; Eng, T. K.; Martin, A.; Zidik, C. M.; Segal, M.
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring. PMID:11079970
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C A; Zielstorff, R D; Fox, R L; O'Connell, E M; Carroll, D L; Conley, K A; Fitzgerald, P; Eng, T K; Martin, A; Zidik, C M; Segal, M
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring.
Baby-MONITOR: A Composite Indicator of NICU Quality
Kowalkowski, Marc A.; Zupancic, John A. F.; Pietz, Kenneth; Richardson, Peter; Draper, David; Hysong, Sylvia J.; Thomas, Eric J.; Petersen, Laura A.; Gould, Jeffrey B.
2014-01-01
BACKGROUND AND OBJECTIVES: NICUs vary in the quality of care delivered to very low birth weight (VLBW) infants. NICU performance on 1 measure of quality only modestly predicts performance on others. Composite measurement of quality of care delivery may provide a more comprehensive assessment of quality. The objective of our study was to develop a robust composite indicator of quality of NICU care provided to VLBW infants that accurately discriminates performance among NICUs. METHODS: We developed a composite indicator, Baby-MONITOR, based on 9 measures of quality chosen by a panel of experts. Measures were standardized, equally weighted, and averaged. We used the California Perinatal Quality Care Collaborative database to perform across-sectional analysis of care given to VLBW infants between 2004 and 2010. Performance on the Baby-MONITOR is not an absolute marker of quality but indicates overall performance relative to that of the other NICUs. We used sensitivity analyses to assess the robustness of the composite indicator, by varying assumptions and methods. RESULTS: Our sample included 9023 VLBW infants in 22 California regional NICUs. We found significant variations within and between NICUs on measured components of the Baby-MONITOR. Risk-adjusted composite scores discriminated performance among this sample of NICUs. Sensitivity analysis that included different approaches to normalization, weighting, and aggregation of individual measures showed the Baby-MONITOR to be robust (r = 0.89–0.99). CONCLUSIONS: The Baby-MONITOR may be a useful tool to comprehensively assess the quality of care delivered by NICUs. PMID:24918221
Multi-objective experimental design for (13)C-based metabolic flux analysis.
Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel
2015-10-01
(13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective design should stimulate its application within the field of (13)C-based metabolic flux analysis. Copyright © 2015 Elsevier Inc. All rights reserved.
TH-D-204-00: The Pursuit of Radiation Oncology Performance Excellence
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance U.S. business competitiveness and economic growth. Administered by the National Institute of Standards and Technology NIST, the Act created the Baldrige National Quality Program, now renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact based knowledge driven system for improving quality of care,more » increasing patient satisfaction, building employee engagement, and boosting organizational innovation. The methodology also provides a valuable framework for benchmarking an individual radiation oncology practice against guidelines defined by accreditation and professional organizations and regulatory agencies. Learning Objectives: To gain knowledge of the Baldrige Performance Excellence Program as it relates to Radiation Oncology. To appreciate the value of a multidisciplinary self-assessment approach in the pursuit of Radiation Oncology quality care, patient satisfaction, and workforce commitment. To acquire a set of useful measurement tools with which an individual Radiation Oncology practice can benchmark its performance against guidelines defined by accreditation and professional organizations and regulatory agencies.« less
TH-D-204-01: The Pursuit of Radiation Oncology Performance Excellence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sternick, E.
The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance U.S. business competitiveness and economic growth. Administered by the National Institute of Standards and Technology NIST, the Act created the Baldrige National Quality Program, now renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact based knowledge driven system for improving quality of care,more » increasing patient satisfaction, building employee engagement, and boosting organizational innovation. The methodology also provides a valuable framework for benchmarking an individual radiation oncology practice against guidelines defined by accreditation and professional organizations and regulatory agencies. Learning Objectives: To gain knowledge of the Baldrige Performance Excellence Program as it relates to Radiation Oncology. To appreciate the value of a multidisciplinary self-assessment approach in the pursuit of Radiation Oncology quality care, patient satisfaction, and workforce commitment. To acquire a set of useful measurement tools with which an individual Radiation Oncology practice can benchmark its performance against guidelines defined by accreditation and professional organizations and regulatory agencies.« less
O'Donnell, Sean T; Caldwell, Michael D; Barlaz, Morton A; Morris, Jeremy W F
2018-05-01
Municipal solid waste (MSW) landfills in the USA are regulated under Subtitle D of the Resource Conservation and Recovery Act (RCRA), which includes the requirement to protect human health and the environment (HHE) during the post-closure care (PCC) period. Several approaches have been published for assessment of potential threats to HHE. These approaches can be broadly divided into organic stabilization, which establishes an inert waste mass as the ultimate objective, and functional stability, which considers long-term emissions in the context of minimizing threats to HHE in the absence of active controls. The objective of this research was to conduct a case study evaluation of a closed MSW landfill using long-term data on landfill gas (LFG) production, leachate quality, site geology, and solids decomposition. Evaluations based on both functional and organic stability criteria were compared. The results showed that longer periods of LFG and leachate management would be required using organic stability criteria relative to an approach based on functional stability. These findings highlight the somewhat arbitrary and overly stringent nature of assigning universal stability criteria without due consideration of the landfill's hydrogeologic setting and potential environmental receptors. This supports previous studies that advocated for transition to a passive or inactive control stage based on a performance-based functional stability framework as a defensible mechanism for optimizing and ending regulatory PCC. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fast global image smoothing based on weighted least squares.
Min, Dongbo; Choi, Sunghwan; Lu, Jiangbo; Ham, Bumsub; Sohn, Kwanghoon; Do, Minh N
2014-12-01
This paper presents an efficient technique for performing a spatially inhomogeneous edge-preserving image smoothing, called fast global smoother. Focusing on sparse Laplacian matrices consisting of a data term and a prior term (typically defined using four or eight neighbors for 2D image), our approach efficiently solves such global objective functions. In particular, we approximate the solution of the memory-and computation-intensive large linear system, defined over a d-dimensional spatial domain, by solving a sequence of 1D subsystems. Our separable implementation enables applying a linear-time tridiagonal matrix algorithm to solve d three-point Laplacian matrices iteratively. Our approach combines the best of two paradigms, i.e., efficient edge-preserving filters and optimization-based smoothing. Our method has a comparable runtime to the fast edge-preserving filters, but its global optimization formulation overcomes many limitations of the local filtering approaches. Our method also achieves high-quality results as the state-of-the-art optimization-based techniques, but runs ∼10-30 times faster. Besides, considering the flexibility in defining an objective function, we further propose generalized fast algorithms that perform Lγ norm smoothing (0 < γ < 2) and support an aggregated (robust) data term for handling imprecise data constraints. We demonstrate the effectiveness and efficiency of our techniques in a range of image processing and computer graphics applications.
Lau, Erica Y; Wong, Del P; Ransdell, Lynda
2011-01-01
Background A growing body of research has employed information and communication technologies (ICTs) such as the Internet and mobile phones for disseminating physical activity (PA) interventions with young populations. Although several systematic reviews have documented the effects of ICT-based interventions on PA behavior, very few have focused on children and adolescents specifically. Objectives The present review aimed to systematically evaluate the efficacy and methodological quality of ICT-based PA interventions for children and adolescents based on evidence from randomized controlled trials. Methods Electronic databases Medline, PsycInfo, CINAHL, and Web of Science were searched to retrieve English language articles published in international academic peer-reviewed journals from January 1, 1997, through December 31, 2009. Included were articles that provided descriptions of interventions designed to improve PA-related cognitive, psychosocial, and behavioral outcomes and that used randomized controlled trial design, included only children (6-12 years old) and adolescents (13-18 years old) in both intervention and control groups, and employed Internet, email, and/or short message services (SMS, also known as text messaging) as one or more major or assistive modes to deliver the intervention. Results In total, 9 studies were analyzed in the present review. All studies were published after 2000 and conducted in Western countries. Of the 9 studies, 7 demonstrated positive and significant within-group differences in at least one psychosocial or behavioral PA outcome. In all, 3 studies reported positive and significant between-group differences favoring the ICT group. When between-group differences were compared across studies, effect sizes were small in 6 studies and large in 3 studies. With respect to methodological quality, 7 of the 9 studies had good methodological quality. Failure to report allocation concealment, blinding to outcome assessment, and lack of long-term follow-up were the criteria met by the fewest studies. In addition, 5 studies measured the intervention exposure rate and only 1 study employed objective measures to record data. Conclusion The present review provides evidence supporting the positive effects of ICTs in PA interventions for children and adolescents, especially when used with other delivery approaches (ie, face-to-face). Because ICT delivery approaches are often mixed with other approaches and these studies sometimes lack a comparable control group, additional research is needed to establish the true independent effects of ICT as an intervention delivery mode. Although two-thirds of the studies demonstrated satisfactory methodological quality, several quality criteria should be considered in future studies: clear descriptions of allocation concealment and blinding of outcome assessment, extension of intervention duration, and employment of objective measures in intervention exposure rate. Due to the small number of studies that met inclusion criteria and the lack of consistent evidence, researchers should be cautious when interpreting the findings of the present review. PMID:21749967
Holographic display system for restoration of sight to the blind
Goetz, G A; Mandel, Y; Manivanh, R; Palanker, D V; Čižmár, T
2013-01-01
Objective We present a holographic near-the-eye display system enabling optical approaches for sight restoration to the blind, such as photovoltaic retinal prosthesis, optogenetic and other photoactivation techniques. We compare it with conventional LCD or DLP-based displays in terms of image quality, field of view, optical efficiency and safety. Approach We detail the optical configuration of the holographic display system and its characterization using a phase-only spatial light modulator. Main results We describe approaches to controlling the zero diffraction order and speckle related issues in holographic display systems and assess the image quality of such systems. We show that holographic techniques offer significant advantages in terms of peak irradiance and power efficiency, and enable designs that are inherently safer than LCD or DLP-based systems. We demonstrate the performance of our holographic display system in the assessment of cortical response to alternating gratings projected onto the retinas of rats. Significance We address the issues associated with the design of high brightness, near-the-eye display systems and propose solutions to the efficiency and safety challenges with an optical design which could be miniaturized and mounted onto goggles. PMID:24045579
Scanlon, Dennis P; Wolf, Laura J; Alexander, Jeffrey A; Christianson, Jon B; Greene, Jessica; Jean-Jacques, Muriel; McHugh, Megan; Shi, Yunfeng; Leitzell, Brigitt; Vanderbrink, Jocelyn M
2016-08-01
The Aligning Forces for Quality (AF4Q) initiative was the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site complex program, RWJF funded an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced during the summative evaluation phase of this near decade-long program are discussed. A descriptive overview of the summative research design and its development for a multi-site, community-based, healthcare quality improvement initiative is provided. The summative research design employed by the evaluation team is discussed. The evaluation team's summative research design involved a data-driven assessment of the effectiveness of the AF4Q program at large, assessments of the impact of AF4Q in the specific programmatic areas, and an assessment of how the AF4Q alliances were positioned for the future at the end of the program. The AF4Q initiative was the largest privately funded community-based healthcare improvement initiative in the United States to date and was implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The summative evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similarly complex community-based initiatives.
CAN-Care: an innovative model of practice-based learning.
Raines, Deborah A
2006-01-01
The "Collaborative Approach to Nursing Care" (CAN-Care) Model of practice-based education is designed to meet the unique learning needs of the accelerated nursing program student. The model is based on a synergistic partnership between the academic and service settings, the vision of which is to create an innovative practice-based learning model, resulting in a positive experience for both the student and unit-based nurse. Thus, the objectives of quality outcomes for both the college and Health Care Organization are fulfilled. Specifically, the goal is the education of nurses ready to meet the challenges of caring for persons in the complex health care environment of the 21st century.
Fuzzy logic controller to improve powerline communication
NASA Astrophysics Data System (ADS)
Tirrito, Salvatore
2015-12-01
The Power Line Communications (PLC) technology allows the use of the power grid in order to ensure the exchange of data information among devices. This work proposes an approach, based on Fuzzy Logic, that dynamically manages the amplitude of the signal, with which each node transmits, by processing the master-slave link quality measured and the master-slave distance. The main objective of this is to reduce both the impact of communication interferences induced and power consumption.
Competence and Quality in Real-Life Decision Making
2015-01-01
What distinguishes a competent decision maker and how should the issue of decision quality be approached in a real-life context? These questions were explored in three studies. In Study 1, using a web-based questionnaire and targeting a community sample, we investigated the relationships between objective and subjective indicators of real-life decision-making success. In Study 2 and 3, targeting two different samples of professionals, we explored if the prevalent cognitively oriented definition of decision-making competence could be beneficially expanded by adding aspects of competence in terms of social skills and time-approach. The predictive power for each of these three aspects of decision-making competence was explored for different indicators of real-life decision-making success. Overall, our results suggest that research on decision-making competence would benefit by expanding the definition of competence, by including decision-related abilities in terms of social skills and time-approach. Finally, the results also indicate that individual differences in real-life decision-making success profitably can be approached and measured by different criteria. PMID:26545239
Bueno, Juan M; Skorsetz, Martin; Palacios, Raquel; Gualda, Emilio J; Artal, Pablo
2014-01-01
Despite the inherent confocality and optical sectioning capabilities of multiphoton microscopy, three-dimensional (3-D) imaging of thick samples is limited by the specimen-induced aberrations. The combination of immersion objectives and sensorless adaptive optics (AO) techniques has been suggested to overcome this difficulty. However, a complex plane-by-plane correction of aberrations is required, and its performance depends on a set of image-based merit functions. We propose here an alternative approach to increase penetration depth in 3-D multiphoton microscopy imaging. It is based on the manipulation of the spherical aberration (SA) of the incident beam with an AO device while performing fast tomographic multiphoton imaging. When inducing SA, the image quality at best focus is reduced; however, better quality images are obtained from deeper planes within the sample. This is a compromise that enables registration of improved 3-D multiphoton images using nonimmersion objectives. Examples on ocular tissues and nonbiological samples providing different types of nonlinear signal are presented. The implementation of this technique in a future clinical instrument might provide a better visualization of corneal structures in living eyes.
NASA Astrophysics Data System (ADS)
Ramlau, R.; Saxenhuber, D.; Yudytskiy, M.
2014-07-01
The problem of atmospheric tomography arises in ground-based telescope imaging with adaptive optics (AO), where one aims to compensate in real-time for the rapidly changing optical distortions in the atmosphere. Many of these systems depend on a sufficient reconstruction of the turbulence profiles in order to obtain a good correction. Due to steadily growing telescope sizes, there is a strong increase in the computational load for atmospheric reconstruction with current methods, first and foremost the MVM. In this paper we present and compare three novel iterative reconstruction methods. The first iterative approach is the Finite Element- Wavelet Hybrid Algorithm (FEWHA), which combines wavelet-based techniques and conjugate gradient schemes to efficiently and accurately tackle the problem of atmospheric reconstruction. The method is extremely fast, highly flexible and yields superior quality. Another novel iterative reconstruction algorithm is the three step approach which decouples the problem in the reconstruction of the incoming wavefronts, the reconstruction of the turbulent layers (atmospheric tomography) and the computation of the best mirror correction (fitting step). For the atmospheric tomography problem within the three step approach, the Kaczmarz algorithm and the Gradient-based method have been developed. We present a detailed comparison of our reconstructors both in terms of quality and speed performance in the context of a Multi-Object Adaptive Optics (MOAO) system for the E-ELT setting on OCTOPUS, the ESO end-to-end simulation tool.
Yuan, Tao; Zheng, Xinqi; Hu, Xuan; Zhou, Wei; Wang, Wei
2014-01-01
Objective and effective image quality assessment (IQA) is directly related to the application of optical remote sensing images (ORSI). In this study, a new IQA method of standardizing the target object recognition rate (ORR) is presented to reflect quality. First, several quality degradation treatments with high-resolution ORSIs are implemented to model the ORSIs obtained in different imaging conditions; then, a machine learning algorithm is adopted for recognition experiments on a chosen target object to obtain ORRs; finally, a comparison with commonly used IQA indicators was performed to reveal their applicability and limitations. The results showed that the ORR of the original ORSI was calculated to be up to 81.95%, whereas the ORR ratios of the quality-degraded images to the original images were 65.52%, 64.58%, 71.21%, and 73.11%. The results show that these data can more accurately reflect the advantages and disadvantages of different images in object identification and information extraction when compared with conventional digital image assessment indexes. By recognizing the difference in image quality from the application effect perspective, using a machine learning algorithm to extract regional gray scale features of typical objects in the image for analysis, and quantitatively assessing quality of ORSI according to the difference, this method provides a new approach for objective ORSI assessment.
The Airline Quality Rating 1999
NASA Technical Reports Server (NTRS)
Bowen, Brent D.; Headley, Dean E.
1999-01-01
The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline performance on combined multiple criteria. This current report, Airline Quality Rating 1999, reflects an updated approach to calculating monthly Airline Quality Rating scores for 1998. AQR scores for the calendar year 1998 are based on 15 elements that focus on airline performance areas important to air travel consumers. The Airline Quality Rating is a summary of month-by-month quality ratings for the ten major U.S. airlines operating during 1998. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of on-time arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, major airlines comparative performance for the calendar year 1998 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for major airlines domestic operations for the 12 month period of 1998, and industry average results. Also, comparative Airline Quality Rating data for 1997, using the updated criteria, are included to provide a reference point regarding quality in the industry.
Fit for purpose quality management system for military forensic exploitation.
Wilson, Lauren Elizabeth; Gahan, Michelle Elizabeth; Robertson, James; Lennard, Chris
2018-03-01
In a previous publication we described a systems approach to forensic science applied in the military domain. The forensic science 'system of systems' describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military systems, with quality management being an important supporting system. Quality management systems help to ensure that organisations achieve their objective and continually improve their capability. Components of forensic science quality management systems can include standardisation of processes, accreditation of facilities to national/international standards, and certification of personnel. A fit for purpose quality management system should be balanced to allow organisations to meet objectives, provide continuous improvement; mitigate risk; and impart a positive quality culture. Considerable attention over the last decades has been given to the need for forensic science quality management systems to meet criminal justice and law enforcement objectives. More recently, the need for the forensic quality management systems to meet forensic intelligence objectives has been considered. This paper, for the first time, discusses the need for a fit for purpose quality management system for military forensic exploitation. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Lumb, Ashok; Halliwell, Doug; Sharma, Tribeni
2006-02-01
All six ecosystem initiatives evolved from many years of federal, provincial, First Nation, local government and community attention to the stresses on sensitive habitats and species, air and water quality, and the consequent threats to community livability. This paper assesses water quality aspect for the ecosystem initiatives and employs newly developed Canadian Council of Ministers of the Environment Water Quality Index (CCME WQI) which provides a convenient mean of summarizing complex water quality data that can be easily understood by the public, water distributors, planners, managers and policy makers. The CCME WQI incorporates three elements: Scope - the number of water quality parameters (variables) not meeting water quality objectives (F(1)); Frequency - the number of times the objectives are not met (F(2)); and Amplitude. the extent to which the objectives are not met (F(3)). The index produces a number between 0 (worst) to 100 (best) to reflect the water quality. This study evaluates water quality of the Mackenzie - Great Bear sub-basin by employing two modes of objective functions (threshold values): one based on the CCME water quality guidelines and the other based on site-specific values that were determined by the statistical analysis of the historical data base. Results suggest that the water quality of the Mackenzie-Great Bear sub-basin is impacted by high turbidity and total (mostly particulate) trace metals due to high suspended sediment loads during the open water season. Comments are also provided on water quality and human health issues in the Mackenzie basin based on the findings and the usefulness of CCME water quality guidelines and site specific values.
Wolff, Anthony H; Kellett, John
2011-12-01
Several approaches to measuring the quality of hospital care have been suggested. We propose the simple and objective approach of using the health related data of the patient administration systems and the laboratory results that have been collected and stored electronically in hospitals for years. Imaginative manipulation of this data can give new insights into the quality of patient care. Copyright © 2011 European Federation of Internal Medicine. All rights reserved.
Fast large-scale object retrieval with binary quantization
NASA Astrophysics Data System (ADS)
Zhou, Shifu; Zeng, Dan; Shen, Wei; Zhang, Zhijiang; Tian, Qi
2015-11-01
The objective of large-scale object retrieval systems is to search for images that contain the target object in an image database. Where state-of-the-art approaches rely on global image representations to conduct searches, we consider many boxes per image as candidates to search locally in a picture. In this paper, a feature quantization algorithm called binary quantization is proposed. In binary quantization, a scale-invariant feature transform (SIFT) feature is quantized into a descriptive and discriminative bit-vector, which allows itself to adapt to the classic inverted file structure for box indexing. The inverted file, which stores the bit-vector and box ID where the SIFT feature is located inside, is compact and can be loaded into the main memory for efficient box indexing. We evaluate our approach on available object retrieval datasets. Experimental results demonstrate that the proposed approach is fast and achieves excellent search quality. Therefore, the proposed approach is an improvement over state-of-the-art approaches for object retrieval.
The effect of input data transformations on object-based image analysis
LIPPITT, CHRISTOPHER D.; COULTER, LLOYD L.; FREEMAN, MARY; LAMANTIA-BISHOP, JEFFREY; PANG, WYSON; STOW, DOUGLAS A.
2011-01-01
The effect of using spectral transform images as input data on segmentation quality and its potential effect on products generated by object-based image analysis are explored in the context of land cover classification in Accra, Ghana. Five image data transformations are compared to untransformed spectral bands in terms of their effect on segmentation quality and final product accuracy. The relationship between segmentation quality and product accuracy is also briefly explored. Results suggest that input data transformations can aid in the delineation of landscape objects by image segmentation, but the effect is idiosyncratic to the transformation and object of interest. PMID:21673829
Human Rights-Based Approaches to Mental Health
Bradley, Valerie J.; Sahakian, Barbara J.
2016-01-01
Abstract The incidence of human rights violations in mental health care across nations has been described as a “global emergency” and an “unresolved global crisis.” The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers. PMID:27781015
Human Rights-Based Approaches to Mental Health: A Review of Programs.
Porsdam Mann, Sebastian; Bradley, Valerie J; Sahakian, Barbara J
2016-06-01
The incidence of human rights violations in mental health care across nations has been described as a "global emergency" and an "unresolved global crisis." The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers.
Opinion versus evidence for the need to move away from animal testing.
Hartung, Thomas
2017-01-01
Science is based on facts and their discourse. Willingly or unwillingly, facts are mixed with opinion, i.e., views or judgments formed, not necessarily based on fact or knowledge. This is often necessary, where we have controversial facts or no definitive evidence yet, because we need to take decisions or have to prioritize. Evidence-based approaches aim at identifying the facts and their quality objectively and transparently; they are now increasingly embraced in toxicology, especially by employing systematic reviews, meta-analyses, quality scoring, risk-of-bias tools, etc. These are core to Evidence-based Toxicology. Such approaches aim at minimizing opinion, the "eminence-based" part of science. Animal experiments are the basis of a lot of our textbook knowledge in the life sciences, have helped to develop desperately needed therapies, and have made this world a safer place. However, they represent only one of the many possible approaches to accomplish all these things. Like all approaches, they come with shortcomings, and their true contribution is often overrated. This article aims to summarize their limitations and challenges beside the ethical and economical concerns (i.e., costs and duration as well as costs following wrong decisions in product development): they include reproducibility, inadequate reporting, statistical under-powering, lack of inter-species predictivity, lack of reflection of human diversity and of real-life exposure. Each and every one of these increasingly discussed aspects of animal experiments can be amended, but this would require enormous additional resources. Together, they prompt a need to engineer a new paradigm to ensure the safety of patients and consumers, new products and therapies.
Comprehensive Aspectual UML approach to support AspectJ.
Magableh, Aws; Shukur, Zarina; Ali, Noorazean Mohd
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
Comprehensive Aspectual UML Approach to Support AspectJ
Magableh, Aws; Shukur, Zarina; Mohd. Ali, Noorazean
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a “good design” criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs. PMID:25136656
Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-01-01
Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840
Harlow C. Landphair
1979-01-01
This paper relates the evolution of an empirical model used to predict public response to scenic quality objectively. The text relates the methods used to develop the visual quality index model, explains the terms used in the equation and briefly illustrates how the model is applied and how it is tested. While the technical application of the model relies heavily on...
Mahjouri, Najmeh; Ardestani, Mojtaba
2011-01-01
In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.
[Interaction between clinical and research towards venture business].
Sumida, Iori
2014-01-01
The author as a medical physicist has supported multiple institutions where the advanced radiation therapies as well as the conventional radiation therapy have been performed. Since the advanced radiation treatment techniques have spread rapidly, the quality assurance (QA) has been more important and complex that results in the increase of QA items. In order to maintain the quality of radiation therapy as accurate as possible, the efficient and objective approach for performing QA should be important. Author has developed some QA software which has solved those approaches based on the experiment. In this paper the background in multiple institutions as a view point of radiation treatment situation is presented and what author contributes to those institutions by a medical physics support is shown, finally it is considered that how the developed software has spread in Japan and used for many institutions via venture business.
Implementing the patient-centered medical home in residency education.
Doolittle, Benjamin R; Tobin, Daniel; Genao, Inginia; Ellman, Matthew; Ruser, Christopher; Brienza, Rebecca
2015-01-01
In recent years, physician groups, government agencies and third party payers in the United States of America have promoted a Patient-centered Medical Home (PCMH) model that fosters a team-based approach to primary care. Advocates highlight the model's collaborative approach where physicians, mid-level providers, nurses and other health care personnel coordinate their efforts with an aim for high-quality, efficient care. Early studies show improvement in quality measures, reduction in emergency room visits and cost savings. However, implementing the PCMH presents particular challenges to physician training programs, including institutional commitment, infrastructure expenditures and faculty training. Teaching programs must consider how the objectives of the PCMH model align with recent innovations in resident evaluation now required by the Accreditation Council of Graduate Medical Education (ACGME) in the US. This article addresses these challenges, assesses the preliminary success of a pilot project, and proposes a viable, realistic model for implementation at other institutions.
Development of framework for sustainable Lean implementation: an ISM approach
NASA Astrophysics Data System (ADS)
Jadhav, Jagdish Rajaram; Mantha, S. S.; Rane, Santosh B.
2014-07-01
The survival of any organization depends upon its competitive edge. Even though Lean is one of the most powerful quality improvement methodologies, nearly two-thirds of the Lean implementations results in failures and less than one-fifth of those implemented have sustained results. One of the most significant tasks of top management is to identify, understand and deploy the significant Lean practices like quality circle, Kanban, Just-in-time purchasing, etc. The term `bundle' is used to make groups of inter-related and internally consistent Lean practices. Eight significant Lean practice bundles have been identified based on literature reviewed and opinion of the experts. The order of execution of Lean practice bundles is very important. Lean practitioners must be able to understand the interrelationship between these practice bundles. The objective of this paper is to develop framework for sustainable Lean implementation using interpretive structural modelling approach.
JPEG vs. JPEG 2000: an objective comparison of image encoding quality
NASA Astrophysics Data System (ADS)
Ebrahimi, Farzad; Chamik, Matthieu; Winkler, Stefan
2004-11-01
This paper describes an objective comparison of the image quality of different encoders. Our approach is based on estimating the visual impact of compression artifacts on perceived quality. We present a tool that measures these artifacts in an image and uses them to compute a prediction of the Mean Opinion Score (MOS) obtained in subjective experiments. We show that the MOS predictions by our proposed tool are a better indicator of perceived image quality than PSNR, especially for highly compressed images. For the encoder comparison, we compress a set of 29 test images with two JPEG encoders (Adobe Photoshop and IrfanView) and three JPEG2000 encoders (JasPer, Kakadu, and IrfanView) at various compression ratios. We compute blockiness, blur, and MOS predictions as well as PSNR of the compressed images. Our results show that the IrfanView JPEG encoder produces consistently better images than the Adobe Photoshop JPEG encoder at the same data rate. The differences between the JPEG2000 encoders in our test are less pronounced; JasPer comes out as the best codec, closely followed by IrfanView and Kakadu. Comparing the JPEG- and JPEG2000-encoding quality of IrfanView, we find that JPEG has a slight edge at low compression ratios, while JPEG2000 is the clear winner at medium and high compression ratios.
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Astrophysics Data System (ADS)
Hartung, Christine; Spraul, Raphael; Schuchert, Tobias
2017-10-01
Wide area motion imagery (WAMI) acquired by an airborne multicamera sensor enables continuous monitoring of large urban areas. Each image can cover regions of several square kilometers and contain thousands of vehicles. Reliable vehicle tracking in this imagery is an important prerequisite for surveillance tasks, but remains challenging due to low frame rate and small object size. Most WAMI tracking approaches rely on moving object detections generated by frame differencing or background subtraction. These detection methods fail when objects slow down or stop. Recent approaches for persistent tracking compensate for missing motion detections by combining a detection-based tracker with a second tracker based on appearance or local context. In order to avoid the additional complexity introduced by combining two trackers, we employ an alternative single tracker framework that is based on multiple hypothesis tracking and recovers missing motion detections with a classifierbased detector. We integrate an appearance-based similarity measure, merge handling, vehicle-collision tests, and clutter handling to adapt the approach to the specific context of WAMI tracking. We apply the tracking framework on a region of interest of the publicly available WPAFB 2009 dataset for quantitative evaluation; a comparison to other persistent WAMI trackers demonstrates state of the art performance of the proposed approach. Furthermore, we analyze in detail the impact of different object detection methods and detector settings on the quality of the output tracking results. For this purpose, we choose four different motion-based detection methods that vary in detection performance and computation time to generate the input detections. As detector parameters can be adjusted to achieve different precision and recall performance, we combine each detection method with different detector settings that yield (1) high precision and low recall, (2) high recall and low precision, and (3) best f-score. Comparing the tracking performance achieved with all generated sets of input detections allows us to quantify the sensitivity of the tracker to different types of detector errors and to derive recommendations for detector and parameter choice.
Quan, May Lynn; Wells, Bryan J; McCready, David; Wright, Frances C; Fraser, Novlette; Gagliardi, Anna R
2010-02-01
Sentinel lymph node biopsy (SNLB) has been adopted as the standard method of axillary staging for women with clinically node-negative early-stage breast cancer. The false negative rate as a quality indicator is impractical given the need for a completion axillary dissection to calculate. The objective of this study was to develop practical quality indicators for SLNB using an expert consensus method and to determine if they were feasible to measure. We used a modified Delphi consensus process to develop quality indicators for SLNB. A multidisciplinary expert panel reviewed potential indicators extracted from the medical literature to select quality indicators that were relevant and measurable. Feasibility was determined by abstracting the quality indicator variables from a retrospective chart review. The expert panel prioritized 11 quality indicators as benchmarks for assessing the quality of surgical care in SNLB. Nine of the indicators were measurable at the chart or institutional level. A systematic evidence- and consensus-based approach was used to develop measurable quality indicators that could be used by practicing surgeons and administrators to evaluate performance of SLNB in breast cancer.
The personal shopper – a pilot randomized trial of grocery store-based dietary advice
Lewis, K H; Roblin, D W; Leo, M; Block, J P
2015-01-01
The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index – 2005 [0–100 points] ), and nutritional knowledge (0–65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. PMID:25873139
The personal shopper--a pilot randomized trial of grocery store-based dietary advice.
Lewis, K H; Roblin, D W; Leo, M; Block, J P
2015-06-01
The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index--2005 [0-100 points]), and nutritional knowledge (0-65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. © 2015 The Authors. Clinical Obesity published by John Wiley & Sons Ltd on behalf of World Obesity.
Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker
2012-08-01
Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.
Building distributed rule-based systems using the AI Bus
NASA Technical Reports Server (NTRS)
Schultz, Roger D.; Stobie, Iain C.
1990-01-01
The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.
NASA Technical Reports Server (NTRS)
Soule, Veronique
1989-01-01
This study was initiated to provide an approach to the development of a permanently manned Mars base. The objectives for a permanently manned Mars base are numerous. Primarily, human presence on Mars will allow utilization of new resources for the improvement of the quality of life on Earth, allowing for new discoveries in technologies, the solar system, and human physiology. Such a mission would also encourage interaction between different countries, increasing international cooperation and leading to a stronger unification of mankind. Surface studies of Mars, scientific experiments in the multiple fields, the research for new minerals, and natural resource production are more immediate goals of the Mars mission. Finally, in the future, colonization of Mars will ensure man's perpetual presence in the universe. Specific objectives of this study were: (1) to design a Mars habitat that minimizes the mass delivered to the Mars surface, provides long-stay capability for the base crew, and accommodates future expansion and modification; (2) to develop a scenario of the construction of a permanently manned Mars base; and (3) to incorporate new and envisioned technologies.
Chougrani, Saada; Ouhadj, Salah
2014-01-01
Quality of care is a strategic priority of any management approach in order to meet users' expectations of health care systems. This study tried to define the role of patient satisfaction surveys and the place of user in the quality of care project. The results of patient satisfaction surveys conducted between 2010 and 2012 and the draft quality of care project were analysed. Patient satisfaction surveys from 2010 to 2012 focused on logistic shortcomings. No comment was formulated about health care. Comments and suggestions did not provide any contribution in terms of patient involvement in the health care process. The multiple perspectives of quality of care include clinical care and other social objectives of respect for the individual and attention to the patient. User satisfaction as assessed by patient satisfaction surveys or patients' experiences only reflect the health professionals' representation. However, the objective is to measure what the user perceives and feels and his/her representation of the attention provided. These approaches, conducted outside of the quality of care strategic plan, only provide a basis for actions with limited or no effectiveness.
Differentiated strategies for improving streaming service quality
NASA Astrophysics Data System (ADS)
An, Hui; Chen, Xin-Meng
2005-02-01
With the explosive growth of streaming services, users are becoming more and more sensitive to its quality of service. To handle these problems, the research community focuses of the application of caching and replication techniques. But most approaches try to find specific strategies of caching of replication that suit for streaming service characteristics and to design some kind of universal policy to deal with all streaming objects. This paper explores the combination of caching and replication for improving streaming service quality and demonstrates that it makes sense to incorporate two technologies. It provides a system model and discusses some related issues of how to determining a refreshable streaming object and which refreshment policies a refreshable object should use.
Estimation of 3D reconstruction errors in a stereo-vision system
NASA Astrophysics Data System (ADS)
Belhaoua, A.; Kohler, S.; Hirsch, E.
2009-06-01
The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.
NASA Astrophysics Data System (ADS)
Samigulina, Galina A.; Shayakhmetova, Assem S.
2016-11-01
Research objective is the creation of intellectual innovative technology and information Smart-system of distance learning for visually impaired people. The organization of the available environment for receiving quality education for visually impaired people, their social adaptation in society are important and topical issues of modern education.The proposed Smart-system of distance learning for visually impaired people can significantly improve the efficiency and quality of education of this category of people. The scientific novelty of proposed Smart-system is using intelligent and statistical methods of processing multi-dimensional data, and taking into account psycho-physiological characteristics of perception and awareness learning information by visually impaired people.
Response Ant Colony Optimization of End Milling Surface Roughness
Kadirgama, K.; Noor, M. M.; Abd Alla, Ahmed N.
2010-01-01
Metal cutting processes are important due to increased consumer demands for quality metal cutting related products (more precise tolerances and better product surface roughness) that has driven the metal cutting industry to continuously improve quality control of metal cutting processes. This paper presents optimum surface roughness by using milling mould aluminium alloys (AA6061-T6) with Response Ant Colony Optimization (RACO). The approach is based on Response Surface Method (RSM) and Ant Colony Optimization (ACO). The main objectives to find the optimized parameters and the most dominant variables (cutting speed, feedrate, axial depth and radial depth). The first order model indicates that the feedrate is the most significant factor affecting surface roughness. PMID:22294914
Adding Remote Sensing Data Products to the Nutrient Management Decision Support Toolbox
NASA Technical Reports Server (NTRS)
Lehrter, John; Schaeffer, Blake; Hagy, Jim; Spiering, Bruce; Blonski, Slawek; Underwood, Lauren; Ellis, Chris
2011-01-01
Some of the primary issues that manifest from nutrient enrichment and eutrophication (Figure 1) may be observed from satellites. For example, remotely sensed estimates of chlorophyll a (chla), total suspended solids (TSS), and light attenuation (Kd) or water clarity, which are often associated with elevated nutrient inputs, are data products collected daily and globally for coastal systems from satellites such as NASA s MODIS (Figure 2). The objective of this project is to inform water quality decision making activities using remotely sensed water quality data. In particular, we seek to inform the development of numeric nutrient criteria. In this poster we demonstrate an approach for developing nutrient criteria based on remotely sensed chla.
Adaptive gamma correction-based expert system for nonuniform illumination face enhancement
NASA Astrophysics Data System (ADS)
Abdelhamid, Iratni; Mustapha, Aouache; Adel, Oulefki
2018-03-01
The image quality of a face recognition system suffers under severe lighting conditions. Thus, this study aims to develop an approach for nonuniform illumination adjustment based on an adaptive gamma correction (AdaptGC) filter that can solve the aforementioned issue. An approach for adaptive gain factor prediction was developed via neural network model-based cross-validation (NN-CV). To achieve this objective, a gamma correction function and its effects on the face image quality with different gain values were examined first. Second, an orientation histogram (OH) algorithm was assessed as a face's feature descriptor. Subsequently, a density histogram module was developed for face label generation. During the NN-CV construction, the model was assessed to recognize the OH descriptor and predict the face label. The performance of the NN-CV model was evaluated by examining the statistical measures of root mean square error and coefficient of efficiency. Third, to evaluate the AdaptGC enhancement approach, an image quality metric was adopted using enhancement by entropy, contrast per pixel, second-derivative-like measure of enhancement, and sharpness, then supported by visual inspection. The experiment results were examined using five face's databases, namely, extended Yale-B, Carnegie Mellon University-Pose, Illumination, and Expression, Mobio, FERET, and Oulu-CASIA-NIR-VIS. The final results prove that AdaptGC filter implementation compared with state-of-the-art methods is the best choice in terms of contrast and nonuniform illumination adjustment. In summary, the benefits attained prove that AdaptGC is driven by a profitable enhancement rate, which provides satisfying features for high rate face recognition systems.
NASA Astrophysics Data System (ADS)
Bialas, James; Oommen, Thomas; Rebbapragada, Umaa; Levin, Eugene
2016-07-01
Object-based approaches in the segmentation and classification of remotely sensed images yield more promising results compared to pixel-based approaches. However, the development of an object-based approach presents challenges in terms of algorithm selection and parameter tuning. Subjective methods are often used, but yield less than optimal results. Objective methods are warranted, especially for rapid deployment in time-sensitive applications, such as earthquake damage assessment. Herein, we used a systematic approach in evaluating object-based image segmentation and machine learning algorithms for the classification of earthquake damage in remotely sensed imagery. We tested a variety of algorithms and parameters on post-event aerial imagery for the 2011 earthquake in Christchurch, New Zealand. Results were compared against manually selected test cases representing different classes. In doing so, we can evaluate the effectiveness of the segmentation and classification of different classes and compare different levels of multistep image segmentations. Our classifier is compared against recent pixel-based and object-based classification studies for postevent imagery of earthquake damage. Our results show an improvement against both pixel-based and object-based methods for classifying earthquake damage in high resolution, post-event imagery.
Dueñas, A.; González, M. A.; Muñoz, A.; Salvador, C. H.
1994-01-01
The objective of this proposal is to provide solutions for the necessities of teleconsultation or telediagnosis among medical professionals, using work stations within the X-Windows environment and applicable in communication lines with an extensive range of bandwidths and operating system independence. Among the advantages sought are savings in transportation, improvement in the quality of the medical attention provided and continued training for the medical professional. Images Fig. 1 PMID:7949963
Peters, Adam; Simpson, Peter; Moccia, Alessandra
2014-01-01
Recent years have seen considerable improvement in water quality standards (QS) for metals by taking account of the effect of local water chemistry conditions on their bioavailability. We describe preliminary efforts to further refine water quality standards, by taking account of the composition of the local ecological community (the ultimate protection objective) in addition to bioavailability. Relevance of QS to the local ecological community is critical as it is important to minimise instances where quality classification using QS does not reconcile with a quality classification based on an assessment of the composition of the local ecology (e.g. using benthic macroinvertebrate quality assessment metrics such as River InVertebrate Prediction and Classification System (RIVPACS)), particularly where ecology is assessed to be at good or better status, whilst chemical quality is determined to be failing relevant standards. The alternative approach outlined here describes a method to derive a site-specific species sensitivity distribution (SSD) based on the ecological community which is expected to be present at the site in the absence of anthropogenic pressures (reference conditions). The method combines a conventional laboratory ecotoxicity dataset normalised for bioavailability with field measurements of the response of benthic macroinvertebrate abundance to chemical exposure. Site-specific QSref are then derived from the 5%ile of this SSD. Using this method, site QSref have been derived for zinc in an area impacted by historic mining activities. Application of QSref can result in greater agreement between chemical and ecological metrics of environmental quality compared with the use of either conventional (QScon) or bioavailability-based QS (QSbio). In addition to zinc, the approach is likely to be applicable to other metals and possibly other types of chemical stressors (e.g. pesticides). However, the methodology for deriving site-specific targets requires additional development and validation before they can be robustly applied during surface water classification.
Evidence-based approach for continuous improvement of occupational health.
Manzoli, Lamberto; Sotgiu, Giovanni; Magnavita, Nicola; Durando, Paolo
2015-01-01
It was recognized early on that an Evidence-Based Medicine (EBM) approach could be applied to Public Health (PH), including the area of Occupational Health (OH). The aim of Evidence-Based Occupational Health (EBOH) is to ensure safety, health, and well-being in the workplace. Currently, high-quality research is necessary in order to provide arguments and scientific evidence upon which effective, efficient, and sustainable preventive measures and policies are to be developed in the workplace in Western countries. Occupational physicians need to integrate available scientific evidence and existing recommendations with a framework of national employment laws and regulations. This paper addresses the state of the art of scientific evidence available in the field (i.e., efficacy of interventions, usefulness of education and training of workers, and need of a multidisciplinary strategy integrated within the national PH programs) and the main critical issues for their implementation. Promoting good health is a fundamental part of the smart, inclusive growth objectives of Europe 2020 - Europe's growth strategy: keeping people healthy and active for longer has a positive impact on productivity and competitiveness. It appears clear that health quality and safety in the workplace play a key role for smart, sustainable, and inclusive growth in Western countries.
Lietz, Henrike; Lingani, Moustapha; Sié, Ali; Sauerborn, Rainer; Souares, Aurelia; Tozan, Yesim
2015-01-01
Background There are more than 40 Health and Demographic Surveillance System (HDSS) sites in 19 different countries. The running costs of HDSS sites are high. The financing of HDSS activities is of major importance, and adding external health surveys to the HDSS is challenging. To investigate the ways of improving data quality and collection efficiency in the Nouna HDSS in Burkina Faso, the stand-alone data collection activities of the HDSS and the Household Morbidity Survey (HMS) were integrated, and the paper-based questionnaires were consolidated into a single tablet-based questionnaire, the Comprehensive Disease Assessment (CDA). Objective The aims of this study are to estimate and compare the implementation costs of the two different survey approaches for measuring population health. Design All financial costs of stand-alone (HDSS and HMS) and integrated (CDA) surveys were estimated from the perspective of the implementing agency. Fixed and variable costs of survey implementation and key cost drivers were identified. The costs per household visit were calculated for both survey approaches. Results While fixed costs of survey implementation were similar for the two survey approaches, there were considerable variations in variable costs, resulting in an estimated annual cost saving of about US$45,000 under the integrated survey approach. This was primarily because the costs of data management for the tablet-based CDA survey were considerably lower than for the paper-based stand-alone surveys. The cost per household visit from the integrated survey approach was US$21 compared with US$25 from the stand-alone surveys for collecting the same amount of information from 10,000 HDSS households. Conclusions The CDA tablet-based survey method appears to be feasible and efficient for collecting health and demographic data in the Nouna HDSS in rural Burkina Faso. The possibility of using the tablet-based data collection platform to improve the quality of population health data requires further exploration. PMID:26257048
Detection of Obstacles in Monocular Image Sequences
NASA Technical Reports Server (NTRS)
Kasturi, Rangachar; Camps, Octavia
1997-01-01
The ability to detect and locate runways/taxiways and obstacles in images captured using on-board sensors is an essential first step in the automation of low-altitude flight, landing, takeoff, and taxiing phase of aircraft navigation. Automation of these functions under different weather and lighting situations, can be facilitated by using sensors of different modalities. An aircraft-based Synthetic Vision System (SVS), with sensors of different modalities mounted on-board, complements the current ground-based systems in functions such as detection and prevention of potential runway collisions, airport surface navigation, and landing and takeoff in all weather conditions. In this report, we address the problem of detection of objects in monocular image sequences obtained from two types of sensors, a Passive Millimeter Wave (PMMW) sensor and a video camera mounted on-board a landing aircraft. Since the sensors differ in their spatial resolution, and the quality of the images obtained using these sensors is not the same, different approaches are used for detecting obstacles depending on the sensor type. These approaches are described separately in two parts of this report. The goal of the first part of the report is to develop a method for detecting runways/taxiways and objects on the runway in a sequence of images obtained from a moving PMMW sensor. Since the sensor resolution is low and the image quality is very poor, we propose a model-based approach for detecting runways/taxiways. We use the approximate runway model and the position information of the camera provided by the Global Positioning System (GPS) to define regions of interest in the image plane to search for the image features corresponding to the runway markers. Once the runway region is identified, we use histogram-based thresholding to detect obstacles on the runway and regions outside the runway. This algorithm is tested using image sequences simulated from a single real PMMW image.
Two hybrid compaction algorithms for the layout optimization problem.
Xiao, Ren-Bin; Xu, Yi-Chun; Amos, Martyn
2007-01-01
In this paper we present two new algorithms for the layout optimization problem: this concerns the placement of circular, weighted objects inside a circular container, the two objectives being to minimize imbalance of mass and to minimize the radius of the container. This problem carries real practical significance in industrial applications (such as the design of satellites), as well as being of significant theoretical interest. We present two nature-inspired algorithms for this problem, the first based on simulated annealing, and the second on particle swarm optimization. We compare our algorithms with the existing best-known algorithm, and show that our approaches out-perform it in terms of both solution quality and execution time.
An experimental investigation of the flow physics of high-lift systems
NASA Technical Reports Server (NTRS)
Thomas, Flint O.; Nelson, R. C.
1995-01-01
This progress report, a series of viewgraphs, outlines experiments on the flow physics of confluent boundary layers for high lift systems. The design objective is to design high lift systems with improved C(sub Lmax) for landing approach and improved take-off L/D and simultaneously reduce acquisition and maintenance costs. In effect, achieve improved performance with simpler designs. The research objectives include: establish the role of confluent boundary layer flow physics in high-lift production; contrast confluent boundary layer structure for optimum and non-optimum C(sub L) cases; formation of a high quality, detailed archival data base for CFD/modeling; and examination of the role of relaminarization and streamline curvature.
NASA Astrophysics Data System (ADS)
Nash, A. E., III
2017-12-01
The most common approaches to identifying the most effective mission design to maximize science return from a potential set of competing alternative design approaches are often inefficient and inaccurate. Recently, Team-X at the Jet Propulsion Laboratory undertook an effort to improve both the speed and quality of science - measurement - mission design trade studies. We will report on the methodology & processes employed and their effectiveness in trade study speed and quality. Our results indicate that facilitated subject matter expert peers are the keys to speed and quality improvements in the effectiveness of science - measurement - mission design trade studies.
Soh, Harold; Demiris, Yiannis
2014-01-01
Human beings not only possess the remarkable ability to distinguish objects through tactile feedback but are further able to improve upon recognition competence through experience. In this work, we explore tactile-based object recognition with learners capable of incremental learning. Using the sparse online infinite Echo-State Gaussian process (OIESGP), we propose and compare two novel discriminative and generative tactile learners that produce probability distributions over objects during object grasping/palpation. To enable iterative improvement, our online methods incorporate training samples as they become available. We also describe incremental unsupervised learning mechanisms, based on novelty scores and extreme value theory, when teacher labels are not available. We present experimental results for both supervised and unsupervised learning tasks using the iCub humanoid, with tactile sensors on its five-fingered anthropomorphic hand, and 10 different object classes. Our classifiers perform comparably to state-of-the-art methods (C4.5 and SVM classifiers) and findings indicate that tactile signals are highly relevant for making accurate object classifications. We also show that accurate "early" classifications are possible using only 20-30 percent of the grasp sequence. For unsupervised learning, our methods generate high quality clusterings relative to the widely-used sequential k-means and self-organising map (SOM), and we present analyses into the differences between the approaches.
Chuang, Emmeline; Dill, Janette; Morgan, Jennifer Craft; Konrad, Thomas R
2012-01-01
Objective To identify high-performance work practices (HPWP) associated with high frontline health care worker (FLW) job satisfaction and perceived quality of care. Methods Cross-sectional survey data from 661 FLWs in 13 large health care employers were collected between 2007 and 2008 and analyzed using both regression and fuzzy-set qualitative comparative analysis. Principal Findings Supervisor support and team-based work practices were identified as necessary for high job satisfaction and high quality of care but not sufficient to achieve these outcomes unless implemented in tandem with other HPWP. Several configurations of HPWP were associated with either high job satisfaction or high quality of care. However, only one configuration of HPWP was sufficient for both: the combination of supervisor support, performance-based incentives, team-based work, and flexible work. These findings were consistent even after controlling for FLW demographics and employer type. Additional research is needed to clarify whether HPWP have differential effects on quality of care in direct care versus administrative workers. Conclusions High-performance work practices that integrate FLWs in health care teams and provide FLWs with opportunities for participative decision making can positively influence job satisfaction and perceived quality of care, but only when implemented as bundles of complementary policies and practices. PMID:22224858
Near-infrared fluorescence image quality test methods for standardized performance evaluation
NASA Astrophysics Data System (ADS)
Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua
2017-03-01
Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.
Narrative methods in quality improvement research
Greenhalgh, T; Russell, J; Swinglehurst, D
2005-01-01
This paper reviews and critiques the different approaches to the use of narrative in quality improvement research. The defining characteristics of narrative are chronology (unfolding over time); emplotment (the literary juxtaposing of actions and events in an implicitly causal sequence); trouble (that is, harm or the risk of harm); and embeddedness (the personal story nests within a particular social, historical and organisational context). Stories are about purposeful action unfolding in the face of trouble and, as such, have much to offer quality improvement researchers. But the quality improvement report (a story about efforts to implement change), which is common, must be distinguished carefully from narrative based quality improvement research (focused systematic enquiry that uses narrative methods to generate new knowledge), which is currently none. We distinguish four approaches to the use of narrative in quality improvement research—narrative interview; naturalistic story gathering; organisational case study; and collective sense-making—and offer a rationale, describe how data can be collected and analysed, and discuss the strengths and limitations of each using examples from the quality improvement literature. Narrative research raises epistemological questions about the nature of narrative truth (characterised by sense-making and emotional impact rather than scientific objectivity), which has implications for how rigour should be defined (and how it might be achieved) in this type of research. We offer some provisional guidance for distinguishing high quality narrative research in a quality improvement setting from other forms of narrative account such as report, anecdote, and journalism. PMID:16326792
A multiple objective optimization approach to quality control
NASA Technical Reports Server (NTRS)
Seaman, Christopher Michael
1991-01-01
The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Patrick
This Closure Report (CR) presents information supporting the clean closure of Corrective Action Unit (CAU) 412: Clean Slate I Plutonium Dispersion (TTR), located on the Tonopah Test Range, Nevada. CAU 412 consists of a release of radionuclides to the surrounding soil from a storage–transportation test conducted on May 25, 1963. Corrective action investigation (CAI) activities were performed in April and May 2015, as set forth in the Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 412: Clean Slate I Plutonium Dispersion (TTR), Tonopah Test Range, Nevada; and in accordance with the Soils Activity Quality Assurance Plan. Themore » purpose of the CAI was to fulfill data needs as defined during the data quality objectives process. The CAU 412 dataset of investigation results was evaluated based on a data quality assessment. This assessment demonstrated the dataset is complete and acceptable for use in fulfilling the data needs identified by the data quality objectives process. This CR provides documentation and justification for the clean closure of CAU 412 under the FFACO without further corrective action. This justification is based on historical knowledge of the site, previous site investigations, implementation of the 1997 interim corrective action, and the results of the CAI. The corrective action of clean closure was confirmed as appropriate for closure of CAU 412 based on achievement of the following closure objectives: Radiological contamination at the site is less than the final action level using the ground troops exposure scenario (i.e., the radiological dose is less than the final action level): Removable alpha contamination is less than the high contamination area criterion: No potential source material is present at the site, and any impacted soil associated with potential source material has been removed so that remaining soil contains contaminants at concentrations less than the final action levels: and There is sufficient information to characterize investigation and remediation waste for disposal.« less
A DECADE OF HEALTH TECHNOLOGY ASSESSMENT IN POLAND.
Lipska, Iga; McAuslane, Neil; Leufkens, Hubert; Hövels, Anke
2017-01-01
The objective of this study is to illustrate and provide a better understanding of the role of health technology assessment (HTA) processes in decision making for drug reimbursement in Poland and how this approach could be considered by other countries of limited resources. We analyzed the evolution of the HTA system and processes in Poland over the past decade and current developments based on publicly available information. The role of HTA in drug-reimbursement process in Poland has increased substantially over the recent decade, starting in 2005 with the formation the Agency for Health Technology Assessment and Tariff System (AOTMiT). The key success factors in this development were effective capacity building based on the use of international expertise, the implementation of transparent criteria into the drug reimbursement processes, and the selective approach to the adoption of innovative medicines based on the cost-effectiveness threshold among other criteria. While Poland is regarded as a leader in Central and Eastern Europe, there is room for improvement, especially with regard to the quality of HTA processes and the consistency of HTA guidelines with reimbursement law. In the "pragmatic" HTA model use by AOTMiT, the pharmaceutical company is responsible for the preparation of a reimbursement dossier of good quality in line with HTA guidelines while the assessment team in AOTMiT is responsible for critical review of that dossier. Adoption of this model may be considered by other countries with limited resources to balance differing priorities and ensure transparent and objective access to medicines for patients who need them.
Improving Evaluation to Address the Unintended Consequences of Health Information Technology:
Ammenwerth, E.; Hyppönen, H.; de Keizer, N.; Nykänen, P.; Rigby, M.; Scott, P.; Talmon, J.; Georgiou, A.
2016-01-01
Summary Background and objectives With growing use of IT by healthcare professionals and patients, the opportunity for any unintended effects of technology to disrupt care health processes and outcomes is intensified. The objectives of this position paper by the IMIA Working Group (WG) on Technology Assessment and Quality Development are to highlight how our ongoing initiatives to enhance evaluation are also addressing the unintended consequences of health IT. Methods Review of WG initiatives Results We argue that an evidence-based approach underpinned by rigorous evaluation is fundamental to the safe and effective use of IT, and for detecting and addressing its unintended consequences in a timely manner. We provide an overview of our ongoing initiatives to strengthen study design, execution and reporting by using evaluation frameworks and guidelines which can enable better characterization and monitoring of unintended consequences, including the Good Evaluation Practice Guideline in Health Informatics (GEP-HI) and the Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI). Indicators to benchmark the adoption and impact of IT can similarly be used to monitor unintended effects on healthcare structures, processes and outcome. We have also developed EvalDB, a web-based database of evaluation studies to promulgate evidence about unintended effects and are developing the content for courses to improve training in health IT evaluation. Conclusion Evaluation is an essential ingredient for the effective use of IT to improve healthcare quality and patient safety. WG resources and skills development initiatives can facilitate a proactive and evidence-based approach to detecting and addressing the unintended effects of health IT. PMID:27830232
Diagnosis and Prognostic of Wastewater Treatment System Based on Bayesian Network
NASA Astrophysics Data System (ADS)
Li, Dan; Yang, Haizhen; Liang, XiaoFeng
2010-11-01
Wastewater treatment is a complicated and dynamic process. The treatment effect can be influenced by many variables in microbial, chemical and physical aspects. These variables are always uncertain. Due to the complex biological reaction mechanisms, the highly time-varying and multivariable aspects, the diagnosis and prognostic of wastewater treatment system are still difficult in practice. Bayesian network (BN) is one of the best methods for dealing with uncertainty in the artificial intelligence field. Because of the powerful inference ability and convenient decision mechanism, BN can be employed into the model description and influencing factor analysis of wastewater treatment system with great flexibility and applicability.In this paper, taking modified sequencing batch reactor (MSBR) as an analysis object, BN model was constructed according to the influent water quality, operational condition and effluent effect data of MSBR, and then a novel approach based on BN is proposed to analyze the influencing factors of the wastewater treatment system. The approach presented gives an effective tool for diagnosing and predicting analysis of the wastewater treatment system. On the basis of the influent water quality and operational condition, effluent effect can be predicted. Moreover, according to the effluent effect, the influent water quality and operational condition also can be deduced.
A quantitative approach to measure road network information based on edge diversity
NASA Astrophysics Data System (ADS)
Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing
2015-12-01
The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.
Enhancing the quality and credibility of qualitative analysis.
Patton, M Q
1999-01-01
Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems. PMID:10591279
High Throughput Multispectral Image Processing with Applications in Food Science.
Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John
2015-01-01
Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.
Automatic segmentation of colon glands using object-graphs.
Gunduz-Demir, Cigdem; Kandemir, Melih; Tosun, Akif Burak; Sokmensuer, Cenk
2010-02-01
Gland segmentation is an important step to automate the analysis of biopsies that contain glandular structures. However, this remains a challenging problem as the variation in staining, fixation, and sectioning procedures lead to a considerable amount of artifacts and variances in tissue sections, which may result in huge variances in gland appearances. In this work, we report a new approach for gland segmentation. This approach decomposes the tissue image into a set of primitive objects and segments glands making use of the organizational properties of these objects, which are quantified with the definition of object-graphs. As opposed to the previous literature, the proposed approach employs the object-based information for the gland segmentation problem, instead of using the pixel-based information alone. Working with the images of colon tissues, our experiments demonstrate that the proposed object-graph approach yields high segmentation accuracies for the training and test sets and significantly improves the segmentation performance of its pixel-based counterparts. The experiments also show that the object-based structure of the proposed approach provides more tolerance to artifacts and variances in tissues.
ERIC Educational Resources Information Center
de Jager, H. J.; Nieuwenhuis, F. J.
2005-01-01
South Africa has embarked on a process of education renewal by adopting outcomes-based education (OBE). This paper focuses on the linkages between total quality management (TQM) and the outcomes-based approach in an education context. Quality assurance in academic programmes in higher education in South Africa is, in some instances, based on the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Triangular Alignment (TAME). A Tensor-based Approach for Higher-order Network Alignment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohammadi, Shahin; Gleich, David F.; Kolda, Tamara G.
2015-11-01
Network alignment is an important tool with extensive applications in comparative interactomics. Traditional approaches aim to simultaneously maximize the number of conserved edges and the underlying similarity of aligned entities. We propose a novel formulation of the network alignment problem that extends topological similarity to higher-order structures and provide a new objective function that maximizes the number of aligned substructures. This objective function corresponds to an integer programming problem, which is NP-hard. Consequently, we approximate this objective function as a surrogate function whose maximization results in a tensor eigenvalue problem. Based on this formulation, we present an algorithm called Triangularmore » AlignMEnt (TAME), which attempts to maximize the number of aligned triangles across networks. We focus on alignment of triangles because of their enrichment in complex networks; however, our formulation and resulting algorithms can be applied to general motifs. Using a case study on the NAPABench dataset, we show that TAME is capable of producing alignments with up to 99% accuracy in terms of aligned nodes. We further evaluate our method by aligning yeast and human interactomes. Our results indicate that TAME outperforms the state-of-art alignment methods both in terms of biological and topological quality of the alignments.« less
NASA Astrophysics Data System (ADS)
Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha
2018-03-01
Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.
SYMPOSIUM REPORT: An Evidence-Based Approach to IBS and CIC: Applying New Advances to Daily Practice
Chey, William D.
2017-01-01
Many nonpharmacologic and pharmacologic therapies are available to manage irritable bowel syndrome (IBS) and chronic idiopathic constipation (CIC). The American College of Gastroenterology (ACG) regularly publishes reviews on IBS and CIC therapies. The most recent of these reviews was published by the ACG Task Force on the Management of Functional Bowel Disorders in 2014. The key objective of this review was to evaluate the efficacy of therapies for IBS or CIC compared with placebo or no treatment in randomized controlled trials. Evidence-based approaches to managing diarrhea-predominant IBS include dietary measures, such as a diet low in gluten and fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs); loperamide; antispasmodics; peppermint oil; probiotics; tricyclic antidepressants; alosetron; eluxadoline, and rifaximin. Evidence-based approaches to managing constipation-predominant IBS and CIC include fiber, stimulant laxatives, polyethylene glycol, selective serotonin reuptake inhibitors, lubiprostone, and guanylate cyclase agonists. With the growing evidence base for IBS and CIC therapies, it has become increasingly important for clinicians to assess the quality of evidence and understand how to apply it to the care of individual patients. PMID:28729815
Quality of clinical trials: A moving target
Bhatt, Arun
2011-01-01
Quality of clinical trials depends on data integrity and subject protection. Globalization, outsourcing and increasing complexicity of clinical trials have made the target of achieving global quality challenging. The quality, as judged by regulatory inspections of the investigator sites, sponsors/contract research organizations and Institutional Review Board, has been of concern to the US Food and Drug Administration, as there has been hardly any change in frequency and nature of common deficiencies. To meet the regulatory expectations, the sponsors need to improve quality by developing systems with specific standards for each clinical trial process. The quality systems include: personnel roles and responsibilities, training, policies and procedures, quality assurance and auditing, document management, record retention, and reporting and corrective and preventive action. With an objective to improve quality, the FDA has planned new inspection approaches such as risk-based inspections, surveillance inspections, real-time oversight, and audit of sponsor quality systems. The FDA has partnered with Duke University for Clinical Trials Transformation Initiative, which will conduct research projects on design principles, data quality and quantity including monitoring, study start-up, and adverse event reporting. These recent initiatives will go a long way in improving quality of clinical trials. PMID:22145122
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less
Hurtado-Chong, Anahí; Joeris, Alexander; Hess, Denise; Blauth, Michael
2017-07-12
A considerable number of clinical studies experience delays, which result in increased duration and costs. In multicentre studies, patient recruitment is among the leading causes of delays. Poor site selection can result in low recruitment and bad data quality. Site selection is therefore crucial for study quality and completion, but currently no specific guidelines are available. Selection of sites adequate to participate in a prospective multicentre cohort study was performed through an open call using a newly developed objective multistep approach. The method is based on use of a network, definition of objective criteria and a systematic screening process. Out of 266 interested sites, 24 were shortlisted and finally 12 sites were selected to participate in the study. The steps in the process included an open call through a network, use of selection questionnaires tailored to the study, evaluation of responses using objective criteria and scripted telephone interviews. At each step, the number of candidate sites was quickly reduced leaving only the most promising candidates. Recruitment and quality of data went according to expectations in spite of the contracting problems faced with some sites. The results of our first experience with a standardised and objective method of site selection are encouraging. The site selection method described here can serve as a guideline for other researchers performing multicentre studies. ClinicalTrials.gov: NCT02297581. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Lafon, Jose J.
(FOD) Foreign Object Debris/Damage has been a costly issue for the commercial and military aircraft manufacturers at their production lines every day. FOD can put pilots, passengers and other crews' lives into high-risk. FOD refers to any type of foreign object, particle, debris or agent in the manufacturing environment, which could contaminate/damage the product or otherwise undermine quality standards. Nowadays, FOD is currently addressed with prevention programs, elimination techniques, and designation of FOD areas, controlled access to FOD areas, restrictions of personal items entering designated areas, tool accountability, etc. All of the efforts mentioned before, have not shown a significant reduction in FOD occurrence in the manufacturing processes. This research presents a Decision Making Model approach based on a logistic regression predictive model that was previously made by other researchers. With a general idea of the FOD expected, elimination plans can be put in place and start eradicating the problem minimizing the cost and time spend on the prediction, detection and/or removal of FOD.
Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies
Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon
2012-01-01
Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600
Incorporating uncertainty and motion in Intensity Modulated Radiation Therapy treatment planning
NASA Astrophysics Data System (ADS)
Martin, Benjamin Charles
In radiation therapy, one seeks to destroy a tumor while minimizing the damage to surrounding healthy tissue. Intensity Modulated Radiation Therapy (IMRT) uses overlapping beams of x-rays that add up to a high dose within the target and a lower dose in the surrounding healthy tissue. IMRT relies on optimization techniques to create high quality treatments. Unfortunately, the possible conformality is limited by the need to ensure coverage even if there is organ movement or deformation. Currently, margins are added around the tumor to ensure coverage based on an assumed motion range. This approach does not ensure high quality treatments. In the standard IMRT optimization problem, an objective function measures the deviation of the dose from the clinical goals. The optimization then finds the beamlet intensities that minimize the objective function. When modeling uncertainty, the dose delivered from a given set of beamlet intensities is a random variable. Thus the objective function is also a random variable. In our stochastic formulation we minimize the expected value of this objective function. We developed a problem formulation that is both flexible and fast enough for use on real clinical cases. While working on accelerating the stochastic optimization, we developed a technique of voxel sampling. Voxel sampling is a randomized algorithms approach to a steepest descent problem based on estimating the gradient by only calculating the dose to a fraction of the voxels within the patient. When combined with an automatic sampling rate adaptation technique, voxel sampling produced an order of magnitude speed up in IMRT optimization. We also develop extensions of our results to Intensity Modulated Proton Therapy (IMPT). Due to the physics of proton beams the stochastic formulation yields visibly different and better plans than normal optimization. The results of our research have been incorporated into a software package OPT4D, which is an IMRT and IMPT optimization tool that we developed.
An adaptive framework to differentiate receiving water quality impacts on a multi-scale level.
Blumensaat, F; Tränckner, J; Helm, B; Kroll, S; Dirckx, G; Krebs, P
2013-01-01
The paradigm shift in recent years towards sustainable and coherent water resources management on a river basin scale has changed the subject of investigations to a multi-scale problem representing a great challenge for all actors participating in the management process. In this regard, planning engineers often face an inherent conflict to provide reliable decision support for complex questions with a minimum of effort. This trend inevitably increases the risk to base decisions upon uncertain and unverified conclusions. This paper proposes an adaptive framework for integral planning that combines several concepts (flow balancing, water quality monitoring, process modelling, multi-objective assessment) to systematically evaluate management strategies for water quality improvement. As key element, an S/P matrix is introduced to structure the differentiation of relevant 'pressures' in affected regions, i.e. 'spatial units', which helps in handling complexity. The framework is applied to a small, but typical, catchment in Flanders, Belgium. The application to the real-life case shows: (1) the proposed approach is adaptive, covers problems of different spatial and temporal scale, efficiently reduces complexity and finally leads to a transparent solution; and (2) water quality and emission-based performance evaluation must be done jointly as an emission-based performance improvement does not necessarily lead to an improved water quality status, and an assessment solely focusing on water quality criteria may mask non-compliance with emission-based standards. Recommendations derived from the theoretical analysis have been put into practice.
NASA Astrophysics Data System (ADS)
Boersma, Christiaan
We propose to quantitatively calibrate the PAH band strength ratios that have been traditionally used as qualitative proxies of PAH properties and linking PAH observables with local astrophysical conditions, thus developing PAHs into quantitative probes of astronomical environments. This will culminate in a toolbox (calibration charts) that can be used by PAH experts and non-PAH experts alike to unlock the information hidden in PAH emission sources that are part of the Spitzer and ISO archives. Furthermore, the proposed work is critical to mine the treasure trove of information JWST will return as it will capture, for the first time, the complete mid-infrared (IR) PAH spectrum with fully resolved features, through a single aperture, and along single lines-of-sight; making it possible to fully extract the information contained in the PAH spectra. In short, the work proposed here represents a major step in enabling the astronomical PAH model to reach its full potential as a diagnostic of the physical and chemical conditions in objects spanning the Universe. Polycyclic aromatic hydrocarbons (PAHs), a common and important reservoir of accessible carbon across the Universe, play an intrinsic part in the formation of stars, planets and possibly even life itself. While most PAH spectra appear quite similar, they differ in detail and contain a wealth of untapped information. Thanks to recent advances in laboratory studies and computer-based calculations of PAH spectra, the majority of which have been made at NASA Ames, coupled with the astronomical modeling tools we have developed, we can interpret the spectral details at levels never before possible. This enables us to extract local physical conditions and track subtle changes in these conditions at levels previously impossible. Building upon the tools and paradigms developed as part of the publicly available NASA Ames PAH IR Spectroscopic Database (PAHdb; www.astrochem.org/pahdb/), the purpose of our proposed research is to extend and test the applicability of the PAH proxy (band strength ratio) calibrations we have developed that are based on a single object, the reflection nebula (RN) NGC7023, to, and within, a variety of objects, each representing different types of astronomical environments. Starting with the results for NGC7023, our initial focus will be placed on other RNe for which high-quality Spitzer spectral maps are available. After this, the focus will shift to Spitzer and ISO catalogs holding PAH spectra from different object types and extragalactic sources at different quality levels. We will first fit the astronomical spectra using the PAH spectra and tools in PAHdb, a database and toolset developed by the proposers and perfectly suited for dealing with large spectral data sets. This approach quantitatively breaks down the emission into the different subclasses, of, PAH size, charge, structure and composition. Following this, the data will be analyzed using the traditional, qualitative, proxy approach in which the PAH bands are isolated and their strengths measured. Combining the results of these two approaches enables us to test the validity of, and to quantitatively calibrate, the PAH proxies that have been traditionally used to probe astronomical environments, and make a quantitative link between PAH observables and local astrophysical conditions. Previous work on NGC7023 demonstrated the potential of this approach, and applying it to different object types at varying quality levels will establish whether his approach holds in general or if adjustments must be made to tackle the full range of PAH-emitting astronomical environments. In parallel, we will perform stability analysis on the fits; establish quality requirements for spectral resolution, spectral range, and signal-to-noise; and make uncertainty estimates for the derived parameters. This is of particular importance for extragalactic sources, as it will establish a data quality threshold.
Danker-Hopfe, Heidi; Dorn, Hans; Bornkessel, Christian; Sauter, Cornelia
2010-01-01
The aim of the present double-blind, sham-controlled, balanced randomized cross-over study was to disentangle effects of electromagnetic fields (EMF) and non-EMF effects of mobile phone base stations on objective and subjective sleep quality. In total 397 residents aged 18-81 years (50.9% female) from 10 German sites, where no mobile phone service was available, were exposed to sham and GSM (Global System for Mobile Communications, 900 MHz and 1,800 MHz) base station signals by an experimental base station while their sleep was monitored at their homes during 12 nights. Participants were randomly exposed to real (GSM) or sham exposure for five nights each. Individual measurement of EMF exposure, questionnaires on sleep disorders, overall sleep quality, attitude towards mobile communication, and on subjective sleep quality (morning and evening protocols) as well as objective sleep data (frontal EEG and EOG recordings) were gathered. Analysis of the subjective and objective sleep data did not reveal any significant differences between the real and sham condition. During sham exposure nights, objective and subjective sleep efficiency, wake after sleep onset, and subjective sleep latency were significantly worse in participants with concerns about possible health risks resulting from base stations than in participants who were not concerned. The study did not provide any evidence for short-term physiological effects of EMF emitted by mobile phone base stations on objective and subjective sleep quality. However, the results indicate that mobile phone base stations as such (not the electromagnetic fields) may have a significant negative impact on sleep quality. (c) 2010 Wiley-Liss, Inc.
Rapid alignment of nanotomography data using joint iterative reconstruction and reprojection.
Gürsoy, Doğa; Hong, Young P; He, Kuan; Hujsak, Karl; Yoo, Seunghwan; Chen, Si; Li, Yue; Ge, Mingyuan; Miller, Lisa M; Chu, Yong S; De Andrade, Vincent; He, Kai; Cossairt, Oliver; Katsaggelos, Aggelos K; Jacobsen, Chris
2017-09-18
As x-ray and electron tomography is pushed further into the nanoscale, the limitations of rotation stages become more apparent, leading to challenges in the alignment of the acquired projection images. Here we present an approach for rapid post-acquisition alignment of these projections to obtain high quality three-dimensional images. Our approach is based on a joint estimation of alignment errors, and the object, using an iterative refinement procedure. With simulated data where we know the alignment error of each projection image, our approach shows a residual alignment error that is a factor of a thousand smaller, and it reaches the same error level in the reconstructed image in less than half the number of iterations. We then show its application to experimental data in x-ray and electron nanotomography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gürsoy, Doğa; Hong, Young P.; He, Kuan
As x-ray and electron tomography is pushed further into the nanoscale, the limitations of rotation stages become more apparent, leading to challenges in the alignment of the acquired projection images. Here we present an approach for rapid post-acquisition alignment of these projections to obtain high quality three-dimensional images. Our approach is based on a joint estimation of alignment errors, and the object, using an iterative refinement procedure. With simulated data where we know the alignment error of each projection image, our approach shows a residual alignment error that is a factor of a thousand smaller, and it reaches the samemore » error level in the reconstructed image in less than half the number of iterations. We then show its application to experimental data in x-ray and electron nanotomography.« less
Blog and Podcast Watch: Pediatric Emergency Medicine.
Zaver, Fareen; Hansen, Michael; Leibner, Evan; Little, Andrew; Lin, Michelle
2016-09-01
By critically appraising open access, educational blogs and podcasts in emergency medicine (EM) using an objective scoring instrument, this installment of the ALiEM (Academic Life in Emergency Medicine) Blog and Podcast Watch series curated and scored relevant posts in the specific areas of pediatric EM. The Approved Instructional Resources - Professional (AIR-Pro) series is a continuously building curriculum covering a new subject area every two months. For each area, six EM chief residents identify 3-5 advanced clinical questions. Using FOAMsearch.net to search blogs and podcasts, relevant posts are scored by eight reviewers from the AIR-Pro Board, which is comprised of EM faculty and chief residents at various institutions. The scoring instrument contains five measurement outcomes based on 7-point Likert scales: recency, accuracy, educational utility, evidence based, and references. The AIR-Pro label is awarded to posts with a score of ≥26 (out of 35) points. An "Honorable Mention" label is awarded if Board members collectively felt that the posts were valuable and the scores were > 20. We included a total of 41 blog posts and podcasts. Key educational pearls from the 10 high quality AIR-Pro posts and four Honorable Mentions are summarized. The WestJEM ALiEM Blog and Podcast Watch series is based on the AIR and AIR-Pro series, which attempts to identify high quality educational content on open-access blogs and podcasts. Until more objective quality indicators are developed for learners and educators, this series provides an expert-based, crowdsourced approach towards critically appraising educational social media content for EM clinicians.
Blog and Podcast Watch: Pediatric Emergency Medicine
Zaver, Fareen; Hansen, Michael; Leibner, Evan; Little, Andrew; Lin, Michelle
2016-01-01
Introduction By critically appraising open access, educational blogs and podcasts in emergency medicine (EM) using an objective scoring instrument, this installment of the ALiEM (Academic Life in Emergency Medicine) Blog and Podcast Watch series curated and scored relevant posts in the specific areas of pediatric EM. Methods The Approved Instructional Resources – Professional (AIR-Pro) series is a continuously building curriculum covering a new subject area every two months. For each area, six EM chief residents identify 3–5 advanced clinical questions. Using FOAMsearch.net to search blogs and podcasts, relevant posts are scored by eight reviewers from the AIR-Pro Board, which is comprised of EM faculty and chief residents at various institutions. The scoring instrument contains five measurement outcomes based on 7-point Likert scales: recency, accuracy, educational utility, evidence based, and references. The AIR-Pro label is awarded to posts with a score of ≥26 (out of 35) points. An “Honorable Mention” label is awarded if Board members collectively felt that the posts were valuable and the scores were > 20. Results We included a total of 41 blog posts and podcasts. Key educational pearls from the 10 high quality AIR-Pro posts and four Honorable Mentions are summarized. Conclusion The WestJEM ALiEM Blog and Podcast Watch series is based on the AIR and AIR-Pro series, which attempts to identify high quality educational content on open-access blogs and podcasts. Until more objective quality indicators are developed for learners and educators, this series provides an expert-based, crowdsourced approach towards critically appraising educational social media content for EM clinicians. PMID:27625713
Comparison of methods for quantitative evaluation of endoscopic distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua
2015-03-01
Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.
Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?
Hey, Spencer Phillips; Kimmelman, Jonathan
2016-10-01
Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. © 2016 John Wiley & Sons Ltd.
Improving oil classification quality from oil spill fingerprint beyond six sigma approach.
Juahir, Hafizan; Ismail, Azimah; Mohamed, Saiful Bahri; Toriman, Mohd Ekhwan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wah, Wong Kok; Zali, Munirah Abdul; Retnam, Ananthy; Taib, Mohd Zaki Mohd; Mokhtar, Mazlin
2017-07-15
This study involves the use of quality engineering in oil spill classification based on oil spill fingerprinting from GC-FID and GC-MS employing the six-sigma approach. The oil spills are recovered from various water areas of Peninsular Malaysia and Sabah (East Malaysia). The study approach used six sigma methodologies that effectively serve as the problem solving in oil classification extracted from the complex mixtures of oil spilled dataset. The analysis of six sigma link with the quality engineering improved the organizational performance to achieve its objectivity of the environmental forensics. The study reveals that oil spills are discriminated into four groups' viz. diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) according to the similarity of the intrinsic chemical properties. Through the validation, it confirmed that four discriminant component, diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) dominate the oil types with a total variance of 99.51% with ANOVA giving F stat >F critical at 95% confidence level and a Chi Square goodness test of 74.87. Results obtained from this study reveals that by employing six-sigma approach in a data-driven problem such as in the case of oil spill classification, good decision making can be expedited. Copyright © 2017. Published by Elsevier Ltd.
A mass-density model can account for the size-weight illusion.
Wolf, Christian; Bergmann Tiest, Wouter M; Drewing, Knut
2018-01-01
When judging the heaviness of two objects with equal mass, people perceive the smaller and denser of the two as being heavier. Despite the large number of theories, covering bottom-up and top-down approaches, none of them can fully account for all aspects of this size-weight illusion and thus for human heaviness perception. Here we propose a new maximum-likelihood estimation model which describes the illusion as the weighted average of two heaviness estimates with correlated noise: One estimate derived from the object's mass, and the other from the object's density, with estimates' weights based on their relative reliabilities. While information about mass can directly be perceived, information about density will in some cases first have to be derived from mass and volume. However, according to our model at the crucial perceptual level, heaviness judgments will be biased by the objects' density, not by its size. In two magnitude estimation experiments, we tested model predictions for the visual and the haptic size-weight illusion. Participants lifted objects which varied in mass and density. We additionally varied the reliability of the density estimate by varying the quality of either visual (Experiment 1) or haptic (Experiment 2) volume information. As predicted, with increasing quality of volume information, heaviness judgments were increasingly biased towards the object's density: Objects of the same density were perceived as more similar and big objects were perceived as increasingly lighter than small (denser) objects of the same mass. This perceived difference increased with an increasing difference in density. In an additional two-alternative forced choice heaviness experiment, we replicated that the illusion strength increased with the quality of volume information (Experiment 3). Overall, the results highly corroborate our model, which seems promising as a starting point for a unifying framework for the size-weight illusion and human heaviness perception.
Tsai, Yu Hsin; Stow, Douglas; Weeks, John
2013-01-01
The goal of this study was to map and quantify the number of newly constructed buildings in Accra, Ghana between 2002 and 2010 based on high spatial resolution satellite image data. Two semi-automated feature detection approaches for detecting and mapping newly constructed buildings based on QuickBird very high spatial resolution satellite imagery were analyzed: (1) post-classification comparison; and (2) bi-temporal layerstack classification. Feature Analyst software based on a spatial contextual classifier and ENVI Feature Extraction that uses a true object-based image analysis approach of image segmentation and segment classification were evaluated. Final map products representing new building objects were compared and assessed for accuracy using two object-based accuracy measures, completeness and correctness. The bi-temporal layerstack method generated more accurate results compared to the post-classification comparison method due to less confusion with background objects. The spectral/spatial contextual approach (Feature Analyst) outperformed the true object-based feature delineation approach (ENVI Feature Extraction) due to its ability to more reliably delineate individual buildings of various sizes. Semi-automated, object-based detection followed by manual editing appears to be a reliable and efficient approach for detecting and enumerating new building objects. A bivariate regression analysis was performed using neighborhood-level estimates of new building density regressed on a census-derived measure of socio-economic status, yielding an inverse relationship with R2 = 0.31 (n = 27; p = 0.00). The primary utility of the new building delineation results is to support spatial analyses of land cover and land use and demographic change. PMID:24415810
Quantifying quality in DNA self-assembly
Wagenbauer, Klaus F.; Wachauf, Christian H.; Dietz, Hendrik
2014-01-01
Molecular self-assembly with DNA is an attractive route for building nanoscale devices. The development of sophisticated and precise objects with this technique requires detailed experimental feedback on the structure and composition of assembled objects. Here we report a sensitive assay for the quality of assembly. The method relies on measuring the content of unpaired DNA bases in self-assembled DNA objects using a fluorescent de-Bruijn probe for three-base ‘codons’, which enables a comparison with the designed content of unpaired DNA. We use the assay to measure the quality of assembly of several multilayer DNA origami objects and illustrate the use of the assay for the rational refinement of assembly protocols. Our data suggests that large and complex objects like multilayer DNA origami can be made with high strand integration quality up to 99%. Beyond DNA nanotechnology, we speculate that the ability to discriminate unpaired from paired nucleic acids in the same macromolecule may also be useful for analysing cellular nucleic acids. PMID:24751596
Total variation optimization for imaging through turbid media with transmission matrix
NASA Astrophysics Data System (ADS)
Gong, Changmei; Shao, Xiaopeng; Wu, Tengfei; Liu, Jietao; Zhang, Jianqi
2016-12-01
With the transmission matrix (TM) of the whole optical system measured, the image of the object behind a turbid medium can be recovered from its speckle field by means of an image reconstruction algorithm. Instead of Tikhonov regularization algorithm (TRA), the total variation minimization by augmented Lagrangian and alternating direction algorithms (TVAL3) is introduced to recover object images. As a total variation (TV)-based approach, TVAL3 allows to effectively damp more noise and preserve more edges compared with TRA, thus providing more outstanding image quality. Different levels of detector noise and TM-measurement noise are successively added to analyze the antinoise performance of these two algorithms. Simulation results show that TVAL3 is able to recover more details and suppress more noise than TRA under different noise levels, thus providing much more excellent image quality. Furthermore, whether it be detector noise or TM-measurement noise, the reconstruction images obtained by TVAL3 at SNR=15 dB are far superior to those by TRA at SNR=50 dB.
Del Fante, Peter; Allan, Don; Babidge, Elizabeth
2006-01-01
The Practice Health Atlas (PHA) is a decision support tool for general practice, designed by the Adelaide Western Division of General Practice (AWDGP). This article describes the features of the PHA and its potential role in enhancing health care. In developing the PHA, the AWDGP utilises a range of software tools and consults with a practice to understand its clinical data management approach. The PHA comprises three sections: epidemiology, business and clinical modelling systems, access to services. The objectives include developing a professional culture around quality health data and synthesis of aggregated de-identified general practice data at both practice and divisional level (and beyond) to assist with local health needs assessment, planning, and funding. Evaluation occurs through group feedback sessions and from the general practitioners and staff. It has demonstrated its potential to fulfill the objectives in outcome areas such as data quality and management, team based care, pro-active practice population health care, and business systems development, thereby contributing to improved patient health outcomes.
[Clinical Management: Basics and organization].
Torres, Juan; Mingo, Carlos
2015-01-01
Many strategies have been proposed over the last years to ensure the Health Care System sustainability, mainly after the recent global economic crisis. One of the most attractive approaches is clinical management, which is a way of organizing health care units based on active participation of professionals who receive the transference of responsibilities dispoto achieve the objectives with the mission of ensuring a proper patient centered care, taking into consideration the rational use of resources (Efficiency) For the start up of Health Care structures based on clinical management, it is necessary a previous management culture within the departments involved and the center's executive board. Furthermore, to achieve the objectives proposed various tools must be used, such as evidence based medicine, clinical practice variability analysis, process management, in addition of quality and safety strategies. The units involved have to propose a management plan that will result in a management contract with the center's executive board. This agreement will establish some activity, expense and quality objectives that will be quantifiable through various indicators. Risk transference to the unit must include certain budget allocation and incentive decision capacity. Clinical management must not be employed as a savings tool from the part of macro and meso management. There is not a health care structure based on clinical management that have a general character for all health care organizations, existing a great variability in the adoption of various organizational formulas, so that every center must perform its own analysis and decide the most adequate model. In our country there are many clinical management experiences, although there is a long way to go.
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Itzchakov, Guy; Kluger, Avraham N; Castro, Dotan R
2017-01-01
We examined how listeners characterized by empathy and a non-judgmental approach affect speakers' attitude structure. We hypothesized that high quality listening decreases speakers' social anxiety, which in turn reduces defensive processing. This reduction in defensive processing was hypothesized to result in an awareness of contradictions (increased objective-attitude ambivalence), and decreased attitude extremity. Moreover, we hypothesized that experiencing high quality listening would enable speakers to tolerate contradictory responses, such that listening would attenuate the association between objective- and subjective-attitude ambivalence. We obtained consistent support for our hypotheses across four laboratory experiments that manipulated listening experience in different ways on a range of attitude topics. The effects of listening on objective-attitude ambivalence were stronger for higher dispositional social anxiety and initial objective-attitude ambivalence (Study 4). Overall, the results suggest that speakers' attitude structure can be changed by a heretofore unexplored interpersonal variable: merely providing high quality listening.
Horwood, Christiane M; Youngleson, Michele S; Moses, Edward; Stern, Amy F; Barker, Pierre M
2015-07-01
Achieving long-term retention in HIV care is an important challenge for HIV management and achieving elimination of mother-to-child transmission. Sustainable, affordable strategies are required to achieve this, including strengthening of community-based interventions. Deployment of community-based health workers (CHWs) can improve health outcomes but there is a need to identify systems to support and maintain high-quality performance. Quality-improvement strategies have been successfully implemented to improve quality and coverage of healthcare in facilities and could provide a framework to support community-based interventions. Four community-based quality-improvement projects from South Africa, Malawi and Mozambique are described. Community-based improvement teams linked to the facility-based health system participated in learning networks (modified Breakthrough Series), and used quality-improvement methods to improve process performance. Teams were guided by trained quality mentors who used local data to help nurses and CHWs identify gaps in service provision and test solutions. Learning network participants gathered at intervals to share progress and identify successful strategies for improvement. CHWs demonstrated understanding of quality-improvement concepts, tools and methods, and implemented quality-improvement projects successfully. Challenges of using quality-improvement approaches in community settings included adapting processes, particularly data reporting, to the education level and first language of community members. Quality-improvement techniques can be implemented by CHWs to improve outcomes in community settings but these approaches require adaptation and additional mentoring support to be successful. More research is required to establish the effectiveness of this approach on processes and outcomes of care.
Real-time computer treatment of THz passive device images with the high image quality
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2012-06-01
We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.
Modulated Modularity Clustering as an Exploratory Tool for Functional Genomic Inference
Stone, Eric A.; Ayroles, Julien F.
2009-01-01
In recent years, the advent of high-throughput assays, coupled with their diminishing cost, has facilitated a systems approach to biology. As a consequence, massive amounts of data are currently being generated, requiring efficient methodology aimed at the reduction of scale. Whole-genome transcriptional profiling is a standard component of systems-level analyses, and to reduce scale and improve inference clustering genes is common. Since clustering is often the first step toward generating hypotheses, cluster quality is critical. Conversely, because the validation of cluster-driven hypotheses is indirect, it is critical that quality clusters not be obtained by subjective means. In this paper, we present a new objective-based clustering method and demonstrate that it yields high-quality results. Our method, modulated modularity clustering (MMC), seeks community structure in graphical data. MMC modulates the connection strengths of edges in a weighted graph to maximize an objective function (called modularity) that quantifies community structure. The result of this maximization is a clustering through which tightly-connected groups of vertices emerge. Our application is to systems genetics, and we quantitatively compare MMC both to the hierarchical clustering method most commonly employed and to three popular spectral clustering approaches. We further validate MMC through analyses of human and Drosophila melanogaster expression data, demonstrating that the clusters we obtain are biologically meaningful. We show MMC to be effective and suitable to applications of large scale. In light of these features, we advocate MMC as a standard tool for exploration and hypothesis generation. PMID:19424432
Virtual Boutique: a 3D modeling and content-based management approach to e-commerce
NASA Astrophysics Data System (ADS)
Paquet, Eric; El-Hakim, Sabry F.
2000-12-01
The Virtual Boutique is made out of three modules: the decor, the market and the search engine. The decor is the physical space occupied by the Virtual Boutique. It can reproduce any existing boutique. For this purpose, photogrammetry is used. A set of pictures of a real boutique or space is taken and a virtual 3D representation of this space is calculated from them. Calculations are performed with software developed at NRC. This representation consists of meshes and texture maps. The camera used in the acquisition process determines the resolution of the texture maps. Decorative elements are added like painting, computer generated objects and scanned objects. The objects are scanned with laser scanner developed at NRC. This scanner allows simultaneous acquisition of range and color information based on white laser beam triangulation. The second module, the market, is made out of all the merchandises and the manipulators, which are used to manipulate and compare the objects. The third module, the search engine, can search the inventory based on an object shown by the customer in order to retrieve similar objects base don shape and color. The items of interest are displayed in the boutique by reconfiguring the market space, which mean that the boutique can be continuously customized according to the customer's needs. The Virtual Boutique is entirely written in Java 3D and can run in mono and stereo mode and has been optimized in order to allow high quality rendering.
Unsupervised motion-based object segmentation refined by color
NASA Astrophysics Data System (ADS)
Piek, Matthijs C.; Braspenning, Ralph; Varekamp, Chris
2003-06-01
For various applications, such as data compression, structure from motion, medical imaging and video enhancement, there is a need for an algorithm that divides video sequences into independently moving objects. Because our focus is on video enhancement and structure from motion for consumer electronics, we strive for a low complexity solution. For still images, several approaches exist based on colour, but these lack in both speed and segmentation quality. For instance, colour-based watershed algorithms produce a so-called oversegmentation with many segments covering each single physical object. Other colour segmentation approaches exist which somehow limit the number of segments to reduce this oversegmentation problem. However, this often results in inaccurate edges or even missed objects. Most likely, colour is an inherently insufficient cue for real world object segmentation, because real world objects can display complex combinations of colours. For video sequences, however, an additional cue is available, namely the motion of objects. When different objects in a scene have different motion, the motion cue alone is often enough to reliably distinguish objects from one another and the background. However, because of the lack of sufficient resolution of efficient motion estimators, like the 3DRS block matcher, the resulting segmentation is not at pixel resolution, but at block resolution. Existing pixel resolution motion estimators are more sensitive to noise, suffer more from aperture problems or have less correspondence to the true motion of objects when compared to block-based approaches or are too computationally expensive. From its tendency to oversegmentation it is apparent that colour segmentation is particularly effective near edges of homogeneously coloured areas. On the other hand, block-based true motion estimation is particularly effective in heterogeneous areas, because heterogeneous areas improve the chance a block is unique and thus decrease the chance of the wrong position producing a good match. Consequently, a number of methods exist which combine motion and colour segmentation. These methods use colour segmentation as a base for the motion segmentation and estimation or perform an independent colour segmentation in parallel which is in some way combined with the motion segmentation. The presented method uses both techniques to complement each other by first segmenting on motion cues and then refining the segmentation with colour. To our knowledge few methods exist which adopt this approach. One example is te{meshrefine}. This method uses an irregular mesh, which hinders its efficient implementation in consumer electronics devices. Furthermore, the method produces a foreground/background segmentation, while our applications call for the segmentation of multiple objects. NEW METHOD As mentioned above we start with motion segmentation and refine the edges of this segmentation with a pixel resolution colour segmentation method afterwards. There are several reasons for this approach: + Motion segmentation does not produce the oversegmentation which colour segmentation methods normally produce, because objects are more likely to have colour discontinuities than motion discontinuities. In this way, the colour segmentation only has to be done at the edges of segments, confining the colour segmentation to a smaller part of the image. In such a part, it is more likely that the colour of an object is homogeneous. + This approach restricts the computationally expensive pixel resolution colour segmentation to a subset of the image. Together with the very efficient 3DRS motion estimation algorithm, this helps to reduce the computational complexity. + The motion cue alone is often enough to reliably distinguish objects from one another and the background. To obtain the motion vector fields, a variant of the 3DRS block-based motion estimator which analyses three frames of input was used. The 3DRS motion estimator is known for its ability to estimate motion vectors which closely resemble the true motion. BLOCK-BASED MOTION SEGMENTATION As mentioned above we start with a block-resolution segmentation based on motion vectors. The presented method is inspired by the well-known K-means segmentation method te{K-means}. Several other methods (e.g. te{kmeansc}) adapt K-means for connectedness by adding a weighted shape-error. This adds the additional difficulty of finding the correct weights for the shape-parameters. Also, these methods often bias one particular pre-defined shape. The presented method, which we call K-regions, encourages connectedness because only blocks at the edges of segments may be assigned to another segment. This constrains the segmentation method to such a degree that it allows the method to use least squares for the robust fitting of affine motion models for each segment. Contrary to te{parmkm}, the segmentation step still operates on vectors instead of model parameters. To make sure the segmentation is temporally consistent, the segmentation of the previous frame will be used as initialisation for every new frame. We also present a scheme which makes the algorithm independent of the initially chosen amount of segments. COLOUR-BASED INTRA-BLOCK SEGMENTATION The block resolution motion-based segmentation forms the starting point for the pixel resolution segmentation. The pixel resolution segmentation is obtained from the block resolution segmentation by reclassifying pixels only at the edges of clusters. We assume that an edge between two objects can be found in either one of two neighbouring blocks that belong to different clusters. This assumption allows us to do the pixel resolution segmentation on each pair of such neighbouring blocks separately. Because of the local nature of the segmentation, it largely avoids problems with heterogeneously coloured areas. Because no new segments are introduced in this step, it also does not suffer from oversegmentation problems. The presented method has no problems with bifurcations. For the pixel resolution segmentation itself we reclassify pixels such that we optimize an error norm which favour similarly coloured regions and straight edges. SEGMENTATION MEASURE To assist in the evaluation of the proposed algorithm we developed a quality metric. Because the problem does not have an exact specification, we decided to define a ground truth output which we find desirable for a given input. We define the measure for the segmentation quality as being how different the segmentation is from the ground truth. Our measure enables us to evaluate oversegmentation and undersegmentation seperately. Also, it allows us to evaluate which parts of a frame suffer from oversegmentation or undersegmentation. The proposed algorithm has been tested on several typical sequences. CONCLUSIONS In this abstract we presented a new video segmentation method which performs well in the segmentation of multiple independently moving foreground objects from each other and the background. It combines the strong points of both colour and motion segmentation in the way we expected. One of the weak points is that the segmentation method suffers from undersegmentation when adjacent objects display similar motion. In sequences with detailed backgrounds the segmentation will sometimes display noisy edges. Apart from these results, we think that some of the techniques, and in particular the K-regions technique, may be useful for other two-dimensional data segmentation problems.
Training of trainers for community primary health care workers.
Cernada, G P
1983-01-01
Training community-based health care workers in "developing" countries is essential to improving the quality of life in both rural and urban areas. Two major obstacles to such training are the tremendous social distance gap between these community workers and their more highly-educated and upper-class trainers (often medical officers) and the didactic, formal educational system. Bridging this gap demands a participant-centered, field-oriented approach which actively involves the trainee in the design, implementation and evaluation of the training program. A description of a philosophic learning approach based on self-initiated change, educational objectives related to planning, organizing, conducting and evaluating training, and specific learning methodologies utilizing participatory learning, non-formal educational techniques, field experience, continuing feedback and learner participation are reviewed. Included are: role playing, story telling, case studies, self-learning and simulation exercises, visuals, and Portapak videotape.
Sun, Xincheng; Yang, Qingsong; Sun, Feng; Shi, Qinglu
2015-01-01
Objective This study aimed to compare the effectiveness and complications between the retropubic and transobturator approaches for the treatment of female stress urinary incontinence (SUI) by conducting a systematic review. Materials and Methods We selected all randomized controlled trials (RCTs) that compared retropubic and transobturator sling placements for treatment of SUI. We estimated pooled odds ratios and 95% confidence intervals for intraoperative and postoperative outcomes and complications. Results Six hundred twelve studies that compared retropubic and transobturator approaches to midurethral sling placement were identified, of which 16 were included in our research. Our study was based on results from 2646 women. We performed a subgroup analysis to compare outcomes and complications between the two approaches. The evidence to support the superior approach that leads to better objective/subjective cure rate was insufficient. The transobturator approach was associated with lower risks of bladder perforation (odds ratio (OR) 0.17, 95% confidence interval (CI) 0.09-0.32), retropubic/vaginal hematoma (OR 0.32, 95% CI 0.16-0.63), and long-term voiding dysfunction (OR 0.32, 95% CI 0.17-0.61). However, the risk of thigh/groin pain seemed higher in the transobturator group (OR 2.53, 95% CI 1.72-3.72). We found no statistically significant differences in the risks of other complications between the two approaches. Conclusions This meta-analysis shows analogical objective and subjective cure rates between the retropubic and transobturator approaches to midurethral sling placement. The transobturator approach was associated with lower risks of several complications. However, good-quality studies with long-term follow-ups are warranted for further research. PMID:26005962
Open-Broadcast Radio: Three Strategies.
ERIC Educational Resources Information Center
Theroux, James; Gunter, Jock
Three effective strategies in quality open-broadcast programming for increasing educational radio's audience attraction are suggested as alternatives to the usual approach to such programming in the third world: (1) the advertising approach, which is suited to audience motivation for accomplishing concrete behavioral objectives; (2) the…
Link-Based Similarity Measures Using Reachability Vectors
Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin
2014-01-01
We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188
NASA Technical Reports Server (NTRS)
Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake
2010-01-01
The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.
Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine
2012-12-09
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a multiobjective evolutionary algorithm SPEA2(26), and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
ERIC Educational Resources Information Center
Itegi, Florence M.
2016-01-01
The aim of this paper is to explore the influence of strategic planning in improving the quality of education. The quality of education is directly linked to the effort expended in making arrangements or preparations of educational objectives and determining the requisite resources to facilitate the training, instruction or study that leads to the…
Stefănescu, Lucrina; Robu, Brînduşa Mihaela; Ozunu, Alexandru
2013-11-01
The environmental impact assessment of mining sites represents nowadays a large interest topic in Romania. Historical pollution in the Rosia Montana mining area of Romania caused extensive damage to environmental media. This paper has two goals: to investigate the environmental pollution induced by mining activities in the Rosia Montana area and to quantify the environmental impacts and associated risks by means of an integrated approach. Thus, a new method was developed and applied for quantifying the impact of mining activities, taking account of the quality of environmental media in the mining area, and used as case study in the present paper. The associated risks are a function of the environmental impacts and the probability of their occurrence. The results show that the environmental impacts and quantified risks, based on quality indicators to characterize the environmental quality, are of a higher order, and thus measures for pollution remediation and control need to be considered in the investigated area. The conclusion drawn is that an integrated approach for the assessment of environmental impact and associated risks is a valuable and more objective method, and is an important tool that can be applied in the decision-making process for national authorities in the prioritization of emergency action.
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-01-01
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-10-09
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
Mavrotas, George; Ziomas, Ioannis C; Diakouaki, Danae
2006-07-01
This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.
NASA Astrophysics Data System (ADS)
Mavrotas, George; Ziomas, Ioannis C.; Diakouaki, Danae
2006-07-01
This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.
A systematic review of the incidence and prevalence of comorbidity in multiple sclerosis: Overview
Cohen, Jeffrey; Stuve, Olaf; Trojano, Maria; Sørensen, Per Soelberg; Reingold, Stephen; Cutter, Gary; Reider, Nadia
2015-01-01
Background: Comorbidity is an area of increasing interest in multiple sclerosis (MS). Objective: The objective of this review is to estimate the incidence and prevalence of comorbidity in people with MS and assess the quality of included studies. Methods: We searched the PubMed, SCOPUS, EMBASE and Web of Knowledge databases, conference proceedings, and reference lists of retrieved articles. Two reviewers independently screened abstracts. One reviewer abstracted data using a standardized form and the abstraction was verified by a second reviewer. We assessed study quality using a standardized approach. We quantitatively assessed population-based studies using the I2 statistic, and conducted random-effects meta-analyses. Results: We included 249 articles. Study designs were variable with respect to source populations, case definitions, methods of ascertainment and approaches to reporting findings. Prevalence was reported more frequently than incidence; estimates for prevalence and incidence varied substantially for all conditions. Heterogeneity was high. Conclusion: This review highlights substantial gaps in the epidemiological knowledge of comorbidity in MS worldwide. Little is known about comorbidity in Central or South America, Asia or Africa. Findings in North America and Europe are inconsistent. Future studies should report age-, sex- and ethnicity-specific estimates of incidence and prevalence, and standardize findings to a common population. PMID:25623244
Maiti, Saumen; Erram, V C; Gupta, Gautam; Tiwari, Ram Krishna; Kulkarni, U D; Sangpal, R R
2013-04-01
Deplorable quality of groundwater arising from saltwater intrusion, natural leaching and anthropogenic activities is one of the major concerns for the society. Assessment of groundwater quality is, therefore, a primary objective of scientific research. Here, we propose an artificial neural network-based method set in a Bayesian neural network (BNN) framework and employ it to assess groundwater quality. The approach is based on analyzing 36 water samples and inverting up to 85 Schlumberger vertical electrical sounding data. We constructed a priori model by suitably parameterizing geochemical and geophysical data collected from the western part of India. The posterior model (post-inversion) was estimated using the BNN learning procedure and global hybrid Monte Carlo/Markov Chain Monte Carlo optimization scheme. By suitable parameterization of geochemical and geophysical parameters, we simulated 1,500 training samples, out of which 50 % samples were used for training and remaining 50 % were used for validation and testing. We show that the trained model is able to classify validation and test samples with 85 % and 80 % accuracy respectively. Based on cross-correlation analysis and Gibb's diagram of geochemical attributes, the groundwater qualities of the study area were classified into following three categories: "Very good", "Good", and "Unsuitable". The BNN model-based results suggest that groundwater quality falls mostly in the range of "Good" to "Very good" except for some places near the Arabian Sea. The new modeling results powered by uncertainty and statistical analyses would provide useful constrain, which could be utilized in monitoring and assessment of the groundwater quality.
NASA Astrophysics Data System (ADS)
Govenor, Heather; Krometis, Leigh Anne H.; Hession, W. Cully
2017-10-01
Macroinvertebrate community assessment is used in most US states to evaluate stream health under the Clean Water Act. While water quality assessment and impairment determinations are reported to the US Environmental Protection Agency, there is no national summary of biological assessment findings. The objective of this work was to determine the national extent of invertebrate-based impairments and to identify pollutants primarily responsible for those impairments. Evaluation of state data in the US Environmental Protection Agency's Assessment and Total Maximum Daily Load Tracking and Implementation System database revealed considerable differences in reporting approaches and terminologies including differences in if and how states report specific biological assessment findings. Only 15% of waters impaired for aquatic life could be identified as having impairments determined by biological assessments (e.g., invertebrates, fish, periphyton); approximately one-third of these were associated with macroinvertebrate bioassessment. Nearly 650 invertebrate-impaired waters were identified nationwide, and sediment was the most common pollutant in bedded (63%) and suspended (9%) forms. This finding is not unexpected, given previous work on the negative impacts of sediment on aquatic life, and highlights the need to more specifically identify the mechanisms driving sediment impairments in order to design effective remediation plans. It also reinforces the importance of efforts to derive sediment-specific biological indices and numerical sediment quality guidelines. Standardization of state reporting approaches and terminology would significantly increase the potential application of water quality assessment data, reveal national trends, and encourage sharing of best practices to facilitate the attainment of water quality goals.
A robot control formalism based on an information quality concept
NASA Technical Reports Server (NTRS)
Ekman, A.; Torne, A.; Stromberg, D.
1994-01-01
A relevance measure based on Jaynes maximum entropy principle is introduced. Information quality is the conjunction of accuracy and relevance. The formalism based on information quality is developed for one-agent applications. The robot requires a well defined working environment where properties of each object must be accurately specified.
The Air Quality Model Evaluation International Initiative ...
This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
NASA Astrophysics Data System (ADS)
Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd
2017-10-01
Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.
Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd
2015-09-01
This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.
Changes in quality of life after elective surgery: an observational study comparing two measures.
Kronzer, Vanessa L; Jerry, Michelle R; Ben Abdallah, Arbi; Wildes, Troy S; McKinnon, Sherry L; Sharma, Anshuman; Avidan, Michael S
2017-08-01
Our main objective was to compare the change in a validated quality of life measure to a global assessment measure. The secondary objectives were to estimate the minimum clinically important difference (MCID) and to describe the change in quality of life by surgical specialty. This prospective cohort study included 7902 adult patients undergoing elective surgery. Changes in the Veterans RAND 12-Item Health Survey (VR-12), composed of a physical component summary (PCS) and a mental component summary (MCS), were calculated using preoperative and postoperative questionnaires. The latter also contained a global assessment question for quality of life. We compared PCS and MCS to the global assessment using descriptive statistics and weighted kappa. MCID was calculated using an anchor-based approach. Analyses were pre-specified and registered (NCT02771964). By the change in VR-12 scores, an equal proportion of patients experienced improvement and deterioration in quality of life (28% for PCS, 25% for MCS). In contrast, by the global assessment measure, 61% reported improvement, while only 10% reported deterioration. Agreement with the global assessment was slight for both PCS (kappa = 0.20, 57% matched) and MCS (kappa = 0.10, 54% matched). The MCID for the overall VR-12 score was approximately 2.5 points. Patients undergoing orthopedic surgery showed the most improvement in quality of life measures, while patients undergoing gastrointestinal/hepatobiliary or urologic surgery showed the most deterioration. Subjective global quality of life report does not agree well with a validated quality of life instrument, perhaps due to patient over-optimism.
Quality Assurance Specifications for Planetary Protection Assays
NASA Astrophysics Data System (ADS)
Baker, Amy
As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.
Salihu, Hamisu M; Adegoke, Korede K; Das, Rachita; Wilson, Ronee E; Mazza, Jessica; Okoh, Jennifer O; Naik, Eknath; Berry, Estrellita Lo
2016-08-01
Poor dietary exposure disproportionately affects African-Americans and contributes to the persistence of disparities in health outcomes. In this study, we hypothesized that fortified dietary intervention (FDI) will improve measured dietary and related health outcomes and will be acceptable among low-income African-American women living in Tampa, FL. These objectives were tested using a prospective experimental study using pretest and posttest design with a control group, using a community-based participatory research approach. The intervention (FDI) was designed by the community through structural modification of a preexisting, diet-based program by the addition of a physical and mental health component. Paired sample t tests were used to examine preintervention and postintervention changes in study outcomes. A total of 49 women participated in the study, 26 in the FDI group and 23 controls. Two weeks postintervention, there were significant improvements in waist circumference and health-related quality of life related to physical health (P< .0001), physical fitness subscores (P= .002), and nutritional subscores (P= .001) in the FDI group. Among overweight/obese women, improvement in health-related quality of life related to physical health, a significant decrease in depressive score, and a reduction in waist circumference were noted. In the control group, a decrease in waist circumference was observed. Implementation of the FDI through a community-based participatory research approach is feasible and effective among low-income African-American women in general and overweight/obese women in particular. Social reengineering of a nutritional intervention coupled with community-based approach will enhance health outcomes of low-income women. Copyright © 2016 Elsevier Inc. All rights reserved.
Internal quality control: best practice.
Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B
2013-12-01
There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.
Quality and Certification of Electronic Health Records
Hoerbst, A.; Ammenwerth, E.
2010-01-01
Background Numerous projects, initiatives, and programs are dedicated to the development of Electronic Health Records (EHR) worldwide. Increasingly more of these plans have recently been brought from a scientific environment to real life applications. In this context, quality is a crucial factor with regard to the acceptance and utility of Electronic Health Records. However, the dissemination of the existing quality approaches is often rather limited. Objectives The present paper aims at the description and comparison of the current major quality certification approaches to EHRs. Methods A literature analysis was carried out in order to identify the relevant publications with regard to EHR quality certification. PubMed, ACM Digital Library, IEEExplore, CiteSeer, and Google (Scholar) were used to collect relevant sources. The documents that were obtained were analyzed using techniques of qualitative content analysis. Results The analysis discusses and compares the quality approaches of CCHIT, EuroRec, IHE, openEHR, and EN13606. These approaches differ with regard to their focus, support of service-oriented EHRs, process of (re-)certification and testing, number of systems certified and tested, supporting organizations, and regional relevance. Discussion The analyzed approaches show differences with regard to their structure and processes. System vendors can exploit these approaches in order to improve and certify their information systems. Health care organizations can use these approaches to support selection processes or to assess the quality of their own information systems. PMID:23616834
NASA Astrophysics Data System (ADS)
Kuo, Chung-Feng Jeffrey; Quang Vu, Huy; Gunawan, Dewantoro; Lan, Wei-Luen
2012-09-01
Laser scribing process has been considered as an effective approach for surface texturization on thin film solar cell. In this study, a systematic method for optimizing multi-objective process parameters of fiber laser system was proposed to achieve excellent quality characteristics, such as the minimum scribing line width, the flattest trough bottom, and the least processing edge surface bumps for increasing incident light absorption of thin film solar cell. First, the Taguchi method (TM) obtained useful statistical information through the orthogonal array with relatively fewer experiments. However, TM is only appropriate to optimize single-objective problems and has to rely on engineering judgment for solving multi-objective problems that can cause uncertainty to some degree. The back-propagation neural network (BPNN) and data envelopment analysis (DEA) were utilized to estimate the incomplete data and derive the optimal process parameters of laser scribing system. In addition, analysis of variance (ANOVA) method was also applied to identify the significant factors which have the greatest effects on the quality of scribing process; in other words, by putting more emphasis on these controllable and profound factors, the quality characteristics of the scribed thin film could be effectively enhanced. The experiments were carried out on ZnO:Al (AZO) transparent conductive thin film with a thickness of 500 nm and the results proved that the proposed approach yields better anticipated improvements than that of the TM which is only superior in improving one quality while sacrificing the other qualities. The results of confirmation experiments have showed the reliability of the proposed method.
NASA Astrophysics Data System (ADS)
Krauß, T.
2014-11-01
The focal plane assembly of most pushbroom scanner satellites is built up in a way that different multispectral or multispectral and panchromatic bands are not all acquired exactly at the same time. This effect is due to offsets of some millimeters of the CCD-lines in the focal plane. Exploiting this special configuration allows the detection of objects moving during this small time span. In this paper we present a method for automatic detection and extraction of moving objects - mainly traffic - from single very high resolution optical satellite imagery of different sensors. The sensors investigated are WorldView-2, RapidEye, Pléiades and also the new SkyBox satellites. Different sensors require different approaches for detecting moving objects. Since the objects are mapped on different positions only in different spectral bands also the change of spectral properties have to be taken into account. In case the main distance in the focal plane is between the multispectral and the panchromatic CCD-line like for Pléiades an approach for weighted integration to receive mostly identical images is investigated. Other approaches for RapidEye and WorldView-2 are also shown. From these intermediate bands difference images are calculated and a method for detecting the moving objects from these difference images is proposed. Based on these presented methods images from different sensors are processed and the results are assessed for detection quality - how many moving objects can be detected, how many are missed - and accuracy - how accurate is the derived speed and size of the objects. Finally the results are discussed and an outlook for possible improvements towards operational processing is presented.
Data-driven grasp synthesis using shape matching and task-based pruning.
Li, Ying; Fu, Jiaxin L; Pollard, Nancy S
2007-01-01
Human grasps, especially whole-hand grasps, are difficult to animate because of the high number of degrees of freedom of the hand and the need for the hand to conform naturally to the object surface. Captured human motion data provides us with a rich source of examples of natural grasps. However, for each new object, we are faced with the problem of selecting the best grasp from the database and adapting it to that object. This paper presents a data-driven approach to grasp synthesis. We begin with a database of captured human grasps. To identify candidate grasps for a new object, we introduce a novel shape matching algorithm that matches hand shape to object shape by identifying collections of features having similar relative placements and surface normals. This step returns many grasp candidates, which are clustered and pruned by choosing the grasp best suited for the intended task. For pruning undesirable grasps, we develop an anatomically-based grasp quality measure specific to the human hand. Examples of grasp synthesis are shown for a variety of objects not present in the original database. This algorithm should be useful both as an animator tool for posing the hand and for automatic grasp synthesis in virtual environments.
Resolving future fire management conflicts using multicriteria decision making.
Driscoll, Don A; Bode, Michael; Bradstock, Ross A; Keith, David A; Penman, Trent D; Price, Owen F
2016-02-01
Management strategies to reduce the risks to human life and property from wildfire commonly involve burning native vegetation. However, planned burning can conflict with other societal objectives such as human health and biodiversity conservation. These conflicts are likely to intensify as fire regimes change under future climates and as growing human populations encroach farther into fire-prone ecosystems. Decisions about managing fire risks are therefore complex and warrant more sophisticated approaches than are typically used. We applied a multicriteria decision making approach (MCDA) with the potential to improve fire management outcomes to the case of a highly populated, biodiverse, and flammable wildland-urban interface. We considered the effects of 22 planned burning options on 8 objectives: house protection, maximizing water quality, minimizing carbon emissions and impacts on human health, and minimizing declines of 5 distinct species types. The MCDA identified a small number of management options (burning forest adjacent to houses) that performed well for most objectives, but not for one species type (arboreal mammal) or for water quality. Although MCDA made the conflict between objectives explicit, resolution of the problem depended on the weighting assigned to each objective. Additive weighting of criteria traded off the arboreal mammal and water quality objectives for other objectives. Multiplicative weighting identified scenarios that avoided poor outcomes for any objective, which is important for avoiding potentially irreversible biodiversity losses. To distinguish reliably among management options, future work should focus on reducing uncertainty in outcomes across a range of objectives. Considering management actions that have more predictable outcomes than landscape fuel management will be important. We found that, where data were adequate, an MCDA can support decision making in the complex and often conflicted area of fire management. © 2015 Society for Conservation Biology.
The Role of the Quality Enhancement Plan in Engendering a Culture of Assessment
ERIC Educational Resources Information Center
Loughman, Thomas P.; Hickson, Joyce; Sheeks, Gina L.; Hortman, J. William
2008-01-01
During the past two decades, colleges and universities have used best practices from corporate management such as total quality management, strategic planning, management by objectives, benchmarking, data warehousing, and performance indicators. Many institutions of higher learning now have adopted comprehensive and multifaceted approaches to…
Objectivity of the Subjective Quality: Convergence on Competencies Expected of Doctoral Graduates
ERIC Educational Resources Information Center
Kariyana, Israel; Sonn, Reynold A.; Marongwe, Newlin
2017-01-01
This study assessed the competencies expected of doctoral graduates. Twelve purposefully sampled education experts provided the data. A case study design within a qualitative approach was adopted. Data were gathered through interviews and thematically analysed. Member checking ensured data trustworthiness. Factors affecting the quality of a…
Chamberlain, David; Brook, Richard
2014-03-01
Health organisations are often driven by specific targets defined by mission statements, aims and objectives to improve patient care. Health libraries need to demonstrate that they contribute to organisational objectives, but it is not clear how nurses view that contribution. To investigate ward nursing staff motivations, their awareness of ward and organisational objectives; and their attitudes towards the contribution of health library services to improving patient care. Qualitative research using focus group data was combined with content analysis of literature evidence and library statistics (quantitative data). Data were analysed using thematic coding, divided into five group themes: understanding of Trust, Ward and Personal objectives, use of Library, use of other information sources, quality and Issues. Four basic social-psychological processes were then developed. Behaviour indicates low awareness of organisational objectives despite patient-centric motivation. High awareness of library services is shown with some connection made by ward staff between improved knowledge and improved patient care. There was a two-tiered understanding of ward objectives and library services, based on level of seniority. However, evidence-based culture needs to be intrinsic in the organisation before all staff benefit. Libraries can actively engage in this at ward and board level and improve patient care by supporting organisational objectives. © 2014 The author. Health Information and Libraries Journal © 2014 Health Libraries Group.
Appreciative Inquiry for quality improvement in primary care practices.
Ruhe, Mary C; Bobiak, Sarah N; Litaker, David; Carter, Caroline A; Wu, Laura; Schroeder, Casey; Zyzanski, Stephen J; Weyer, Sharon M; Werner, James J; Fry, Ronald E; Stange, Kurt C
2011-01-01
To test the effect of an Appreciative Inquiry (AI) quality improvement strategy on clinical quality management and practice development outcomes. Appreciative inquiry enables the discovery of shared motivations, envisioning a transformed future, and learning around the implementation of a change process. Thirty diverse primary care practices were randomly assigned to receive an AI-based intervention focused on a practice-chosen topic and on improving preventive service delivery (PSD) rates. Medical-record review assessed change in PSD rates. Ethnographic field notes and observational checklist analysis used editing and immersion/crystallization methods to identify factors affecting intervention implementation and practice development outcomes. The PSD rates did not change. Field note analysis suggested that the intervention elicited core motivations, facilitated development of a shared vision, defined change objectives, and fostered respectful interactions. Practices most likely to implement the intervention or develop new practice capacities exhibited 1 or more of the following: support from key leader(s), a sense of urgency for change, a mission focused on serving patients, health care system and practice flexibility, and a history of constructive practice change. An AI approach and enabling practice conditions can lead to intervention implementation and practice development by connecting individual and practice strengths and motivations to the change objective.
Appreciative Inquiry for Quality Improvement in Primary Care Practices
Ruhe, Mary C.; Bobiak, Sarah N.; Litaker, David; Carter, Caroline A.; Wu, Laura; Schroeder, Casey; Zyzanski, Stephen; Weyer, Sharon M.; Werner, James J.; Fry, Ronald E.; Stange, Kurt C.
2014-01-01
Purpose To test the effect of an Appreciative Inquiry (AI) quality improvement strategy, on clinical quality management and practice development outcomes. AI enables discovery of shared motivations, envisioning a transformed future, and learning around implementation of a change process. Methods Thirty diverse primary care practices were randomly assigned to receive an AI-based intervention focused on a practice-chosen topic and on improving preventive service delivery (PSD) rates. Medical record review assessed change in PSD rates. Ethnographic fieldnotes and observational checklist analysis used editing and immersion/crystallization methods to identify factors affecting intervention implementation and practice development outcomes. Results PSD rates did not change. Field note analysis suggested that the intervention elicited core motivations, facilitated development of a shared vision, defined change objectives and fostered respectful interactions. Practices most likely to implement the intervention or develop new practice capacities exhibited one or more of the following: support from key leader(s), a sense of urgency for change, a mission focused on serving patients, health care system and practice flexibility, and a history of constructive practice change. Conclusions An AI approach and enabling practice conditions can lead to intervention implementation and practice development by connecting individual and practice strengths and motivations to the change objective. PMID:21192206
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C; Passalia, Felipe; Matos, Felipe D; Maserati, Marc P; Alves, Mayra F; Almeida, Tamie G de; Cardoso, Bruna L; Basso, Andrea C; Nogueira, Marcelo F G
2016-08-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment.
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C.; Passalia, Felipe; Matos, Felipe D.; Maserati Jr, Marc P.; Alves, Mayra F.; de Almeida, Tamie G.; Cardoso, Bruna L.; Basso, Andrea C.; Nogueira, Marcelo F. G.
2016-01-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment. PMID:27584609
A Qualitative Analysis of Acute Skin Toxicity among Breast Cancer Radiotherapy Patients
Schnur, Julie B.; Ouellette, Suzanne C.; DiLorenzo, Terry A.; Green, Sheryl; Montgomery, Guy H.
2013-01-01
Objectives One of the most common acute side effects of breast cancer radiotherapy is treatment induced skin changes, referred to as skin toxicity. Yet no research to date has focused expressly on skin toxicity-related quality of life in breast cancer radiotherapy patients. Therefore, our aim was to use qualitative approaches to better understand the impact of skin toxicity on quality of life. Methods Semi-structured interviews were conducted with 20 women (Stage 0-III breast cancer), during their last week of external beam radiotherapy. Each interview was transcribed verbatim, and thematic analysis was performed. Results Three themes were identified based on the interview responses: First, skin changes affect multiple dimensions of quality of life. They cause physical discomfort, body image disturbance, emotional distress, and impair both day-to-day functioning and satisfaction with radiation treatment. Second, individual differences affect women’s experiences. Generally African-American women, younger women, women who are not currently in a relationship, women who are being treated during the summer, and women who are more invested in their appearance are more distressed by skin toxicity. Third, women use a variety of symptom management strategies including self-medication, complementary/alternative medicine approaches, and psychological strategies. Conclusions Implications of results are: 1) Skin toxicity affects numerous dimensions of quality of life, and assessment approaches and psychosocial interventions should address this; 2) individual differences may affect the experience of skin toxicity, and should be considered in treatment and education approaches; and 3) participants’ own creativity and problem-solving should be used to improve the treatment experience. PMID:20238306
Improving automated 3D reconstruction methods via vision metrology
NASA Astrophysics Data System (ADS)
Toschi, Isabella; Nocerino, Erica; Hess, Mona; Menna, Fabio; Sargeant, Ben; MacDonald, Lindsay; Remondino, Fabio; Robson, Stuart
2015-05-01
This paper aims to provide a procedure for improving automated 3D reconstruction methods via vision metrology. The 3D reconstruction problem is generally addressed using two different approaches. On the one hand, vision metrology (VM) systems try to accurately derive 3D coordinates of few sparse object points for industrial measurement and inspection applications; on the other, recent dense image matching (DIM) algorithms are designed to produce dense point clouds for surface representations and analyses. This paper strives to demonstrate a step towards narrowing the gap between traditional VM and DIM approaches. Efforts are therefore intended to (i) test the metric performance of the automated photogrammetric 3D reconstruction procedure, (ii) enhance the accuracy of the final results and (iii) obtain statistical indicators of the quality achieved in the orientation step. VM tools are exploited to integrate their main functionalities (centroid measurement, photogrammetric network adjustment, precision assessment, etc.) into the pipeline of 3D dense reconstruction. Finally, geometric analyses and accuracy evaluations are performed on the raw output of the matching (i.e. the point clouds) by adopting a metrological approach. The latter is based on the use of known geometric shapes and quality parameters derived from VDI/VDE guidelines. Tests are carried out by imaging the calibrated Portable Metric Test Object, designed and built at University College London (UCL), UK. It allows assessment of the performance of the image orientation and matching procedures within a typical industrial scenario, characterised by poor texture and known 3D/2D shapes.
A Risk-based Assessment And Management Framework For Multipollutant Air Quality
Frey, H. Christopher; Hubbell, Bryan
2010-01-01
The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management system, e.g. focused on standards for individual pollutants and with separate goals for air toxics and ambient pollutants. However, additional improvements may be possible if goals and actions are defined in terms of risk metrics that are comparable across criteria pollutants and air toxics (hazardous air pollutants), and that encompass both human health and ecological risks. PMID:21209847
A Human Capital Approach to Reduce Health Disparities
Glover, Saundra H.; Xirasagar, Sudha; Jeon, Yunho; Elder, Keith T.; Piper, Crystal N.; Pastides, Harris
2010-01-01
Objective To introduce a human capital approach to reduce health disparities in South Carolina by increasing the number and quality of trained minority professionals in public health practice and research. Methods The conceptual basis and elements of Project EXPORT in South Carolina are described. Project EXPORT is a community based participatory research (CBPR) translational project designed to build human capital in public health practice and research. This project involves Claflin University (CU), a Historically Black College University (HBCU) and the African American community of Orangeburg, South Carolina to reduce health disparities, utilizing resources from the University of South Carolina (USC), a level 1 research institution to build expertise at a minority serving institution. The elements of Project EXPORT were created to advance the science base of disparities reduction, increase trained minority researchers, and engage the African American community at all stages of research. Conclusion Building upon past collaborations between HBCU’s in South Carolina and USC, this project holds promise for a public health human capital approach to reduce health disparities. PMID:21814634
Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.
2014-01-01
Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721
NASA Astrophysics Data System (ADS)
Shang, Ruibo; Archibald, Richard; Gelb, Anne; Luke, Geoffrey P.
2018-02-01
In photoacoustic (PA) imaging, the optical absorption can be acquired from the initial pressure distribution (IPD). An accurate reconstruction of the IPD will be very helpful for the reconstruction of the optical absorption. However, the image quality of PA imaging in scattering media is deteriorated by the acoustic diffraction, imaging artifacts, and weak PA signals. In this paper, we propose a sparsity-based optimization approach that improves the reconstruction of the IPD in PA imaging. A linear imaging forward model was set up based on time-and-delay method with the assumption that the point spread function (PSF) is spatial invariant. Then, an optimization equation was proposed with a regularization term to denote the sparsity of the IPD in a certain domain to solve this inverse problem. As a proof of principle, the approach was applied to reconstructing point objects and blood vessel phantoms. The resolution and signal-to-noise ratio (SNR) were compared between conventional back-projection and our proposed approach. Overall these results show that computational imaging can leverage the sparsity of PA images to improve the estimation of the IPD.
Food Quality Improvement of Soy Milk Made from Short-Time Germinated Soybeans
Jiang, Susu; Cai, Weixi; Xu, Baojun
2013-01-01
The objectives of this study were to develop soy milk with improved food quality and to enhance the functional attributes by incorporating short-time germination into the processing. Changes in trypsin inhibitor activity (TIA), phytic acid content and total phenolic content (TPC) in soy milk produced from soybeans germinated within 72 h were investigated to determine the optimum germination condition. Results from the present research showed significant (p < 0.05) improvement of TPC in cooked germinated soybean milk, while both the TIA and phytic acid content were decreased significantly (p < 0.05). In the subsequent evaluation on the quality attributes under the optimum germination condition, soy milk made from 28 h-germinated soybeans presented enhanced nutritional value and comparable physicochemical properties to conventional soy milk. The current approach provides a feasible and convenient way for soy-based product innovation in both household and industrial settings. PMID:28239109
A study of optimization techniques in HDR brachytherapy for the prostate
NASA Astrophysics Data System (ADS)
Pokharel, Ghana Shyam
Several studies carried out thus far are in favor of dose escalation to the prostate gland to have better local control of the disease. But optimal way of delivery of higher doses of radiation therapy to the prostate without hurting neighboring critical structures is still debatable. In this study, we proposed that real time high dose rate (HDR) brachytherapy with highly efficient and effective optimization could be an alternative means of precise delivery of such higher doses. This approach of delivery eliminates the critical issues such as treatment setup uncertainties and target localization as in external beam radiation therapy. Likewise, dosimetry in HDR brachytherapy is not influenced by organ edema and potential source migration as in permanent interstitial implants. Moreover, the recent report of radiobiological parameters further strengthen the argument of using hypofractionated HDR brachytherapy for the management of prostate cancer. Firstly, we studied the essential features and requirements of real time HDR brachytherapy treatment planning system. Automating catheter reconstruction with fast editing tools, fast yet accurate dose engine, robust and fast optimization and evaluation engine are some of the essential requirements for such procedures. Moreover, in most of the cases we performed, treatment plan optimization took significant amount of time of overall procedure. So, making treatment plan optimization automatic or semi-automatic with sufficient speed and accuracy was the goal of the remaining part of the project. Secondly, we studied the role of optimization function and constraints in overall quality of optimized plan. We have studied the gradient based deterministic algorithm with dose volume histogram (DVH) and more conventional variance based objective functions for optimization. In this optimization strategy, the relative weight of particular objective in aggregate objective function signifies its importance with respect to other objectives. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was done to keep time window desirable for real time procedures. Therefore, it requires further study with improved conditions to realize the full potential of the algorithm.
NASA Astrophysics Data System (ADS)
Pipaud, Isabel; Lehmkuhl, Frank
2017-09-01
In the field of geomorphology, automated extraction and classification of landforms is one of the most active research areas. Until the late 2000s, this task has primarily been tackled using pixel-based approaches. As these methods consider pixels and pixel neighborhoods as the sole basic entities for analysis, they cannot account for the irregular boundaries of real-world objects. Object-based analysis frameworks emerging from the field of remote sensing have been proposed as an alternative approach, and were successfully applied in case studies falling in the domains of both general and specific geomorphology. In this context, the a-priori selection of scale parameters or bandwidths is crucial for the segmentation result, because inappropriate parametrization will either result in over-segmentation or insufficient segmentation. In this study, we describe a novel supervised method for delineation and classification of alluvial fans, and assess its applicability using a SRTM 1‧‧ DEM scene depicting a section of the north-eastern Mongolian Altai, located in northwest Mongolia. The approach is premised on the application of mean-shift segmentation and the use of a one-class support vector machine (SVM) for classification. To consider variability in terms of alluvial fan dimension and shape, segmentation is performed repeatedly for different weightings of the incorporated morphometric parameters as well as different segmentation bandwidths. The final classification layer is obtained by selecting, for each real-world object, the most appropriate segmentation result according to fuzzy membership values derived from the SVM classification. Our results show that mean-shift segmentation and SVM-based classification provide an effective framework for delineation and classification of a particular landform. Variable bandwidths and terrain parameter weightings were identified as being crucial for consideration of intra-class variability, and, in turn, for a constantly high segmentation quality. Our analysis further reveals that incorporation of morphometric parameters quantifying specific morphological aspects of a landform is indispensable for developing an accurate classification scheme. Alluvial fans exhibiting accentuated composite morphologies were identified as a major challenge for automatic delineation, as they cannot be fully captured by a single segmentation run. There is, however, a high probability that this shortcoming can be overcome by enhancing the presented approach with a routine merging fan sub-entities based on their spatial relationships.
An importance-performance analysis of hospital information system attributes: A nurses' perspective.
Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J
2016-02-01
Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
CognitionMaster: an object-based image analysis framework
2013-01-01
Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542
An Integrative Approach to Cultural Competence in the Psychiatric Curriculum
ERIC Educational Resources Information Center
Fung, Kenneth; Andermann, Lisa; Zaretsky, Ari; Lo, Hung-Tat
2008-01-01
Objective: As it is increasingly recognized that cultural competence is an essential quality for any practicing psychiatrist, postgraduate psychiatry training programs need to incorporate cultural competence training into their curricula. This article documents the unique approach to resident cultural competence training being developed in the…
Space transfer vehicle concepts and requirements, volume 2, book 1
NASA Technical Reports Server (NTRS)
1991-01-01
The objective of the systems engineering task was to develop and implement an approach that would generate the required study products as defined by program directives. This product list included a set of system and subsystem requirements, a complete set of optimized trade studies and analyses resulting in a recommended system configuration, and the definition of an integrated system/technology and advanced development growth path. A primary ingredient in the approach was the TQM philosophy stressing job quality from the inception. Included throughout the Systems Engineering, Programmatics, Concepts, Flight Design, and Technology sections are data supporting the original objectives as well as supplemental information resulting from program activities. The primary result of the analyses and studies was the recommendation of a single propulsion stage Lunar Transportation System (LTS) configuration that supports several different operations scenarios with minor element changes. This concept has the potential to support two additional scenarios with complex element changes. The space based LTS concept consists of three primary configurations--Piloted, Reusable Cargo, and Expendable Cargo.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
Galbally, Javier; Marcel, Sébastien; Fierrez, Julian
2014-02-01
To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. In this paper, we present a novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment. The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results, obtained on publicly available data sets of fingerprint, iris, and 2D face, show that the proposed method is highly competitive compared with other state-of-the-art approaches and that the analysis of the general image quality of real biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
NASA Astrophysics Data System (ADS)
Tonbul, H.; Kavzoglu, T.
2016-12-01
In recent years, object based image analysis (OBIA) has spread out and become a widely accepted technique for the analysis of remotely sensed data. OBIA deals with grouping pixels into homogenous objects based on spectral, spatial and textural features of contiguous pixels in an image. The first stage of OBIA, named as image segmentation, is the most prominent part of object recognition. In this study, multiresolution segmentation, which is a region-based approach, was employed to construct image objects. In the application of multi-resolution, three parameters, namely shape, compactness and scale must be set by the analyst. Segmentation quality remarkably influences the fidelity of the thematic maps and accordingly the classification accuracy. Therefore, it is of great importance to search and set optimal values for the segmentation parameters. In the literature, main focus has been on the definition of scale parameter, assuming that the effect of shape and compactness parameters is limited in terms of achieved classification accuracy. The aim of this study is to deeply analyze the influence of shape/compactness parameters by varying their values while using the optimal scale parameter determined by the use of Estimation of Scale Parameter (ESP-2) approach. A pansharpened Qickbird-2 image covering Trabzon, Turkey was employed to investigate the objectives of the study. For this purpose, six different combinations of shape/compactness were utilized to make deductions on the behavior of shape and compactness parameters and optimal setting for all parameters as a whole. Objects were assigned to classes using nearest neighbor classifier in all segmentation observations and equal number of pixels was randomly selected to calculate accuracy metrics. The highest overall accuracy (92.3%) was achieved by setting the shape/compactness criteria to 0.3/0.3. The results of this study indicate that shape/compactness parameters can have significant effect on classification accuracy with 4% change in overall accuracy. Also, statistical significance of differences in accuracy was tested using the McNemar's test and found that the difference between poor and optimal setting of shape/compactness parameters was statistically significant, suggesting a search for optimal parameterization instead of default setting.
Facial motion parameter estimation and error criteria in model-based image coding
NASA Astrophysics Data System (ADS)
Liu, Yunhai; Yu, Lu; Yao, Qingdong
2000-04-01
Model-based image coding has been given extensive attention due to its high subject image quality and low bit-rates. But the estimation of object motion parameter is still a difficult problem, and there is not a proper error criteria for the quality assessment that are consistent with visual properties. This paper presents an algorithm of the facial motion parameter estimation based on feature point correspondence and gives the motion parameter error criteria. The facial motion model comprises of three parts. The first part is the global 3-D rigid motion of the head, the second part is non-rigid translation motion in jaw area, and the third part consists of local non-rigid expression motion in eyes and mouth areas. The feature points are automatically selected by a function of edges, brightness and end-node outside the blocks of eyes and mouth. The numbers of feature point are adjusted adaptively. The jaw translation motion is tracked by the changes of the feature point position of jaw. The areas of non-rigid expression motion can be rebuilt by using block-pasting method. The estimation approach of motion parameter error based on the quality of reconstructed image is suggested, and area error function and the error function of contour transition-turn rate are used to be quality criteria. The criteria reflect the image geometric distortion caused by the error of estimated motion parameters properly.
ERIC Educational Resources Information Center
Kay, Robin H.; Knaack, Liesel
2009-01-01
Learning objects are interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and/or guiding the cognitive processes of learners. Research on the impact, effectiveness, and usefulness of learning objects is limited, partially because comprehensive, theoretically based, reliable, and valid evaluation…
Patterns of coordination and clinical outcomes: a study of surgical services.
Young, G J; Charns, M P; Desai, K; Khuri, S F; Forbes, M G; Henderson, W; Daley, J
1998-01-01
OBJECTIVE: To test the hypothesis that surgical services combining relatively high levels of feedback and programming approaches to the coordination of surgical staff would have better quality of care than surgical services using low levels of both coordination approaches as well as those surgical service using low levels of either coordination approach. STUDY SETTING: A study sample of 44 academically affiliated surgical services that are part of the Department of Veterans Affairs. STUDY DESIGN: In a cross-sectional analysis, surgical services were assigned to one of three groups based on their scores on feedback and programming coordination measures: high on both measures; high on one measure, low on the other; and low on both. Univariate and multivariate analyses were used to assess differences among these groups with respect to three quality indicators: risk-adjusted mortality, risk-adjusted morbidity, and staff perceptions of quality. DATA COLLECTION/EXTRACTION METHODS: Risk-adjusted mortality and morbidity came from an outcomes reporting program within the Department of Veterans Affairs that entails the prospective collection of clinical data from patient charts. Data on coordination practices and perceived quality came from a survey of surgical staff at each of the 44 participating surgical services. PRINCIPAL FINDINGS: The group of surgical services using high feedback and high programming had the best perceived quality. This group also had the lowest morbidity, but the difference was statistically significant with respect to only one of the two other groups: the group with low feedback and low programming. No significant group differences were found for mortality. CONCLUSIONS: Study results provide partial support for the hypothesis that high levels of feedback and programming should be combined for optimal quality of care. Study results also suggest that staff coordination is more important for improving morbidity than mortality in surgical services. PMID:9865218
Comparison of outlier identification methods in hospital surgical quality improvement programs.
Bilimoria, Karl Y; Cohen, Mark E; Merkow, Ryan P; Wang, Xue; Bentrem, David J; Ingraham, Angela M; Richards, Karen; Hall, Bruce L; Ko, Clifford Y
2010-10-01
Surgeons and hospitals are being increasingly assessed by third parties regarding surgical quality and outcomes, and much of this information is reported publicly. Our objective was to compare various methods used to classify hospitals as outliers in established surgical quality assessment programs by applying each approach to a single data set. Using American College of Surgeons National Surgical Quality Improvement Program data (7/2008-6/2009), hospital risk-adjusted 30-day morbidity and mortality were assessed for general surgery at 231 hospitals (cases = 217,630) and for colorectal surgery at 109 hospitals (cases = 17,251). The number of outliers (poor performers) identified using different methods and criteria were compared. The overall morbidity was 10.3% for general surgery and 25.3% for colorectal surgery. The mortality was 1.6% for general surgery and 4.0% for colorectal surgery. Programs used different methods (logistic regression, hierarchical modeling, partitioning) and criteria (P < 0.01, P < 0.05, P < 0.10) to identify outliers. Depending on outlier identification methods and criteria employed, when each approach was applied to this single dataset, the number of outliers ranged from 7 to 57 hospitals for general surgery morbidity, 1 to 57 hospitals for general surgery mortality, 4 to 27 hospitals for colorectal morbidity, and 0 to 27 hospitals for colorectal mortality. There was considerable variation in the number of outliers identified using different detection approaches. Quality programs seem to be utilizing outlier identification methods contrary to what might be expected, thus they should justify their methodology based on the intent of the program (i.e., quality improvement vs. reimbursement). Surgeons and hospitals should be aware of variability in methods used to assess their performance as these outlier designations will likely have referral and reimbursement consequences.
Aksu, Buket; Paradkar, Anant; de Matas, Marcel; Ozer, Ozgen; Güneri, Tamer; York, Peter
2012-12-01
The publication of the International Conference of Harmonization (ICH) Q8, Q9, and Q10 guidelines paved the way for the standardization of quality after the Food and Drug Administration issued current Good Manufacturing Practices guidelines in 2003. "Quality by Design", mentioned in the ICH Q8 guideline, offers a better scientific understanding of critical process and product qualities using knowledge obtained during the life cycle of a product. In this scope, the "knowledge space" is a summary of all process knowledge obtained during product development, and the "design space" is the area in which a product can be manufactured within acceptable limits. To create the spaces, artificial neural networks (ANNs) can be used to emphasize the multidimensional interactions of input variables and to closely bind these variables to a design space. This helps guide the experimental design process to include interactions among the input variables, along with modeling and optimization of pharmaceutical formulations. The objective of this study was to develop an integrated multivariate approach to obtain a quality product based on an understanding of the cause-effect relationships between formulation ingredients and product properties with ANNs and genetic programming on the ramipril tablets prepared by the direct compression method. In this study, the data are generated through the systematic application of the design of experiments (DoE) principles and optimization studies using artificial neural networks and neurofuzzy logic programs.
Contribution to systematic education of quality management in Slovak health care.
Rusnakova, V; Bacharova, L
2001-01-01
Of the study was to contribute to quality improvement initiatives in Slovak health services through systematic approach to the education and training in quality management (QM). Consequently, the main objectives were to analyse the content of the education in QM abroad, to conduct an audit of perceived training needs in Slovakia, and to propose the design of QM training programme to be applied within CME scheme based on the study results. Triangular method in the design of the study was implemented. Review of relevant information, data from the questionnaire and semi-structured interview in the sample of 67 Slovak trainees from Health Management School and School of Public Health--were adopted in complementary fashion. Highlighted in the survey are positive attitudes to training in quality management documented by the median score higher than 6 in all tested areas, on scale 0-10. No significant differences in profession groups as physicians, nurses, HC managers or among training institutions involved were displayed. However, potential obstacles were identified in deeper study using interviews. The absence of knowledge and skills in management in general and in quality management approaches especially are observed. Typically, the role of strategic planning is undermined. The large scale of quality management approaches is converted to problems of accreditation. Barriers to participative culture, innovation, devolution of accountability, resistance to change and to team based management are authentic findings as well. Drawn from the study were related to: fostering managers--"transformational leaders" for locally driven decision making in health care policy and practice; need of training activities for the continuing education in quality with respect to specific target groups interests and their level of knowledge in management; content of training oriented towards combination of rational utilization of information, critical analytical skills and planning for quality with human resource development-interpersonal skills, team building (soft skills), not just reduction of quality management tools to hard techniques (statistics, ISO norms); methods of education, where the usage of experiential learning methods, participative training inclusive action learning is highlighted; team training complemented with individual professional development support inclusive a coaching and mentoring scheme. AS IMPLICATIONS: Four types of CME training: Basic Module QM, Training for QM teams, Training Trainers Scheme and Guiding through Accreditation and Quality Award were proposed. (Tab. 9, Ref. 38.)
Construction and assembly of the wire planes for the MicroBooNE Time Projection Chamber
Acciarri, R.; Adams, C.; Asaadi, J.; ...
2017-03-09
As x-ray and electron tomography is pushed further into the nanoscale, the limitations of rotation stages become more apparent, leading to challenges in the alignment of the acquired projection images. Here we present an approach for rapid post-acquisition alignment of these projections to obtain high quality three-dimensional images. Our approach is based on a joint estimation of alignment errors, and the object, using an iterative refinement procedure. With simulated data where we know the alignment error of each projection image, our approach shows a residual alignment error that is a factor of a thousand smaller, and it reaches the samemore » error level in the reconstructed image in less than half the number of iterations. We then show its application to experimental data in x-ray and electron nanotomography.« less
Development of a new approach to cumulative effects assessment: a northern river ecosystem example.
Dubé, Monique; Johnson, Brian; Dunn, Gary; Culp, Joseph; Cash, Kevin; Munkittrick, Kelly; Wong, Isaac; Hedley, Kathlene; Booty, William; Lam, David; Resler, Oskar; Storey, Alex
2006-02-01
If sustainable development of Canadian waters is to be achieved, a realistic and manageable framework is required for assessing cumulative effects. The objective of this paper is to describe an approach for aquatic cumulative effects assessment that was developed under the Northern Rivers Ecosystem Initiative. The approach is based on a review of existing monitoring practices in Canada and the presence of existing thresholds for aquatic ecosystem health assessments. It suggests that a sustainable framework is possible for cumulative effects assessment of Canadian waters that would result in integration of national indicators of aquatic health, integration of national initiatives (e.g., water quality index, environmental effects monitoring), and provide an avenue where long-term monitoring programs could be integrated with baseline and follow-up monitoring conducted under the environmental assessment process.
Rapid alignment of nanotomography data using joint iterative reconstruction and reprojection
Gürsoy, Doğa; Hong, Young P.; He, Kuan; ...
2017-09-18
As x-ray and electron tomography is pushed further into the nanoscale, the limitations of rotation stages become more apparent, leading to challenges in the alignment of the acquired projection images. Here we present an approach for rapid post-acquisition alignment of these projections to obtain high quality three-dimensional images. Our approach is based on a joint estimation of alignment errors, and the object, using an iterative refinement procedure. With simulated data where we know the alignment error of each projection image, our approach shows a residual alignment error that is a factor of a thousand smaller, and it reaches the samemore » error level in the reconstructed image in less than half the number of iterations. We then show its application to experimental data in x-ray and electron nanotomography.« less
Construction and assembly of the wire planes for the MicroBooNE Time Projection Chamber
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acciarri, R.; Adams, C.; Asaadi, J.
As x-ray and electron tomography is pushed further into the nanoscale, the limitations of rotation stages become more apparent, leading to challenges in the alignment of the acquired projection images. Here we present an approach for rapid post-acquisition alignment of these projections to obtain high quality three-dimensional images. Our approach is based on a joint estimation of alignment errors, and the object, using an iterative refinement procedure. With simulated data where we know the alignment error of each projection image, our approach shows a residual alignment error that is a factor of a thousand smaller, and it reaches the samemore » error level in the reconstructed image in less than half the number of iterations. We then show its application to experimental data in x-ray and electron nanotomography.« less
Domain specific software architectures: Command and control
NASA Technical Reports Server (NTRS)
Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave
1992-01-01
GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.
A Wireless Sensor Network-Based Approach with Decision Support for Monitoring Lake Water Quality.
Huang, Xiaoci; Yi, Jianjun; Chen, Shaoli; Zhu, Xiaomin
2015-11-19
Online monitoring and water quality analysis of lakes are urgently needed. A feasible and effective approach is to use a Wireless Sensor Network (WSN). Lake water environments, like other real world environments, present many changing and unpredictable situations. To ensure flexibility in such an environment, the WSN node has to be prepared to deal with varying situations. This paper presents a WSN self-configuration approach for lake water quality monitoring. The approach is based on the integration of a semantic framework, where a reasoner can make decisions on the configuration of WSN services. We present a WSN ontology and the relevant water quality monitoring context information, which considers its suitability in a pervasive computing environment. We also propose a rule-based reasoning engine that is used to conduct decision support through reasoning techniques and context-awareness. To evaluate the approach, we conduct usability experiments and performance benchmarks.
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
A decision-support system for the analysis of clinical practice patterns.
Balas, E A; Li, Z R; Mitchell, J A; Spencer, D C; Brent, E; Ewigman, B G
1994-01-01
Several studies documented substantial variation in medical practice patterns, but physicians often do not have adequate information on the cumulative clinical and financial effects of their decisions. The purpose of developing an expert system for the analysis of clinical practice patterns was to assist providers in analyzing and improving the process and outcome of patient care. The developed QFES (Quality Feedback Expert System) helps users in the definition and evaluation of measurable quality improvement objectives. Based on objectives and actual clinical data, several measures can be calculated (utilization of procedures, annualized cost effect of using a particular procedure, and expected utilization based on peer-comparison and case-mix adjustment). The quality management rules help to detect important discrepancies among members of the selected provider group and compare performance with objectives. The system incorporates a variety of data and knowledge bases: (i) clinical data on actual practice patterns, (ii) frames of quality parameters derived from clinical practice guidelines, and (iii) rules of quality management for data analysis. An analysis of practice patterns of 12 family physicians in the management of urinary tract infections illustrates the use of the system.
Can we prevent accidental injury to adolescents? A systematic review of the evidence.
Munro, J.; Coleman, P.; Nicholl, J.; Harper, R.; Kent, G.; Wild, D.
1995-01-01
OBJECTIVES: As part of the Department of Health strategy The Health of the Nation, a systematic review of published and unpublished literature relating to the effectiveness of interventions in reducing accidental injury in the population aged 15-24 years was carried out. METHODS: The literature was reviewed under the standard setting headings of road, work, home, and sports and leisure, and graded for quality of evidence and strength of recommendation using a scale published in the UK national epidemiologically based needs assessment programme. RESULTS: The most effective measures appear to be legislative and regulatory controls in road, sport, and workplace settings. Environmental engineering measures on the road and in sports have relatively low implementation costs and result in fewer injuries at all ages. There is little evidence that purely educational measures reduced injuries in the short term. Community based approaches may be effective in all age groups, and incentives to encourage safer behaviour hold promise but require further evaluation. The potential of multifactorial approaches seems greater than narrowly based linear approaches. CONCLUSIONS: Few interventions to reduce injury in adolescents have been rigorously evaluated using good quality randomised controlled trials, and where such evidence is available, fewer have been shown to be definitely worthwhile. Many studies relied on surrogate measures rather than actual injury rates, and substantial issues relating to the efficacy or implementation of preventive measures in adolescent and young adult populations remain unresolved. PMID:9346041
2013-01-01
Introduction In 2004, a community-based health insurance (CBI) scheme was introduced in Nouna health district, Burkina Faso, with the objective of improving financial access to high quality health services. We investigate the role of CBI enrollment in the quality of care provided at primary-care facilities in Nouna district, and measure differences in objective and perceived quality of care and patient satisfaction between enrolled and non-enrolled populations who visit the facilities. Methods We interviewed a systematic random sample of 398 patients after their visit to one of the thirteen primary-care facilities contracted with the scheme; 34% (n = 135) of the patients were currently enrolled in the CBI scheme. We assessed objective quality of care as consultation, diagnostic and counselling tasks performed by providers during outpatient visits, perceived quality of care as patient evaluations of the structures and processes of service delivery, and overall patient satisfaction. Two-sample t-tests were performed for group comparison and ordinal logistic regression (OLR) analysis was used to estimate the association between CBI enrollment and overall patient satisfaction. Results Objective quality of care evaluations show that CBI enrollees received substantially less comprehensive care for outpatient services than non-enrollees. In contrast, CBI enrollment was positively associated with overall patient satisfaction (aOR = 1.51, p = 0.014), controlling for potential confounders such as patient socio-economic status, illness symptoms, history of illness and characteristics of care received. Conclusions CBI patients perceived better quality of care, while objectively receiving worse quality of care, compared to patients who were not enrolled in CBI. Systematic differences in quality of care expectations between CBI enrollees and non-enrollees may explain this finding. One factor influencing quality of care may be the type of provider payment used by the CBI scheme, which has been identified as a leading factor in reducing provider motivation to deliver high quality care to CBI enrollees in previous studies. Based on this study, it is unlikely that perceived quality of care and patient satisfaction explain the low CBI enrollment rates in this community. PMID:23680066
Supervised classification of continental shelf sediment off western Donegal, Ireland
NASA Astrophysics Data System (ADS)
Monteys, X.; Craven, K.; McCarron, S. G.
2017-12-01
Managing human impacts on marine ecosystems requires natural regions to be identified and mapped over a range of hierarchically nested scales. In recent years (2000-present) the Irish National Seabed Survey (INSS) and Integrated Mapping for the Sustainable Development of Ireland's Marine Resources programme (INFOMAR) (Geological Survey Ireland and Marine Institute collaborations) has provided unprecedented quantities of high quality data on Ireland's offshore territories. The increasing availability of large, detailed digital representations of these environments requires the application of objective and quantitative analyses. This study presents results of a new approach for sea floor sediment mapping based on an integrated analysis of INFOMAR multibeam bathymetric data (including the derivatives of slope and relative position), backscatter data (including derivatives of angular response analysis) and sediment groundtruthing over the continental shelf, west of Donegal. It applies a Geographic-Object-Based Image Analysis software package to provide a supervised classification of the surface sediment. This approach can provide a statistically robust, high resolution classification of the seafloor. Initial results display a differentiation of sediment classes and a reduction in artefacts from previously applied methodologies. These results indicate a methodology that could be used during physical habitat mapping and classification of marine environments.
A survey of MRI-based medical image analysis for brain tumor studies
NASA Astrophysics Data System (ADS)
Bauer, Stefan; Wiest, Roland; Nolte, Lutz-P.; Reyes, Mauricio
2013-07-01
MRI-based medical image analysis for brain tumor studies is gaining attention in recent times due to an increased need for efficient and objective evaluation of large amounts of data. While the pioneering approaches applying automated methods for the analysis of brain tumor images date back almost two decades, the current methods are becoming more mature and coming closer to routine clinical application. This review aims to provide a comprehensive overview by giving a brief introduction to brain tumors and imaging of brain tumors first. Then, we review the state of the art in segmentation, registration and modeling related to tumor-bearing brain images with a focus on gliomas. The objective in the segmentation is outlining the tumor including its sub-compartments and surrounding tissues, while the main challenge in registration and modeling is the handling of morphological changes caused by the tumor. The qualities of different approaches are discussed with a focus on methods that can be applied on standard clinical imaging protocols. Finally, a critical assessment of the current state is performed and future developments and trends are addressed, giving special attention to recent developments in radiological tumor assessment guidelines.
Ali, Syed Mustafa; Anjum, Naveed; Kamel Boulos, Maged N; Ishaq, Muhammad; Aamir, Javariya; Haider, Ghulam Rasool
2018-01-16
Data quality is core theme of programme's performance assessment and many organizations do not have any data quality improvement strategy, wherein data quality dimensions and data quality assessment framework are important constituents. As there is limited published research about the data quality specifics that are relevant to the context of Pakistan's Tuberculosis control programme, this study aims at identifying the applicable data quality dimensions by using the 'fitness-for-purpose' perspective. Forty-two respondents pooled a total of 473 years of professional experience, out of which 223 years (47%) were in TB control related programmes. Based on the responses against 11 practical cases, adopted from the routine recording and reporting system of Pakistan's TB control programme (real identities of patient were masked), completeness, accuracy, consistency, vagueness, uniqueness and timeliness are the applicable data quality dimensions relevant to the programme's context, i.e. work settings and field of practice. Based on a 'fitness-for-purpose' approach to data quality, this study used a test-based approach to measure management's perspective and identified data quality dimensions pertinent to the programme and country specific requirements. Implementation of a data quality improvement strategy and achieving enhanced data quality would greatly help organizations in promoting data use for informed decision making.
Object-based class modelling for multi-scale riparian forest habitat mapping
NASA Astrophysics Data System (ADS)
Strasser, Thomas; Lang, Stefan
2015-05-01
Object-based class modelling allows for mapping complex, hierarchical habitat systems. The riparian zone, including forests, represents such a complex ecosystem. Forests within riparian zones are biologically high productive and characterized by a rich biodiversity; thus considered of high community interest with an imperative to be protected and regularly monitored. Satellite earth observation (EO) provides tools for capturing the current state of forest habitats such as forest composition including intermixture of non-native tree species. Here we present a semi-automated object based image analysis (OBIA) approach for the mapping of riparian forests by applying class modelling of habitats based on the European Nature Information System (EUNIS) habitat classifications and the European Habitats Directive (HabDir) Annex 1. A very high resolution (VHR) WorldView-2 satellite image provided the required spatial and spectral details for a multi-scale image segmentation and rule-base composition to generate a six-level hierarchical representation of riparian forest habitats. Thereby habitats were hierarchically represented within an image object hierarchy as forest stands, stands of homogenous tree species and single trees represented by sunlit tree crowns. 522 EUNIS level 3 (EUNIS-3) habitat patches with a mean patch size (MPS) of 12,349.64 m2 were modelled from 938 forest stand patches (MPS = 6868.20 m2) and 43,742 tree stand patches (MPS = 140.79 m2). The delineation quality of the modelled EUNIS-3 habitats (focal level) was quantitatively assessed to an expert-based visual interpretation showing a mean deviation of 11.71%.
Integration of Problem-Based Learning and Web-Based Multimedia to Enhance Soil Management Course
NASA Astrophysics Data System (ADS)
Strivelli, R.; Krzic, M.; Crowley, C.; Dyanatkar, S.; Bomke, A.; Simard, S.; Grand, S.
2012-04-01
In an attempt to address declining enrolment in soil science programs and the changing learning needs of 21st century students, several universities in North America and around the world have re-organized their soil science curriculum and adopted innovative educational approaches and web-based teaching resources. At the University of British Columbia, Canada, an interdisciplinary team set out to integrate teaching approaches to address this trend. The objective of this project was to develop an interactive web-based teaching resource, which combined a face-to-face problem-based learning (PBL) case study with multimedia to illustrate the impacts of three land-uses on soil transformation and quality. The Land Use Impacts (LUI) tool (http://soilweb.landfood.ubc.ca/luitool/) was a collaborative and concentrated effort to maximize the advantages of two educational approaches: (1) the web's interactivity, flexibility, adaptability and accessibility, and (2) PBL's ability to foster an authentic learning environment, encourage group work and promote the application of core concepts. The design of the LUI case study was guided by Herrington's development principles for web-based authentic learning. The LUI tool presented students with rich multimedia (streaming videos, text, data, photographs, maps, and weblinks) and real world tasks (site assessment and soil analysis) to encourage students to utilize knowledge of soil science in collaborative problem-solving. Preliminary student feedback indicated that the LUI tool effectively conveyed case study objectives and was appealing to students. The resource is intended primarily for students enrolled in an upper level undergraduate/graduate university course titled Sustainable Soil Management but it is flexible enough to be adapted to other natural resource courses. Project planning and an interactive overview of the tool will be given during the presentation.
ERIC Educational Resources Information Center
Bloxham, Kristy Taylor
2010-01-01
The objective of this study was to examine the use of frequent, anonymous student course surveys as a tool in supporting continuous quality improvement (CQI) principles in online instruction. The study used a qualitative, multiple-case design involving four separate online courses. Analysis methods included pattern matching/explanation building,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foy, J; Marsh, R; Owen, D
2015-06-15
Purpose: Creating high quality SBRT treatment plans for the spine is often tedious and time consuming. In addition, the quality of treatment plans can vary greatly between treatment facilities due to inconsistencies in planning methods. This study investigates the performance of knowledge-based planning (KBP) for spine SBRT. Methods: Treatment plans were created for 28 spine SBRT patients. Each case was planned to meet strict dose objectives and guidelines. After physician and physicist approval, the plans were added to a custom model in a KBP system (RapidPlan, Varian Eclipse v13.5). The model was then trained to be able to predict estimatedmore » DVHs and provide starting objective functions for future patients based on both generated and manual objectives. To validate the model, ten additional spine SBRT cases were planned manually as well as using the model objectives. Plans were compared based on planning time and quality (ability to meet the plan objectives, including dose metrics and conformity). Results: The average dose to the spinal cord and the cord PRV differed between the validation and control plans by <0.25% demonstrating iso-toxicity. Six out of 10 validation plans met all dose objectives without the need for modifications, and overall, target dose coverage was increased by about 4.8%. If the validation plans did not meet the dose requirements initially, only 1–2 iterations of modifying the planning parameters were required before an acceptable plan was achieved. While manually created plans usually required 30 minutes to 3 hours to create, KBP can be used to create similar quality plans in 15–20 minutes. Conclusion: KBP for spinal tumors has shown to greatly decrease the amount of time required to achieve high quality treatment plans with minimal human intervention and could feasibly be used to standardize plan quality between institutions. Supported by Varian Medical Systems.« less
Query Health: standards-based, cross-platform population health surveillance
Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N
2014-01-01
Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371
Photoacoustic image reconstruction via deep learning
NASA Astrophysics Data System (ADS)
Antholzer, Stephan; Haltmeier, Markus; Nuster, Robert; Schwab, Johannes
2018-02-01
Applying standard algorithms to sparse data problems in photoacoustic tomography (PAT) yields low-quality images containing severe under-sampling artifacts. To some extent, these artifacts can be reduced by iterative image reconstruction algorithms which allow to include prior knowledge such as smoothness, total variation (TV) or sparsity constraints. These algorithms tend to be time consuming as the forward and adjoint problems have to be solved repeatedly. Further, iterative algorithms have additional drawbacks. For example, the reconstruction quality strongly depends on a-priori model assumptions about the objects to be recovered, which are often not strictly satisfied in practical applications. To overcome these issues, in this paper, we develop direct and efficient reconstruction algorithms based on deep learning. As opposed to iterative algorithms, we apply a convolutional neural network, whose parameters are trained before the reconstruction process based on a set of training data. For actual image reconstruction, a single evaluation of the trained network yields the desired result. Our presented numerical results (using two different network architectures) demonstrate that the proposed deep learning approach reconstructs images with a quality comparable to state of the art iterative reconstruction methods.
Postgraduate students experience in research supervision
NASA Astrophysics Data System (ADS)
Mohamed, Hazura; Judi, Hairulliza Mohamad; Mohammad, Rofizah
2017-04-01
The success and quality of postgraduate education depends largely on the effective and efficient supervision of postgraduate students. The role of the supervisor becomes more challenging with supervisory expectations rising high quality graduates. The main objective of this study was to examine the experiences of postgraduate students towards supervisory services for the duration of their studies. It also examines whether supervisory experience varies based on demographic variables such as level of study and nationality. This study uses a quantitative approach in the form of survey. Questionnaires were distributed to 96 postgraduate students of the Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia. Data collected were analyzed using Statistical Package for the Social Science (SPSS 23.0) to get the frequency, mean and standard deviation. T-test was used to find the difference between demographic variables and supervisory experience. The findings overall showed that postgraduate students gave positive response to the supervisory services. However, there were differences supervisory experiences based on the level of study and nationality. The results of this study hope the parties involved could provide a better support to improve the quality of supervision.
Validity test and its consistency in the construction of patient loyalty model
NASA Astrophysics Data System (ADS)
Yanuar, Ferra
2016-04-01
The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.
Reimagining Teacher Development: Cultivating Spirit
ERIC Educational Resources Information Center
Dress, Amelia
2012-01-01
Although well-meaning, some methods of training approach teaching as a one-size-fits-all approach. Yet, there are myriad techniques for teaching and no one method works for all teachers or all students. Indeed, good teachers use a variety of techniques. Unfortunately, search for objective standards by which to measure quality teaching has…
Resisting Technological Gravity: Using Guiding Principles for Instructional Design
ERIC Educational Resources Information Center
McDonald, Jason K.
2010-01-01
Instructional designers face tremendous pressure to abandon the essential characteristics of educational approaches, and settle instead for routine practices that do not preserve the level of quality those approaches originally expressed. Because this pressure can be strong enough to affect designers almost as gravity affects objects in the…
O'Brien, Rosaleen; Fitzpatrick, Bridie; Higgins, Maria; Guthrie, Bruce; Watt, Graham; Wyke, Sally
2016-01-01
Objectives To develop and optimise a primary care-based complex intervention (CARE Plus) to enhance the quality of life of patients with multimorbidity in the deprived areas. Methods Six co-design discussion groups involving 32 participants were held separately with multimorbid patients from the deprived areas, voluntary organisations, general practitioners and practice nurses working in the deprived areas. This was followed by piloting in two practices and further optimisation based on interviews with 11 general practitioners, 2 practice nurses and 6 participating multimorbid patients. Results Participants endorsed the need for longer consultations, relational continuity and a holistic approach. All felt that training and support of the health care staff was important. Most participants welcomed the idea of additional self-management support, though some practitioners were dubious about whether patients would use it. The pilot study led to changes including a revised care plan, the inclusion of mindfulness-based stress reduction techniques in the support of practitioners and patients, and the stream-lining of the written self-management support material for patients. Discussion We have co-designed and optimised an augmented primary care intervention involving a whole-system approach to enhance quality of life in multimorbid patients living in the deprived areas. CARE Plus will next be tested in a phase 2 cluster randomised controlled trial. PMID:27068113
ERIC Educational Resources Information Center
Australian Government Tertiary Education Quality and Standards Agency, 2015
2015-01-01
The Australian Government Tertiary Education Quality and Standards Agency's (TEQSA's) role is to assure that quality standards are being met by all registered higher education providers. This paper explains how TEQSA's risk-based approach to assuring higher education standards is applied in broad terms to a diverse sector. This explanation is…
Paying physician group practices for quality: A statewide quasi-experiment.
Conrad, Douglas A; Grembowski, David; Perry, Lisa; Maynard, Charles; Rodriguez, Hector; Martin, Diane
2013-12-01
This article presents the results of a unique quasi-experiment of the effects of a large-scale pay-for-performance (P4P) program implemented by a leading health insurer in Washington state during 2001-2007. The authors received external funding to provide an objective impact evaluation of the program. The program was unique in several respects: (1) It was designed dynamically, with two discrete intervention periods-one in which payment incentives were based on relative performance (the "contest" period) and a second in which payment incentives were based on absolute performance compared to achievable benchmarks. (2) The program was designed in collaboration with large multispecialty group practices, with an explicit run-in period to test the quality metrics. Public reporting of the quality scorecard for all participating medical groups was introduced 1 year before the quality incentive payment program's inception, and continued throughout 2002-2007. (3) The program was implemented in stages with distinct medical groups. A control group of comparable group practices also was assembled, and difference-in-differences methodology was applied to estimate program effects. Case mix measures were included in all multivariate analyses. The regression design permitted a contrast of intervention effects between the "contest" approach in the sub-period of 2003-2004 and the absolute standard, "achievable benchmarks of care" approach in sub-period 2005-2007. Most of the statistically significant quality incentive program coefficients were small and negative (opposite to program intent). A consistent pattern of differential intervention impact in the sub-periods did not emerge. Cumulatively, the probit regression estimates indicate that neither the quality scorecard nor the quality incentive payment program had a significant positive effect on general clinical quality. Based on key informant interviews with medical leaders, practicing physicians, and administrators of the participating groups, the authors conclude that several factors likely combined to dampen program effects: (1) modest size of the incentive; (2) use of rewards only, rather than a balance of rewards and penalties; (3) targeting incentive payments to the group, thus potentially weakening incentive effects at the individual level. Copyright © 2013 Elsevier Inc. All rights reserved.
48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.
Code of Federal Regulations, 2011 CFR
2011-10-01
... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...
48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.
Code of Federal Regulations, 2013 CFR
2013-10-01
... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...
48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.
Code of Federal Regulations, 2012 CFR
2012-10-01
... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...
48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.
Code of Federal Regulations, 2014 CFR
2014-10-01
... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...
48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.
Code of Federal Regulations, 2010 CFR
2010-10-01
... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eads, Damian Ryan; Rosten, Edward; Helmbold, David
The authors present BEAMER: a new spatially exploitative approach to learning object detectors which shows excellent results when applied to the task of detecting objects in greyscale aerial imagery in the presence of ambiguous and noisy data. There are four main contributions used to produce these results. First, they introduce a grammar-guided feature extraction system, enabling the exploration of a richer feature space while constraining the features to a useful subset. This is specified with a rule-based generative grammer crafted by a human expert. Second, they learn a classifier on this data using a newly proposed variant of AdaBoost whichmore » takes into account the spatially correlated nature of the data. Third, they perform another round of training to optimize the method of converting the pixel classifications generated by boosting into a high quality set of (x,y) locations. lastly, they carefully define three common problems in object detection and define two evaluation criteria that are tightly matched to these problems. Major strengths of this approach are: (1) a way of randomly searching a broad feature space, (2) its performance when evaluated on well-matched evaluation criteria, and (3) its use of the location prediction domain to learn object detectors as well as to generate detections that perform well on several tasks: object counting, tracking, and target detection. They demonstrate the efficacy of BEAMER with a comprehensive experimental evaluation on a challenging data set.« less
An efficient hybrid approach for multiobjective optimization of water distribution systems
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.
2014-05-01
An efficient hybrid approach for the design of water distribution systems (WDSs) with multiple objectives is described in this paper. The objectives are the minimization of the network cost and maximization of the network resilience. A self-adaptive multiobjective differential evolution (SAMODE) algorithm has been developed, in which control parameters are automatically adapted by means of evolution instead of the presetting of fine-tuned parameter values. In the proposed method, a graph algorithm is first used to decompose a looped WDS into a shortest-distance tree (T) or forest, and chords (Ω). The original two-objective optimization problem is then approximated by a series of single-objective optimization problems of the T to be solved by nonlinear programming (NLP), thereby providing an approximate Pareto optimal front for the original whole network. Finally, the solutions at the approximate front are used to seed the SAMODE algorithm to find an improved front for the original entire network. The proposed approach is compared with two other conventional full-search optimization methods (the SAMODE algorithm and the NSGA-II) that seed the initial population with purely random solutions based on three case studies: a benchmark network and two real-world networks with multiple demand loading cases. Results show that (i) the proposed NLP-SAMODE method consistently generates better-quality Pareto fronts than the full-search methods with significantly improved efficiency; and (ii) the proposed SAMODE algorithm (no parameter tuning) exhibits better performance than the NSGA-II with calibrated parameter values in efficiently offering optimal fronts.
Tamjidy, Mehran; Baharudin, B. T. Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-01-01
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon’s entropy. PMID:28772893
Tamjidy, Mehran; Baharudin, B T Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-05-15
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.
Automated stent defect detection and classification with a high numerical aperture optical system
NASA Astrophysics Data System (ADS)
Bermudez, Carlos; Laguarta, Ferran; Cadevall, Cristina; Matilla, Aitor; Ibañez, Sergi; Artigas, Roger
2017-06-01
Stent quality control is a highly critical process. Cardiovascular stents have to be inspected 100% so as no defective stent is implanted in a human body. However, this visual control is currently performed manually and every stent could need tenths of minutes to be inspected. In this paper, a novel optical inspection system is presented. By the combination of a high numerical aperture (NA) optical system, a rotational stage and a line-scan camera, unrolled sections of the outer and inner surfaces of the stent are obtained and image-processed at high speed. Defects appearing in those surfaces and also in the edges are extremely contrasted due to the shadowing effect of the high NA illumination and acquisition approach. Therefore by means of morphological operations and a sensitivity parameter, defects are detected. Based on a trained defect library, a binary classifier sorts each kind of defect through a set of scoring vectors, providing the quality operator with all the required information to finally take a decision. We expect this new approach to make defect detection completely objective and to dramatically reduce the time and cost of stent quality control stage.
MUSQA: a CS method to build a multi-standard quality management system
NASA Astrophysics Data System (ADS)
Cros, Elizabeth; Sneed, Isabelle
2002-07-01
CS Communication & Systèmes, through its long quality management experience, has been able to build and evolve its Quality Management System according to clients requirements, norms, standards and models (ISO, DO178, ECSS, CMM, ...), evolving norms (transition from ISO 9001:1994 to ISO 9001:2000) and the TQM approach, being currently deployed. The aim of this paper is to show how, from this enriching and instructive experience, CS has defined and formalised its method: MuSQA (Multi-Standard Quality Approach). This method allows to built a new Quality Management System or simplify and unify an existing one. MuSQA objective is to provide any organisation with an open Quality Management System, which is able to evolve easily and turns to be a useful instrument for everyone, operational as well as non-operational staff.
NASA Astrophysics Data System (ADS)
Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.
2014-12-01
The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.
Slic Superpixels for Object Delineation from Uav Data
NASA Astrophysics Data System (ADS)
Crommelinck, S.; Bennett, R.; Gerke, M.; Koeva, M. N.; Yang, M. Y.; Vosselman, G.
2017-08-01
Unmanned aerial vehicles (UAV) are increasingly investigated with regard to their potential to create and update (cadastral) maps. UAVs provide a flexible and low-cost platform for high-resolution data, from which object outlines can be accurately delineated. This delineation could be automated with image analysis methods to improve existing mapping procedures that are cost, time and labor intensive and of little reproducibility. This study investigates a superpixel approach, namely simple linear iterative clustering (SLIC), in terms of its applicability to UAV data. The approach is investigated in terms of its applicability to high-resolution UAV orthoimages and in terms of its ability to delineate object outlines of roads and roofs. Results show that the approach is applicable to UAV orthoimages of 0.05 m GSD and extents of 100 million and 400 million pixels. Further, the approach delineates the objects with the high accuracy provided by the UAV orthoimages at completeness rates of up to 64 %. The approach is not suitable as a standalone approach for object delineation. However, it shows high potential for a combination with further methods that delineate objects at higher correctness rates in exchange of a lower localization quality. This study provides a basis for future work that will focus on the incorporation of multiple methods for an interactive, comprehensive and accurate object delineation from UAV data. This aims to support numerous application fields such as topographic and cadastral mapping.
Managing complex processing of medical image sequences by program supervision techniques
NASA Astrophysics Data System (ADS)
Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert
1997-05-01
Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.
Oh, Seungtaik; Jeong, Il Kwon
2015-11-16
We will introduce a new simple analytic formula of the Fourier coefficient of the 3D field distribution of a point light source to generate a cylindrical angular spectrum which captures the object wave in 360° in the 3D Fourier space. Conceptually, the cylindrical angular spectrum can be understood as a cylindrical version of the omnidirectional spectral approach of Sando et al. Our Fourier coefficient formula is based on an intuitive observation that a point light radiates uniformly in all directions. Our formula is defined over all frequency vectors lying on the entire sphere in the 3D Fourier space and is more natural and computationally more efficient for all around recording of the object wave than that of the previous omnidirectional spectral method. A generalized frequency-based occlusion culling method for an arbitrary complex object is also proposed to enhance the 3D quality of a hologram. As a practical application of the cylindrical angular spectrum, an interactive hologram example is presented together with implementation details.
ERIC Educational Resources Information Center
Valikhanova, Zarina
2015-01-01
This article considers the problems of the definition of quality in the educational sphere. Alternative approaches to the concept of quality of education and its evaluation are determined given the different approaches of scientists and experts. The most important criteria in assessing the quality is distinguished and formed in the matrix for…
Steps in Moving Evidence-Based Health Informatics from Theory to Practice.
Rigby, Michael; Magrabi, Farah; Scott, Philip; Doupi, Persephone; Hypponen, Hannele; Ammenwerth, Elske
2016-10-01
To demonstrate and promote the importance of applying a scientific process to health IT design and implementation, and of basing this on research principles and techniques. A review by international experts linked to the IMIA Working Group on Technology Assessment and Quality Development. Four approaches are presented, linking to the creation of national professional expectations, adherence to research-based standards, quality assurance approaches to ensure safety, and scientific measurement of impact. Solely marketing- and aspiration-based approaches to health informatics applications are no longer ethical or acceptable when scientifically grounded evidence-based approaches are available and in use.
Soper, Bryony; Buxton, Martin; Hanney, Stephen; Oortwijn, Wija; Scoggins, Amanda; Steel, Nick; Ling, Tom
2008-01-01
In 2004 a UK charity, The Health Foundation, established the 'Engaging with Quality Initiative' to explore and evaluate the benefits of engaging clinicians in quality improvement in healthcare. Eight projects run by professional bodies or specialist societies were commissioned in various areas of acute care. A developmental approach to the initiative was adopted, accompanied by a two level evaluation: eight project self-evaluations and a related external evaluation. This paper describes how the protocol for the external evaluation was developed. The challenges faced included large variation between and within the projects (in approach, scope and context, and in understanding of quality improvement), the need to support the project teams in their self-evaluations while retaining a necessary objectivity, and the difficulty of evaluating the moving target created by the developmental approach adopted in the initiative. An initial period to develop the evaluation protocol proved invaluable in helping us to explore these issues. PMID:18973650
Small satellite product assurance
NASA Astrophysics Data System (ADS)
Demontlivault, J.; Cadelec, Jacques
1993-01-01
In order to increase the interest in small satellites, their cost must be reduced; reducing product assurance costs induced by quality requirements is a major objective. For a logical approach, small satellites are classified in three main categories: satellites for experimental operations with a short lifetime, operational satellites manufactured in small mass with long lifetime requirements, operational satellites (long lifetime required), of which only a few models are produced. The various requirements as regards the product assurance are examined for each satellite category: general requirements for space approach, reliability, electronic components, materials and processes, quality assurance, documentation, tests, and management. Ideal product assurance system integrates quality teams and engineering teams.
Marti, Alessandra; Cattaneo, Stefano; Benedetti, Simona; Buratti, Susanna; Abbasi Parizad, Parisa; Masotti, Fabio; Iametti, Stefania; Pagani, Maria Ambrogina
2017-11-01
The consumption of whole-grain food-including pasta-has been increasing steadily. In the case of whole-grain pasta, given the many different producers, it seems important to have some objective parameters to define its overall quality. In this study, commercial whole-grain pasta samples representative of the Italian market have been characterized from both molecular and electronic-senses (electronic nose and electronic tongue) standpoint in order to provide a survey of the properties of different commercial samples. Only 1 pasta product showed very low levels of heat damage markers (furosine and pyrraline), suggesting that this sample underwent to low temperature dry treatment. In all samples, the furosine content was directly correlated to protein structural indices, since protein structure compactness increased with increasing levels of heat damage markers. Electronic senses were able to discriminate among pasta samples according to the intensity of heat treatment during the drying step. Pasta sample with low furosine content was discriminated by umami taste and by sensors responding to aliphatic and inorganic compounds. Data obtained with this multidisciplinary approach are meant to provide hints for identifying useful indices for pasta quality. As observed for semolina pasta, objective parameters based on heat-damage were best suited to define the overall quality of wholegrain pasta, almost independently of compositional differences among commercial samples. Drying treatments of different intensity also had an impact on instrumental sensory traits that may provide a reliable alternative to analytical determination of chemical markers of heat damage in all cases where there is a need for avoiding time-consuming procedures. © 2017 Institute of Food Technologists®.
A denoising algorithm for CT image using low-rank sparse coding
NASA Astrophysics Data System (ADS)
Lei, Yang; Xu, Dong; Zhou, Zhengyang; Wang, Tonghe; Dong, Xue; Liu, Tian; Dhabaan, Anees; Curran, Walter J.; Yang, Xiaofeng
2018-03-01
We propose a denoising method of CT image based on low-rank sparse coding. The proposed method constructs an adaptive dictionary of image patches and estimates the sparse coding regularization parameters using the Bayesian interpretation. A low-rank approximation approach is used to simultaneously construct the dictionary and achieve sparse representation through clustering similar image patches. A variable-splitting scheme and a quadratic optimization are used to reconstruct CT image based on achieved sparse coefficients. We tested this denoising technology using phantom, brain and abdominal CT images. The experimental results showed that the proposed method delivers state-of-art denoising performance, both in terms of objective criteria and visual quality.
Advanced Imaging Methods for Long-Baseline Optical Interferometry
NASA Astrophysics Data System (ADS)
Le Besnerais, G.; Lacour, S.; Mugnier, L. M.; Thiebaut, E.; Perrin, G.; Meimon, S.
2008-11-01
We address the data processing methods needed for imaging with a long baseline optical interferometer. We first describe parametric reconstruction approaches and adopt a general formulation of nonparametric image reconstruction as the solution of a constrained optimization problem. Within this framework, we present two recent reconstruction methods, Mira and Wisard, representative of the two generic approaches for dealing with the missing phase information. Mira is based on an implicit approach and a direct optimization of a Bayesian criterion while Wisard adopts a self-calibration approach and an alternate minimization scheme inspired from radio-astronomy. Both methods can handle various regularization criteria. We review commonly used regularization terms and introduce an original quadratic regularization called ldquosoft support constraintrdquo that favors the object compactness. It yields images of quality comparable to nonquadratic regularizations on the synthetic data we have processed. We then perform image reconstructions, both parametric and nonparametric, on astronomical data from the IOTA interferometer, and discuss the respective roles of parametric and nonparametric approaches for optical interferometric imaging.
Spatial and Temporal Dynamics in Air Pollution Exposure Assessment
Dias, Daniela; Tchepel, Oxana
2018-01-01
Analyzing individual exposure in urban areas offers several challenges where both the individual’s activities and air pollution levels demonstrate a large degree of spatial and temporal dynamics. This review article discusses the concepts, key elements, current developments in assessing personal exposure to urban air pollution (seventy-two studies reviewed) and respective advantages and disadvantages. A new conceptual structure to organize personal exposure assessment methods is proposed according to two classification criteria: (i) spatial-temporal variations of individuals’ activities (point-fixed or trajectory based) and (ii) characterization of air quality (variable or uniform). This review suggests that the spatial and temporal variability of urban air pollution levels in combination with indoor exposures and individual’s time-activity patterns are key elements of personal exposure assessment. In the literature review, the majority of revised studies (44 studies) indicate that the trajectory based with variable air quality approach provides a promising framework for tackling the important question of inter- and intra-variability of individual exposure. However, future quantitative comparison between the different approaches should be performed, and the selection of the most appropriate approach for exposure quantification should take into account the purpose of the health study. This review provides a structured basis for the intercomparing of different methodologies and to make their advantages and limitations more transparent in addressing specific research objectives. PMID:29558426
WE-A-BRC-01: Introduction to the Certificate Course
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less
WE-A-BRC-03: Lessons Learned: IROC Audits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Followill, D.
Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less
WE-A-BRC-02: Lessons Learned: Clinical Trials and Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, S.
Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less
Automatic evidence quality prediction to support evidence-based decision making.
Sarker, Abeed; Mollá, Diego; Paris, Cécile
2015-06-01
Evidence-based medicine practice requires practitioners to obtain the best available medical evidence, and appraise the quality of the evidence when making clinical decisions. Primarily due to the plethora of electronically available data from the medical literature, the manual appraisal of the quality of evidence is a time-consuming process. We present a fully automatic approach for predicting the quality of medical evidence in order to aid practitioners at point-of-care. Our approach extracts relevant information from medical article abstracts and utilises data from a specialised corpus to apply supervised machine learning for the prediction of the quality grades. Following an in-depth analysis of the usefulness of features (e.g., publication types of articles), they are extracted from the text via rule-based approaches and from the meta-data associated with the articles, and then applied in the supervised classification model. We propose the use of a highly scalable and portable approach using a sequence of high precision classifiers, and introduce a simple evaluation metric called average error distance (AED) that simplifies the comparison of systems. We also perform elaborate human evaluations to compare the performance of our system against human judgments. We test and evaluate our approaches on a publicly available, specialised, annotated corpus containing 1132 evidence-based recommendations. Our rule-based approach performs exceptionally well at the automatic extraction of publication types of articles, with F-scores of up to 0.99 for high-quality publication types. For evidence quality classification, our approach obtains an accuracy of 63.84% and an AED of 0.271. The human evaluations show that the performance of our system, in terms of AED and accuracy, is comparable to the performance of humans on the same data. The experiments suggest that our structured text classification framework achieves evaluation results comparable to those of human performance. Our overall classification approach and evaluation technique are also highly portable and can be used for various evidence grading scales. Copyright © 2015 Elsevier B.V. All rights reserved.
Nicol, Sam; Wiederholt, Ruscena; Diffendorfer, James E.; Mattsson, Brady; Thogmartin, Wayne E.; Semmens, Darius J.; Laura Lopez-Hoffman,; Norris, Ryan
2016-01-01
Mobile species with complex spatial dynamics can be difficult to manage because their population distributions vary across space and time, and because the consequences of managing particular habitats are uncertain when evaluated at the level of the entire population. Metrics to assess the importance of habitats and pathways connecting habitats in a network are necessary to guide a variety of management decisions. Given the many metrics developed for spatially structured models, it can be challenging to select the most appropriate one for a particular decision. To guide the management of spatially structured populations, we define three classes of metrics describing habitat and pathway quality based on their data requirements (graph-based, occupancy-based, and demographic-based metrics) and synopsize the ecological literature relating to these classes. Applying the first steps of a formal decision-making approach (problem framing, objectives, and management actions), we assess the utility of metrics for particular types of management decisions. Our framework can help managers with problem framing, choosing metrics of habitat and pathway quality, and to elucidate the data needs for a particular metric. Our goal is to help managers to narrow the range of suitable metrics for a management project, and aid in decision-making to make the best use of limited resources.
A fully 3D approach for metal artifact reduction in computed tomography.
Kratz, Barbel; Weyers, Imke; Buzug, Thorsten M
2012-11-01
In computed tomography imaging metal objects in the region of interest introduce inconsistencies during data acquisition. Reconstructing these data leads to an image in spatial domain including star-shaped or stripe-like artifacts. In order to enhance the quality of the resulting image the influence of the metal objects can be reduced. Here, a metal artifact reduction (MAR) approach is proposed that is based on a recomputation of the inconsistent projection data using a fully three-dimensional Fourier-based interpolation. The success of the projection space restoration depends sensitively on a sensible continuation of neighboring structures into the recomputed area. Fortunately, structural information of the entire data is inherently included in the Fourier space of the data. This can be used for a reasonable recomputation of the inconsistent projection data. The key step of the proposed MAR strategy is the recomputation of the inconsistent projection data based on an interpolation using nonequispaced fast Fourier transforms (NFFT). The NFFT interpolation can be applied in arbitrary dimension. The approach overcomes the problem of adequate neighborhood definitions on irregular grids, since this is inherently given through the usage of higher dimensional Fourier transforms. Here, applications up to the third interpolation dimension are presented and validated. Furthermore, prior knowledge may be included by an appropriate damping of the transform during the interpolation step. This MAR method is applicable on each angular view of a detector row, on two-dimensional projection data as well as on three-dimensional projection data, e.g., a set of sequential acquisitions at different spatial positions, projection data of a spiral acquisition, or cone-beam projection data. Results of the novel MAR scheme based on one-, two-, and three-dimensional NFFT interpolations are presented. All results are compared in projection data space and spatial domain with the well-known one-dimensional linear interpolation strategy. In conclusion, it is recommended to include as much spatial information into the recomputation step as possible. This is realized by increasing the dimension of the NFFT. The resulting image quality can be enhanced considerably.
[Theoretical and conceptual contribution to evaluative research in health surveillance context].
Arreaza, Antônio Luis Vicente; de Moraes, José Cássio
2010-08-01
Initially this article revises some of the conceptual and operational elements on evaluative research by gathering knowledge and action fields on public health practices. Such concepts are taken according to a wider conception of quality. Then, the article intends to arrange a theoretical model design considering the proposition for implementation of health surveillance actionsAn image-objective definition of organization and integration of health polices and practices based on hierarchic and local logic also take place. Finally, becomings and challenges around the theory in the health evaluation field turn to be the aim of our reflection in order to enable the production of knowledge and approaches to construct logic models which reveals the complexity of interventionist objects as well as its transforming nature of social practices.
A CNN based neurobiology inspired approach for retinal image quality assessment.
Mahapatra, Dwarikanath; Roy, Pallab K; Sedai, Suman; Garnavi, Rahil
2016-08-01
Retinal image quality assessment (IQA) algorithms use different hand crafted features for training classifiers without considering the working of the human visual system (HVS) which plays an important role in IQA. We propose a convolutional neural network (CNN) based approach that determines image quality using the underlying principles behind the working of the HVS. CNNs provide a principled approach to feature learning and hence higher accuracy in decision making. Experimental results demonstrate the superior performance of our proposed algorithm over competing methods.
Context-sensitive extraction of tree crown objects in urban areas using VHR satellite images
NASA Astrophysics Data System (ADS)
Ardila, Juan P.; Bijker, Wietske; Tolpekin, Valentyn A.; Stein, Alfred
2012-04-01
Municipalities need accurate and updated inventories of urban vegetation in order to manage green resources and estimate their return on investment in urban forestry activities. Earlier studies have shown that semi-automatic tree detection using remote sensing is a challenging task. This study aims to develop a reproducible geographic object-based image analysis (GEOBIA) methodology to locate and delineate tree crowns in urban areas using high resolution imagery. We propose a GEOBIA approach that considers the spectral, spatial and contextual characteristics of tree objects in the urban space. The study presents classification rules that exploit object features at multiple segmentation scales modifying the labeling and shape of image-objects. The GEOBIA methodology was implemented on QuickBird images acquired over the cities of Enschede and Delft (The Netherlands), resulting in an identification rate of 70% and 82% respectively. False negative errors concentrated on small trees and false positive errors in private gardens. The quality of crown boundaries was acceptable, with an overall delineation error <0.24 outside of gardens and backyards.
Mexico City Air Quality Research Initiative; Volume 5, Strategic evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-03-01
Members of the Task HI (Strategic Evaluation) team were responsible for the development of a methodology to evaluate policies designed to alleviate air pollution in Mexico City. This methodology utilizes information from various reports that examined ways to reduce pollutant emissions, results from models that calculate the improvement in air quality due to a reduction in pollutant emissions, and the opinions of experts as to the requirements and trade-offs that are involved in developing a program to address the air pollution problem in Mexico City. The methodology combines these data to produce comparisons between different approaches to improving Mexico City`smore » air quality. These comparisons take into account not only objective factors such as the air quality improvement or cost of the different approaches, but also subjective factors such as public acceptance or political attractiveness of the different approaches. The end result of the process is a ranking of the different approaches and, more importantly, the process provides insights into the implications of implementing a particular approach or policy.« less
Lau, Patrick W C; Lau, Erica Y; Wong, Del P; Ransdell, Lynda
2011-07-13
A growing body of research has employed information and communication technologies (ICTs) such as the Internet and mobile phones for disseminating physical activity (PA) interventions with young populations. Although several systematic reviews have documented the effects of ICT-based interventions on PA behavior, very few have focused on children and adolescents specifically. The present review aimed to systematically evaluate the efficacy and methodological quality of ICT-based PA interventions for children and adolescents based on evidence from randomized controlled trials. Electronic databases Medline, PsycInfo, CINAHL, and Web of Science were searched to retrieve English language articles published in international academic peer-reviewed journals from January 1, 1997, through December 31, 2009. Included were articles that provided descriptions of interventions designed to improve PA-related cognitive, psychosocial, and behavioral outcomes and that used randomized controlled trial design, included only children (6-12 years old) and adolescents (13-18 years old) in both intervention and control groups, and employed Internet, email, and/or short message services (SMS, also known as text messaging) as one or more major or assistive modes to deliver the intervention. In total, 9 studies were analyzed in the present review. All studies were published after 2000 and conducted in Western countries. Of the 9 studies, 7 demonstrated positive and significant within-group differences in at least one psychosocial or behavioral PA outcome. In all, 3 studies reported positive and significant between-group differences favoring the ICT group. When between-group differences were compared across studies, effect sizes were small in 6 studies and large in 3 studies. With respect to methodological quality, 7 of the 9 studies had good methodological quality. Failure to report allocation concealment, blinding to outcome assessment, and lack of long-term follow-up were the criteria met by the fewest studies. In addition, 5 studies measured the intervention exposure rate and only 1 study employed objective measures to record data. The present review provides evidence supporting the positive effects of ICTs in PA interventions for children and adolescents, especially when used with other delivery approaches (ie, face-to-face). Because ICT delivery approaches are often mixed with other approaches and these studies sometimes lack a comparable control group, additional research is needed to establish the true independent effects of ICT as an intervention delivery mode. Although two-thirds of the studies demonstrated satisfactory methodological quality, several quality criteria should be considered in future studies: clear descriptions of allocation concealment and blinding of outcome assessment, extension of intervention duration, and employment of objective measures in intervention exposure rate. Due to the small number of studies that met inclusion criteria and the lack of consistent evidence, researchers should be cautious when interpreting the findings of the present review.
Balancing emotion and cognition: a case for decision aiding in conservation efforts.
Wilson, Robyn S
2008-12-01
Despite advances in the quality of participatory decision making for conservation, many current efforts still suffer from an inability to bridge the gap between science and policy. Judgment and decision-making research suggests this gap may result from a person's reliance on affect-based shortcuts in complex decision contexts. I examined the results from 3 experiments that demonstrate how affect (i.e., the instantaneous reaction one has to a stimulus) influences individual judgments in these contexts and identified techniques from the decision-aiding literature that help encourage a balance between affect-based emotion and cognition in complex decision processes. In the first study, subjects displayed a lack of focus on their stated conservation objectives and made decisions that reflected their initial affective impressions. Value-focused approaches may help individuals incorporate all the decision-relevant objectives by making the technical and value-based objectives more salient. In the second study, subjects displayed a lack of focus on statistical risk and again made affect-based decisions. Trade-off techniques may help individuals incorporate relevant technical data, even when it conflicts with their initial affective impressions or other value-based objectives. In the third study, subjects displayed a lack of trust in decision-making authorities when the decision involved a negatively affect-rich outcome (i.e., a loss). Identifying shared salient values and increasing procedural fairness may help build social trust in both decision-making authorities and the decision process.
QUALITY MANAGEMENT DURING SELECTION OF TECHNOLOGIES EXAMPLE SITE MARCH AIR FORCE BASE, USA
This paper describes the remedial approach, organizational structure and key elements facilitating effective and efficient remediation of contaminated sites at March Air Force Base (AFB), California. The U.S. implementation and quality assurance approach to site remediation for ...
An international virtual medical school (IVIMEDS): the future for medical education?
Harden, R M; Hart, I R
2002-05-01
The introduction of new learning technologies, the exponential growth of Internet usage and the advent of the World Wide Web have the potential of changing the face of higher education. There are also demands in medical education for greater globalization, for the development of a common core curriculum, for improving access to training, for more flexible and student-centred training programmes including programmes with multi-professional elements and for maintaining quality while increasing student numbers and working within financial constraints. An international virtual medical school (IVIMEDS) with a high-quality education programme embodying a hybrid model of a blended curriculum of innovative e-learning approaches and the best of traditional face-to-face teaching is one response to these challenges. Fifty leading international medical schools and institutions are participating in a feasibility study. This is exploring: innovative thinking and approaches to the new learning technologies including e-learning and virtual reality; new approaches to curriculum planning and mapping and advanced instructional design based on the use of 'reusable learning objects'; an international perspective on medical education which takes into account the trend to globalization; a flexible curriculum which meets the needs of different students and has the potential of increasing access to medicine.
Mobile-Based Applications and Functionalities for Self-Management of People Living with HIV.
Mehraeen, Esmaeil; Safdari, Reza; Mohammadzadeh, Niloofar; Seyedalinaghi, Seyed Ahmad; Forootan, Siavash; Mohraz, Minoo
2018-01-01
Due to the chronicity of HIV/AIDS and the increased number of people living with HIV (PLWH), these people need the innovative and practical approaches to take advantage of high-quality healthcare services. The objectives of this scoping review were to identify the mobile-based applications and functionalities for self-management of people living with HIV. We conducted a comprehensive search of PubMed, Scopus, Science direct, Web of Science and Embase databases for literature published from 2010 to 2017. Screening, data abstraction, and methodological quality assessment were done in duplicate. Our search identified 10 common mobile-based applications and 8 functionalities of these applications for self-management of people living with HIV. According to the findings, "text-messaging" and "reminder" applications were more addressed in reviewed articles. Moreover, the results indicated that "medication adherence" was the common functionality of mobile-based applications for PLWH. Inclusive evidence supports the use of text messaging as a mobile-based functionality to improve medication adherence and motivational messaging. Future mobile-based applications in the healthcare industry should address additional practices such as online chatting, social conversations, physical activity intervention, and supply chain management.
Matrix Factorisation-based Calibration For Air Quality Crowd-sensing
NASA Astrophysics Data System (ADS)
Dorffer, Clement; Puigt, Matthieu; Delmaire, Gilles; Roussel, Gilles; Rouvoy, Romain; Sagnier, Isabelle
2017-04-01
Internet of Things (IoT) is extending internet to physical objects and places. The internet-enabled objects are thus able to communicate with each other and with their users. One main interest of IoT is the ease of production of huge masses of data (Big Data) using distributed networks of connected objects, thus making possible a fine-grained yet accurate analysis of physical phenomena. Mobile crowdsensing is a way to collect data using IoT. It basically consists of acquiring geolocalized data from the sensors (from or connected to the mobile devices, e.g., smartphones) of a crowd of volunteers. The sensed data are then collectively shared using wireless connection—such as GSM or WiFi—and stored on a dedicated server to be processed. One major application of mobile crowdsensing is environment monitoring. Indeed, with the proliferation of miniaturized yet sensitive sensors on one hand and, on the other hand, of low-cost microcontrollers/single-card PCs, it is easy to extend the sensing abilities of smartphones. Alongside the conventional, regulated, bulky and expensive instruments used in authoritative air quality stations, it is then possible to create a large-scale mobile sensor network providing insightful information about air quality. In particular, the finer spatial sampling rate due to such a dense network should allow air quality models to take into account local effects such as street canyons. However, one key issue with low-cost air quality sensors is the lack of trust in the sensed data. In most crowdsensing scenarios, the sensors (i) cannot be calibrated in a laboratory before or during their deployment and (ii) might be sparsely or continuously faulty (thus providing outliers in the data). Such issues should be automatically handled from the sensor readings. Indeed, due to the masses of generated data, solving the above issues cannot be performed by experts but requires specific data processing techniques. In this work, we assume that some mobile sensors share some information using the APISENSE® crowdsensing platform and we aim to calibrate the sensor responses from the data directly. For that purpose, we express the sensor readings as a low-rank matrix with missing entries and we revisit self-calibration as a Matrix Factorization (MF) problem. In our proposed framework, one factor matrix contains the calibration parameters while the other is structured by the calibration model and contains some values of the sensed phenomenon. The MF calibration approach also uses the precise measurements from ATMO—the French public institution—to drive the calibration of the mobile sensors. MF calibration can be improved using, e.g., the mean calibration parameters provided by the sensor manufacturers, or using sparse priors or a model of the physical phenomenon. All our approaches are shown to provide a better calibration accuracy than matrix-completion-based and robust-regression-based methods, even in difficult scenarios involving a lot of missing data and/or very few accurate references. When combined with a dictionary of air quality patterns, our experiments suggest that MF is not only able to perform sensor network calibration but also to provide detailed maps of air quality.
Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A
2004-01-01
We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.
USDA-ARS?s Scientific Manuscript database
Intercropping of legumes with non-legumes is an ancient crop production method used to improve quality and dry matter (DM) yields of forage and grain, and to control weeds. However, there is little information regarding intercropping at high latitudes. The objectives of this field study were to eval...
USDA-ARS?s Scientific Manuscript database
Color and texture are among the key quality attributes for small fruit. Postharvest approaches such as modified atmosphere packaging (MAP) along with cold chain management have been shown to support retention of fruit quality during handling and distribution. The objective of this study was to inves...
The Assessment of Mangrove Sediment Quality in Mengkabong Lagoon: An Index Analysis Approach
ERIC Educational Resources Information Center
Praveena, Sarva M.; Radojevic, Miroslav; Abdullah, Mohd H.
2007-01-01
The objectives of this study are to use different types of indexes to assess the current pollution status in Mengkabong lagoon and select the best index to describe the Mengkabong sediment quality. The indexes used in this study were Enrichment Factor (EF), Geo-accumulation Index (Igeo), Pollution Load Index (PLI) and Marine Sediment Pollution…
Quality and Equality: The Mask of Discursive Conflation in Education Policy Texts
ERIC Educational Resources Information Center
Gillies, Donald
2008-01-01
Two key themes of recent UK education policy texts have been a focus on "quality" in public sector performance, and on "equality" in the form of New Labour's stated commitment to equality of opportunity as a key policy objective. This twin approach can be seen at its most obvious in the concept of "excellence for…
ERIC Educational Resources Information Center
Ashraf, Mohammad A.; Osman, Abu Zafar Rashed; Ratan, Sarker Rafij Ahmed
2016-01-01
Purpose: The purpose of the present study is to identify the determinants that potentially influence quality education in private universities in Bangladesh. Design/methodology/approach: To attain this objective, 234 data were collected through face-to-face interviews on campus during February-March 2013 from Bachelor of Business Administration…
Coolbaugh, Crystal L; Raymond Jr, Stephen C
2015-01-01
Background Computer tailored, Web-based interventions have emerged as an effective approach to promote physical activity. Existing programs, however, do not adjust activities according to the participant’s compliance or physiologic adaptations, which may increase risk of injury and program attrition in sedentary adults. To address this limitation, objective activity monitor (AM) and heart rate data could be used to guide personalization of physical activity, but improved Web-based frameworks are needed to test such interventions. Objective The objective of this study is to (1) develop a personalized physical activity prescription (PPAP) app that combines dynamic Web-based guidance with multi-sensor AM data to promote physical activity and (2) to assess the feasibility of using this system in the field. Methods The PPAP app was constructed using an open-source software platform and a custom, multi-sensor AM capable of accurately measuring heart rate and physical activity. A novel algorithm was written to use a participant’s compliance and physiologic response to aerobic training (ie, changes in daily resting heart rate) recorded by the AM to create daily, personalized physical activity prescriptions. In addition, the PPAP app was designed to (1) manage the transfer of files from the AM to data processing software and a relational database, (2) provide interactive visualization features such as calendars and training tables to encourage physical activity, and (3) enable remote administrative monitoring of data quality and participant compliance. A 12-week feasibility study was performed to assess the utility and limitations of the PPAP app used by sedentary adults in the field. Changes in physical activity level and resting heart rate were monitored throughout the intervention. Results The PPAP app successfully created daily, personalized physical activity prescriptions and an interactive Web environment to guide and promote physical activity by the participants. The varied compliance of the participants enabled evaluation of administrative features of the app including the generation of automated email reminders, participation surveys, and daily AM file upload logs. Conclusions This study describes the development of the PPAP app, a closed-loop technology framework that enables personalized physical activity prescription and remote monitoring of an individual’s compliance and health response to the intervention. Data obtained during a 12-week feasibility study demonstrated the ability of the PPAP app to use objective AM data to create daily, personalized physical activity guidance, provide interactive feedback to users, and enable remote administrative monitoring of data quality and subject compliance. Using this approach, public health professionals, clinicians, and researchers can adapt the PPAP app to facilitate a range of personalized physical activity interventions to improve health outcomes, assess injury risk, and achieve fitness performance goals in diverse populations. PMID:26043793
NutrientNet: An Internet-Based Approach to Teaching Market-Based Policy for Environmental Management
ERIC Educational Resources Information Center
Nguyen, To N.; Woodward, Richard T.
2009-01-01
NutrientNet is an Internet-based environment in which a class can simulate a market-based approach for improving water quality. In NutrientNet, each student receives a role as either a point source or a nonpoint source polluter, and then the participants are allowed to trade water quality credits to cost-effectively reduce pollution in a…
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
NASA Astrophysics Data System (ADS)
Setiawan, R.
2018-03-01
In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.
NHEERL scientists have developed an approach that could be used by the State of Oregon for development of nutrient and other water quality criteria for the Yaquina Estuary, Oregon. The principle objective in setting protective criteria is to prevent future degradation of estuari...
ERIC Educational Resources Information Center
Jurecki, Karenann; Wander, Matthew C. F.
2012-01-01
In this work, we present an approach for teaching students to evaluate scientific literature and other materials critically. We use four criteria divided into two tiers: original research, authority, objectivity, and validity. The first tier, originality and authority, assesses the quality of the source. The second tier, objectivity and validity,…
ERIC Educational Resources Information Center
Okay-Somerville, Belgin; Scholarios, Dora
2017-01-01
This article aims to understand predictors of objective (i.e. job offers, employment status and employment quality) and subjective (i.e. perceived) graduate employability during university-to-work transitions. Using survey data from two cohorts of graduates in the UK (N = 293), it contrasts three competing theoretical approaches to employability:…
Management approach recommendations. Earth Observatory Satellite system definition study (EOS)
NASA Technical Reports Server (NTRS)
1974-01-01
Management analyses and tradeoffs were performed to determine the most cost effective management approach for the Earth Observatory Satellite (EOS) Phase C/D. The basic objectives of the management approach are identified. Some of the subjects considered are as follows: (1) contract startup phase, (2) project management control system, (3) configuration management, (4) quality control and reliability engineering requirements, and (5) the parts procurement program.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
Magrabi, F; Ammenwerth, E; Hyppönen, H; de Keizer, N; Nykänen, P; Rigby, M; Scott, P; Talmon, J; Georgiou, A
2016-11-10
With growing use of IT by healthcare professionals and patients, the opportunity for any unintended effects of technology to disrupt care health processes and outcomes is intensified. The objectives of this position paper by the IMIA Working Group (WG) on Technology Assessment and Quality Development are to highlight how our ongoing initiatives to enhance evaluation are also addressing the unintended consequences of health IT. Review of WG initiatives Results: We argue that an evidence-based approach underpinned by rigorous evaluation is fundamental to the safe and effective use of IT, and for detecting and addressing its unintended consequences in a timely manner. We provide an overview of our ongoing initiatives to strengthen study design, execution and reporting by using evaluation frameworks and guidelines which can enable better characterization and monitoring of unintended consequences, including the Good Evaluation Practice Guideline in Health Informatics (GEP-HI) and the Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI). Indicators to benchmark the adoption and impact of IT can similarly be used to monitor unintended effects on healthcare structures, processes and outcome. We have also developed EvalDB, a web-based database of evaluation studies to promulgate evidence about unintended effects and are developing the content for courses to improve training in health IT evaluation. Evaluation is an essential ingredient for the effective use of IT to improve healthcare quality and patient safety. WG resources and skills development initiatives can facilitate a proactive and evidence-based approach to detecting and addressing the unintended effects of health IT.
An Assessment of the Quality of Life in the European Union Based on the Social Indicators Approach
ERIC Educational Resources Information Center
Grasso, Marco; Canova, Luciano
2008-01-01
This article carries out a multidimensional analysis of welfare based on the social indicators approach aimed at assessing the quality of life in the 25 member countries of the European Union. It begins with description of the social indicators approach and provides some specifications on its most controversial points. It then specifies the…
QUALITY MANAGEMENT DURING SELECTION OF TECHNOLOGIES; EXAMPLE SITE MARCH AIR FORCE BASE, USA
This paper describes the remedial approach, organizational structure and key elements facilitating effective and efficient remediation of contaminated sites at March Air Force Base (AFB), California. The U.S. implementation and quality assurance approach to site remediation for a...
Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu
2018-01-01
The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.
A tutorial for developing a topical cream formulation based on the Quality by Design approach.
Simões, Ana; Veiga, Francisco; Vitorino, Carla; Figueiras, Ana
2018-06-20
The pharmaceutical industry has entered in a new era, as there is a growing interest in increasing the quality standards of dosage forms, through the implementation of more structured development and manufacturing approaches. For many decades, the manufacturing of drug products was controlled by a regulatory framework to guarantee the quality of the final product through a fixed process and exhaustive testing. Limitations related to the Quality by Test (QbT) system have been widely acknowledged. The emergence of Quality by Design (QbD) as a systematic and risk-based approach introduced a new quality concept based on a good understanding of how raw materials and process parameters influence the final quality profile. Although the QbD system has been recognized as a revolutionary approach to product development and manufacturing, its full implementation in the pharmaceutical field is still limited. This is particularly evident in the case of semisolid complex formulation development. The present review aims at establishing a practical QbD framework to describe all stages comprised in the pharmaceutical development of a conventional cream in a comprehensible manner. Copyright © 2018. Published by Elsevier Inc.
Sunway Medical Laboratory Quality Control Plans Based on Six Sigma, Risk Management and Uncertainty.
Jairaman, Jamuna; Sakiman, Zarinah; Li, Lee Suan
2017-03-01
Sunway Medical Centre (SunMed) implemented Six Sigma, measurement uncertainty, and risk management after the CLSI EP23 Individualized Quality Control Plan approach. Despite the differences in all three approaches, each implementation was beneficial to the laboratory, and none was in conflict with another approach. A synthesis of these approaches, built on a solid foundation of quality control planning, can help build a strong quality management system for the entire laboratory. Copyright © 2016 Elsevier Inc. All rights reserved.
Simulation-based artifact correction (SBAC) for metrological computed tomography
NASA Astrophysics Data System (ADS)
Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc
2017-06-01
Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.
40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring
Code of Federal Regulations, 2010 CFR
2010-07-01
... quality system in terms of the organizational structure, functional responsibilities of management and... more stringent requirements. Monitoring organizations may, based on their quality objectives, develop... infrequent work with EPA funds may combine the QMP with the QAPP based on negotiations with the funding...
Flight Simulator Visual-Display Delay Compensation
NASA Technical Reports Server (NTRS)
Crane, D. Francis
1981-01-01
A piloted aircraft can be viewed as a closed-loop man-machine control system. When a simulator pilot is performing a precision maneuver, a delay in the visual display of aircraft response to pilot-control input decreases the stability of the pilot-aircraft system. The less stable system is more difficult to control precisely. Pilot dynamic response and performance change as the pilot attempts to compensate for the decrease in system stability. The changes in pilot dynamic response and performance bias the simulation results by influencing the pilot's rating of the handling qualities of the simulated aircraft. The study reported here evaluated an approach to visual-display delay compensation. The objective of the compensation was to minimize delay-induced change in pilot performance and workload, The compensation was effective. Because the compensation design approach is based on well-established control-system design principles, prospects are favorable for successful application of the approach in other simulations.
NASA Astrophysics Data System (ADS)
Landhäusser, Simon
2017-04-01
Forest loss and degradation is occurring worldwide, but at the same time efforts in forest restoration are ever increasing. While approaches to restoration often follow specific stakeholder objectives, regional climates and the degree of site degradation also play an important role in the prioritization of restoration efforts. Often the restoration of degraded lands can satisfy only few measurable objectives; however, to design and restore resistant and resilient ecosystems that can adapt to changing conditions, there is a need for new and adaptive management approaches. Mining and other resource extraction industries are affecting more and more forested areas worldwide. A priority in the reclamation and certification of forest lands disturbed by industrial activity is their expeditious redevelopment to functioning forests. To rehabilitate these heavily disturbed areas back to forest ecosystems, planting of trees remains one of the most effective strategies for the redevelopment of a continuous tree canopy on a site. It is well understood that access to good quality seedling stock is essential to achieve establishment success and early growth of seedlings. However, most reclamation areas have challenging initial site conditions and these conditions are often not a single factor but a combination of factors that can be additive or synergistic. Therefore successful forest restoration on degraded lands needs to consider multiple objectives and approaches to minimize trade-offs in achieving these objectives. To meet these demands, new methods for the production and evaluation of seedling stock types are needed to ensure that that seedlings are fit to grow on a wide range of site conditions or are particularly designed to grow in very specific conditions. Generally, defining seedling quality is difficult as it is species specific and results have been mixed; likely influenced by site conditions, further reiterating the need to carefully evaluate sites allowing appropriate seedling qualities to be identified. In this presentation, I will show results from a range of studies that explored the role of seedling characteristics in response to challenging site conditions and explore the need for a balance between the recognition and improvement of limiting site conditions and the availability of quality seedling stock in forest restoration.
A practical approach to object based requirements analysis
NASA Technical Reports Server (NTRS)
Drew, Daniel W.; Bishop, Michael
1988-01-01
Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.
NASA Astrophysics Data System (ADS)
Liu, Yu-Che; Huang, Chung-Lin
2013-03-01
This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.
2010-01-01
Background The measurement of healthcare provider performance is becoming more widespread. Physicians have been guarded about performance measurement, in part because the methodology for comparative measurement of care quality is underdeveloped. Comprehensive quality improvement will require comprehensive measurement, implying the aggregation of multiple quality metrics into composite indicators. Objective To present a conceptual framework to develop comprehensive, robust, and transparent composite indicators of pediatric care quality, and to highlight aspects specific to quality measurement in children. Methods We reviewed the scientific literature on composite indicator development, health systems, and quality measurement in the pediatric healthcare setting. Frameworks were selected for explicitness and applicability to a hospital-based measurement system. Results We synthesized various frameworks into a comprehensive model for the development of composite indicators of quality of care. Among its key premises, the model proposes identifying structural, process, and outcome metrics for each of the Institute of Medicine's six domains of quality (safety, effectiveness, efficiency, patient-centeredness, timeliness, and equity) and presents a step-by-step framework for embedding the quality of care measurement model into composite indicator development. Conclusions The framework presented offers researchers an explicit path to composite indicator development. Without a scientifically robust and comprehensive approach to measurement of the quality of healthcare, performance measurement will ultimately fail to achieve its quality improvement goals. PMID:20181129
ERIC Educational Resources Information Center
Edwards, Fleur
2012-01-01
This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…
Wang, Lin; Qu, Hui; Liu, Shan; Dun, Cai-xia
2013-01-01
As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted.
Dun, Cai-xia
2013-01-01
As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted. PMID:24302880
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Davidson, John B.
1998-01-01
A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.
A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado
Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.
2007-01-01
Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.
Salter, Robert S; Durbin, Gregory W; Conklin, Ernestine; Rosen, Jeff; Clancy, Jennifer
2010-12-01
Coliphages are microbial indicators specified in the Ground Water Rule that can be used to monitor for potential fecal contamination of drinking water. The Total Coliform Rule specifies coliform and Escherichia coli indicators for municipal water quality testing; thus, coliphage indicator use is less common and advances in detection methodology are less frequent. Coliphages are viral structures and, compared to bacterial indicators, are more resistant to disinfection and diffuse further distances from pollution sources. Therefore, coliphage presence may serve as a better predictor of groundwater quality. This study describes Fast Phage, a 16- to 24-h presence/absence modification of U.S. Environmental Protection Agency (EPA) Method 1601 for detection of coliphages in 100 ml water. The objective of the study is to demonstrate that the somatic and male-specific coliphage modifications provide results equivalent to those of Method 1601. Five laboratories compared the modifications, featuring same-day fluorescence-based prediction, to Method 1601 by using the performance-based measurement system (PBMS) criterion. This requires a minimum 50% positive response in 10 replicates of 100-ml water samples at coliphage contamination levels of 1.3 to 1.5 PFU/100 ml. The laboratories showed that Fast Phage meets PBMS criteria with 83.5 to 92.1% correlation of the same-day rapid fluorescence-based prediction with the next-day result. Somatic coliphage PBMS data are compared to manufacturer development data that followed the EPA alternative test protocol (ATP) validation approach. Statistical analysis of the data sets indicates that PBMS utilizes fewer samples than does the ATP approach but with similar conclusions. Results support testing the coliphage modifications by using an EPA-approved national PBMS approach with collaboratively shared samples.
Vadiati, M; Asghari-Moghaddam, A; Nakhaei, M; Adamowski, J; Akbarzadeh, A H
2016-12-15
Due to inherent uncertainties in measurement and analysis, groundwater quality assessment is a difficult task. Artificial intelligence techniques, specifically fuzzy inference systems, have proven useful in evaluating groundwater quality in uncertain and complex hydrogeological systems. In the present study, a Mamdani fuzzy-logic-based decision-making approach was developed to assess groundwater quality based on relevant indices. In an effort to develop a set of new hybrid fuzzy indices for groundwater quality assessment, a Mamdani fuzzy inference model was developed with widely-accepted groundwater quality indices: the Groundwater Quality Index (GQI), the Water Quality Index (WQI), and the Ground Water Quality Index (GWQI). In an effort to present generalized hybrid fuzzy indices a significant effort was made to employ well-known groundwater quality index acceptability ranges as fuzzy model output ranges rather than employing expert knowledge in the fuzzification of output parameters. The proposed approach was evaluated for its ability to assess the drinking water quality of 49 samples collected seasonally from groundwater resources in Iran's Sarab Plain during 2013-2014. Input membership functions were defined as "desirable", "acceptable" and "unacceptable" based on expert knowledge and the standard and permissible limits prescribed by the World Health Organization. Output data were categorized into multiple categories based on the GQI (5 categories), WQI (5 categories), and GWQI (3 categories). Given the potential of fuzzy models to minimize uncertainties, hybrid fuzzy-based indices produce significantly more accurate assessments of groundwater quality than traditional indices. The developed models' accuracy was assessed and a comparison of the performance indices demonstrated the Fuzzy Groundwater Quality Index model to be more accurate than both the Fuzzy Water Quality Index and Fuzzy Ground Water Quality Index models. This suggests that the new hybrid fuzzy indices developed in this research are reliable and flexible when used in groundwater quality assessment for drinking purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Meyerding, Stephan G H; Gentz, Maria; Altmann, Brianne; Meier-Dinkel, Lisa
2018-08-01
Consumer perspectives of beef quality are complex, leading to a market that is increasingly differentiating. Thus, ongoing monitoring and assessment of changes in consumer perspectives is essential to identify changing market conditions. Often only credence and search characteristics are evaluated in consumer studies; therefore the object of the present study is to examine consumer preferences and perceptions towards beef steaks, also including experience characteristics, using a mixed methods approach. For this reason, 55 consumers participated in an experiment in Germany, including a sensory acceptance test, stated willingness to pay, and choice-based conjoint analysis (CBCA). Different quality characteristics were included, but a focus on the quality labels of 'dry aged beef', 'Block House beef', and 'Angus beef' was predominant throughout the experiment with the results showing that quality labels significantly increased overall liking as well as the stated willingness to pay. Quality labels were also the one of the most important characteristics in the conjoint analysis, after origin and price. The results of all applied methods are comparable for the characteristic quality label. The combination of sensory acceptance test and CBCA were additionally able to evaluate all three kinds of beef quality characteristics, which could not be evaluated together only using a single method. This suggests that a mixture of methods should be used to gain better knowledge on the true behavior of beef consumers. Experience and credence characteristics, including beef quality labels, present opportunities for future research as well as the potential for determining product and market differentiation. Copyright © 2018 Elsevier Ltd. All rights reserved.
Design-Based Implementation Research
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Potvin, Ashley Seidel
2017-01-01
Purpose: This paper is second of seven in this volume elaborating different approaches to quality improvement in education. It delineates a methodology called design-based implementation research (DBIR). The approach used in this paper is aimed at iteratively improving the quality of classroom teaching and learning practices in defined problem…
Moving object detection using dynamic motion modelling from UAV aerial images.
Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid
2014-01-01
Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.
Kim, James; Li, Li; Liu, Hui
2018-01-01
Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796
AN ECOEPIDEMIOLOGICAL APPROACH FOR DEVELOPING WATER QUALITY CRITERIA
The USEPA's Draft Framework for Developing Suspended and Bedded Sediments Water Quality Criteria is based on an ecoepidemiological approach that is potentially applicable to any chemical or non-chemical agent. An ecoepidemiological approach infers associations from the co-occurre...
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.