Measuring Software Product Quality: The ISO 25000 Series and CMMI
2004-06-14
performance objectives” covers objectives and requirements for product quality, service quality , and process performance. Process performance objectives...such that product quality, service quality , and process performance attributes are measurable and controlled throughout the project (internal and
Redefining and expanding quality assurance.
Robins, J L
1992-12-01
To meet the current standards of excellence necessary for blood establishments, we have learned from industry that a movement toward organization-wide quality assurance/total quality management must be made. Everyone in the organization must accept responsibility for participating in providing the highest quality products and services. Quality must be built into processes and design systems to support these quality processes. Quality assurance has been redefined to include a quality planning function described as the most effective way of designing quality into processes. A formalized quality planning process must be part of quality assurance. Continuous quality improvement has been identified as the strategy every blood establishment must support while striving for error-free processing as the long-term objective. The auditing process has been realigned to support and facilitate this same objective. Implementing organization-wide quality assurance/total quality management is one proven plan for guaranteeing the quality of the 20 million products that are transfused into 4 million patients each year and for moving toward the new order.
DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM
The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...
Campos Andrade, Cláudia; Lima, Maria Luísa; Pereira, Cícero Roberto; Fornara, Ferdinando; Bonaiuto, Marino
2013-05-01
This study analyses the processes through which the physical environment of health care settings impacts on patients' well-being. Specifically, we investigate the mediating role of perceptions of the physical and social environments, and if this process is moderated by patients' status, that is, if the objective physical environment impacts inpatients' and outpatients' satisfaction by different social-psychological processes. Patients (N=206) evaluated the physical and social environments of the care unit where they were receiving treatment, and its objective physical conditions were independently evaluated by two architects. Results showed that the objective environmental quality affects satisfaction through perceptions of environmental quality, and that patients' status moderates this relationship. For inpatients, it is the perception of quality of the social environment that mediates the relationship between objective environmental quality and satisfaction, whereas for outpatients it is the perception of quality of the physical environment. This moderated mediation is discussed in terms of differences on patients' experiences of health care environments. Copyright © 2013 Elsevier Ltd. All rights reserved.
Objective assessment of MPEG-2 video quality
NASA Astrophysics Data System (ADS)
Gastaldo, Paolo; Zunino, Rodolfo; Rovetta, Stefano
2002-07-01
The increasing use of video compression standards in broadcasting television systems has required, in recent years, the development of video quality measurements that take into account artifacts specifically caused by digital compression techniques. In this paper we present a methodology for the objective quality assessment of MPEG video streams by using circular back-propagation feedforward neural networks. Mapping neural networks can render nonlinear relationships between objective features and subjective judgments, thus avoiding any simplifying assumption on the complexity of the model. The neural network processes an instantaneous set of input values, and yields an associated estimate of perceived quality. Therefore, the neural-network approach turns objective quality assessment into adaptive modeling of subjective perception. The objective features used for the estimate are chosen according to the assessed relevance to perceived quality and are continuously extracted in real time from compressed video streams. The overall system mimics perception but does not require any analytical model of the underlying physical phenomenon. The capability to process compressed video streams represents an important advantage over existing approaches, like avoiding the stream-decoding process greatly enhances real-time performance. Experimental results confirm that the system provides satisfactory, continuous-time approximations for actual scoring curves concerning real test videos.
A Quality Sorting of Fruit Using a New Automatic Image Processing Method
NASA Astrophysics Data System (ADS)
Amenomori, Michihiro; Yokomizu, Nobuyuki
This paper presents an innovative approach for quality sorting of objects such as apples sorting in an agricultural factory, using an image processing algorithm. The objective of our approach are; firstly to sort the objects by their colors precisely; secondly to detect any irregularity of the colors surrounding the apples efficiently. An experiment has been conducted and the results have been obtained and compared with that has been preformed by human sorting process and by color sensor sorting devices. The results demonstrate that our approach is capable to sort the objects rapidly and the percentage of classification valid rate was 100 %.
DATA QUALITY OBJECTIVES IN RESEARCH PLANNING AT MED: THREE CASE STUDIES
This course will give a quality assurance perspective to research planning by describing the Data Quality Objective Process....Written plans are mandatory for all EPA environmental data collection activities according to EPA Order 5360.1 CHG 1 and Federal Acquisition Regulations,...
Li, Mingjie; Zhou, Ping; Wang, Hong; ...
2017-09-19
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Mingjie; Zhou, Ping; Wang, Hong
As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less
Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEMPLETON, A.M.
2000-01-12
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less
Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEMPLETON, A.M.
2000-05-19
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
A multiple objective optimization approach to quality control
NASA Technical Reports Server (NTRS)
Seaman, Christopher Michael
1991-01-01
The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.
Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
MULKEY, C.H.
1999-07-02
This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for themore » Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements.« less
Image processing system performance prediction and product quality evaluation
NASA Technical Reports Server (NTRS)
Stein, E. K.; Hammill, H. B. (Principal Investigator)
1976-01-01
The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.
Twofold processing for denoising ultrasound medical images.
Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y
2015-01-01
Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India.
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
NASA Astrophysics Data System (ADS)
Qi, Li; Wang, Shun; Zhang, Yixin; Sun, Yingying; Zhang, Xuping
2015-11-01
The quality inspection process is usually carried out after first processing of the raw materials such as cutting and milling. This is because the parts of the materials to be used are unidentified until they have been trimmed. If the quality of the material is assessed before the laser process, then the energy and efforts wasted on defected materials can be saved. We proposed a new production scheme that can achieve quantitative quality inspection prior to primitive laser cutting by means of three-dimensional (3-D) vision measurement. First, the 3-D model of the object is reconstructed by the stereo cameras, from which the spatial cutting path is derived. Second, collaborating with another rear camera, the 3-D cutting path is reprojected to both the frontal and rear views of the object and thus generates the regions-of-interest (ROIs) for surface defect analysis. An accurate visual guided laser process and reprojection-based ROI segmentation are enabled by a global-optimization-based trinocular calibration method. The prototype system was built and tested with the processing of raw duck feathers for high-quality badminton shuttle manufacture. Incorporating with a two-dimensional wavelet-decomposition-based defect analysis algorithm, both the geometrical and appearance features of the raw feathers are quantified before they are cut into small patches, which result in fully automatic feather cutting and sorting.
Perceptual video quality assessment in H.264 video coding standard using objective modeling.
Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu
2014-01-01
Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.
The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.
Itzchakov, Guy; Kluger, Avraham N; Castro, Dotan R
2017-01-01
We examined how listeners characterized by empathy and a non-judgmental approach affect speakers' attitude structure. We hypothesized that high quality listening decreases speakers' social anxiety, which in turn reduces defensive processing. This reduction in defensive processing was hypothesized to result in an awareness of contradictions (increased objective-attitude ambivalence), and decreased attitude extremity. Moreover, we hypothesized that experiencing high quality listening would enable speakers to tolerate contradictory responses, such that listening would attenuate the association between objective- and subjective-attitude ambivalence. We obtained consistent support for our hypotheses across four laboratory experiments that manipulated listening experience in different ways on a range of attitude topics. The effects of listening on objective-attitude ambivalence were stronger for higher dispositional social anxiety and initial objective-attitude ambivalence (Study 4). Overall, the results suggest that speakers' attitude structure can be changed by a heretofore unexplored interpersonal variable: merely providing high quality listening.
NASA Astrophysics Data System (ADS)
Kuo, Chung-Feng Jeffrey; Quang Vu, Huy; Gunawan, Dewantoro; Lan, Wei-Luen
2012-09-01
Laser scribing process has been considered as an effective approach for surface texturization on thin film solar cell. In this study, a systematic method for optimizing multi-objective process parameters of fiber laser system was proposed to achieve excellent quality characteristics, such as the minimum scribing line width, the flattest trough bottom, and the least processing edge surface bumps for increasing incident light absorption of thin film solar cell. First, the Taguchi method (TM) obtained useful statistical information through the orthogonal array with relatively fewer experiments. However, TM is only appropriate to optimize single-objective problems and has to rely on engineering judgment for solving multi-objective problems that can cause uncertainty to some degree. The back-propagation neural network (BPNN) and data envelopment analysis (DEA) were utilized to estimate the incomplete data and derive the optimal process parameters of laser scribing system. In addition, analysis of variance (ANOVA) method was also applied to identify the significant factors which have the greatest effects on the quality of scribing process; in other words, by putting more emphasis on these controllable and profound factors, the quality characteristics of the scribed thin film could be effectively enhanced. The experiments were carried out on ZnO:Al (AZO) transparent conductive thin film with a thickness of 500 nm and the results proved that the proposed approach yields better anticipated improvements than that of the TM which is only superior in improving one quality while sacrificing the other qualities. The results of confirmation experiments have showed the reliability of the proposed method.
Data quality objectives for the initial fuel conditioning examinations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence, L.A.
The Data Quality Objectives (DQOs) were established for the response of the first group of fuel samples shipped from the K West Basin to the Hanford 327 Building hot cells for examinations to the proposed Path Forward conditioning process. Controlled temperature and atmosphere furnace testing testing will establish performance parameters using the conditioning process (drying, sludge drying, hydride decomposition passivation) proposed by the Independent Technical Assessment (ITA) Team as the baseline.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... the efficiency and effectiveness of FHA's quality assurance process (QAP). The objective of FHA's QAP... control plan (QCP).\\1\\ A copy of the plan must be submitted by the lender when applying for FHA lender... processes: post-endorsement technical reviews, Quality Assurance Division reviews and targeted lender...
Information processing during NREM sleep and sleep quality in insomnia.
Ceklic, Tijana; Bastien, Célyne H
2015-12-01
Insomnia sufferers (INS) are cortically hyperaroused during sleep, which seems to translate into altered information processing during nighttime. While information processing, as measured by event-related potentials (ERPs), during wake appears to be associated with sleep quality of the preceding night, the existence of such an association during nighttime has never been investigated. This study aims to investigate nighttime information processing among good sleepers (GS) and INS while considering concomitant sleep quality. Following a multistep clinical evaluation, INS and GS participants underwent 4 consecutive nights of PSG recordings in the sleep laboratory. Thirty nine GS (mean age 34.56±9.02) and twenty nine INS (mean age 43.03±9.12) were included in the study. ERPs (N1, P2, N350) were recorded all night on Night 4 (oddball paradigm) during NREM sleep. Regardless of sleep quality, INS presented a larger N350 amplitude during SWS (p=0.042) while GS showed a larger N350 amplitude during late-night stage 2 sleep (p=0.004). Regardless of diagnosis, those who slept objectively well showed a smaller N350 amplitude (p=0.020) while those who slept subjectively well showed a smaller P2 (p<0.001) and N350 amplitude (p=0.006). Also, those who reported an objectively bad night as good showed smaller P2 (p< 0.001) and N350 (p=0.010) amplitudes. Information processing seems to be associated with concomitant subjective and objective sleep quality for both GS and INS. However, INS show an alteration in information processing during sleep, especially for inhibition processes, regardless of their sleep quality. Copyright © 2015 Elsevier B.V. All rights reserved.
Double shell tanks (DST) chemistry control data quality objectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-10-09
One of the main functions of the River Protection Project is to store the Hanford Site tank waste until the Waste Treatment Plant (WTP) is ready to receive and process the waste. Waste from the older single-shell tanks is being transferred to the newer double-shell tanks (DSTs). Therefore, the integrity of the DSTs must be maintained until the waste from all tanks has been retrieved and transferred to the WTP. To help maintain the integrity of the DSTs over the life of the project, specific chemistry limits have been established to control corrosion of the DSTs. These waste chemistry limitsmore » are presented in the Technical Safety Requirements (TSR) document HNF-SD-WM-TSR-006, Sec. 5 . IS, Rev 2B (CHG 200 I). In order to control the chemistry in the DSTs, the Chemistry Control Program will require analyses of the tank waste. This document describes the Data Quality Objective (DUO) process undertaken to ensure appropriate data will be collected to control the waste chemistry in the DSTs. The DQO process was implemented in accordance with Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. Ib, Vol. IV, Section 4.16, (Banning 2001) and the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994), with some modifications to accommodate project or tank specific requirements and constraints.« less
ERIC Educational Resources Information Center
Shahzadi, Uzma; Shaheen, Gulnaz; Shah, Ashfaque Ahmed
2012-01-01
The study was intended to compare the quality of teaching learning process in the faculty of social science and science at University of Sargodha. This study was descriptive and quantitative in nature. The objectives of the study were to compare the quality of teaching learning process in the faculty of social science and science at University of…
Metrology: Measurement Assurance Program Guidelines
NASA Technical Reports Server (NTRS)
Eicke, W. G.; Riley, J. P.; Riley, K. J.
1995-01-01
The 5300.4 series of NASA Handbooks for Reliability and Quality Assurance Programs have provisions for the establishment and utilization of a documented metrology system to control measurement processes and to provide objective evidence of quality conformance. The intent of these provisions is to assure consistency and conformance to specifications and tolerances of equipment, systems, materials, and processes procured and/or used by NASA, its international partners, contractors, subcontractors, and suppliers. This Measurement Assurance Program (MAP) guideline has the specific objectives to: (1) ensure the quality of measurements made within NASA programs; (2) establish realistic measurement process uncertainties; (3) maintain continuous control over the measurement processes; and (4) ensure measurement compatibility among NASA facilities. The publication addresses MAP methods as applied within and among NASA installations and serves as a guide to: control measurement processes at the local level (one facility); conduct measurement assurance programs in which a number of field installations are joint participants; and conduct measurement integrity (round robin) experiments in which a number of field installations participate to assess the overall quality of particular measurement processes at a point in time.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2010-08-03
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less
Development of a course review process.
Persky, Adam M; Joyner, Pamela U; Cox, Wendy C
2012-09-10
To describe and assess a course review process designed to enhance course quality. A course review process led by the curriculum and assessment committees was designed for all required courses in the doctor of pharmacy (PharmD) program at a school of pharmacy. A rubric was used by the review team to address 5 areas: course layout and integration, learning outcomes, assessment, resources and materials, and learner interaction. One hundred percent of targeted courses, or 97% of all required courses, were reviewed from January to August 2010 (n=30). Approximately 3.5 recommendations per course were made, resulting in improvement in course evaluation items related to learning outcomes. Ninety-five percent of reviewers and 85% of course directors agreed that the process was objective and the course review process was important. The course review process was objective and effective in improving course quality. Future work will explore the effectiveness of an integrated, continual course review process in improving the quality of pharmacy education.
Quality Assurance for Digital Learning Object Repositories: Issues for the Metadata Creation Process
ERIC Educational Resources Information Center
Currier, Sarah; Barton, Jane; O'Beirne, Ronan; Ryan, Ben
2004-01-01
Metadata enables users to find the resources they require, therefore it is an important component of any digital learning object repository. Much work has already been done within the learning technology community to assure metadata quality, focused on the development of metadata standards, specifications and vocabularies and their implementation…
Implementation of quality by design toward processing of food products.
Rathore, Anurag S; Kapoor, Gautam
2017-05-28
Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.
Aden, Bile; Allekotte, Silke; Mösges, Ralph
2016-12-01
For long-term maintenance and improvement of quality within a clinical research institute, the implementation and certification of a quality management system is suitable. Due to the implemented quality management system according to the still valid DIN EN ISO 9001:2008 desired quality objectives are achieved effectively. The evaluation of quality scores and the appraisal of in-house quality indicators make an important contribution in this regard. In order to achieve this and draw quality assurance conclusions, quality indicators as sensible and sensitive as possible are developed. For this, own key objectives, the retrospective evaluation of quality scores, a prospective follow-up and also discussions establish the basis. In the in-house clinical research institute the measures introduced by the quality management led to higher efficiency in work processes, improved staff skills, higher customer satisfaction and overall to more successful outcomes in relation to the self-defined key objectives. Copyright © 2016. Published by Elsevier GmbH.
Real-time computer treatment of THz passive device images with the high image quality
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2012-06-01
We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.
Process safety improvement--quality and target zero.
Van Scyoc, Karl
2008-11-15
Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.
NASA Astrophysics Data System (ADS)
Zschocke, Thomas; Beniest, Jan
The Consultative Group on International Agricultural Re- search (CGIAR) has established a digital repository to share its teaching and learning resources along with descriptive educational information based on the IEEE Learning Object Metadata (LOM) standard. As a critical component of any digital repository, quality metadata are critical not only to enable users to find more easily the resources they require, but also for the operation and interoperability of the repository itself. Studies show that repositories have difficulties in obtaining good quality metadata from their contributors, especially when this process involves many different stakeholders as is the case with the CGIAR as an international organization. To address this issue the CGIAR began investigating the Open ECBCheck as well as the ISO/IEC 19796-1 standard to establish quality protocols for its training. The paper highlights the implications and challenges posed by strengthening the metadata creation workflow for disseminating learning objects of the CGIAR.
McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R
2014-10-01
Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.
McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R
2015-01-01
Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.
New opportunities for quality enhancing of images captured by passive THz camera
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2014-10-01
As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.
Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II
NASA Astrophysics Data System (ADS)
Pal, Kamal; Pal, Surjya K.
2018-05-01
Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.
Not planning a sustainable transport system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finnveden, Göran, E-mail: goran.finnveden@abe.kth.se; Åkerman, Jonas
2014-04-01
The overall objective of the Swedish transport policy is to ensure the economically efficient and sustainable provision of transport services for people and business throughout the country. More specifically, the transport sector shall, among other things, contribute to the achievement of environmental quality objectives in which the development of the transport system plays an important role in the achievement of the objectives. The aim of this study is to analyse if current transport planning supports this policy. This is done by analysing two recent cases: the National Infrastructure Plan 2010–2021, and the planning of Bypass Stockholm, a major road investment.more » Our results show that the plans are in conflict with several of the environmental quality objectives. Another interesting aspect of the planning processes is that the long-term climate goals are not included in the planning processes, neither as a clear goal nor as factor that will influence future transport systems. In this way, the long-term sustainability aspects are not present in the planning. We conclude that the two cases do not contribute to a sustainable transport system. Thus, several changes must be made in the processes, including putting up clear targets for emissions. Also, the methodology for the environmental assessments needs to be further developed and discussed. - Highlights: • Two cases are studied to analyse if current planning supports a sustainable transport system. • Results show that the plans are in conflict with several of the environmental quality objectives. • Long-term climate goals are not included in the planning processes. • Current practices do not contribute to a sustainable planning processes. • Methodology and process for environmental assessments must be further developed and discussed.« less
Study on Handing Process and Quality Degradation of Oil Palm Fresh Fruit Bunches (FFB)
NASA Astrophysics Data System (ADS)
Mat Sharif, Zainon Binti; Taib, Norhasnina Binti Mohd; Yusof, Mohd Sallehuddin Bin; Rahim, Mohammad Zulafif Bin; Tobi, Abdul Latif Bin Mohd; Othman, Mohd Syafiq Bin
2017-05-01
The main objective of this study is to determine the relationship between quality of oil palm fresh fruit bunches (FFB) and handling processes. The study employs exploratory and descriptive design, with quantitative approach and purposive sampling using self-administrated questionnaires, were obtained from 30 smallholder respondents from the Southern Region, Peninsular Malaysia. The study reveals that there was a convincing relationship between quality of oil palm fresh fruit bunches (FFB) and handling processes. The main handling process factors influencing quality of oil palm fresh fruit bunches (FFB) were harvesting activity and handling at the plantation area. As a result, it can be deduced that the handling process factors variable explains 82.80% of the variance that reflects the quality of oil palm fresh fruit bunches (FFB). The overall findings reveal that the handling process factors do play a significant role in the quality of oil palm fresh fruit bunches (FFB).
ERIC Educational Resources Information Center
Hou, Angela Yung-Chi; Ince, Martin; Tsai, Sandy; Chiang, Chung Lin
2015-01-01
As quality guardians of higher education, quality assurance agencies are required to guarantee the credibility of the review process and to ensure the objectivity and transparency of their decisions and recommendations. These agencies are therefore expected to use a range of internal and external approaches to prove the quality of their review…
Fit for purpose quality management system for military forensic exploitation.
Wilson, Lauren Elizabeth; Gahan, Michelle Elizabeth; Robertson, James; Lennard, Chris
2018-03-01
In a previous publication we described a systems approach to forensic science applied in the military domain. The forensic science 'system of systems' describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military systems, with quality management being an important supporting system. Quality management systems help to ensure that organisations achieve their objective and continually improve their capability. Components of forensic science quality management systems can include standardisation of processes, accreditation of facilities to national/international standards, and certification of personnel. A fit for purpose quality management system should be balanced to allow organisations to meet objectives, provide continuous improvement; mitigate risk; and impart a positive quality culture. Considerable attention over the last decades has been given to the need for forensic science quality management systems to meet criminal justice and law enforcement objectives. More recently, the need for the forensic quality management systems to meet forensic intelligence objectives has been considered. This paper, for the first time, discusses the need for a fit for purpose quality management system for military forensic exploitation. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Tank 241-AZ-102 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
RASMUSSEN, J.H.
1999-08-02
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AZ-102. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AZ-102 required to satisfy the Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank TIS An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase 1: Confirm Tank TIS An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activity Waste andmore » High Level Waste Feed Data Quality Objectives (L&H DQO) (Patello et al. 1999) and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). The Tank Characterization Technical Sampling Basis document (Brown et al. 1998) indicates that these issues, except the Equipment DQO apply to tank 241-AZ-102 for this sampling event. The Equipment DQO is applied for shear strength measurements of the solids segments only. Poppiti (1999) requires additional americium-241 analyses of the sludge segments. Brown et al. (1998) also identify safety screening, regulatory issues and provision of samples to the Privatization Contractor(s) as applicable issues for this tank. However, these issues will not be addressed via this sampling event. Reynolds et al. (1999) concluded that information from previous sampling events was sufficient to satisfy the safety screening requirements for tank 241 -AZ-102. Push mode core samples will be obtained from risers 15C and 24A to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples, composite the liquids and solids, perform chemical analyses, and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AZ-102 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plan.« less
ERIC Educational Resources Information Center
Kay, Robin H.; Knaack, Liesel
2009-01-01
Learning objects are interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and/or guiding the cognitive processes of learners. Research on the impact, effectiveness, and usefulness of learning objects is limited, partially because comprehensive, theoretically based, reliable, and valid evaluation…
NASA Astrophysics Data System (ADS)
Hess, M.; Robson, S.
2012-07-01
3D colour image data generated for the recording of small museum objects and archaeological finds are highly variable in quality and fitness for purpose. Whilst current technology is capable of extremely high quality outputs, there are currently no common standards or applicable guidelines in either the museum or engineering domain suited to scientific evaluation, understanding and tendering for 3D colour digital data. This paper firstly explains the rationale towards and requirements for 3D digital documentation in museums. Secondly it describes the design process, development and use of a new portable test object suited to sensor evaluation and the provision of user acceptance metrics. The test object is specifically designed for museums and heritage institutions and includes known surface and geometric properties which support quantitative and comparative imaging on different systems. The development for a supporting protocol will allow object reference data to be included in the data processing workflow with specific reference to conservation and curation.
Carson, Matthew B; Lee, Young Ji; Benacka, Corrine; Mutharasan, R. Kannan; Ahmad, Faraz S; Kansal, Preeti; Yancy, Clyde W; Anderson, Allen S; Soulakis, Nicholas D
2017-01-01
Objective: Using Failure Mode and Effects Analysis (FMEA) as an example quality improvement approach, our objective was to evaluate whether secondary use of orders, forms, and notes recorded by the electronic health record (EHR) during daily practice can enhance the accuracy of process maps used to guide improvement. We examined discrepancies between expected and observed activities and individuals involved in a high-risk process and devised diagnostic measures for understanding discrepancies that may be used to inform quality improvement planning. Methods: Inpatient cardiology unit staff developed a process map of discharge from the unit. We matched activities and providers identified on the process map to EHR data. Using four diagnostic measures, we analyzed discrepancies between expectation and observation. Results: EHR data showed that 35% of activities were completed by unexpected providers, including providers from 12 categories not identified as part of the discharge workflow. The EHR also revealed sub-components of process activities not identified on the process map. Additional information from the EHR was used to revise the process map and show differences between expectation and observation. Conclusion: Findings suggest EHR data may reveal gaps in process maps used for quality improvement and identify characteristics about workflow activities that can identify perspectives for inclusion in an FMEA. Organizations with access to EHR data may be able to leverage clinical documentation to enhance process maps used for quality improvement. While focused on FMEA protocols, findings from this study may be applicable to other quality activities that require process maps. PMID:27589944
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: 1) the main production quality characteristics; 2) the process features for quality assurance; 3) the internal quality characteristics and their specification designs; 4) the quality control and process capability evaluation methods, and 5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. PMID:23872356
Air Pollution Data for Model Evaluation and Application
One objective of designing an air pollution monitoring network is to obtain data for evaluating air quality models that are used in the air quality management process and scientific discovery.1.2 A common use is to relate emissions to air quality, including assessing ...
NASA Technical Reports Server (NTRS)
Hamazaki, Takashi
1992-01-01
This paper describes an architecture for realizing high quality production schedules. Although quality is one of the most important aspects of production scheduling, it is difficult, even for a user, to specify precisely. However, it is also true that the decision as to whether a scheduler is good or bad can only be made by the user. This paper proposes the following: (1) the quality of a schedule can be represented in the form of quality factors, i.e. constraints and objectives of the domain, and their structure; (2) quality factors and their structure can be used for decision making at local decision points during the scheduling process; and (3) that they can be defined via iteration of user specification processes.
Intelligent Systems Approaches to Product Sound Quality Analysis
NASA Astrophysics Data System (ADS)
Pietila, Glenn M.
As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach. Next, an unsupervised jury clustering algorithm is used to identify and classify subgroups within a jury who have conflicting preferences. In addition, a nested Artificial Neural Network (ANN) architecture is developed to predict subjective preference based on objective sound quality metrics, in the presence of non-linear preferences. Finally, statistical decomposition and correlation algorithms are reviewed that can help an analyst establish a clear understanding of the variability of the product sounds used as inputs into the jury study and to identify correlations between preference scores and sound quality metrics in the presence of non-linearities.
The Minimum Data Set Depression Quality Indicator: Does It Reflect Differences in Care Processes?
ERIC Educational Resources Information Center
Simmons, S.F.; Cadogan, M.P.; Cabrera, G.R.; Al-Samarrai, N.R.; Jorge, J.S.; Levy-Storms, L.; Osterweil, D.; Schnelle, J.F.
2004-01-01
Purpose. The objective of this work was to determine if nursing homes that score differently on prevalence of depression, according to the Minimum Data Set (MDS) quality indicator, also provide different processes of care related to depression. Design and Methods. A cross-sectional study with 396 long-term residents in 14 skilled nursing…
USDA-ARS?s Scientific Manuscript database
The objective of this study was to evaluate the effects of moisture addition at the gin stand feeder conditioning hopper and/or the battery condenser slide on gin performance and Western cotton fiber quality and textile processing. The test treatments included no moisture addition, feeder hopper hum...
Quality-Oriented Management of Educational Innovation at Madrasah Ibtidaiyah
ERIC Educational Resources Information Center
Sofanudin, Aji; Rokhman, Fathur; Wasino; Rusdarti
2016-01-01
This study aims to explore the quality-oriented management of educational innovation at Madrasah Ibtidaiyah. Quality-Oriented Management of Educational Innovation is the process of managing new resources (ideas, practices, objects, methods) in the field of education to achieve educational goals or solve the problem of education. New ideas,…
Managing the travel model process : small and medium-sized MPOs. Instructor guide.
DOT National Transportation Integrated Search
2013-09-01
The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.
Managing the travel model process : small and medium-sized MPOs. Participant handbook.
DOT National Transportation Integrated Search
2013-09-01
The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.
Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4
Provides a standard working tool for project managers and planners to develop DQO for determining the type, quantity, and quality of data needed to reach defensible decisions or make credible estimates.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
The NCC project: A quality management perspective
NASA Technical Reports Server (NTRS)
Lee, Raymond H.
1993-01-01
The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.
Hansen, J H; Nandkumar, S
1995-01-01
The formulation of reliable signal processing algorithms for speech coding and synthesis require the selection of a prior criterion of performance. Though coding efficiency (bits/second) or computational requirements can be used, a final performance measure must always include speech quality. In this paper, three objective speech quality measures are considered with respect to quality assessment for American English, noisy American English, and noise-free versions of seven languages. The purpose is to determine whether objective quality measures can be used to quantify changes in quality for a given voice coding method, with a known subjective performance level, as background noise or language conditions are changed. The speech coding algorithm chosen is regular-pulse excitation with long-term prediction (RPE-LTP), which has been chosen as the standard voice compression algorithm for the European Digital Mobile Radio system. Three areas are considered for objective quality assessment which include: (i) vocoder performance for American English in a noise-free environment, (ii) speech quality variation for three additive background noise sources, and (iii) noise-free performance for seven languages which include English, Japanese, Finnish, German, Hindi, Spanish, and French. It is suggested that although existing objective quality measures will never replace subjective testing, they can be a useful means of assessing changes in performance, identifying areas for improvement in algorithm design, and augmenting subjective quality tests for voice coding/compression algorithms in noise-free, noisy, and/or non-English applications.
The objective impact of clinical peer review on hospital quality and safety.
Edwards, Marc T
2011-01-01
Despite its importance, the objective impact of clinical peer review on the quality and safety of care has not been studied. Data from 296 acute care hospitals show that peer review program and related organizational factors can explain up to 18% of the variation in standardized measures of quality and patient safety. The majority of programs rely on an outmoded and dysfunctional process model. Adoption of best practices informed by the continuing study of peer review program effectiveness has the potential to significantly improve patient outcomes.
Quality measurement and benchmarking of HPV vaccination services: a new approach.
Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta
2014-01-01
A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.
Food fortification: issues on quality assurance and impact evaluation in developing countries.
Florentino, R
2003-01-01
Quality assurance and impact evaluation are essential components of a food fortification program and should be integrated in the fortification process. Quality assurance will ensure that the micronutrient meant to be delivered is indeed reaching the target population at the correct level. Impact evaluation will determine the effectiveness of food fortification as a strategy in controlling micronutrient deficiency and enable program planners to make decisions on the future of the program. In developing countries, both quality assurance and impact evaluation are often constrained not only by inadequacy of facilities as well as financial and manpower resources, but by unclear definition of objectives and inappropriate design. It is therefore necessary to consider the target audience for the quality assurance monitoring and impact evaluation in order to clearly define their objectives and in turn suit the design to these objectives, at the same time as the limitations in financial and manpower resources are considered.
Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process
NASA Astrophysics Data System (ADS)
Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.
2015-08-01
An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.
Video quality assesment using M-SVD
NASA Astrophysics Data System (ADS)
Tao, Peining; Eskicioglu, Ahmet M.
2007-01-01
Objective video quality measurement is a challenging problem in a variety of video processing application ranging from lossy compression to printing. An ideal video quality measure should be able to mimic the human observer. We present a new video quality measure, M-SVD, to evaluate distorted video sequences based on singular value decomposition. A computationally efficient approach is developed for full-reference (FR) video quality assessment. This measure is tested on the Video Quality Experts Group (VQEG) phase I FR-TV test data set. Our experiments show the graphical measure displays the amount of distortion as well as the distribution of error in all frames of the video sequence while the numerical measure has a good correlation with perceived video quality outperforms PSNR and other objective measures by a clear margin.
Manipulation of Unknown Objects to Improve the Grasp Quality Using Tactile Information.
Montaño, Andrés; Suárez, Raúl
2018-05-03
This work presents a novel and simple approach in the area of manipulation of unknown objects considering both geometric and mechanical constraints of the robotic hand. Starting with an initial blind grasp, our method improves the grasp quality through manipulation considering the three common goals of the manipulation process: improving the hand configuration, the grasp quality and the object positioning, and, at the same time, prevents the object from falling. Tactile feedback is used to obtain local information of the contacts between the fingertips and the object, and no additional exteroceptive feedback sources are considered in the approach. The main novelty of this work lies in the fact that the grasp optimization is performed on-line as a reactive procedure using the tactile and kinematic information obtained during the manipulation. Experimental results are shown to illustrate the efficiency of the approach.
Schroeder, R.L.
2006-01-01
It is widely accepted that plans for restoration projects should contain specific, measurable, and science-based objectives to guide restoration efforts. The United States Fish and Wildlife Service (USFWS) is in the process of developing Comprehensive Conservation Plans (CCPs) for more than 500 units in the National Wildlife Refuge System (NWRS). These plans contain objectives for biological and ecosystem restoration efforts on the refuges. Based on USFWS policy, a system was developed to evaluate the scientific quality of such objectives based on three critical factors: (1) Is the objective specific, measurable, achievable, results-oriented, and time-fixed? (2) What is the extent of the rationale that explains the assumptions, logic, and reasoning for the objective? (3) How well was available science used in the development of the objective? The evaluation system scores each factor on a scale of 1 (poor) to 4 (excellent) according to detailed criteria. The biological and restoration objectives from CCPs published as of September 2004 (60 total) were evaluated. The overall average score for all biological and restoration objectives was 1.73. Average scores for each factor were: Factor 1-1.97; Factor 2-1.86; Factor 3-1.38. The overall scores increased from 1997 to 2004. Future restoration efforts may benefit by using this evaluation system during the process of plan development, to ensure that biological and restoration objectives are of the highest scientific quality possible prior to the implementation of restoration plans, and to allow for improved monitoring and adaptive management.
Quality - Inexpensive if a way of life.
NASA Technical Reports Server (NTRS)
Grau, D.
1972-01-01
NASA major projects require phased planning. The participation of persons charged with maintaining the proper quality during the last two of four phases has become accepted practice. Current objectives are concerned with the application of quality assurance techniques during the second phase. It is pointed out that quality must be emphasized during the entire engineering process, starting with the selection of the components.
USDA-ARS?s Scientific Manuscript database
Objectives were to determine the effects of marketing group on quality and variability of belly and adipose tissue quality traits of pigs sourced from differing production focuses (lean vs. quality). Pigs (N = 8,042) raised in 8 barns representing 2 seasons (cold and hot) were used. Three groups wer...
USDA-ARS?s Scientific Manuscript database
The objective was: 1) to characterize the effect of marketing 30 group on fresh and cured ham quality, and 2) to determine which fresh ham traits correlated to cured ham quality traits. Pigs raised in 8 barns representing two seasons (hot and cold) and two production focuses (lean and quality) were ...
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Internal Quality Assurance Benchmarking. ENQA Workshop Report 20
ERIC Educational Resources Information Center
Blackstock, Douglas; Burquel, Nadine; Comet, Nuria; Kajaste, Matti; dos Santos, Sergio Machado; Marcos, Sandra; Moser, Marion; Ponds, Henri; Scheuthle, Harald; Sixto, Luis Carlos Velon
2012-01-01
The Internal Quality Assurance group of ENQA (IQA Group) has been organising a yearly seminar for its members since 2007. The main objective is to share experiences concerning the internal quality assurance of work processes in the participating agencies. The overarching theme of the 2011 seminar was how to use benchmarking as a tool for…
Low-cost oblique illumination: an image quality assessment.
Ruiz-Santaquiteria, Jesus; Espinosa-Aranda, Jose Luis; Deniz, Oscar; Sanchez, Carlos; Borrego-Ramos, Maria; Blanco, Saul; Cristobal, Gabriel; Bueno, Gloria
2018-01-01
We study the effectiveness of several low-cost oblique illumination filters to improve overall image quality, in comparison with standard bright field imaging. For this purpose, a dataset composed of 3360 diatom images belonging to 21 taxa was acquired. Subjective and objective image quality assessments were done. The subjective evaluation was performed by a group of diatom experts by psychophysical test where resolution, focus, and contrast were assessed. Moreover, some objective nonreference image quality metrics were applied to the same image dataset to complete the study, together with the calculation of several texture features to analyze the effect of these filters in terms of textural properties. Both image quality evaluation methods, subjective and objective, showed better results for images acquired using these illumination filters in comparison with the no filtered image. These promising results confirm that this kind of illumination filters can be a practical way to improve the image quality, thanks to the simple and low cost of the design and manufacturing process. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Choice of mathematical models for technological process of glass rod drawing
NASA Astrophysics Data System (ADS)
Alekseeva, L. B.
2017-10-01
The technological process of drawing glass rods (light guides) is considered. Automated control of the drawing process is reduced to the process of making decisions to ensure a given quality. The drawing process is considered as a control object, including the drawing device (control device) and the optical fiber forming zone (control object). To study the processes occurring in the formation zone, mathematical models are proposed, based on the continuum mechanics basics. To assess the influence of disturbances, a transfer function is obtained from the basis of the wave equation. Obtaining the regression equation also adequately describes the drawing process.
DATA QUALITY OBJECTIVE SUMMARY REPORT FOR THE 105 K EAST ION EXCHANGE COLUMN MONOLITH
DOE Office of Scientific and Technical Information (OSTI.GOV)
JOCHEN, R.M.
2007-08-02
The 105-K East (KE) Basin Ion Exchange Column (IXC) cells, lead caves, and the surrounding vault are to be removed as necessary components in implementing ''Hanford Federal Facility Agreement and Consent Order'' (Ecology et al. 2003) milestone M-034-32 (Complete Removal of the K East Basin Structure). The IXCs consist of six units located in the KE Basin, three in operating positions in cells and three stored in a lead cave. Methods to remove the IXCs from the KE Basin were evaluated in KBC-28343, ''Disposal of K East Basin Ion Exchange Column Evaluation''. The method selected for removal was grouting themore » six IXCs into a single monolith for disposal at the Environmental Restoration Disposal Facility (ERDF). Grout will be added to the IXC cells, IXC lead caves containing spent IXCs, and in the spaces between the lead cave walls and metal skin, to immobilize the contaminants, provide self-shielding, minimize void space, and provide a structurally stable waste form. The waste to be offered for disposal is the encapsulated monolith defined by the exterior surfaces of the vault and the lower surface of the underlying slab. This document presents summary of the data quality objective (DQO) process establishing the decisions and data required to support decision-making activities for the disposition of the IXC monolith. The DQO process is completed in accordance with the seven-step planning process described in EPA QA/G-4, ''Guidance for the Data Quality Objectives Process'', which is used to clarify and study objectives; define the appropriate type, quantity, and quality of data; and support defensible decision-making. The DQO process involves the following steps: (1) state the problem; (2) identify the decision; (3) identify the inputs to the decision; (4) define the boundaries of the study; (5) develop a decision rule (DR); (6) specify tolerable limits on decision errors; and (7) optimize the design for obtaining data.« less
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
A new approach to the identification of Landscape Quality Objectives (LQOs) as a set of indicators.
Sowińska-Świerkosz, Barbara Natalia; Chmielewski, Tadeusz J
2016-12-15
The objective of the paper is threefold: (1) to introduce Landscape Quality Objectives (LQOs) as a set of indicators; (2) to present a method of linking social and expert opinion in the process of the formulation of landscape indicators; and (3) to present a methodological framework for the identification of LQOs. The implementation of these goals adopted a six-stage procedure based on the use of landscape units: (1) GIS analysis; (2) classification; (3) social survey; (4) expert value judgement; (5) quality assessment; and (6) guidelines formulation. The essence of the research was the presentation of features that determine landscape quality according to public opinion as a set of indicators. The results showed that 80 such indicators were identified, of both a qualitative (49) and a quantitative character (31). Among the analysed units, 60% (18 objects) featured socially expected (and confirmed by experts) levels of landscape quality, and 20% (6 objects) required overall quality improvement in terms of both public and expert opinion. The adopted procedure provides a new tool for integrating social responsibility into environmental management. The advantage of the presented method is the possibility of its application in the territories of various European countries. It is flexible enough to be based on cartographic studies, landscape research methods, and environmental quality standards existing in a given country. Copyright © 2016 Elsevier Ltd. All rights reserved.
Astronomical Instrumentation Systems Quality Management Planning: AISQMP
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse
2017-06-01
The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
Astronomical Instrumentation Systems Quality Management Planning: AISQMP (Abstract)
NASA Astrophysics Data System (ADS)
Goldbaum, J.
2017-12-01
(Abstract only) The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
USDA-ARS?s Scientific Manuscript database
The objective was to quantify the effect of HCW on pork primal quality of 7,684 pigs with carcass weights ranging from 53.2 to 129.6 kg. Carcass composition, subjective loin quality, and ham face color were collected on all carcasses. In-plant instrumental loin color and belly quality analyses were ...
Quality Assurance for Postgraduate Programs: Design of a Model Applied on a University in Chile
ERIC Educational Resources Information Center
Careaga Butter, Marcelo; Meyer Aguilera, Eduardo; Badilla Quintana, María Graciela; Jiménez Pérez, Laura; Sepúlveda Valenzuela, Eileen
2017-01-01
The quality of Education in Chile is a controversial topic that has been in the public debate in the last several years. To ensure quality in graduate programs, accreditation is compulsory. The current article presents a model to improve the process of self-regulation. The main objective was to design a Model of Quality Assurance for Postgraduate…
Quality risk management in pharmaceutical development.
Charoo, Naseem Ahmad; Ali, Areeg Anwer
2013-07-01
The objective of ICH Q8, Q9 and Q10 documents is application of systemic and science based approach to formulation development for building quality into product. There is always some uncertainty in new product development. Good risk management practice is essential for success of new product development in decreasing this uncertainty. In quality by design paradigm, the product performance properties relevant to the patient are predefined in target product profile (TPP). Together with prior knowledge and experience, TPP helps in identification of critical quality attributes (CQA's). Initial risk assessment which identifies risks to these CQA's provides impetus for product development. Product and process are designed to gain knowledge about these risks, devise strategies to eliminate or mitigate these risks and meet objectives set in TPP. By laying more emphasis on high risk events the protection level of patient is increased. The process being scientifically driven improves the transparency and reliability of the manufacturer. The focus on risk to the patient together with flexible development approach saves invaluable resources, increases confidence on quality and reduces compliance risk. The knowledge acquired in analysing risks to CQA's permits construction of meaningful design space. Within the boundaries of the design space, variation in critical material characteristics and process parameters must be managed in order to yield a product having the desired characteristics. Specifications based on product and process understanding are established such that product will meet the specifications if tested. In this way, the product is amenable to real time release, since specifications only confirm quality but they do not serve as a means of effective process control.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2011-02-11
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
High-efficiency cell concepts on low-cost silicon sheets
NASA Technical Reports Server (NTRS)
Bell, R. O.; Ravi, K. V.
1985-01-01
The limitations on sheet growth material in terms of the defect structure and minority carrier lifetime are discussed. The effect of various defects on performance are estimated. Given these limitations designs for a sheet growth cell that will make the best of the material characteristics are proposed. Achievement of optimum synergy between base material quality and device processing variables is proposed. A strong coupling exists between material quality and the variables during crystal growth, and device processing variables. Two objectives are outlined: (1) optimization of the coupling for maximum performance at minimal cost; and (2) decoupling of materials from processing by improvement in base material quality to make it less sensitive to processing variables.
Sears, Jeanne M; Wickizer, Thomas M; Franklin, Gary M; Cheadle, Allen D; Berkowitz, Bobbie
2007-08-01
The objectives of this study were 1) to identify quality and process of care indicators available in administrative workers' compensation data and to document their association with work disability outcomes, and 2) to use these indicators to assess whether nurse practitioners (NPs), recently authorized to serve as attending providers for injured workers in Washington State, performed differently than did primary care physicians (PCPs). Quality and process of care indicators for NP and PCP back injury claims from Washington State were compared using direct standardization and logistic regression. This study found little evidence of differences between NP and PCP claims in case mix or quality of care. The process of care indicators that we identified were highly associated with the duration of work disability and have potential for further development to assess and promote quality improvement.
NASA Technical Reports Server (NTRS)
Strand, Albert A.; Jackson, Darryl J.
1992-01-01
As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.
NASA Astrophysics Data System (ADS)
Strand, Albert A.; Jackson, Darryl J.
As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.
Hunter, Linda; Myles, Joanne; Worthington, James R; Lebrun, Monique
2011-01-01
This article discusses the background and process for developing a multi-year corporate quality plan. The Ottawa Hospital's goal is to be a top 10% performer in quality and patient safety in North America. In order to create long-term measurable and sustainable changes in the quality of patient care, The Ottawa Hospital embarked on the development of a three-year strategic corporate quality plan. This was accomplished by engaging the organization at all levels and defining quality frameworks, aligning with internal and external expectations, prioritizing strategic goals, articulating performance measurements and reporting to stakeholders while maintaining a transparent communication process. The plan was developed through an iterative process that engaged a broad base of health professionals, physicians, support staff, administration and senior management. A literature review of quality frameworks was undertaken, a Quality Plan Working Group was established, 25 key stakeholder interviews were conducted and 48 clinical and support staff consultations were held. The intent was to gather information on current quality initiatives and challenges encountered and to prioritize corporate goals and then create the quality plan. Goals were created and then prioritized through an affinity exercise. Action plans were developed for each goal and included objectives, tasks and activities, performance measures (structure, process and outcome), accountabilities and timelines. This collaborative methodology resulted in the development of a three-year quality plan. Six corporate goals were outlined by the tenets of the quality framework for The Ottawa Hospital: access to care, appropriate care (effective and efficient), safe care and satisfaction with care. Each of the six corporate goals identified objectives and supporting action plans with accountabilities outlining what would be accomplished in years one, two and three. The three-year quality plan was approved by senior management and the board in April 2009. This process has supported The Ottawa Hospital's journey of excellence through the creation of a quality plan that will enable long-term measurable and sustainable changes in the quality of patient care. It also engaged healthcare providers who aim to achieve more measured quality patient care, engaged practitioners through collaboration resulting in both alignment of goals and outcomes and allowed for greater commitment by those responsible for achieving quality goals.
DOT National Transportation Integrated Search
2010-02-01
This project developed a methodology to couple a new pollutant dispersion model with a traffic : assignment process to contain air pollution while maximizing mobility. The overall objective of the air : quality modeling part of the project is to deve...
2002-09-30
Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer-Indian Ocean METOC Imager ( GIFTS -IOMI) Hyperspectral Data...water quality assessment. OBJECTIVES The objective of this DoD research effort is to develop and demonstrate a fully functional GIFTS - IOMI...environment once GIFTS -IOMI is stationed over the Indian Ocean. The system will provide specialized methods for the characterization of the atmospheric
Quality assessment for color reproduction using a blind metric
NASA Astrophysics Data System (ADS)
Bringier, B.; Quintard, L.; Larabi, M.-C.
2007-01-01
This paper deals with image quality assessment. This field plays nowadays an important role in various image processing applications. Number of objective image quality metrics, that correlate or not, with the subjective quality have been developed during the last decade. Two categories of metrics can be distinguished, the first with full-reference and the second with no-reference. Full-reference metric tries to evaluate the distortion introduced to an image with regards to the reference. No-reference approach attempts to model the judgment of image quality in a blind way. Unfortunately, the universal image quality model is not on the horizon and empirical models established on psychophysical experimentation are generally used. In this paper, we focus only on the second category to evaluate the quality of color reproduction where a blind metric, based on human visual system modeling is introduced. The objective results are validated by single-media and cross-media subjective tests.
Land Surface Process and Air Quality Research and Applications at MSFC
NASA Technical Reports Server (NTRS)
Quattrochi, Dale; Khan, Maudood
2007-01-01
This viewgraph presentation provides an overview of land surface process and air quality research at MSFC including atmospheric modeling and ongoing research whose objective is to undertake a comprehensive spatiotemporal analysis of the effects of accurate land surface characterization on atmospheric modeling results, and public health applications. Land use maps as well as 10 meter air temperature, surface wind, PBL mean difference heights, NOx, ozone, and O3+NO2 plots as well as spatial growth model outputs are included. Emissions and general air quality modeling are also discussed.
NASA Astrophysics Data System (ADS)
Sato, Takashi; Honma, Michio; Itoh, Hiroyuki; Iriki, Nobuyuki; Kobayashi, Sachiko; Miyazaki, Norihiko; Onodera, Toshio; Suzuki, Hiroyuki; Yoshioka, Nobuyuki; Arima, Sumika; Kadota, Kazuya
2009-04-01
The category and objective of DFM production management are shown. DFM is not limited to an activity within a particular unit process in design and process. A new framework for DFM is required. DFM should be a total solution for the common problems of all processes. Each of them must be linked to one another organically. After passing through the whole of each process on the manufacturing platform, quality of final products is guaranteed and products are shipped to the market. The information platform is layered with DFM, APC, and AEC. Advanced DFM is not DFM for partial optimization of the lithography process and the design, etc. and it should be Organized DFM. They are managed with high-level organizational IQ. The interim quality between each step of the flow should be visualized. DFM will be quality engineering if it is Organized DFM and common metrics of the quality are provided. DFM becomes quality engineering through effective implementation of common industrial metrics and standardized technology. DFM is differential technology, but can leverage standards for efficient development.
USDA-ARS?s Scientific Manuscript database
The objective was to determine the predictive abilities of HCW for loin, ham, and belly quality of 7,684 pigs with carcass weights ranging from 53.2 to 129.6 kg. Carcass composition, subjective loin quality, and ham face color were targeted on all carcasses, whereas in-plant instrumental loin color ...
ERIC Educational Resources Information Center
Everett, James; Gershwin, Mary; Hayes, Homer; Jacobs, James; Mundhenk, Robert
Although objectively measurable achievement of outcomes is an important guide to the quality of education, the process of defining and assuring the quality of technical education and training must include consideration for the context in which technical education and training occurs. It is also critical to remember that education has two sets of…
42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...
42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...
NASA Technical Reports Server (NTRS)
Basili, V. R.
1981-01-01
Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.
NASA Astrophysics Data System (ADS)
Rosyidi, C. N.; Jauhari, WA; Suhardi, B.; Hamada, K.
2016-02-01
Quality improvement must be performed in a company to maintain its product competitiveness in the market. The goal of such improvement is to increase the customer satisfaction and the profitability of the company. In current practice, a company needs several suppliers to provide the components in assembly process of a final product. Hence quality improvement of the final product must involve the suppliers. In this paper, an optimization model to allocate the variance reduction is developed. Variation reduction is an important term in quality improvement for both manufacturer and suppliers. To improve suppliers’ components quality, the manufacturer must invest an amount of their financial resources in learning process of the suppliers. The objective function of the model is to minimize the total cost consists of investment cost, and quality costs for both internal and external quality costs. The Learning curve will determine how the employee of the suppliers will respond to the learning processes in reducing the variance of the component.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2001-12-01
Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.
A perspective on the FAA approval process: Integrating rotorcraft displays, controls and workload
NASA Technical Reports Server (NTRS)
Green, David L.; Hart, Jake; Hwoschinsky, Peter
1993-01-01
The FAA is responsible for making the determination that a helicopter is safe for IFR operations in the National Airspace System (NAS). This involves objective and subjective evaluations of cockpit displays, flying qualities, procedures and human factors as they affect performance and workload. After all of the objective evaluations are completed, and all Federal Regulations have been met, FAA pilots make the final subjective judgement as to suitability for use by civil pilots in the NAS. The paper uses the flying qualities and pilot workload characteristics of a small helicopter to help examine the FAA pilot's involvement in this process. The result highlights the strengths of the process and its importance to the approval of new aircraft and equipments for civil IFR helicopter applications. The paper also identifies opportunities for improvement.
NASA Astrophysics Data System (ADS)
Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi
2005-10-01
MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
System of error detection in the manufacture of garments using artificial vision
NASA Astrophysics Data System (ADS)
Moreno, J. J.; Aguila, A.; Partida, E.; Martinez, C. L.; Morales, O.; Tejeida, R.
2017-12-01
A computer vision system is implemented to detect errors in the cutting stage within the manufacturing process of garments in the textile industry. It provides solution to errors within the process that cannot be easily detected by any employee, in addition to significantly increase the speed of quality review. In the textile industry as in many others, quality control is required in manufactured products and this has been carried out manually by means of visual inspection by employees over the years. For this reason, the objective of this project is to design a quality control system using computer vision to identify errors in the cutting stage within the garment manufacturing process to increase the productivity of textile processes by reducing costs.
QUALITY MANAGEMENT PLAN FOR THE NATIONAL CHILDREN'S STUDY
EPA has taken the lead, in consort with NIH, in developing the Quality Management Plan (QMP) for the National Children's Study (NCS); the QMP will delineate a systematic planning process for the implementation of the NCS. The QMP will state the goals and objectives of the NCS, th...
Case for Quality Assurance in ESP [English For Specific Purposes] Programmes.
ERIC Educational Resources Information Center
Tan San Yee, Christine
There is now a need, just like in industry, for quality assurance in education, for injecting systematically planned and formal processes, precise definitions, objectivity, and measurability in education. The demand for educational excellence in industry is "out there," and companies in more advanced countries are partnering educational…
Quality assurance and accreditation.
1997-01-01
In 1996, the Joint Commission International (JCI), which is a partnership between the Joint Commission on Accreditation of Healthcare Organizations and Quality Healthcare Resources, Inc., became one of the contractors of the Quality Assurance Project (QAP). JCI recognizes the link between accreditation and quality, and uses a collaborative approach to help a country develop national quality standards that will improve patient care, satisfy patient-centered objectives, and serve the interest of all affected parties. The implementation of good standards provides support for the good performance of professionals, introduces new ideas for improvement, enhances the quality of patient care, reduces costs, increases efficiency, strengthens public confidence, improves management, and enhances the involvement of the medical staff. Such good standards are objective and measurable; achievable with current resources; adaptable to different institutions and cultures; and demonstrate autonomy, flexibility, and creativity. The QAP offers the opportunity to approach accreditation through research efforts, training programs, and regulatory processes. QAP work in the area of accreditation has been targeted for Zambia, where the goal is to provide equal access to cost-effective, quality health care; Jordan, where a consensus process for the development of standards, guidelines, and policies has been initiated; and Ecuador, where JCI has been asked to help plan an approach to the evaluation and monitoring of the health care delivery system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sailer, S.J.
This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCPmore » has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.« less
Process perspective on image quality evaluation
NASA Astrophysics Data System (ADS)
Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte
2008-01-01
The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.
STARS Proceedings (3-4 December 1991)
1991-12-04
PROJECT PROCESS OBJECTIVES & ASSOCIATED METRICS: Prioritize ECPs: complexity & error-history measures 0 Make vs Buy decisions: Effort & Quality (or...history measures, error- proneness and past histories of trouble with particular modules are very useful measures. Make vs Buy decisions: Does the...Effort offset the gain in Quality relative to buy ... Effort and Quality (or defect rate) histories give helpful indications of how to make this decision
Impact of sex on composition and quality of fresh loins, bellies, and fresh and processed hams
USDA-ARS?s Scientific Manuscript database
The objective was to characterize the effect of sex and selection focus on primal quality. Pigs (N=7,672) from a lean growth selection [n=1,468 barrows (LB); n=2,151 gilts (LG)] or superior meat quality selection [n=1,895 barrows (QB); n=2,158 gilts (QG)] focus were slaughtered in 3 marketing groups...
ERIC Educational Resources Information Center
Spaulding, Trent Joseph
2011-01-01
The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…
Measuring, managing and maximizing refinery performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bascur, O.A.; Kennedy, J.P.
1996-01-01
Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.
Rizvi, Zainab; Usmani, Rabia Arshed; Rizvi, Amna; Wazir, Salim; Zahra, Taskeen; Rasool, Hafza
2017-01-01
Quality of any service is the most important aspect for the manufacturer as well as the consumer. The primary objective of any nation's health system is to provide supreme quality health care services to its patients. The objective of this study was to assess the quality of diagnostic fine needle aspiration cytology service in a tertiary care hospital. As Patient's perspectives provide valuable information on quality of process, therefore, patient's perception in terms of satisfaction with the service was measured. In this cross sectional analytical study, 291 patients undergoing fine needle aspiration cytology in Mayo Hospital were selected by systematic sampling technique. Information regarding satisfaction of patients with four dimensions of service quality process, namely "procedure, sterilization, conduct and competency of doctor" was collected through interview on questionnaire. The questionnaire was developed on SERVQUAL model, a measurement tool, for quality assessment of services provided to patients. All items were assessed on 2- point likert scale (0=dissatisfied, 1=satisfied). Frequencies and percentages of satisfied and dissatisfied patients were recorded for each item and all items in each dimension were scored. If the percentage of sum of all item scores of a dimension was ≥60, the dimension was 'good quality'. Whereas <60% was 'poor quality' dimension. Data was analysed using epi-info-3.5.1. Fisher test was applied to check statistical significance. (p-value <0.05). Out of the 4 dimensions of service quality process, Procedure (48.8%), Sterilization (51.5%) and practitioner conduct (50.9%) were perceived as 'poor' by the patients. Only practitioner competency (67.4%) was perceived as 'good'. Comparison of dimensions of service quality scoring with overall level of patient satisfaction revealed that all 4 dimensions were significantly related to patient dissatisfaction (p<.05). The study suggests that service quality of therapeutic and diagnostic procedures in public hospitals should be routinely monitored from the patients' point of view as most aspects of service quality in public hospitals of Pakistan, require improvements. In this manner patient's satisfaction regarding use of services in public hospitals can be made better.
Navon, David
2011-03-01
Though figure-ground assignment has been shown to be probably affected by recognizability, it appears sensible that object recognition must follow at least the earlier process of figure-ground segregation. To examine whether or not rudimentary object recognition could, counterintuitively, start even before the completion of the stage of parsing in which figure-ground segregation is done, participants were asked to respond, in a go/no-go fashion, whenever any out of 16 alternative connected patterns (that constituted familiar stimuli in the upright orientation) appeared. The white figure of the to-be-attended stimulus-target or foil-could be segregated from the white ambient ground only by means of a frame surrounding it. Such a frame was absent until the onset of target display. Then, to manipulate organizational quality, the greyness of the frame was either gradually increased from zero (in Experiment 1) or changed abruptly to a stationary level whose greyness was varied between trials (in Experiments 2 and 3). Stimulus recognizability was manipulated by orientation angle. In all three experiments the effect of recognizability was found to be considerably larger when organizational quality was minimal due to an extremely faint frame. This result is argued to be incompatible with any version of a serial thesis suggesting that processing aimed at object recognition starts only with a good enough level of organizational quality. The experiments rather provide some support to the claim, termed here "early interaction hypothesis", positing interaction between early recognition processing and preassignment parsing processes.
Quality of Care for Work-associated Carpal Tunnel Syndrome
Nuckols, Teryl; Conlon, Craig; Robbins, Michael; Dworsky, Michael; Lai, Julie; Roth, Carol P.; Levitan, Barbara; Seabury, Seth; Seelam, Rachana; Asch, Steven M.
2017-01-01
Objective To evaluate the quality of care provided to individuals with workers’ compensation claims related to CTS and identify patient characteristics associated with receiving better care. Methods We recruited subjects with new claims for CTS from 30 occupational clinics affiliated with Kaiser Permanente Northern California. We applied 45 process-oriented quality measures to 477 subjects’ medical records, and performed multivariate logistic regression to identify patient characteristics associated with quality. Results Overall, 81.6% of care adhered to recommended standards. Certain tasks related to assessing and managing activity were underused. Patients with classic/probable Katz diagrams, positive electrodiagnostic tests, and higher incomes received better care. However, age, gender, and race/ethnicity were not associated with quality. Conclusions Care processes for work-associated CTS frequently adhered to quality measures. Clinical factors were more strongly associated with quality than demographic and socioeconomic ones. PMID:28045797
Data-quality measures for stakeholder-implemented watershed-monitoring programs
Greve, Adrienne I.
2002-01-01
Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.
Peering into peer review: Galileo, ESP, Dr Scott Reuben, and advancing our professional evolution.
Biddle, Chuck
2011-10-01
The fundamental purpose of peer review is quality control that facilitates the introduction of information into our discipline; information that is essential to the care of patients who require anesthesia services. While the AANA Journal relies heavily on this process to maintain the overall quality of our scholarly literature, it may fail that objective under certain conditions. This editorial serves to inform readers of the nature and goals of the peer review process.
USDA-ARS?s Scientific Manuscript database
The objective was to quantify the effect of marketing group (MG) on the variability of primal quality. Pigs (N=7,684) were slaughtered in 3 MGs from 8 barns. Pigs were from genetic selection programs focused on lean growth (L; group 1 n=1,131; group 2 n=1,466; group 3 n=1,030) or superior meat qua...
Effects of image processing on the detective quantum efficiency
NASA Astrophysics Data System (ADS)
Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na
2010-04-01
Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.
ERIC Educational Resources Information Center
Holt, Maurice
1995-01-01
The central idea in W. Edwards Deming's approach to quality management is the need to improve process. Outcome-based education's central defect is its failure to address process. Deming would reject OBE along with management-by-objectives. Education is not a product defined by specific output measures, but a process to develop the mind. (MLH)
NASA Astrophysics Data System (ADS)
Srinivasagupta, Deepak; Kardos, John L.
2004-05-01
Injected pultrusion (IP) is an environmentally benign continuous process for low-cost manufacture of prismatic polymer composites. IP has been of recent regulatory interest as an option to achieve significant vapour emissions reduction. This work describes the design of the IP process with multiple design objectives. In our previous work (Srinivasagupta D et al 2003 J. Compos. Mater. at press), an algorithm for economic design using a validated three-dimensional physical model of the IP process was developed, subject to controllability considerations. In this work, this algorithm was used in a multi-objective optimization approach to simultaneously meet economic, quality related, and environmental objectives. The retrofit design of a bench-scale set-up was considered, and the concept of exergy loss in the process, as well as in vapour emission, was introduced. The multi-objective approach was able to determine the optimal values of the processing parameters such as heating zone temperatures and resin injection pressure, as well as the equipment specifications (die dimensions, heater, puller and pump ratings) that satisfy the various objectives in a weighted sense, and result in enhanced throughput rates. The economic objective did not coincide with the environmental objective, and a compromise became necessary. It was seen that most of the exergy loss is in the conversion of electric power into process heating. Vapour exergy loss was observed to be negligible for the most part.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bell, L.; Castaldi, A.; Jones, C.
The ultimate goal of the project is to develop procedures, techniques, data and other information that will aid in the design of cost effective and energy efficient drying processes that produce high quality foods. This objective has been sought by performing studies to determine the pertinent properties of food products, by developing models to describe the fundamental phenomena of food drying and by testing the models at laboratory scale. Finally, this information is used to develop recommendations and strategies for improved dryer design and control. This volume emphasizes a detailed literature review and several extensive experimental studies. Since the basicmore » principle of food dehydration is the removal of water from food, the process of removing water causes quality changes which can be categorized as physical, chemical, and nutritional. These changes often have adverse effects on the quality of the resulting dehydrated food. In this work, the types of physical and chemical changes common in food drying and the important factors for them were reviewed. Pertinent kinetic models and kinetic data reported in literature were also collected and compiled as the results of review study. The overall objectives of this study were to identify major quality change in foods caused by drying process and to get the knowledge of the relationship between the quality change and factors known to affect them. The quality parameters reviewed included: browning, lipid oxidation, color loss, shrinkage, solubility, texture, aroma and flavor, vitamin and protein loss and microbiological concerns. 54 refs., 74 figs., 49 tabs.« less
National Quality Measures for Child Mental Health Care: Background, Progress, and Next Steps
Murphy, J. Michael; Scholle, Sarah Hudson; Hoagwood, Kimberly Eaton; Sachdeva, Ramesh C.; Mangione-Smith, Rita; Woods, Donna; Kamin, Hayley S.; Jellinek, Michael
2013-01-01
OBJECTIVE: To review recent health policies related to measuring child health care quality, the selection processes of national child health quality measures, the nationally recommended quality measures for child mental health care and their evidence strength, the progress made toward developing new measures, and early lessons learned from these national efforts. METHODS: Methods used included description of the selection process of child health care quality measures from 2 independent national initiatives, the recommended quality measures for child mental health care, and the strength of scientific evidence supporting them. RESULTS: Of the child health quality measures recommended or endorsed during these national initiatives, only 9 unique measures were related to child mental health. CONCLUSIONS: The development of new child mental health quality measures poses methodologic challenges that will require a paradigm shift to align research with its accelerated pace. PMID:23457148
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haney, Thomas Jay
This document describes the process used to develop data quality objectives for the Idaho National Laboratory (INL) Environmental Soil Monitoring Program in accordance with U.S. Environmental Protection Agency guidance. This document also develops and presents the logic that was used to determine the specific number of soil monitoring locations at the INL Site, at locations bordering the INL Site, and at locations in the surrounding regional area. The monitoring location logic follows the guidance from the U.S. Department of Energy for environmental surveillance of its facilities.
Tavakol, Mohsen; Dennick, Reg
2012-01-01
As great emphasis is rightly placed upon the importance of assessment to judge the quality of our future healthcare professionals, it is appropriate not only to choose the most appropriate assessment method, but to continually monitor the quality of the tests themselves, in a hope that we may continually improve the process. This article stresses the importance of quality control mechanisms in the exam cycle and briefly outlines some of the key psychometric concepts including reliability measures, factor analysis, generalisability theory and item response theory. The importance of such analyses for the standard setting procedures is emphasised. This article also accompanies two new AMEE Guides in Medical Education (Tavakol M, Dennick R. Post-examination Analysis of Objective Tests: AMEE Guide No. 54 and Tavakol M, Dennick R. 2012. Post examination analysis of objective test data: Monitoring and improving the quality of high stakes examinations: AMEE Guide No. 66) which provide the reader with practical examples of analysis and interpretation, in order to help develop valid and reliable tests.
A decision-support system for the analysis of clinical practice patterns.
Balas, E A; Li, Z R; Mitchell, J A; Spencer, D C; Brent, E; Ewigman, B G
1994-01-01
Several studies documented substantial variation in medical practice patterns, but physicians often do not have adequate information on the cumulative clinical and financial effects of their decisions. The purpose of developing an expert system for the analysis of clinical practice patterns was to assist providers in analyzing and improving the process and outcome of patient care. The developed QFES (Quality Feedback Expert System) helps users in the definition and evaluation of measurable quality improvement objectives. Based on objectives and actual clinical data, several measures can be calculated (utilization of procedures, annualized cost effect of using a particular procedure, and expected utilization based on peer-comparison and case-mix adjustment). The quality management rules help to detect important discrepancies among members of the selected provider group and compare performance with objectives. The system incorporates a variety of data and knowledge bases: (i) clinical data on actual practice patterns, (ii) frames of quality parameters derived from clinical practice guidelines, and (iii) rules of quality management for data analysis. An analysis of practice patterns of 12 family physicians in the management of urinary tract infections illustrates the use of the system.
Nuclear Technology. Course 28: Welding Inspection. Module 28-1, Welding Fundamentals and Processes.
ERIC Educational Resources Information Center
Espy, John
This first in a series of ten modules for a course titled Welding Inspection describes the role and responsbilities of the quality assurance/quality control technician in welding inspections. The module follows a typical format that includes the following sections: (1) introduction, (2) module prerequisites, (3) objectives, (4) notes to…
Learning Principal Component Analysis by Using Data from Air Quality Networks
ERIC Educational Resources Information Center
Perez-Arribas, Luis Vicente; Leon-González, María Eugenia; Rosales-Conrado, Noelia
2017-01-01
With the final objective of using computational and chemometrics tools in the chemistry studies, this paper shows the methodology and interpretation of the Principal Component Analysis (PCA) using pollution data from different cities. This paper describes how students can obtain data on air quality and process such data for additional information…
Factors Influencing the Quality of EHR Performance: An Exploratory Qualitative Study
ERIC Educational Resources Information Center
Rhodes, Harry B.
2016-01-01
A significant amount of evidence existed in support of the positive effect on the quality of healthcare that resulted from transitioning to electronic health information systems, equally compelling evidence suggests that the development process for electronic health information systems falls short of achieving its potential. The objective of this…
USDA-ARS?s Scientific Manuscript database
The objective was to determine the effects of the wooden breast (WB) and white striping (WS) myopathies on meat quality and protein characteristics of broiler breast meat. Breast fillets (Pectoralis major) from a commercial processing plant were segregated into four groups: normal (neither WS nor W...
ERIC Educational Resources Information Center
Argon, Türkan; Sezen-Gültekin, Gözde
2016-01-01
Moral maturity, defined as the competence in moral emotions, thoughts, judgments, attitudes and behaviors, is one of the most important qualities that the would-be teachers at Faculties of Education must possess. Teachers with moral maturity will train students with the qualities of reliability, responsibility, fairness, objectivity, consistency…
Framework for Optimizing the Evaluation of Data From Contaminated Soil in Sweden
The Swedish guidelines for the evaluation of data for the purpose of a risk assessment at contaminated sites are of a qualitative character, as opposed to the USEPA’s Data Quality Objective Process. In Sweden, this can sometimes be a problem because the demands on data quality ar...
Beaulieu, Luc; Radford, Dee-Ann; Eduardo Villarreal-Barajas, J
2018-03-14
The Canadian Organization of Medical Physicists (COMP), in close partnership with the Canadian Partnership for Quality Radiotherapy (CPQR) has developed a series of Technical Quality Control (TQC) guidelines for radiation treatment equipment. These guidelines outline the performance objectives that equipment should meet in order to ensure an acceptable level of radiation treatment quality. The TQC guidelines have been rigorously reviewed and field tested in a variety of Canadian radiation treatment facilities. The development process enables rapid review and update to keep the guidelines current with changes in technology. This article contains detailed performance objectives and safety criteria for low-dose-rate (LDR) permanent seed brachytherapy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Bubis, E. L.; Lozhrkarev, V. V.; Stepanov, A. N.; Smirnov, A. I.; Martynov, V. O.; Mal'shakova, O. A.; Silin, D. E.; Gusev, S. A.
2017-03-01
We describe the process of adaptive self-inversion of an image (nonlinear switching) of smallscale opaque object, when the amplitude-modulated laser beam, which illuminates it, is focused in a weakly absorbing medium. It is shown that, despite the nonlocal character of the process, which is due to thermal nonlinearity, the brightness-inverse image is characterized by acceptable quality and a high conversion coefficient. It is shown that the coefficient of conversion of the original image to the inverse one depends on the ratio of the object dimensions and the size of the illuminating beam, and decreases sharply for relatively large objects. The obtained experimental data agree with the numerical calculations. Inversion of the images of several model objects and microdefects in a nonlinear KDP crystal is demonstrated.
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data for the time period addressed in this report were stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality- control samples analyzed from July 1997 through June 1999. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration and (or) low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, ammonium, calcium, chloride, specific conductance, and sulfate. The data from the potassium and sodium analytical procedures are insufficient for evaluation. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 11 of 13 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. Blank analysis results for chloride showed that 22 percent of blanks did not meet data-quality objectives and results for dissolved organic carbon showed that 31 percent of the blanks did not meet data-quality objectives. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 14 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except total aluminum (70 percent of samples met objectives) and potassium (83 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality for most constituents over the time period. The P-sample (low-ionic-strength constituents) analysis had good ratings in two of these studies and a satisfactory rating in the third. The results of the T-sample (trace constituents) analysis indicated high data quality with good ratings in all three studies. The N-sample (nutrient constituents) studies had one each of excellent, good, and satisfactory ratings. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 80 percent of the samples met data-quality objectives for 9 of the 13 analytes; the exceptions were dissolved organic carbon, ammonium, chloride, and specific conductance. Data-quality objectives were not met for dissolved organic carbon in two NWRI studies, but all of the samples were within control limits for the last study. Data-quality objectives were not met in 41 percent of samples analyzed for ammonium, 25 percent of samples analyzed for chloride, and 30 percent of samples analyzed for specific conductance. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 84 percent of the samples analyzed for calcium, chloride, magnesium, pH, and potassium. Data-quality objectives were met by 73 percent of those analyzed for sulfate. The data-quality objective was not met for sodium. The data are insufficient for evaluation of the specific conductance results.
Cryogenic Tank Technology Program (CTTP)
NASA Technical Reports Server (NTRS)
Vaughn, T. P.
2001-01-01
The objectives of the Cryogenic Tank Technology Program were to: (1) determine the feasibility and cost effectiveness of near net shape hardware; (2) demonstrate near net shape processes by fabricating large scale-flight quality hardware; and (3) advance state of current weld processing technologies for aluminum lithium alloys.
ERIC Educational Resources Information Center
Green, Sharon; Grierson, Lawrence E. M.; Dubrowski, Adam; Carnahan, Heather
2010-01-01
It is well known that sensorimotor memories are built and updated through experience with objects. These representations are useful to anticipatory and feedforward control processes that preset grip and load forces during lifting. When individuals lift objects with qualities that are not congruent with their memory-derived expectations, feedback…
High-quality compressive ghost imaging
NASA Astrophysics Data System (ADS)
Huang, Heyan; Zhou, Cheng; Tian, Tian; Liu, Dongqi; Song, Lijun
2018-04-01
We propose a high-quality compressive ghost imaging method based on projected Landweber regularization and guided filter, which effectively reduce the undersampling noise and improve the resolution. In our scheme, the original object is reconstructed by decomposing of regularization and denoising steps instead of solving a minimization problem in compressive reconstruction process. The simulation and experimental results show that our method can obtain high ghost imaging quality in terms of PSNR and visual observation.
No-reference video quality measurement: added value of machine learning
NASA Astrophysics Data System (ADS)
Mocanu, Decebal Constantin; Pokhrel, Jeevan; Garella, Juan Pablo; Seppänen, Janne; Liotou, Eirini; Narwaria, Manish
2015-11-01
Video quality measurement is an important component in the end-to-end video delivery chain. Video quality is, however, subjective, and thus, there will always be interobserver differences in the subjective opinion about the visual quality of the same video. Despite this, most existing works on objective quality measurement typically focus only on predicting a single score and evaluate their prediction accuracies based on how close it is to the mean opinion scores (or similar average based ratings). Clearly, such an approach ignores the underlying diversities in the subjective scoring process and, as a result, does not allow further analysis on how reliable the objective prediction is in terms of subjective variability. Consequently, the aim of this paper is to analyze this issue and present a machine-learning based solution to address it. We demonstrate the utility of our ideas by considering the practical scenario of video broadcast transmissions with focus on digital terrestrial television (DTT) and proposing a no-reference objective video quality estimator for such application. We conducted meaningful verification studies on different video content (including video clips recorded from real DTT broadcast transmissions) in order to verify the performance of the proposed solution.
A Systematic Process for Developing High Quality SaaS Cloud Services
NASA Astrophysics Data System (ADS)
La, Hyun Jung; Kim, Soo Dong
Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.
Ono, Daiki; Bamba, Takeshi; Oku, Yuichi; Yonetani, Tsutomu; Fukusaki, Eiichiro
2011-09-01
In this study, we constructed prediction models by metabolic fingerprinting of fresh green tea leaves using Fourier transform near-infrared (FT-NIR) spectroscopy and partial least squares (PLS) regression analysis to objectively optimize of the steaming process conditions in green tea manufacture. The steaming process is the most important step for manufacturing high quality green tea products. However, the parameter setting of the steamer is currently determined subjectively by the manufacturer. Therefore, a simple and robust system that can be used to objectively set the steaming process parameters is necessary. We focused on FT-NIR spectroscopy because of its simple operation, quick measurement, and low running costs. After removal of noise in the spectral data by principal component analysis (PCA), PLS regression analysis was performed using spectral information as independent variables, and the steaming parameters set by experienced manufacturers as dependent variables. The prediction models were successfully constructed with satisfactory accuracy. Moreover, the results of the demonstrated experiment suggested that the green tea steaming process parameters could be predicted on a larger manufacturing scale. This technique will contribute to improvement of the quality and productivity of green tea because it can objectively optimize the complicated green tea steaming process and will be suitable for practical use in green tea manufacture. Copyright © 2011 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
The use of process mapping in healthcare quality improvement projects.
Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James
2018-05-01
Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.
Evaluation of a System of Electronic Documentation for the Nursing Process
de Oliveira, Neurilene Batista; Peres, Heloisa Helena Ciqueto
2012-01-01
The objective of this study is to evaluate the functional performance and the technical quality of an electronic documentation system designed to document the data of the Nursing Process. The Model of Quality will be the one established by the ISO/IEC 25010. Such research will allow the spreading of the knowledge of an emerging area, thus adding a further initiative to the growing efforts made in the information technology area for health and nursing. PMID:24199110
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, J. Matthew; Meier, Kirsten M.; Snyder, Sandra F.
2012-12-27
This document of Data Quality Objectives (DQOs) was prepared based on the U.S. Environmental Protection Agency (EPA) Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA, QA/G4, 2/2006 (EPA 2006), as well as several other published DQOs. The intent of this report is to determine the necessary steps required to ensure that radioactive emissions to the air from the Marine Sciences Laboratory (MSL) headquartered at the Pacific Northwest National Laboratory’s Sequim Marine Research Operations (Sequim Site) on Washington State’s Olympic Peninsula are managed in accordance with regulatory requirements and best practices. The Sequim Site was transitioned in Octobermore » 2012 from private operation under Battelle Memorial Institute to an exclusive use contract with the U.S. Department of Energy, Office of Science, Pacific Northwest Site Office.« less
Expanded opportunities of THz passive camera for the detection of concealed objects
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.
2013-10-01
Among the security problems, the detection of object implanted into either the human body or animal body is the urgent problem. At the present time the main tool for the detection of such object is X-raying only. However, X-ray is the ionized radiation and therefore can not be used often. Other way for the problem solving is passive THz imaging using. In our opinion, using of the passive THz camera may help to detect the object implanted into the human body under certain conditions. The physical reason of such possibility arises from temperature trace on the human skin as a result of the difference in temperature between object and parts of human body. Modern passive THz cameras have not enough resolution in temperature to see this difference. That is why, we use computer processing to enhance the passive THz camera resolution for this application. After computer processing of images captured by passive THz camera TS4, developed by ThruVision Systems Ltd., we may see the pronounced temperature trace on the human body skin from the water, which is drunk by person, or other food eaten by person. Nevertheless, there are many difficulties on the way of full soution of this problem. We illustrate also an improvement of quality of the image captured by comercially available passive THz cameras using computer processing. In some cases, one can fully supress a noise on the image without loss of its quality. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts.
Fast processing of microscopic images using object-based extended depth of field.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Pannarut, Montri; Shaw, Philip J; Tongsima, Sissades
2016-12-22
Microscopic analysis requires that foreground objects of interest, e.g. cells, are in focus. In a typical microscopic specimen, the foreground objects may lie on different depths of field necessitating capture of multiple images taken at different focal planes. The extended depth of field (EDoF) technique is a computational method for merging images from different depths of field into a composite image with all foreground objects in focus. Composite images generated by EDoF can be applied in automated image processing and pattern recognition systems. However, current algorithms for EDoF are computationally intensive and impractical, especially for applications such as medical diagnosis where rapid sample turnaround is important. Since foreground objects typically constitute a minor part of an image, the EDoF technique could be made to work much faster if only foreground regions are processed to make the composite image. We propose a novel algorithm called object-based extended depths of field (OEDoF) to address this issue. The OEDoF algorithm consists of four major modules: 1) color conversion, 2) object region identification, 3) good contrast pixel identification and 4) detail merging. First, the algorithm employs color conversion to enhance contrast followed by identification of foreground pixels. A composite image is constructed using only these foreground pixels, which dramatically reduces the computational time. We used 250 images obtained from 45 specimens of confirmed malaria infections to test our proposed algorithm. The resulting composite images with all in-focus objects were produced using the proposed OEDoF algorithm. We measured the performance of OEDoF in terms of image clarity (quality) and processing time. The features of interest selected by the OEDoF algorithm are comparable in quality with equivalent regions in images processed by the state-of-the-art complex wavelet EDoF algorithm; however, OEDoF required four times less processing time. This work presents a modification of the extended depth of field approach for efficiently enhancing microscopic images. This selective object processing scheme used in OEDoF can significantly reduce the overall processing time while maintaining the clarity of important image features. The empirical results from parasite-infected red cell images revealed that our proposed method efficiently and effectively produced in-focus composite images. With the speed improvement of OEDoF, this proposed algorithm is suitable for processing large numbers of microscope images, e.g., as required for medical diagnosis.
NASA Astrophysics Data System (ADS)
Barber, Jeffrey; Greca, Joseph; Yam, Kevin; Weatherall, James C.; Smith, Peter R.; Smith, Barry T.
2017-05-01
In 2016, the millimeter wave (MMW) imaging community initiated the formation of a standard for millimeter wave image quality metrics. This new standard, American National Standards Institute (ANSI) N42.59, will apply to active MMW systems for security screening of humans. The Electromagnetic Signatures of Explosives Laboratory at the Transportation Security Laboratory is supporting the ANSI standards process via the creation of initial prototypes for round-robin testing with MMW imaging system manufacturers and experts. Results obtained for these prototypes will be used to inform the community and lead to consensus objective standards amongst stakeholders. Images collected with laboratory systems are presented along with results of preliminary image analysis. Future directions for object design, data collection and image processing are discussed.
Quality metric for spherical panoramic video
NASA Astrophysics Data System (ADS)
Zakharchenko, Vladyslav; Choi, Kwang Pyo; Park, Jeong Hoon
2016-09-01
Virtual reality (VR)/ augmented reality (AR) applications allow users to view artificial content of a surrounding space simulating presence effect with a help of special applications or devices. Synthetic contents production is well known process form computer graphics domain and pipeline has been already fixed in the industry. However emerging multimedia formats for immersive entertainment applications such as free-viewpoint television (FTV) or spherical panoramic video require different approaches in content management and quality assessment. The international standardization on FTV has been promoted by MPEG. This paper is dedicated to discussion of immersive media distribution format and quality estimation process. Accuracy and reliability of the proposed objective quality estimation method had been verified with spherical panoramic images demonstrating good correlation results with subjective quality estimation held by a group of experts.
[The significance of meat quality in marketing].
Kallweit, E
1994-07-01
Food quality in general and meat quality in particular are not only evaluated by means of objective quality traits but the entire production process is gaining more attention by the modern consumer. Due to this development quality programs were developed to define the majority of the processes in all production and marketing steps which are again linked by contracts. Not all of these items are quality relevant, but are concessions to ethic principles (animal welfare etc.). This is demonstrated by the example of Scharrel-pork production. The price differentiation at the pork market is still influenced predominantly by quantitative carcass traits. On the European market quality programs still are of minor significance. Premiums which are paid for high quality standards are more or less compensated by higher production costs and lower lean meat percentages, which must be expected in stress susceptible strains. The high efforts to establish quality programs, however, help to improve the quality level in general, and secure the market shares for local producers.
USDA-ARS?s Scientific Manuscript database
Fresh-cut cantaloupes have been associated with outbreaks of Salmonellosis. Minimally processed fresh-cut fruits have a limited shelf life because of deterioration caused by spoilage microflora and physiological processes. The objectives of this study were to use a wet steam process to 1) reduce ind...
Pratap Singh, Anubhav; Singh, Anika; Ramaswamy, Hosahalli S
2017-06-01
Reciprocating agitation thermal processing (RA-TP) is a recent innovation in the field of canning for obtaining high-quality canned food. The objective of this study was to compare RA-TP processing with conventional non-agitated (still) processing with respect to the impact on quality (color, antioxidant capacity, total phenols, carotenoid and lycopene contents) of canned tomato (Solanum lycopersicum) puree. Owing to a 63-81% reduction in process times as compared with still processing, tomato puree with a brighter red color (closer to fresh) was obtained during RA-TP. At 3 Hz reciprocation frequency, the loss of antioxidant, lycopene and carotenoid contents could be reduced to 34, 8 and 8% respectively as compared with 96, 41 and 52% respectively during still processing. In fact, the phenolic content for RA-TP at 3 Hz was 5% higher than in fresh puree. Quality retention generally increased with an increase in frequency, although the differences were less significant at higher reciprocation frequencies (between 2 and 3 Hz). Research findings indicate that RA-TP can be effective to obtain thermally processed foods with high-quality attribute retention. It can also be concluded that a very high reciprocation frequency (>3 Hz) is not necessarily needed and significant quality improvement can be obtained at lower frequencies (∼2 Hz). © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Optical scanning holography based on compressive sensing using a digital micro-mirror device
NASA Astrophysics Data System (ADS)
A-qian, Sun; Ding-fu, Zhou; Sheng, Yuan; You-jun, Hu; Peng, Zhang; Jian-ming, Yue; xin, Zhou
2017-02-01
Optical scanning holography (OSH) is a distinct digital holography technique, which uses a single two-dimensional (2D) scanning process to record the hologram of a three-dimensional (3D) object. Usually, these 2D scanning processes are in the form of mechanical scanning, and the quality of recorded hologram may be affected due to the limitation of mechanical scanning accuracy and unavoidable vibration of stepper motor's start-stop. In this paper, we propose a new framework, which replaces the 2D mechanical scanning mirrors with a Digital Micro-mirror Device (DMD) to modulate the scanning light field, and we call it OSH based on Compressive Sensing (CS) using a digital micro-mirror device (CS-OSH). CS-OSH can reconstruct the hologram of an object through the use of compressive sensing theory, and then restore the image of object itself. Numerical simulation results confirm this new type OSH can get a reconstructed image with favorable visual quality even under the condition of a low sample rate.
PepsNMR for 1H NMR metabolomic data pre-processing.
Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette
2018-08-17
In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The objective of this study was to investigate and evaluate the effects of high hydrostatic pressure (HHP) applied to cantaloupe puree (CP) on microbial loads and product quality during storage for 10 days at 4 degrees C. Freshly prepared, double sealed and double bagged CP (ca. 5 g) was pressure tr...
Joshi, Anuja; Gislason-Lee, Amber J; Keeble, Claire; Sivananthan, Uduvil M
2017-01-01
Objective: The aim of this research was to quantify the reduction in radiation dose facilitated by image processing alone for percutaneous coronary intervention (PCI) patient angiograms, without reducing the perceived image quality required to confidently make a diagnosis. Methods: Incremental amounts of image noise were added to five PCI angiograms, simulating the angiogram as having been acquired at corresponding lower dose levels (10–89% dose reduction). 16 observers with relevant experience scored the image quality of these angiograms in 3 states—with no image processing and with 2 different modern image processing algorithms applied. These algorithms are used on state-of-the-art and previous generation cardiac interventional X-ray systems. Ordinal regression allowing for random effects and the delta method were used to quantify the dose reduction possible by the processing algorithms, for equivalent image quality scores. Results: Observers rated the quality of the images processed with the state-of-the-art and previous generation image processing with a 24.9% and 15.6% dose reduction, respectively, as equivalent in quality to the unenhanced images. The dose reduction facilitated by the state-of-the-art image processing relative to previous generation processing was 10.3%. Conclusion: Results demonstrate that statistically significant dose reduction can be facilitated with no loss in perceived image quality using modern image enhancement; the most recent processing algorithm was more effective in preserving image quality at lower doses. Advances in knowledge: Image enhancement was shown to maintain perceived image quality in coronary angiography at a reduced level of radiation dose using computer software to produce synthetic images from real angiograms simulating a reduction in dose. PMID:28124572
Quality assurance and reliability sub-committee W88-0/Mk5 weapon assessment NSA lab test results (u)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Earl M
2010-11-29
The purpose of this report is to gather appropriate level of relevant stockpile surveillance data to assess trends in the NEP quality, reliability, performance, and safety over the life of the system. The objectives are to gather relevant stockpile data to assess NEP quality and trends and to develop metrics to assess the suitability of the surveillance sampling regime to meet assessment process requirements.
2013-01-01
Introduction In 2004, a community-based health insurance (CBI) scheme was introduced in Nouna health district, Burkina Faso, with the objective of improving financial access to high quality health services. We investigate the role of CBI enrollment in the quality of care provided at primary-care facilities in Nouna district, and measure differences in objective and perceived quality of care and patient satisfaction between enrolled and non-enrolled populations who visit the facilities. Methods We interviewed a systematic random sample of 398 patients after their visit to one of the thirteen primary-care facilities contracted with the scheme; 34% (n = 135) of the patients were currently enrolled in the CBI scheme. We assessed objective quality of care as consultation, diagnostic and counselling tasks performed by providers during outpatient visits, perceived quality of care as patient evaluations of the structures and processes of service delivery, and overall patient satisfaction. Two-sample t-tests were performed for group comparison and ordinal logistic regression (OLR) analysis was used to estimate the association between CBI enrollment and overall patient satisfaction. Results Objective quality of care evaluations show that CBI enrollees received substantially less comprehensive care for outpatient services than non-enrollees. In contrast, CBI enrollment was positively associated with overall patient satisfaction (aOR = 1.51, p = 0.014), controlling for potential confounders such as patient socio-economic status, illness symptoms, history of illness and characteristics of care received. Conclusions CBI patients perceived better quality of care, while objectively receiving worse quality of care, compared to patients who were not enrolled in CBI. Systematic differences in quality of care expectations between CBI enrollees and non-enrollees may explain this finding. One factor influencing quality of care may be the type of provider payment used by the CBI scheme, which has been identified as a leading factor in reducing provider motivation to deliver high quality care to CBI enrollees in previous studies. Based on this study, it is unlikely that perceived quality of care and patient satisfaction explain the low CBI enrollment rates in this community. PMID:23680066
ERIC Educational Resources Information Center
Okay-Somerville, Belgin; Scholarios, Dora
2017-01-01
This article aims to understand predictors of objective (i.e. job offers, employment status and employment quality) and subjective (i.e. perceived) graduate employability during university-to-work transitions. Using survey data from two cohorts of graduates in the UK (N = 293), it contrasts three competing theoretical approaches to employability:…
ERIC Educational Resources Information Center
Agostinho, Shirley; Bennett, Sue; Lockyer, Lori; Harper, Barry
2004-01-01
This paper reports recent work in developing of structures and processes that support university teachers and instructional designers incorporating learning objects into higher education focused learning designs. The aim of the project is to develop a framework to guide the design and implementation of high quality learning experiences. This…
ERIC Educational Resources Information Center
Mentzer, Nathan
2011-01-01
The objective of this research was to explore the relationship between information access and design solution quality of high school students presented with an engineering design problem. This objective is encompassed in the research question driving this inquiry: How does information access impact the design process? This question has emerged in…
Reexamining competitive priorities: Empirical study in service sector
NASA Astrophysics Data System (ADS)
Idris, Fazli; Mohammad, Jihad
2015-02-01
The general objective of this study is to validate the multi-level concept of competitive priorities using reflective-formative model at a higher order for service industries. An empirical study of 228 firms from 9 different service industries is conducted to answer the objective of this study. Partial least square analysis with SmartPLS 2.0 was used to perform the analysis. Finding revealed six priorities: cost, flexibility, delivery, quality talent management, quality tangibility, and innovativeness. It emerges that quality are expanded into two types; one is related to managing talent for process improvement and the second one is the physical appearance and tangibility of the service quality. This study has confirmed competitive priorities as formative second-order hierarchical latent construct by using rigorous empirical evidence. Implications, limitation and suggestion for future research are accordingly discussed in this paper.
Quality assurance program requirements, Amendment 5 (9-26-79) to August 1973 issue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This standard sets forth general requirements for planning, managing, conducting, and evaluating quality assurance programs for reactor development and test facility projects and associated processes, structures, components, and systems. These quality assurance requirements are based on proven practices and provide the means of control and verification whereby those responsible fo poject management can assure that the quality required for safe, reliable, and economical operation will be achieved. The objective of the program of the programs covered by this standard is to assure that structures, components, systems, and facilities are designed, developed, manufactured, constructed, operated, and maintained in compliance with establishedmore » engineering criteria. To achieve this objective, controls are to be established and implemented at predetermined points, and necessary action taken to prevent, detect, and correct any deficiencies.« less
Single Layer Centrifugation Can Be Scaled-Up Further to Process up to 150 mL Semen
Morrell, J. M.; van Wienen, M.; Wallgren, M.
2011-01-01
Single-Layer centrifugation has been used to improve the quality of sperm samples in several species. However, where stallion or boar semen is to be used for AI, larger volumes of semen have to be processed than for other species, thus limiting the effectiveness of the original technique. The objective of the present study was to scale up the SLC method for both stallion and boar semen. Stallion semen could be processed in 100 mL glass tubes without a loss of sperm quality, and similarly, boar semen could be processed in 200 mL and 500 mL tubes without losing sperm quality. The results of these preliminary studies are encouraging, and larger trials are underway to evaluate using these methods in the field. PMID:23738111
Thin film processing of photorefractive BaTiO3
NASA Technical Reports Server (NTRS)
Schuster, Paul R.; Potember, Richard S.
1991-01-01
The principle objectives of this ongoing research involve the preparation and characterization of polycrystalline single-domain thin films of BaTiO3 for photorefractive applications. These films must be continuous, free of cracks, and of high optical quality. The two methods proposed are sputtering and sol-gel related processing.
Quality Measures for the Care of Patients with Narcolepsy
Krahn, Lois E.; Hershner, Shelley; Loeding, Lauren D.; Maski, Kiran P.; Rifkin, Daniel I.; Selim, Bernardo; Watson, Nathaniel F.
2015-01-01
The American Academy of Sleep Medicine (AASM) commissioned a Workgroup to develop quality measures for the care of patients with narcolepsy. Following a comprehensive literature search, 306 publications were found addressing quality care or measures. Strength of association was graded between proposed process measures and desired outcomes. Following the AASM process for quality measure development, we identified three outcomes (including one outcome measure) and seven process measures. The first desired outcome was to reduce excessive daytime sleepiness by employing two process measures: quantifying sleepiness and initiating treatment. The second outcome was to improve the accuracy of diagnosis by employing the two process measures: completing both a comprehensive sleep history and an objective sleep assessment. The third outcome was to reduce adverse events through three steps: ensuring treatment follow-up, documenting medical comorbidities, and documenting safety measures counseling. All narcolepsy measures described in this report were developed by the Narcolepsy Quality Measures Work-group and approved by the AASM Quality Measures Task Force and the AASM Board of Directors. The AASM recommends the use of these measures as part of quality improvement programs that will enhance the ability to improve care for patients with narcolepsy. Citation: Krahn LE, Hershner S, Loeding LD, Maski KP, Rifkin DI, Selim B, Watson NF. Quality measures for the care of patients with narcolepsy. J Clin Sleep Med 2015;11(3):335–355. PMID:25700880
The ventral visual pathway: an expanded neural framework for the processing of object quality.
Kravitz, Dwight J; Saleem, Kadharbatcha S; Baker, Chris I; Ungerleider, Leslie G; Mishkin, Mortimer
2013-01-01
Since the original characterization of the ventral visual pathway, our knowledge of its neuroanatomy, functional properties, and extrinsic targets has grown considerably. Here we synthesize this recent evidence and propose that the ventral pathway is best understood as a recurrent occipitotemporal network containing neural representations of object quality both utilized and constrained by at least six distinct cortical and subcortical systems. Each system serves its own specialized behavioral, cognitive, or affective function, collectively providing the raison d'être for the ventral visual pathway. This expanded framework contrasts with the depiction of the ventral visual pathway as a largely serial staged hierarchy culminating in singular object representations and more parsimoniously incorporates attentional, contextual, and feedback effects. Published by Elsevier Ltd.
Hofmann, Julia; Kien, Christina; Gartlehner, Gerald
2015-01-01
Evidence-based information materials about the pros and cons of cancer screening are important sources for men and women to decide for or against cancer screening. The aim of this paper was to compare recommendations from different cancer institutions in German-speaking countries (Austria, Germany, and Switzerland) regarding screening for breast, cervix, colon, and prostate cancer and to assess the quality and development process of patient information materials. Relevant information material was identified through web searches and personal contact with cancer institutions. To achieve our objective, we employed a qualitative approach. The quality of 22 patient information materials was analysed based on established guidance by Bunge et al. In addition, we conducted guided interviews about the process of developing information materials with decision-makers of cancer institutes. Overall, major discrepancies in cancer screening recommendations exist among the Austrian, German, and Swiss cancer institutes. Process evaluation revealed that crucial steps of quality assurance, such as assembling a multi-disciplinary panel, assessing conflicts of interest, or transparency regarding funding sources, have frequently not been undertaken. All information materials had substantial quality deficits in multiple areas. Three out of four institutes issued information materials that met fewer than half of the quality criteria. Most patient information materials of cancer institutes in German-speaking countries are fraught with substantial deficits and do not provide an objective source for patients to be able to make an informed decision for or against cancer screening. Copyright © 2015. Published by Elsevier GmbH.
Alfadeel, Mona A; Hamid, Yassin H M; El Fadeel, Ogail Ata; Salih, Karimeldin M A
2015-01-01
The objectives of this study are to identify the availability of the service logistics in basic public schools (structure as quality concept), to assess steps of physical examination according to the ministry of health guidelines (process as quality concept) and to measure satisfaction of service consumers (pupils) and service providers (teacher and doctors). The study involved seven localities in Sudan using questionnaires and observations. The structure in form of material and human resources was not well maintained, equally the process and procedure of medical examination did not well fit with rules of quality, however, the satisfaction level was within the accepted level. As far as structure, process and outcome were concerned, we are still below the standards in developed countries for many reasons but the level of satisfaction in the present study is more or less similar as in else studies.
Statistical process management: An essential element of quality improvement
NASA Astrophysics Data System (ADS)
Buckner, M. R.
Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2003 through June 2005. Results for the quality-control samples for 20 analytical procedures were evaluated for bias and precision. Control charts indicate that data for five of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, pH, silicon, and sodium. Seven of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: dissolved organic carbon, chloride, nitrate (ion chromatograph), nitrite, silicon, sodium, and sulfate. The calcium and magnesium procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 17 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 22 analytes. At least 85 percent of the samples met data-quality objectives for all analytes except total monomeric aluminum (82 percent of samples met objectives), total aluminum (77 percent of samples met objectives), chloride (80 percent of samples met objectives), fluoride (76 percent of samples met objectives), and nitrate (ion chromatograph) (79 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with ratings for each sample in the satisfactory, good, and excellent ranges or less than 10 percent error. The P-sample (low-ionic-strength constituents) analysis had one marginal and two unsatisfactory ratings for the chloride procedure. The T-sample (trace constituents)analysis had two unsatisfactory ratings and one high range percent error for the aluminum procedure. The N-sample (nutrient constituents) analysis had one marginal rating for the nitrate procedure. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 84 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were ammonium, total aluminum, and acid-neutralizing capacity. The ammonium procedure did not meet data quality objectives in all studies. Data-quality objectives were not met in 23 percent of samples analyzed for total aluminum and 45 percent of samples analyzed acid-neutralizing capacity. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, sodium, and sulfate. Data-quality objectives were not met by samples analyzed for fluoride.
Visalli, Antonino; Vallesi, Antonino
2018-01-01
Visual search tasks have often been used to investigate how cognitive processes change with expertise. Several studies have shown visual experts' advantages in detecting objects related to their expertise. Here, we tried to extend these findings by investigating whether professional search experience could boost top-down monitoring processes involved in visual search, independently of advantages specific to objects of expertise. To this aim, we recruited a group of quality-control workers employed in citrus farms. Given the specific features of this type of job, we expected that the extensive employment of monitoring mechanisms during orange selection could enhance these mechanisms even in search situations in which orange-related expertise is not suitable. To test this hypothesis, we compared performance of our experimental group and of a well-matched control group on a computerized visual search task. In one block the target was an orange (expertise target) while in the other block the target was a Smurfette doll (neutral target). The a priori hypothesis was to find an advantage for quality-controllers in those situations in which monitoring was especially involved, that is, when deciding the presence/absence of the target required a more extensive inspection of the search array. Results were consistent with our hypothesis. Quality-controllers were faster in those conditions that extensively required monitoring processes, specifically, the Smurfette-present and both target-absent conditions. No differences emerged in the orange-present condition, which resulted to mainly rely on bottom-up processes. These results suggest that top-down processes in visual search can be enhanced through immersive real-life experience beyond visual expertise advantages. PMID:29497392
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
Objective Quality Assessment for Color-to-Gray Image Conversion.
Ma, Kede; Zhao, Tiesong; Zeng, Kai; Wang, Zhou
2015-12-01
Color-to-gray (C2G) image conversion is the process of transforming a color image into a grayscale one. Despite its wide usage in real-world applications, little work has been dedicated to compare the performance of C2G conversion algorithms. Subjective evaluation is reliable but is also inconvenient and time consuming. Here, we make one of the first attempts to develop an objective quality model that automatically predicts the perceived quality of C2G converted images. Inspired by the philosophy of the structural similarity index, we propose a C2G structural similarity (C2G-SSIM) index, which evaluates the luminance, contrast, and structure similarities between the reference color image and the C2G converted image. The three components are then combined depending on image type to yield an overall quality measure. Experimental results show that the proposed C2G-SSIM index has close agreement with subjective rankings and significantly outperforms existing objective quality metrics for C2G conversion. To explore the potentials of C2G-SSIM, we further demonstrate its use in two applications: 1) automatic parameter tuning for C2G conversion algorithms and 2) adaptive fusion of C2G converted images.
Hurtado-Chong, Anahí; Joeris, Alexander; Hess, Denise; Blauth, Michael
2017-07-12
A considerable number of clinical studies experience delays, which result in increased duration and costs. In multicentre studies, patient recruitment is among the leading causes of delays. Poor site selection can result in low recruitment and bad data quality. Site selection is therefore crucial for study quality and completion, but currently no specific guidelines are available. Selection of sites adequate to participate in a prospective multicentre cohort study was performed through an open call using a newly developed objective multistep approach. The method is based on use of a network, definition of objective criteria and a systematic screening process. Out of 266 interested sites, 24 were shortlisted and finally 12 sites were selected to participate in the study. The steps in the process included an open call through a network, use of selection questionnaires tailored to the study, evaluation of responses using objective criteria and scripted telephone interviews. At each step, the number of candidate sites was quickly reduced leaving only the most promising candidates. Recruitment and quality of data went according to expectations in spite of the contracting problems faced with some sites. The results of our first experience with a standardised and objective method of site selection are encouraging. The site selection method described here can serve as a guideline for other researchers performing multicentre studies. ClinicalTrials.gov: NCT02297581. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Assessment and improvement of sound quality in cochlear implant users
Caldwell, Meredith T.; Jiam, Nicole T.
2017-01-01
Objectives Cochlear implants (CIs) have successfully provided speech perception to individuals with sensorineural hearing loss. Recent research has focused on more challenging acoustic stimuli such as music and voice emotion. The purpose of this review is to evaluate and describe sound quality in CI users with the purposes of summarizing novel findings and crucial information about how CI users experience complex sounds. Data Sources Here we review the existing literature on PubMed and Scopus to present what is known about perceptual sound quality in CI users, discuss existing measures of sound quality, explore how sound quality may be effectively studied, and examine potential strategies of improving sound quality in the CI population. Results Sound quality, defined here as the perceived richness of an auditory stimulus, is an attribute of implant‐mediated listening that remains poorly studied. Sound quality is distinct from appraisal, which is generally defined as the subjective likability or pleasantness of a sound. Existing studies suggest that sound quality perception in the CI population is limited by a range of factors, most notably pitch distortion and dynamic range compression. Although there are currently very few objective measures of sound quality, the CI‐MUSHRA has been used as a means of evaluating sound quality. There exist a number of promising strategies to improve sound quality perception in the CI population including apical cochlear stimulation, pitch tuning, and noise reduction processing strategies. Conclusions In the published literature, sound quality perception is severely limited among CI users. Future research should focus on developing systematic, objective, and quantitative sound quality metrics and designing therapies to mitigate poor sound quality perception in CI users. Level of Evidence NA PMID:28894831
Ade-Oshifogun, Jochebed Bosede; Dufelmeier, Thaddeus
2012-01-01
This article describes a quality improvement process for "do not return" (DNR) notices for healthcare supplemental staffing agencies and healthcare facilities that use them. It is imperative that supplemental staffing agencies partner with healthcare facilities in assuring the quality of supplemental staff. Although supplemental staffing agencies attempt to ensure quality staffing, supplemental staff are sometimes subjectively evaluated by healthcare facilities as "DNR." The objective of this article is to describe a quality improvement process to prevent and manage "DNR" within healthcare organizations. We developed a curriculum and accompanying evaluation tool by adapting Rampersad's problem-solving discipline approach: (a) definition of area(s) for improvement; (b) identification of all possible causes; (c) development of an action plan; (d) implementation of the action plan; (e) evaluation for program improvement; and (f) standardization of the process. Face and content validity of the evaluation tool was ascertained by input from a panel of experienced supplemental staff and nursing faculty. This curriculum and its evaluation tool will have practical implications for supplemental staffing agencies and healthcare facilities in reducing "DNR" rates and in meeting certification/accreditation requirements. Further work is needed to translate this process into future research. © 2012 Wiley Periodicals, Inc.
Interim Basis for PCB Sampling and Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-01-18
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposalmore » approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1 A, Vol. IV, Section 4.16 (Banning 1999).« less
Interim Basis for PCB Sampling and Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-03-20
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the U.S. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposalmore » approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1A, Vol. IV, Section 4.16 (Banning 1999).« less
ERIC Educational Resources Information Center
Neely, Margery A.
This book has the following objectives: (1) to sharpen the skills of interviewers to increase the quality of the interview process used with diverse populations; (2) to describe the dynamics of the occupations that use a brief interview in the education and training service-producing industry; and (3) to provide measurement for statistical process…
Is a Quality Course a Worthy Course? Designing for Value and Worth in Online Courses
ERIC Educational Resources Information Center
Youger, Robin E.; Ahern, Terence C.
2015-01-01
There are many strategies for estimating the effectiveness of instruction. Typically, most methods are based on the student evaluation. Recently a more standardized approach, Quality Matters (QM), has been developed that uses an objectives-based strategy. QM, however, does not account for the learning process, nor for the value and worth of the…
USDA-ARS?s Scientific Manuscript database
BACKGROUND: Texture is a major quality parameter for the acceptability of canned whole beans. Prior knowledge of this quality trait before processing would be useful to guide variety development by bean breeders and optimize handling protocols by processors. The objective of this study was to evalua...
ERIC Educational Resources Information Center
Dinizulu, Sonya Mathies; Grant, Kathryn E.; Bryant, Fred B.; Boustani, Maya M.; Tyler, Donald; McIntosh, Jeanne M.
2014-01-01
Background: African American youth residing in urban poverty have been shown to be at increased risk for exposure to violence and for psychological symptoms, but there has been little investigation of mediating processes that might explain this association. Objectives: This study tested the quality of parent-adolescent relationships and adolescent…
Evaluating the Quality of Learning Environments and Teaching Practice in Special Schools
ERIC Educational Resources Information Center
Hedegaard-Soerensen, Lotte; Tetler, Susan
2016-01-01
This article reports on findings of a study which objective is the development of an instrument for systematic evaluation and improvement of the quality of teaching in special schools. The article describes the research process which led to the construction of the instrument as well as the way teachers can use the instrument to improve the quality…
Digital radiography: optimization of image quality and dose using multi-frequency software.
Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D
2012-09-01
New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.
NASA Astrophysics Data System (ADS)
Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.
2009-01-01
Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.
NASA Astrophysics Data System (ADS)
Shrivastava, Prashant Kumar; Pandey, Arun Kumar
2018-06-01
Inconel-718 has found high demand in different industries due to their superior mechanical properties. The traditional cutting methods are facing difficulties for cutting these alloys due to their low thermal potential, lower elasticity and high chemical compatibility at inflated temperature. The challenges of machining and/or finishing of unusual shapes and/or sizes in these materials have also faced by traditional machining. Laser beam cutting may be applied for the miniaturization and ultra-precision cutting and/or finishing by appropriate control of different process parameter. This paper present multi-objective optimization the kerf deviation, kerf width and kerf taper in the laser cutting of Incone-718 sheet. The second order regression models have been developed for different quality characteristics by using the experimental data obtained through experimentation. The regression models have been used as objective function for multi-objective optimization based on the hybrid approach of multiple regression analysis and genetic algorithm. The comparison of optimization results to experimental results shows an improvement of 88%, 10.63% and 42.15% in kerf deviation, kerf width and kerf taper, respectively. Finally, the effects of different process parameters on quality characteristics have also been discussed.
NASA Astrophysics Data System (ADS)
Khalilpourazari, Soheyl; Khalilpourazary, Saman
2017-05-01
In this article a multi-objective mathematical model is developed to minimize total time and cost while maximizing the production rate and surface finish quality in the grinding process. The model aims to determine optimal values of the decision variables considering process constraints. A lexicographic weighted Tchebycheff approach is developed to obtain efficient Pareto-optimal solutions of the problem in both rough and finished conditions. Utilizing a polyhedral branch-and-cut algorithm, the lexicographic weighted Tchebycheff model of the proposed multi-objective model is solved using GAMS software. The Pareto-optimal solutions provide a proper trade-off between conflicting objective functions which helps the decision maker to select the best values for the decision variables. Sensitivity analyses are performed to determine the effect of change in the grain size, grinding ratio, feed rate, labour cost per hour, length of workpiece, wheel diameter and downfeed of grinding parameters on each value of the objective function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann
As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less
Machine vision system: a tool for quality inspection of food and agricultural products.
Patel, Krishna Kumar; Kar, A; Jha, S N; Khan, M A
2012-04-01
Quality inspection of food and agricultural produce are difficult and labor intensive. Simultaneously, with increased expectations for food products of high quality and safety standards, the need for accurate, fast and objective quality determination of these characteristics in food products continues to grow. However, these operations generally in India are manual which is costly as well as unreliable because human decision in identifying quality factors such as appearance, flavor, nutrient, texture, etc., is inconsistent, subjective and slow. Machine vision provides one alternative for an automated, non-destructive and cost-effective technique to accomplish these requirements. This inspection approach based on image analysis and processing has found a variety of different applications in the food industry. Considerable research has highlighted its potential for the inspection and grading of fruits and vegetables, grain quality and characteristic examination and quality evaluation of other food products like bakery products, pizza, cheese, and noodles etc. The objective of this paper is to provide in depth introduction of machine vision system, its components and recent work reported on food and agricultural produce.
NASA Astrophysics Data System (ADS)
Junker, Berit; Buchecker, Mattias; Müller-BöKer, Ulrike
2007-10-01
River restoration as a measure to improve both flood protection and ecological quality has become a common practice in river management. This new practice, however, has also become a source of conflicts arising from a neglect of the social aspects in river restoration projects. Therefore appropriate public involvement strategies have been recommended in recent years as a way of coping with these conflicts. However, an open question remains: Which stakeholders should be involved in the decision-making process? This, in turn, raises the question of the appropriate objectives of public participation. This study aims to answer these questions drawing on two case studies of Swiss river restoration projects and a related representative nationwide survey. Our findings suggest that public involvement should not be restricted to a small circle of influential stakeholder groups. As restoration projects have been found to have a substantial impact on the quality of life of the local population, avoiding conflicts is only one of several objectives of the involvement process. Including the wider public provides a special opportunity to promote social objectives, such as trust building and identification of people with their local environment.
What Does it Mean to Publish Data in Earth System Science Data Journal?
NASA Astrophysics Data System (ADS)
Carlson, D.; Pfeiffenberger, H.
2015-12-01
The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?
Quality Assessment of TPB-Based Questionnaires: A Systematic Review
Oluka, Obiageli Crystal; Nie, Shaofa; Sun, Yi
2014-01-01
Objective This review is aimed at assessing the quality of questionnaires and their development process based on the theory of planned behavior (TPB) change model. Methods A systematic literature search for studies with the primary aim of TPB-based questionnaire development was conducted in relevant databases between 2002 and 2012 using selected search terms. Ten of 1,034 screened abstracts met the inclusion criteria and were assessed for methodological quality using two different appraisal tools: one for the overall methodological quality of each study and the other developed for the appraisal of the questionnaire content and development process. Both appraisal tools consisted of items regarding the likelihood of bias in each study and were eventually combined to give the overall quality score for each included study. Results 8 of the 10 included studies showed low risk of bias in the overall quality assessment of each study, while 9 of the studies were of high quality based on the quality appraisal of questionnaire content and development process. Conclusion Quality appraisal of the questionnaires in the 10 reviewed studies was successfully conducted, highlighting the top problem areas (including: sample size estimation; inclusion of direct and indirect measures; and inclusion of questions on demographics) in the development of TPB-based questionnaires and the need for researchers to provide a more detailed account of their development process. PMID:24722323
Data Quality Objectives Process for Designation of K Basins Debris
DOE Office of Scientific and Technical Information (OSTI.GOV)
WESTCOTT, J.L.
2000-05-22
The U.S. Department of Energy has developed a schedule and approach for the removal of spent fuels, sludge, and debris from the K East (KE) and K West (KW) Basins, located in the 100 Area at the Hanford Site. The project that is the subject of this data quality objective (DQO) process is focused on the removal of debris from the K Basins and onsite disposal of the debris at the Environmental Restoration Disposal Facility (ERDF). This material previously has been dispositioned at the Hanford Low-Level Burial Grounds (LLBGs) or Central Waste Complex (CWC). The goal of this DQO processmore » and the resulting Sampling and Analysis Plan (SAP) is to provide the strategy for characterizing and designating the K-Basin debris to determine if it meets the Environmental Restoration Disposal Facility Waste Acceptance Criteria (WAC), Revision 3 (BHI 1998). A critical part of the DQO process is to agree on regulatory and WAC interpretation, to support preparation of the DQO workbook and SAP.« less
An open system approach to process reengineering in a healthcare operational environment.
Czuchry, A J; Yasin, M M; Norris, J
2000-01-01
The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.
Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts
Parker, Gene W.; Pinson, Harlow
1993-01-01
A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.
NASA Astrophysics Data System (ADS)
Flores, Jorge L.; García-Torales, G.; Ponce Ávila, Cristina
2006-08-01
This paper describes an in situ image recognition system designed to inspect the quality standards of the chocolate pops during their production. The essence of the recognition system is the localization of the events (i.e., defects) in the input images that affect the quality standards of pops. To this end, processing modules, based on correlation filter, and segmentation of images are employed with the objective of measuring the quality standards. Therefore, we designed the correlation filter and defined a set of features from the correlation plane. The desired values for these parameters are obtained by exploiting information about objects to be rejected in order to find the optimal discrimination capability of the system. Regarding this set of features, the pop can be correctly classified. The efficacy of the system has been tested thoroughly under laboratory conditions using at least 50 images, containing 3 different types of possible defects.
Biotechnology Science Experiments on Mir
NASA Technical Reports Server (NTRS)
Kroes, Roger L.
1999-01-01
This paper describes the microgravity biotechnology experiments carried out on the Shuttle/Mir program. Four experiments investigated the growth of protein crystals, and three investigated cellular growth. Many hundreds of protein samples were processed using four different techniques. The objective of these experiments was to determine optimum conditions for the growth of very high quality single crystals to be used for structure determination. The Biotechnology System (BTS) was used to process the three cell growth investigations. The samples processed by these experiments were: bovine chondrocytes, human renal epithelial cells, and human breast cancer cells and endothelial cells. The objective was to determine the unique properties of cell aggregates produced in the microgravity environment.
The Timing of Visual Object Categorization
Mack, Michael L.; Palmeri, Thomas J.
2011-01-01
An object can be categorized at different levels of abstraction: as natural or man-made, animal or plant, bird or dog, or as a Northern Cardinal or Pyrrhuloxia. There has been growing interest in understanding how quickly categorizations at different levels are made and how the timing of those perceptual decisions changes with experience. We specifically contrast two perspectives on the timing of object categorization at different levels of abstraction. By one account, the relative timing implies a relative timing of stages of visual processing that are tied to particular levels of object categorization: Fast categorizations are fast because they precede other categorizations within the visual processing hierarchy. By another account, the relative timing reflects when perceptual features are available over time and the quality of perceptual evidence used to drive a perceptual decision process: Fast simply means fast, it does not mean first. Understanding the short-term and long-term temporal dynamics of object categorizations is key to developing computational models of visual object recognition. We briefly review a number of models of object categorization and outline how they explain the timing of visual object categorization at different levels of abstraction. PMID:21811480
Multiview 3D sensing and analysis for high quality point cloud reconstruction
NASA Astrophysics Data System (ADS)
Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard
2018-04-01
Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.
Exploring the feasibility of traditional image querying tasks for industrial radiographs
NASA Astrophysics Data System (ADS)
Bray, Iliana E.; Tsai, Stephany J.; Jimenez, Edward S.
2015-08-01
Although there have been great strides in object recognition with optical images (photographs), there has been comparatively little research into object recognition for X-ray radiographs. Our exploratory work contributes to this area by creating an object recognition system designed to recognize components from a related database of radiographs. Object recognition for radiographs must be approached differently than for optical images, because radiographs have much less color-based information to distinguish objects, and they exhibit transmission overlap that alters perceived object shapes. The dataset used in this work contained more than 55,000 intermixed radiographs and photographs, all in a compressed JPEG form and with multiple ways of describing pixel information. For this work, a robust and efficient system is needed to combat problems presented by properties of the X-ray imaging modality, the large size of the given database, and the quality of the images contained in said database. We have explored various pre-processing techniques to clean the cluttered and low-quality images in the database, and we have developed our object recognition system by combining multiple object detection and feature extraction methods. We present the preliminary results of the still-evolving hybrid object recognition system.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.
Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985
SHERPA: an image segmentation and outline feature extraction tool for diatoms and other objects
2014-01-01
Background Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and/or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. Results The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Conclusions Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems. PMID:24964954
SHERPA: an image segmentation and outline feature extraction tool for diatoms and other objects.
Kloster, Michael; Kauer, Gerhard; Beszteri, Bánk
2014-06-25
Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and/or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2005 through June 2007. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: total aluminum, calcium, magnesium, nitrate (colorimetric method), potassium, silicon, sodium, and sulfate. Eight of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: total aluminum, calcium, dissolved organic carbon, chloride, nitrate (ion chromatograph), potassium, silicon, and sulfate. The magnesium and pH procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The acid-neutralizing capacity, total monomeric aluminum, nitrite, and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicated that the procedures for 16 of 17 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 93 percent of the samples met data-quality objectives for all analytes except acid-neutralizing capacity (85 percent of samples met objectives), total monomeric aluminum (83 percent of samples met objectives), total aluminum (85 percent of samples met objectives), and chloride (85 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project met the Troy Laboratory data-quality objectives for 87 percent of the samples analyzed. The P-sample (low-ionic-strength constituents) analysis had two outliers each in two studies. The T-sample (trace constituents) analysis and the N-sample (nutrient constituents) analysis had one outlier each in two studies. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 85 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were acid-neutralizing capacity, total aluminum and ammonium. Data-quality objectives were not met in 41 percent of samples analyzed for acid-neutralizing capacity, 50 percent of samples analyzed for total aluminum, and 44 percent of samples analyzed for ammonium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 76 percent of the samples analyzed for chloride, 80 percent of the samples analyzed for specific conductance, and 77 percent of the samples analyzed for sulfate.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of
Water recovery by catalytic treatment of urine vapor
NASA Technical Reports Server (NTRS)
Budininkas, P.; Quattrone, P. D.; Leban, M. I.
1980-01-01
The objective of this investigation was to demonstrate the feasibility of water recovery on a man-rated scale by the catalytic processing of untreated urine vapor. For this purpose, two catalytic systems, one capable of processing an air stream containing low urine vapor concentrations and another to process streams with high urine vapor concentrations, were designed, constructed, and tested to establish the quality of the recovered water.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2001-01-01
A laboratory for analysis of low-ionic strength water has been developed at the U.S. Geological Survey (USGS) office in Troy, N.Y., to analyze samples collected by USGS projects in the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data are stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of quality-assurance/quality-control data. This report presents and discusses samples analyzed from July 1993 through June 1995. Quality-control results for 18 analytical procedures were evaluated for bias and precision. Control charts show that data from seven of the analytical procedures were biased throughout the analysis period for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, dissolved inorganic carbon, dissolved organic carbon (soil expulsions), chloride, magnesium, nitrate (colorimetric method), and pH. Three of the analytical procedures were occasionally biased but were within control limits; they were: calcium (high for high-concentration samples for May 1995), dissolved organic carbon (high for highconcentration samples from January through September 1994), and fluoride (high in samples for April and June 1994). No quality-control sample has been developed for the organic monomeric aluminum procedure. Results from the filter-blank and analytical-blank analyses indicate that all analytical procedures in which blanks were run were within control limits, although values for a few blanks were outside the control limits. Blanks were not analyzed for acid-neutralizing capacity, dissolved inorganic carbon, fluoride, nitrate (colorimetric method), or pH. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in 14 of the 18 procedures. Data-quality objectives were met by more than 90 percent of the samples analyzed in all procedures except total monomeric aluminum (85 percent of samples met objectives), total aluminum (70 percent of samples met objectives), and dissolved organic carbon (85 percent of samples met objectives). Triplicate samples were not analyzed for ammonium, fluoride, dissolved inorganic carbon, or nitrate (colorimetric method). Results of the USGS interlaboratory Standard Reference Sample Program indicated high data quality with a median result of 3.6 of a possible 4.0. Environment Canada's LRTAP interlaboratory study results indicated that more than 85 percent of the samples met data-quality objectives in 6 of the 12 analyses; exceptions were calcium, dissolved organic carbon, chloride, pH, potassium, and sodium. Data-quality objectives were not met for calcium samples in one LRTAP study, but 94 percent of samples analyzed were within control limits for the remaining studies. Data-quality objectives were not met by 35 percent of samples analyzed for dissolved organic carbon, but 94 percent of sample values were within 20 percent of the most probable value. Data-quality objectives were not met for 30 percent of samples analyzed for chloride, but 90 percent of sample values were within 20 percent of the most probable value. Measurements of samples with a pH above 6.0 were biased high in 54 percent of the samples, although 85 percent of the samples met data-quality objectives for pH measurements below 6.0. Data-quality objectives for potassium and sodium were not met in one study (only 33 percent of the samples analyzed met the objectives), although 85 percent of the sample values were within control limits for the other studies. Measured sodium values were above the upper control limit in all studies. Results from blind reference-sample analyses indicated that data
Practical quality control tools for curves and surfaces
NASA Technical Reports Server (NTRS)
Small, Scott G.
1992-01-01
Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.
The ventral visual pathway: An expanded neural framework for the processing of object quality
Kravitz, Dwight J.; Saleem, Kadharbatcha S.; Baker, Chris I.; Ungerleider, Leslie G.; Mishkin, Mortimer
2012-01-01
Since the original characterization of the ventral visual pathway our knowledge of its neuroanatomy, functional properties, and extrinsic targets has grown considerably. Here we synthesize this recent evidence and propose that the ventral pathway is best understood as a recurrent occipitotemporal network containing neural representations of object quality both utilized and constrained by at least six distinct cortical and subcortical systems. Each system serves its own specialized behavioral, cognitive, or affective function, collectively providing the raison d’etre for the ventral visual pathway. This expanded framework contrasts with the depiction of the ventral visual pathway as a largely serial staged hierarchy that culminates in singular object representations for utilization mainly by ventrolateral prefrontal cortex and, more parsimoniously than this account, incorporates attentional, contextual, and feedback effects. PMID:23265839
Chen, Xudong; Xu, Zhongwen; Yao, Liming; Ma, Ning
2018-03-05
This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1995-01-01
Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
Stable image acquisition for mobile image processing applications
NASA Astrophysics Data System (ADS)
Henning, Kai-Fabian; Fritze, Alexander; Gillich, Eugen; Mönks, Uwe; Lohweg, Volker
2015-02-01
Today, mobile devices (smartphones, tablets, etc.) are widespread and of high importance for their users. Their performance as well as versatility increases over time. This leads to the opportunity to use such devices for more specific tasks like image processing in an industrial context. For the analysis of images requirements like image quality (blur, illumination, etc.) as well as a defined relative position of the object to be inspected are crucial. Since mobile devices are handheld and used in constantly changing environments the challenge is to fulfill these requirements. We present an approach to overcome the obstacles and stabilize the image capturing process such that image analysis becomes significantly improved on mobile devices. Therefore, image processing methods are combined with sensor fusion concepts. The approach consists of three main parts. First, pose estimation methods are used to guide a user moving the device to a defined position. Second, the sensors data and the pose information are combined for relative motion estimation. Finally, the image capturing process is automated. It is triggered depending on the alignment of the device and the object as well as the image quality that can be achieved under consideration of motion and environmental effects.
Models of formation and some algorithms of hyperspectral image processing
NASA Astrophysics Data System (ADS)
Achmetov, R. N.; Stratilatov, N. R.; Yudakov, A. A.; Vezenov, V. I.; Eremeev, V. V.
2014-12-01
Algorithms and information technologies for processing Earth hyperspectral imagery are presented. Several new approaches are discussed. Peculiar properties of processing the hyperspectral imagery, such as multifold signal-to-noise reduction, atmospheric distortions, access to spectral characteristics of every image point, and high dimensionality of data, were studied. Different measures of similarity between individual hyperspectral image points and the effect of additive uncorrelated noise on these measures were analyzed. It was shown that these measures are substantially affected by noise, and a new measure free of this disadvantage was proposed. The problem of detecting the observed scene object boundaries, based on comparing the spectral characteristics of image points, is considered. It was shown that contours are processed much better when spectral characteristics are used instead of energy brightness. A statistical approach to the correction of atmospheric distortions, which makes it possible to solve the stated problem based on analysis of a distorted image in contrast to analytical multiparametric models, was proposed. Several algorithms used to integrate spectral zonal images with data from other survey systems, which make it possible to image observed scene objects with a higher quality, are considered. Quality characteristics of hyperspectral data processing were proposed and studied.
NASA Astrophysics Data System (ADS)
Baillard, C.; Dissard, O.; Jamet, O.; Maître, H.
Above-ground analysis is a key point to the reconstruction of urban scenes, but it is a difficult task because of the diversity of the involved objects. We propose a new method to above-ground extraction from an aerial stereo pair, which does not require any assumption about object shape or nature. A Digital Surface Model is first produced by a stereoscopic matching stage preserving discontinuities, and then processed by a region-based Markovian classification algorithm. The produced above-ground areas are finally characterized as man-made or natural according to the grey level information. The quality of the results is assessed and discussed.
Least-squares luma-chroma demultiplexing algorithm for Bayer demosaicking.
Leung, Brian; Jeon, Gwanggil; Dubois, Eric
2011-07-01
This paper addresses the problem of interpolating missing color components at the output of a Bayer color filter array (CFA), a process known as demosaicking. A luma-chroma demultiplexing algorithm is presented in detail, using a least-squares design methodology for the required bandpass filters. A systematic study of objective demosaicking performance and system complexity is carried out, and several system configurations are recommended. The method is compared with other benchmark algorithms in terms of CPSNR and S-CIELAB ∆E∗ objective quality measures and demosaicking speed. It was found to provide excellent performance and the best quality-speed tradeoff among the methods studied.
Improving plan quality for prostate volumetric-modulated arc therapy.
Wright, Katrina; Ferrari-Anderson, Janet; Barry, Tamara; Bernard, Anne; Brown, Elizabeth; Lehman, Margot; Pryor, David
2017-01-01
We critically evaluated the quality and consistency of volumetric-modulated arc therapy (VMAT) prostate planning at a single institution to quantify objective measures for plan quality and establish clear guidelines for plan evaluation and quality assurance. A retrospective analysis was conducted on 34 plans generated on the Pinnacle 3 version 9.4 and 9.8 treatment planning system to deliver 78 Gy in 39 fractions to the prostate only using VMAT. Data were collected on contoured structure volumes, overlaps and expansions, planning target volume (PTV) and organs at risk volumes and relationship, dose volume histogram, plan conformity, plan homogeneity, low-dose wash, and beam parameters. Standard descriptive statistics were used to describe the data. Despite a standardized planning protocol, we found variability was present in all steps of the planning process. Deviations from protocol contours by radiation oncologists and radiation therapists occurred in 12% and 50% of cases, respectively, and the number of optimization parameters ranged from 12 to 27 (median 17). This contributed to conflicts within the optimization process reflected by the mean composite objective value of 0.07 (range 0.01 to 0.44). Methods used to control low-intermediate dose wash were inconsistent. At the PTV rectum interface, the dose-gradient distance from the 74.1 Gy to 40 Gy isodose ranged from 0.6 cm to 2.0 cm (median 1.0 cm). Increasing collimator angle was associated with a decrease in monitor units and a single full 6 MV arc was sufficient for the majority of plans. A significant relationship was found between clinical target volume-rectum distance and rectal tolerances achieved. A linear relationship was determined between the PTV volume and volume of 40 Gy isodose. Objective values and composite objective values were useful in determining plan quality. Anatomic geometry and overlap of structures has a measurable impact on the plan quality achieved for prostate patients being treated with VMAT. By evaluating multiple planning variables, we have been able to determine important factors influencing plan quality and develop predictive models for quality metrics that have been incorporated into our new protocol and will be tested and refined in future studies. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Object recognition through turbulence with a modified plenoptic camera
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Davis, Christopher
2015-03-01
Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.
Faraji-Khiavi, F; Ghobadian, S; Moradi-Joo, E
2015-01-01
Background and Objective: Knowledge management is introduced as a key element of quality improvement in organizations. There was no such research in university hospitals of Ahvaz. This study aimed to determine the association between the effectiveness of the processes of knowledge management and the health services quality from the managers’ view in the educational hospitals of Ahvaz city. Materials and Methods: in this correlational and research, the research population consisted of 120 managers from hospitals in University of Medical Sciences Ahvaz. Due to the limited population, the census was run. Three questionnaires were used for data collection: Demographic characteristics, the effectiveness of knowledge management processes and the quality of medical services. To analyze the data, the Spearman association analysis, The Kruskal-Wallis, the Mann–Whitney U test, were used in SPSS. Results: estimation of average scoring of the effectiveness of knowledge management processes and its components were relatively appropriate. Quality of medical services was estimated as relatively appropriate. Relationship of quality of health services with the effectiveness of knowledge management processes showed a medium and positive correlation (p < 0.001). Managers with different genders showed significant differences in knowledge development and transfer (P = 0.003). Conclusion: a significant and positive association was observed between the effectiveness of knowledge management processes and health care quality. To improve the health care quality in university hospitals, managers should pay more attention to develop the cultures of innovation, encourage teamwork, and improve communication and creative thinking in the knowledge management context PMID:28316735
Recursive search method for the image elements of functionally defined surfaces
NASA Astrophysics Data System (ADS)
Vyatkin, S. I.
2017-05-01
This paper touches upon the synthesis of high-quality images in real time and the technique for specifying three-dimensional objects on the basis of perturbation functions. The recursive search method for the image elements of functionally defined objects with the use of graphics processing units is proposed. The advantages of such an approach over the frame-buffer visualization method are shown.
Application of Oversampling to obtain the MTF of Digital Radiology Equipment.
NASA Astrophysics Data System (ADS)
Narváez, M.; Graffigna, J. P.; Gómez, M. E.; Romo, R.
2016-04-01
Within the objectives of theproject Medical Image Processing for QualityAssessment ofX Ray Imaging, the present research work is aimed at developinga phantomX ray image and itsassociated processing algorithms in order to evaluatethe image quality rendered by digital X ray equipment. These tools are used to measure various image parameters, among which spatial resolution shows afundamental property that can be characterized by the Modulation Transfer Function (MTF)of an imaging system [1]. After performing a thorough literature surveyon imaging quality control in digital X film in Argentine and international publications, it was decided to adopt for this work the Norm IEC 62220 1:2003 that recommends using an image edge as a testingmethod. In order to obtain the characterizing MTF, a protocol was designedfor unifying the conditions under which the images are acquired for later evaluation. The protocol implied acquiring a radiography image by means of a specific referential technique, i.e. referred either to voltage, current, time, distance focus plate (/film?) distance, or other referential parameter, and to interpret the image through a system of computed radiology or direct digital radiology. The contribution of the work stems from the fact that, even though the traditional way of evaluating an X film image quality has relied mostly on subjective methods, this work presents an objective evaluative toolfor the images obtained with a givenequipment, followed by a contrastive analysis with the renderings from other X filmimaging sets.Once the images were obtained, specific calculations were carried out. Though there exist some methods based on the subjective evaluation of the quality of image, this work offers an objective evaluation of the equipment under study. Finally, we present the results obtained on different equipment.
Baselining current road weather information : final report
DOT National Transportation Integrated Search
2009-06-10
This final report contains research findings on the characterization of the quality and value of road weather information resources used by members of the surface transportation community in their decision-making process. The objectives of the projec...
A mask quality control tool for the OSIRIS multi-object spectrograph
NASA Astrophysics Data System (ADS)
López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor
2012-09-01
OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.
Chougrani, Saada; Ouhadj, Salah
2014-01-01
Quality of care is a strategic priority of any management approach in order to meet users' expectations of health care systems. This study tried to define the role of patient satisfaction surveys and the place of user in the quality of care project. The results of patient satisfaction surveys conducted between 2010 and 2012 and the draft quality of care project were analysed. Patient satisfaction surveys from 2010 to 2012 focused on logistic shortcomings. No comment was formulated about health care. Comments and suggestions did not provide any contribution in terms of patient involvement in the health care process. The multiple perspectives of quality of care include clinical care and other social objectives of respect for the individual and attention to the patient. User satisfaction as assessed by patient satisfaction surveys or patients' experiences only reflect the health professionals' representation. However, the objective is to measure what the user perceives and feels and his/her representation of the attention provided. These approaches, conducted outside of the quality of care strategic plan, only provide a basis for actions with limited or no effectiveness.
Effects of developer depletion on image quality of Kodak Insight and Ektaspeed Plus films.
Casanova, M S; Casanova, M L S; Haiter-Neto, F
2004-03-01
To evaluate the effect of processing solution depletion on the image quality of F-speed dental X-ray film (Insight), compared with Ektaspeed Plus. The films were exposed with a phantom and developed in manual and automatic conditions, in fresh and progressively depleted solutions. The comparison was based on densitometric analysis and subjective appraisal. The processing solution depletion presented a different behaviour depending on whether manual or automatic technique was used. The films were distinctly affected by depleted processing solutions. The developer depletion was faster in automatic than manual conditions. Insight film was more resistant than Ektaspeed Plus to the effects of processing solution depletion. In the present study there was agreement between the objective and subjective appraisals.
[Quality control in anesthesiology].
Muñoz-Ramón, J M
1995-03-01
The process of quality control and auditing of anesthesiology allows us to evaluate care given by a service and solve problems that are detected. Quality control is a basic element of care giving and is only secondarily an area of academic research; it is therefore a meaningless effort if the information does not serve to improve departmental procedures. Quality assurance procedures assume certain infrastructural requirements and an initial period of implementation and adjustment. The main objectives of quality control are the reduction of morbidity and mortality due to anesthesia, assurance of the availability and proper management of resources and, finally, the well-being and safety of the patient.
Strength and stiffness assessment of standing trees using a nondestructive stress wave technique.
Xiping Wang; Robert J. Ross; Michael McClellan; R. James Barbour; John R. Erickson; John W. Forsman; Gary D. McGinnis
Natureas engineering of wood through genetics, stand conditions, and environment creates wide variability in wood as a material, which in turn introduces difficulties in wood processing and utilization. Manufacturers sometimes find it difficult to consistently process wood into quality products because of its wide range of properties. The primary objective of this...
Quality Measures for Hospice and Palliative Care: Piloting the PEACE Measures
Rokoske, Franziska S.; Durham, Danielle; Cagle, John G.; Hanson, Laura C.
2014-01-01
Abstract Background: The Carolinas Center for Medical Excellence launched the PEACE project in 2006, under contract with the Centers for Medicare & Medicaid Services (CMS), to identify, develop, and pilot test quality measures for hospice and palliative care programs. Objectives: The project collected pilot data to test the usability and feasibility of potential quality measures and data collection processes for hospice and palliative care programs. Settings/subjects: Twenty-two hospices participating in a national Quality Improvement Collaborative (QIC) submitted data from 367 chart reviews for pain care and 45 chart reviews for nausea care. Fourteen additional hospices completed a one-time data submission of 126 chart reviews on 60 potential patient-level quality measures across eight domains of care and an organizational assessment evaluating structure and processes of care. Design: Usability was assessed by examining the range, variability and size of the populations targeted by each quality measure. Feasibility was assessed during the second pilot study by surveying data abstractors about the abstraction process and examining the rates of missing data. The impact of data collection processes was assessed by comparing results obtained using different processes. Results: Measures shown to be both usable and feasible included: screening for physical symptoms on admission and documentation of treatment preferences. Methods of data collection and measure construction appear to influence observed rates of quality of care. Conclusions: We successfully identified quality measures with potential for use in hospices and palliative care programs. Future research is needed to understand whether these measures are sensitive to quality improvement interventions. PMID:24921162
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's LabMaster data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality-control samples analyzed from July 1999 through June 2001. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, calcium, chloride and nitrate (ion chromatography and colormetric method) and sulfate. The total aluminum and dissolved organic carbon procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits. The calcium and specific conductance procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The magnesium procedure was biased for the high-concentration and low concentration samples, but was within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 14 of 15 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 17 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except ammonium (81 percent of samples met objectives), chloride (75 percent of samples met objectives), and sodium (86 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with most ratings for each sample in the good to excellent range. The P-sample (low-ionic-strength constituents) analysis had one satisfactory rating for the specific conductance procedure in one study. The T-sample (trace constituents) analysis had one satisfactory rating for the aluminum procedure in one study and one unsatisfactory rating for the sodium procedure in another. The remainder of the samples had good or excellent ratings for each study. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 89 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were ammonium, total aluminum, dissolved organic carbon, and sodium. Results indicate a positive bias for the ammonium procedure in all studies. Data-quality objectives were not met in 50 percent of samples analyzed for total aluminum, 38 percent of samples analyzed for dissolved organic carbon, and 27 percent of samples analyzed for sodium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 91 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, and sulfate. Data-quality objectives were met by 75 percent of the samples analyzed for sodium and 58 percent of the samples analyzed for specific conductance.
ISO 9000 and/or Systems Engineering Capability Maturity Model?
NASA Technical Reports Server (NTRS)
Gholston, Sampson E.
2002-01-01
For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.
NASA Astrophysics Data System (ADS)
Zhan, Qi; Wang, Xin; Mu, Baozhong; Xu, Jie; Xie, Qing; Li, Yaran; Chen, Yifan; He, Yanan
2016-10-01
Dangerous materials inspection is an important technique to confirm dangerous materials crimes. It has significant impact on the prohibition of dangerous materials-related crimes and the spread of dangerous materials. Lobster-Eye Optical Imaging System is a kind of dangerous materials detection device which mainly takes advantage of backscatter X-ray. The strength of the system is its applicability to access only one side of an object, and to detect dangerous materials without disturbing the surroundings of the target material. The device uses Compton scattered x-rays to create computerized outlines of suspected objects during security detection process. Due to the grid structure of the bionic object glass, which imitate the eye of a lobster, grids contribute to the main image noise during the imaging process. At the same time, when used to inspect structured or dense materials, the image is plagued by superposition artifacts and limited by attenuation and noise. With the goal of achieving high quality images which could be used for dangerous materials detection and further analysis, we developed effective image process methods applied to the system. The first aspect of the image process is the denoising and enhancing edge contrast process, during the process, we apply deconvolution algorithm to remove the grids and other noises. After image processing, we achieve high signal-to-noise ratio image. The second part is to reconstruct image from low dose X-ray exposure condition. We developed a kind of interpolation method to achieve the goal. The last aspect is the region of interest (ROI) extraction process, which could be used to help identifying dangerous materials mixed with complex backgrounds. The methods demonstrated in the paper have the potential to improve the sensitivity and quality of x-ray backscatter system imaging.
Phantom evaluation of the effect of film processing on mammographic screen-film combinations.
McLean, D; Rickard, M T
1994-08-01
Mammographic image quality should be optimal for diagnosis, and the film contrast can be manipulated by altering development parameters. In this study phantom test objects were radiographed and processed for a given range of developer temperatures and times for four film-screen systems. Radiologists scored the phantom test objects on the resultant films to evaluate the effect on diagnosis of varying image contrast. While for three film-screen systems processing led to appreciable contrast differences, for only one film system did maximum contrast correspond with optimal phantom test object scoring. The inability to show an effect on diagnosis in all cases is possibly due to the variation in radiologist responses found in this study and in normal clinical circumstances. Other technical factors such as changes in film fog, grain and mottle may contribute to the study findings.
Giuliani, Felice; D’Anselmo, Anita; Tommasi, Luca; Brancucci, Alfredo; Pietroni, Davide
2017-01-01
The Spatial Numerical Association of Response Codes (SNARC) effect has been associated with a wide range of magnitude processing. This effect is due to an implicit relationship between numbers and horizontal space, according to which weaker magnitudes and smaller numbers are represented on the left, whereas stronger magnitudes and larger numbers are represented on the right. However, for some particular type of magnitudes such as price, judgments may be also influenced by perceived quality and thus involving valence attribution biases driven by brain asymmetries. In the present study, a lateralized tachistoscopic presentation was used in a price estimation task, using a weight estimation task as a control, to assess differences in asymmetries between these two attributes. Results show a side bias in the former condition but not in the latter, thus indicating that other non-numerical mechanisms are involved in price estimation. Specifically, prices were estimated lower in the left visual field than in the right visual field. The proposed explanation is that price appraisal might involve a valence attribution mechanism leading to a better perceived quality (related to higher prices) when objects are processed primarily in the left hemisphere, and to a lower perceived quality (related to lower prices) when objects are processed primarily in the right hemisphere. PMID:29213252
Giuliani, Felice; D'Anselmo, Anita; Tommasi, Luca; Brancucci, Alfredo; Pietroni, Davide
2017-01-01
The Spatial Numerical Association of Response Codes (SNARC) effect has been associated with a wide range of magnitude processing. This effect is due to an implicit relationship between numbers and horizontal space, according to which weaker magnitudes and smaller numbers are represented on the left, whereas stronger magnitudes and larger numbers are represented on the right. However, for some particular type of magnitudes such as price, judgments may be also influenced by perceived quality and thus involving valence attribution biases driven by brain asymmetries. In the present study, a lateralized tachistoscopic presentation was used in a price estimation task, using a weight estimation task as a control, to assess differences in asymmetries between these two attributes. Results show a side bias in the former condition but not in the latter, thus indicating that other non-numerical mechanisms are involved in price estimation. Specifically, prices were estimated lower in the left visual field than in the right visual field. The proposed explanation is that price appraisal might involve a valence attribution mechanism leading to a better perceived quality (related to higher prices) when objects are processed primarily in the left hemisphere, and to a lower perceived quality (related to lower prices) when objects are processed primarily in the right hemisphere.
A methodology for automatic intensity-modulated radiation treatment planning for lung cancer
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng
2011-07-01
In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.
Single-exposure color digital holography
NASA Astrophysics Data System (ADS)
Feng, Shaotong; Wang, Yanhui; Zhu, Zhuqing; Nie, Shouping
2010-11-01
In this paper, we report a method for color image reconstruction by recording only one single multi-wavelength hologram. In the recording process, three lasers of different wavelengths emitting in the red, green and blue regions are used for illuminating on the object and the object diffraction fields will arrive at the hologram plane simultaneously. Three reference beams with different spatial angles will interfere with the corresponding object diffraction fields on the hologram plane, respectively. Finally, a series of sub-holograms incoherently overlapped on the CCD to be recorded as a multi-wavelength hologram. Angular division multiplexing is employed to reference beams so that the spatial spectra of the multiple recordings will be separated in the Fourier plane. In the reconstruction process, the multi-wavelength hologram will be Fourier transformed into its Fourier plane, where the spatial spectra of different wavelengths are separated and can be easily extracted by employing frequency filtering. The extracted spectra are used to reconstruct the corresponding monochromatic complex amplitudes, which will be synthesized to reconstruct the color image. For singleexposure recording technique, it is convenient for applications on the real-time image processing fields. However, the quality of the reconstructed images is affected by speckle noise. How to improve the quality of the images needs for further research.
NASA Technical Reports Server (NTRS)
Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake
2010-01-01
The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.
Design of penicillin fermentation process simulation system
NASA Astrophysics Data System (ADS)
Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi
2011-10-01
Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.
Measuring, managing and maximizing performance of mineral processing plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bascur, O.A.; Kennedy, J.P.
1995-12-31
The implementation of continuous quality improvement is the confluence of Total Quality Management, People Empowerment, Performance Indicators and Information Engineering. The supporting information technologies allow a mineral processor to narrow the gap between management business objectives and the process control level. One of the most important contributors is the user friendliness and flexibility of the personal computer in a client/server environment. This synergistic combination when used for real time performance monitoring translates into production cost savings, improved communications and enhanced decision support. Other savings come from reduced time to collect data and perform tedious calculations, act quickly with fresh newmore » data, generate and validate data to be used by others. This paper presents an integrated view of plant management. The selection of the proper tools for continuous quality improvement are described. The process of selecting critical performance monitoring indices for improved plant performance are discussed. The importance of a well balanced technological improvement, personnel empowerment, total quality management and organizational assets are stressed.« less
O’Suilleabhain, Padraig E.; Sanghera, Manjit; Patel, Neepa; Khemani, Pravin; Lacritz, Laura H.; Chitnis, Shilpa; Whitworth, Louis A.; Dewey, Richard B.
2016-01-01
Objective To develop a process to improve patient outcomes from deep brain stimulation (DBS) surgery for Parkinson disease (PD), essential tremor (ET), and dystonia. Methods We employed standard quality improvement methodology using the Plan-Do-Study-Act process to improve patient selection, surgical DBS lead implantation, postoperative programming, and ongoing assessment of patient outcomes. Results The result of this quality improvement process was the development of a neuromodulation network. The key aspect of this program is rigorous patient assessment of both motor and non-motor outcomes tracked longitudinally using a REDCap database. We describe how this information is used to identify problems and to initiate Plan-Do-Study-Act cycles to address them. Preliminary outcomes data is presented for the cohort of PD and ET patients who have received surgery since the creation of the neuromodulation network. Conclusions Careful outcomes tracking is essential to ensure quality in a complex therapeutic endeavor like DBS surgery for movement disorders. The REDCap database system is well suited to store outcomes data for the purpose of ongoing quality assurance monitoring. PMID:27711133
Water-quality assessment of the Smith River drainage basin, California and Oregon
Iwatsubo, Rick T.; Washabaugh, Donna S.
1982-01-01
A water-quality assessment of the Smith River drainage basin was made to provide a summary of the water-quality conditions including known or potential water-quality problems. Results of the study showed that the water quality of the Smith River is excellent and generally meets the water-quality objectives for the beneficial uses identified by the California Regional Water Quality Control Board, North Coast Region. Known and potential problems related to water quality include: Sedimentation resulting from both natural erosional processes and land-use activities such as timber harvest, road construction, and mining that accelerate the erosional processes; bacterial contamination of surface and ground waters from inundated septic tanks and drainfields, and grazing activities; industrial spills which have resulted in fish kills and oil residues; high concetrations of iron in ground water; log and debris jams creating fish migration barriers; and pesticide and trace-element contamination from timber-harvest and mining activities, respectively. Future studies are needed to establish: (1) a sustained long-term monitoring program to provide a broad coverage of water-quality conditions in order to define long-term water-quality trends; and (2) interpretive studies to determine the source of known and potential water-quality problems. (USGS)
NASA Astrophysics Data System (ADS)
Vuori, Tero; Olkkonen, Maria
2006-01-01
The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.
Detailed Field Investigation of Vapor Intrusion Processes
2008-08-01
difluoroethane DQO data quality objective ESTCP Environmental Security Technology Certification Program HCl hydrochloric acid OU-5 Operable Unit...impacted by significant leakage of ambient air. Some leak tracer compounds such as difluoroethane (DFA) and isopropyl alcohol may cause elevated detection
Environmental Response Laboratory Network (ERLN) WebEDR Quick Reference Guide
The Web Electronic Data Review is a web-based system that performs automated data processing on laboratory-submitted Electronic Data Deliverables (EDDs). Enables users to perform technical audits on data, and against Measurement Quality Objectives (MQOs).
Objective Quality and Intelligibility Prediction for Users of Assistive Listening Devices
Falk, Tiago H.; Parsa, Vijay; Santos, João F.; Arehart, Kathryn; Hazrati, Oldooz; Huber, Rainer; Kates, James M.; Scollie, Susan
2015-01-01
This article presents an overview of twelve existing objective speech quality and intelligibility prediction tools. Two classes of algorithms are presented, namely intrusive and non-intrusive, with the former requiring the use of a reference signal, while the latter does not. Investigated metrics include both those developed for normal hearing listeners, as well as those tailored particularly for hearing impaired (HI) listeners who are users of assistive listening devices (i.e., hearing aids, HAs, and cochlear implants, CIs). Representative examples of those optimized for HI listeners include the speech-to-reverberation modulation energy ratio, tailored to hearing aids (SRMR-HA) and to cochlear implants (SRMR-CI); the modulation spectrum area (ModA); the hearing aid speech quality (HASQI) and perception indices (HASPI); and the PErception MOdel - hearing impairment quality (PEMO-Q-HI). The objective metrics are tested on three subjectively-rated speech datasets covering reverberation-alone, noise-alone, and reverberation-plus-noise degradation conditions, as well as degradations resultant from nonlinear frequency compression and different speech enhancement strategies. The advantages and limitations of each measure are highlighted and recommendations are given for suggested uses of the different tools under specific environmental and processing conditions. PMID:26052190
Stepwise drying of medicinal plants as alternative to reduce time and energy processing
NASA Astrophysics Data System (ADS)
Cuervo-Andrade, S. P.; Hensel, O.
2016-07-01
The objective of drying medicinal plants is to extend the shelf life and conserving the fresh characteristics. This is achieved by reducing the water activity (aw) of the product to a value which will inhibit the growth and development of pathogenic and spoilage microorganisms, significantly reducing enzyme activity and the rate at which undesirable chemical reactions occur. The technical drying process requires an enormous amount of thermal and electrical energy. An improvement in the quality of the product to be dried and at the same time a decrease in the drying cost and time are achieved through the utilization of a controlled conventional drying method, which is based on a good utilization of the renewable energy or looking for other alternatives which achieve lower processing times without sacrificing the final product quality. In this work the method of stepwise drying of medicinal plants is presented as an alternative to the conventional drying that uses a constant temperature during the whole process. The objective of stepwise drying is the decrease of drying time and reduction in energy consumption. In this process, apart from observing the effects on decreases the effective drying process time and energy, the influence of the different combinations of drying phases on several characteristics of the product are considered. The tests were carried out with Melissa officinalis L. variety citronella, sowed in greenhouse. For the stepwise drying process different combinations of initial and final temperature, 40/50°C, are evaluated, with different transition points associated to different moisture contents (20, 30, 40% and 50%) of the product during the process. Final quality of dried foods is another important issue in food drying. Drying process has effect in quality attributes drying products. This study was determining the color changes and essential oil loses by reference the measurement of the color and essential oil content of the fresh product was used. Drying curves were obtained to observe the dynamics of the process for different combinations of temperature and points of change, corresponding to different conditions of moisture content of the product.
Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine
2012-12-09
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a multiobjective evolutionary algorithm SPEA2(26), and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Errázuriz, Paula; Constantino, Michael J; Calvo, Esteban
2015-09-01
This study examined the relationship between patients' object relations and interpersonal process in psychotherapy. Namely, we tested the hypothesis that the quality of patients' object relations is positively associated with both patient- and therapist-rated alliance quality. Psychotherapy was administered naturalistically, with quantitative data collection before and during treatment. Participants included 73 adult outpatients and 23 therapists at two mental health clinics. Using the Bell Object Relations and Reality Testing Inventory, we measured four dimensions of patients' object relations at baseline-alienation, insecure attachment, egocentricity, and social incompetence. Using the Working Alliance Inventory, we measured alliance from patient and therapist perspectives. Control variables included time, patient demographics, symptom severity, and clinic. We employed hierarchical linear modelling to analyse data with a nested structure, with 138 sessions at Level 1, 73 patients at Level 2, and 23 therapists at Level 3. Patient alienation and insecure attachment were associated with lower patient-rated alliance, while egocentricity was associated with higher patient-rated alliance. Patients' object relations were not significantly associated with therapist-rated alliance. On average, patients perceived the alliance more positively than their therapists, with a weak positive correlation between the alliance perspectives. The results suggest that object relation dimensions may be important patient characteristics for forecasting therapeutic relationship quality. They also call for more attention to differences between alliance rating perspectives. Treatment may benefit from more attention to the quality of patients' object relations. If patients present with high levels of alienation and insecure attachment, therapists may need to pay especially close attention to the therapeutic alliance, and prudently address any ruptures in its quality. When monitoring the alliance quality, it is important to consider that patients and therapists may have different perspectives. Therapists relying solely on their own perceptions are at risk of missing alliance difficulties, and patients' object relations may be uniquely predictive of their own sense of the alliance. Therefore, it may be helpful to ask patients in session and through standardized measures for feedback on how they perceive the goals and tasks of treatment and the emotional bond with their therapist. Again, any alliance tensions could then be addressed directly as a means to maintaining engagement in the service of better outcome. © 2014 The British Psychological Society.
Deccache, A
1997-06-01
Health promotion and health education have often been limited to evaluation of the effectiveness of actions and programmes. However, since 1996 with the Third European Conference on Health Promotion and Education Effectiveness, many researchers have become interested in "quality assessment" and new ways of thinking have emerged. Quality assurance is a concept and activity developed in industry with the objective of increasing production efficiency. There are two distinct approaches: External Standard Inspection (ESI) and Continuous Quality Improvement (CQI). ESI involves establishing criteria of quality, evaluating them and improving whatever needs improvement. CQI views the activity or service as a process and includes the quality assessment as part of the process. This article attempts to answer the questions of whether these methods are sufficient and suitable for operationalising the concepts of evaluation, effectiveness and quality in health promotion and education, whether it is necessary to complement them with other methods, and whether the ESI approach is appropriate. The first section of the article explains that health promotion is based on various paradigms from epidemiology to psychology and anthropology. Many authors warn against the exclusive use of public health disciplines for understanding, implementing and evaluating health promotion. The author argues that in practice, health promotion: -integrates preventive actions with those aiming to maintain and improve health, a characteristic which widens the actions of health promotion from those of classic public health which include essentially an epidemiological or "risk" focus; -aims to replace vertical approaches to prevention with a global approach based on educational sciences; -involves a community approach which includes the individual in a "central position of power" as much in the definition of needs as in the evaluation of services; -includes the participation and socio-political actions which necessitate the use of varied and specific instruments for action and evaluation. With the choice of health promotion ideology, there exist corresponding theories, concepts of quality, and therefore methods and techniques that differ from those used until now. The educational sciences have led to a widening of the definition of process to include both "throughput and input", which has meant that the methods of needs analysis, objective and priority setting and project development in health promotion have become objects of quality assessment. Also, the modes of action and interaction among actors are included, which has led to evaluation of ethical and ideological aspects of projects. The second section of the article discusses quality assessment versus evaluation of effectiveness. Different paradigms of evaluation such as the public health approach based on the measurement of (epidemiological) effectiveness, social marketing and communication, and the anthropological approach are briefly discussed, pointing out that there are many approaches which can both complement and contradict one another. The author explains the difference between impact (the intermediate effects, direct or indirect, planned or not planned, changes in practical or theoretical knowledge, perceptions, and attitudes) and results (final effects of mid to long term changes such as changes in morbidity, mortality, or access to services or cost of health care). He argues that by being too concerned with results of programmes, we have often ignored the issue of impact. Also, by limiting ourselves to evaluating effectiveness (i.e. that the expected effects were obtained), we ignore other possible unexpected, unplanned and positive and negative secondary effects. There are therefore many reasons to: -evaluate all possible effects rather than only those lined to objectives; -evaluate the entire process rather than only the resources, procedures and costs; -evaluate the impact rather than results; -evalu
Low cost 3D scanning process using digital image processing
NASA Astrophysics Data System (ADS)
Aguilar, David; Romero, Carlos; Martínez, Fernando
2017-02-01
This paper shows the design and building of a low cost 3D scanner, able to digitize solid objects through contactless data acquisition, using active object reflection. 3D scanners are used in different applications such as: science, engineering, entertainment, etc; these are classified in: contact scanners and contactless ones, where the last ones are often the most used but they are expensive. This low-cost prototype is done through a vertical scanning of the object using a fixed camera and a mobile horizontal laser light, which is deformed depending on the 3-dimensional surface of the solid. Using digital image processing an analysis of the deformation detected by the camera was done; it allows determining the 3D coordinates using triangulation. The obtained information is processed by a Matlab script, which gives to the user a point cloud corresponding to each horizontal scanning done. The obtained results show an acceptable quality and significant details of digitalized objects, making this prototype (built on LEGO Mindstorms NXT kit) a versatile and cheap tool, which can be used for many applications, mainly by engineering students.
Laser-induced acoustic imaging of underground objects
NASA Astrophysics Data System (ADS)
Li, Wen; DiMarzio, Charles A.; McKnight, Stephen W.; Sauermann, Gerhard O.; Miller, Eric L.
1999-02-01
This paper introduces a new demining technique based on the photo-acoustic interaction, together with results from photo- acoustic experiments. We have buried different types of targets (metal, rubber and plastic) in different media (sand, soil and water) and imaged them by measuring reflection of acoustic waves generated by irradiation with a CO2 laser. Research has been focused on the signal acquisition and signal processing. A deconvolution method using Wiener filters is utilized in data processing. Using a uniform spatial distribution of laser pulses at the ground's surface, we obtained 3D images of buried objects. The images give us a clear representation of the shapes of the underground objects. The quality of the images depends on the mismatch of acoustic impedance of the buried objects, the bandwidth and center frequency of the acoustic sensors and the selection of filter functions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jonathan Helmus, Scott Collis
The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.
Zineldin, Mosad
2006-01-01
To examine the major factors affecting patients' perception of cumulative satisfaction and to address the question whether patients in Egypt and Jordan evaluate quality of health care similarly or differently. A conceptual model including behavioural dimensions of patient-physician relationships and patient satisfaction has been developed. As the empirical research setting, this study concerns three hospitals in Egypt and Jordan. The survey instrument in a questionnaire form was designed to achieve the research objectives. A total of 48 items (attributes) of the newly developed five quality dimensions were identified to be the most relevant. A total of 224 complete and usable questionnaires were received from the in-patients. Hospital C has above-average total and dimensional qualities and patients are the most satisfied in accordance with all dimensions of services. Hospitals A and B have under-average total qualities as the majority of patients are not satisfied with services. Comparing hospitals A and B, in the majority of dimensions (with the exception of Q5), the quality in hospital B is higher than in hospital A. Patients' satisfaction with different service quality dimensions is correlated with their willingness to recommend the hospital to others. A cure to improve the quality for health-care services can be an application of total relationship management and the 5Qs model together with customer orientation strategy. The result can be used by the hospitals to reengineer and redesign creatively their quality management processes and the future direction of their more effective health-care quality strategies. In this research a study is described involving a new instrument and a new method which assure a reasonable level of relevance, validity and reliability, while being explicitly change-oriented. This study argues that a patient's satisfaction is a cumulative construct, summing satisfaction with five different qualities (5Qs) of the hospital: quality of object, processes, infrastructure, interaction, and atmosphere.
High Throughput Multispectral Image Processing with Applications in Food Science.
Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John
2015-01-01
Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.
APPLICATION OF DATA QUALITY OBJECTIVES AND MEASUREMENT QUALITY OBJECTIVES TO RESEARCH PROJECTS
The paper assists systematic planning for research projects. It presents planning concepts in terms that have some utility for researchers. For example, measurement quality objectives are more familiar to researchers than data quality objectives because these quality criteria are...
South Asia transboundary water quality monitoring workshop summary report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betsill, Jeffrey David; Littlefield, Adriane C.; Luetters, Frederick O.
2003-04-01
The Cooperative Monitoring Center (CMC) promotes collaborations among scientists and researchers in several regions as a means of achieving common regional security objectives. To promote cooperation in South Asia on environmental research, an international working group made up of participants from Bangladesh, India, Nepal, Pakistan, and the United States convened in Kathmandu, Nepal, from February 17-23,2002. The workshop was held to further develop the South Asia Transboundary Water Quality Monitoring (SATWQM) project. The project is sponsored in part by the CMC located at Sandia National Laboratories in Albuquerque, New Mexico through funding provided by the US. Department of State, Regionalmore » Environmental Affairs Office, American Embassy, Kathmandu, Nepal, and the National Nuclear Security Administration's (NNSA) Office of Nonproliferation and National Security. This report summarizes the SATWQM project, the workshop objectives, process and results. The long-term interests of the participants are to develop systems for sharing regional environmental information as a means of building confidence and improving relations among South Asian countries. The more immediate interests of the group are focused on activities that foster regional sharing of water quality data in the Ganges and Indus River basins. Issues of concern to the SATWQM network participants include studying the impacts from untreated sewage and industrial effluents, agricultural run-off, salinity increases in fresh waters, the siltation and shifting of river channels, and the environmental degradation of critical habitats such as wetlands, protected forests, and endangered aquatic species conservation areas. The workshop focused on five objectives: (1) a deepened understanding of the partner organizations involved; (2) garnering the support of additional regional and national government and non-government organizations in South Asia involved in river water quality monitoring; (3) identification of sites within the region at which water quality data are to be collected; (4) instituting a data and information collection and sharing process; and, (5) training of partners in the use of water quality monitoring equipment.« less
Quality assessment of color images based on the measure of just noticeable color difference
NASA Astrophysics Data System (ADS)
Chou, Chun-Hsien; Hsu, Yun-Hsiang
2014-01-01
Accurate assessment on the quality of color images is an important step to many image processing systems that convey visual information of the reproduced images. An accurate objective image quality assessment (IQA) method is expected to give the assessment result highly agreeing with the subjective assessment. To assess the quality of color images, many approaches simply apply the metric for assessing the quality of gray scale images to each of three color channels of the color image, neglecting the correlation among three color channels. In this paper, a metric for assessing color images' quality is proposed, in which the model of variable just-noticeable color difference (VJNCD) is employed to estimate the visibility thresholds of distortion inherent in each color pixel. With the estimated visibility thresholds of distortion, the proposed metric measures the average perceptible distortion in terms of the quantized distortion according to the perceptual error map similar to that defined by National Bureau of Standards (NBS) for converting the color difference enumerated by CIEDE2000 to the objective score of perceptual quality assessment. The perceptual error map in this case is designed for each pixel according to the visibility threshold estimated by the VJNCD model. The performance of the proposed metric is verified by assessing the test images in the LIVE database, and is compared with those of many well-know IQA metrics. Experimental results indicate that the proposed metric is an effective IQA method that can accurately predict the image quality of color images in terms of the correlation between objective scores and subjective evaluation.
NASA Astrophysics Data System (ADS)
Shu, Hui; Zhou, Xideng
2014-05-01
The single-vendor single-buyer integrated production inventory system has been an object of study for a long time, but little is known about the effect of investing in reducing setup cost reduction and process-quality improvement for an integrated inventory system in which the products are sold with free minimal repair warranty. The purpose of this article is to minimise the integrated cost by optimising simultaneously the number of shipments and the shipment quantity, the setup cost, and the process quality. An efficient algorithm procedure is proposed for determining the optimal decision variables. A numerical example is presented to illustrate the results of the proposed models graphically. Sensitivity analysis of the model with respect to key parameters of the system is carried out. The paper shows that the proposed integrated model can result in significant savings in the integrated cost.
NASA Astrophysics Data System (ADS)
Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia
2017-12-01
There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.
Scientific and Regulatory Considerations in Solid Oral Modified Release Drug Product Development.
Li, Min; Sander, Sanna; Duan, John; Rosencrance, Susan; Miksinski, Sarah Pope; Yu, Lawrence; Seo, Paul; Rege, Bhagwant
2016-11-01
This review presents scientific and regulatory considerations for the development of solid oral modified release (MR) drug products. It includes a rationale for patient-focused development based on Quality-by-Design (QbD) principles. Product and process understanding of MR products includes identification and risk-based evaluation of critical material attributes (CMAs), critical process parameters (CPPs), and their impact on critical quality attributes (CQAs) that affect the clinical performance. The use of various biopharmaceutics tools that link the CQAs to a predictable and reproducible clinical performance for patient benefit is emphasized. Product and process understanding lead to a more comprehensive control strategy that can maintain product quality through the shelf life and the lifecycle of the drug product. The overall goal is to develop MR products that consistently meet the clinical objectives while mitigating the risks to patients by reducing the probability and increasing the detectability of CQA failures.
Research management peer exchange hosted by the Oregon Department of Transportation.
DOT National Transportation Integrated Search
1998-06-01
The objectives of the peer exchange process were to: : -Identify how ODOT can improve the quality of the research results. : -Examin how ODOT can better implement research findings. : -Identify methods to determine the value of research. : -Determine...
Creating an Overall Environmental Quality Index: Assessing Available Data
Background and Objectives: The interaction between environmental insults and human health is a complex process. Environmental exposures tend to cluster and disamenities such as landfills or industrial plants are often located in neighborhoods with high a percentage of minority a...
Research management peer exchange hosted by the Ohio Department of Transportation, August 5-7, 2002.
DOT National Transportation Integrated Search
2002-08-01
The expressed objectives of the Peer Exchange were to: : Enhance the overall research process : Enhance implementation and tracking of research results : Improve the quality and accuracy of preliminary research cost estimates prepared : internally pr...
An Introduction to Database Structure and Database Machines.
ERIC Educational Resources Information Center
Detweiler, Karen
1984-01-01
Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…
Lucyk, Kelsey; Tang, Karen; Quan, Hude
2017-11-22
Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.
NASA Astrophysics Data System (ADS)
Murali, Swetha; Ponmalar, V.
2017-07-01
To make innovation and continuous improvement as a norm, some traditional practices must become unlearnt. Change for growth and competitiveness are required for sustainability for any profitable business such as the construction industry. The leading companies are willing to implement Total Quality Management (TQM) principles, to realise potential advantages and improve growth and efficiency. Ironically, researches recollected quality as the most significant provider for competitive advantage in industrial leadership. The two objectives of this paper are 1) Identify TQM effectiveness in residential projects and 2) Identify the client satisfaction/dissatisfaction areas using Analytical Hierarchy Process (AHP) and suggest effective mitigate measures. Using statistical survey techniques like set of questionnaire survey, it is observed that total quality management was applied in some leading successful organization to an extent. The main attributes for quality achievement can be defined as teamwork and better communication with single agreed goal between client and contractor. Onsite safety is a paramount attribute in the identifying quality within the residential projects. It was noticed that the process based quality methods such as onsite safe working condition; safe management system and modern engineering process safety controls etc. as interlinked functions. Training and effective communication with all stakeholders on quality management principles is essential for effective quality work. Late Only through effective TQM principles companies can avoid some contract litigations with an increased client satisfaction Index.
Sobottka, Stephan B; Töpfer, Armin; Eberlein-Gonska, Maria; Schackert, Gabriele; Albrecht, D Michael
2010-01-01
Six Sigma is an innovative management- approach to reach practicable zero- defect quality in medical service processes. The Six Sigma principle utilizes strategies, which are based on quantitative measurements and which seek to optimize processes, limit deviations or dispersion from the target process. Hence, Six Sigma aims to eliminate errors or quality problems of all kinds. A pilot project to optimize the preparation for neurosurgery could now show that the Six Sigma method enhanced patient safety in medical care, while at the same time disturbances in the hospital processes and failure costs could be avoided. All six defined safety relevant quality indicators were significantly improved by changes in the workflow by using a standardized process- and patient- oriented approach. Certain defined quality standards such as a 100% complete surgical preparation at start of surgery and the required initial contact of the surgeon with the patient/ surgical record on the eve of surgery could be fulfilled within the range of practical zero- defect quality. Likewise, the degree of completion of the surgical record by 4 p.m. on the eve of surgery and their quality could be improved by a factor of 170 and 16, respectively, at sigma values of 4.43 and 4.38. The other two safety quality indicators "non-communicated changes in the OR- schedule" and the "completeness of the OR- schedule by 12:30 a.m. on the day before surgery" also show an impressive improvement by a factor of 2.8 and 7.7, respectively, corresponding with sigma values of 3.34 and 3.51. The results of this pilot project demonstrate that the Six Sigma method is eminently suitable for improving quality of medical processes. In our experience this methodology is suitable, even for complex clinical processes with a variety of stakeholders. In particular, in processes in which patient safety plays a key role, the objective of achieving a zero- defect quality is reasonable and should definitely be aspirated. Copyright © 2010. Published by Elsevier GmbH.
Ebadifar, Asghar; Baradaran Eftekhari, Monir; Owlia, Parviz; Habibi, Elham; Ghalenoee, Elham; Bagheri, Mohammad Reza; Falahat, Katayoun; Eltemasi, Masoumeh; Sobhani, Zahra; Akhondzadeh, Shahin
2017-11-01
Research evaluation is a systematic and objective process to measure relevance, efficiency and effectiveness of research activities, and peer review is one of the most important tools for assessing quality of research. The aim of this study was introducing research evaluation indicators based on peer reviewing. This study was implemented in 4 stages. A list of objective-oriented evaluation indicators were designed in 4 axes, including; governance and leadership, structure, knowledge production and research impact. The top 10% medical sciences research centers (RCs) were evaluated based on peer review. Adequate equipment and laboratory instruments, high quality research publication and national or international cooperation were the main strengths in medical sciences RCs and the most important weaknesses included failure to adhere to strategic plans, parallel actions in similar fields, problems in manpower recruitment, knowledge translation & exchange (KTE) in service providers and policy makers' levels. Peer review evaluation can improve the quality of research.
Total quality in acute care hospitals: guidelines for hospital managers.
Holthof, B
1991-08-01
Quality improvement can not focus exclusively on peer review and the scientific evaluation of medical care processes. These essential elements have to be complemented with a focus on individual patient needs and preferences. Only then will hospitals create the competitive advantage needed to survive in an increasingly market-driven hospital industry. Hospital managers can identify these patients' needs by 'living the patient experience' and should then set the hospital's quality objectives according to its target patients and their needs. Excellent quality program design, however, is not sufficient. Successful implementation of a quality improvement program further requires fundamental changes in pivotal jobholders' behavior and mindset and in the supporting organizational design elements.
Effect of storage conditions on sensory properties of Bierzo roasted pepper.
Casquero, Pedro A; Sanz, Miguel A; Guerra, Marcos
2011-01-15
Roasted pepper is marketed with the European recognition of Protected Geographical Indication 'Pimiento Asado del Bierzo'. The industry needs to prolong the period in which fresh pepper received from farmers is available to be processed, without deteriorating the sensory quality of roasted pepper. The objective of this study was to analyse how different storage conditions affect the sensory quality of roasted pepper. Differences in weight loss among storage conditions did not affect roast yield. Descriptors juice quality, bitterness and spiciness were not influenced by storage conditions in 2006 or 2007, whereas uniformity, skin surface, cohesiveness and smokiness were influenced by storage conditions in both years. Overall quality was better when pepper was stored for 5 days at 18 °C or for 10 days at 8 °C. The quality of roasted pepper was affected positively by storage conditions in terms of colour and uniformity, which were improved, and hardness, which was reduced. Newly roasted samples, on the other hand, obtained the lowest quality values. Therefore storage of pepper for up to 10 days was useful not only to extend the time of roasted pepper processing for companies but also to improve the sensory quality of roasted pepper without decreasing the roast yield of processed pepper. Copyright © 2010 Society of Chemical Industry.
NASA Technical Reports Server (NTRS)
Larson, David J.; Casagrande, Luis G.; DiMarzio, Don; Alexander, J. Iwan D.; Carlson, Fred; Lee, Taipo; Dudley, Michael; Raghathamachar, Balaji
1998-01-01
The Orbital Processing of High-Quality Doped and Alloyed CdTe Compound Semiconductors program was initiated to investigate, quantitatively, the influences of gravitationally dependent phenomena on the growth and quality of bulk compound semiconductors. The objective was to improve crystal quality (both structural and compositional) and to better understand and control the variables within the crystal growth production process. The empirical effort entailed the development of a terrestrial (one-g) experiment baseline for quantitative comparison with microgravity (mu-g) results. This effort was supported by the development of high-fidelity process models of heat transfer, fluid flow and solute redistribution, and thermo-mechanical stress occurring in the furnace, safety cartridge, ampoule, and crystal throughout the melting, seeding, crystal growth, and post-solidification processing. In addition, the sensitivity of the orbital experiments was analyzed with respect to the residual microgravity (mu-g) environment, both steady state and g-jitter. CdZnTe crystals were grown in one-g and in mu-g. Crystals processed terrestrially were grown at the NASA Ground Control Experiments Laboratory (GCEL) and at Grumman Aerospace Corporation (now Northrop Grumman Corporation). Two mu-g crystals were grown in the Crystal Growth Furnace (CGF) during the First United States Microgravity Laboratory Mission (USML-1), STS-50, June 24 - July 9, 1992.
NASA Technical Reports Server (NTRS)
1976-01-01
The onboard experiment data support facility (OEDSF) will provide data processing support to various experiment payloads on board the space shuttle. The OEDSF study will define the conceptual design and generate specifications for an OEDSF which will meet the following objectives: (1) provide a cost-effective approach to end-to-end processing requirements, (2) service multiple disciplines (3) satisfy user needs, (4) reduce the amount and improve the quality of data collected, stored and processed, and (5) embody growth capacity.
MO-E-9A-01: Risk Based Quality Management: TG100 In Action
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M; Palta, J; Dunscombe, P
2014-06-15
One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less
ERIC Educational Resources Information Center
Scramlin, Stacy Maurine
2009-01-01
The objective of this research was to determine areas of improvement to bacon production. The first trial was conducted to determine differences in belly and bacon quality traits in pigs fed ractopamine (RAC) for various durations during finishing. A 2x3x2 factorial arrangement was used with barrows and gilts, fed RAC levels of 0.0, 5.0, or 7.4…
NASA Astrophysics Data System (ADS)
Kang, Chao; Shi, Yaoyao; He, Xiaodong; Yu, Tao; Deng, Bo; Zhang, Hongji; Sun, Pengcheng; Zhang, Wenbin
2017-09-01
This study investigates the multi-objective optimization of quality characteristics for a T300/epoxy prepreg tape-wound cylinder. The method integrates the Taguchi method, grey relational analysis (GRA) and response surface methodology, and is adopted to improve tensile strength and reduce residual stress. In the winding process, the main process parameters involving winding tension, pressure, temperature and speed are selected to evaluate the parametric influences on tensile strength and residual stress. Experiments are conducted using the Box-Behnken design. Based on principal component analysis, the grey relational grades are properly established to convert multi-responses into an individual objective problem. Then the response surface method is used to build a second-order model of grey relational grade and predict the optimum parameters. The predictive accuracy of the developed model is proved by two test experiments with a low prediction error of less than 7%. The following process parameters, namely winding tension 124.29 N, pressure 2000 N, temperature 40 °C and speed 10.65 rpm, have the highest grey relational grade and give better quality characteristics in terms of tensile strength and residual stress. The confirmation experiment shows that better results are obtained with GRA improved by the proposed method than with ordinary GRA. The proposed method is proved to be feasible and can be applied to optimize the multi-objective problem in the filament winding process.
Benchmark matrix and guide: Part III.
1992-01-01
The final article in the "Benchmark Matrix and Guide" series developed by Headquarters Air Force Logistics Command completes the discussion of the last three categories that are essential ingredients of a successful total quality management (TQM) program. Detailed behavioral objectives are listed in the areas of recognition, process improvement, and customer focus. These vertical categories are meant to be applied to the levels of the matrix that define the progressive stages of the TQM: business as usual, initiation, implementation, expansion, and integration. By charting the horizontal progress level and the vertical TQM category, the quality management professional can evaluate the current state of TQM in any given organization. As each category is completed, new goals can be defined in order to advance to a higher level. The benchmarking process is integral to quality improvement efforts because it focuses on the highest possible standards to evaluate quality programs.
NASA Astrophysics Data System (ADS)
Byun, D. W.; Rappenglueck, B.; Lefer, B.
2007-12-01
Accurate meteorological and photochemical modeling efforts are necessary to understand the measurements made during the Texas Air Quality Study (TexAQS-II). The main objective of the study is to understand the meteorological and chemical processes of high ozone and regional haze events in the Eastern Texas, including the Houston-Galveston metropolitan area. Real-time and retrospective meteorological and photochemical model simulations were performed to study key physical and chemical processes in the Houston Galveston Area. In particular, the Vertical Mixing Experiment (VME) at the University of Houston campus was performed on selected days during the TexAQS-II. Results of the MM5 meteorological model and CMAQ air quality model simulations were compared with the VME and other TexAQS-II measurements to understand the interaction of the boundary layer dynamics and photochemical evolution affecting Houston air quality.
2013-01-01
In 2003, the International Patient Decision Aid Standards (IPDAS) Collaboration was established to enhance the quality and effectiveness of patient decision aids by establishing an evidence-informed framework for improving their content, development, implementation, and evaluation. Over this 10 year period, the Collaboration has established: a) the background document on 12 core dimensions to inform the original modified Delphi process to establish the IPDAS checklist (74 items); b) the valid and reliable IPDAS instrument (47 items); and c) the IPDAS qualifying (6 items), certifying (6 items + 4 items for screening), and quality criteria (28 items). The objective of this paper is to describe the evolution of the IPDAS Collaboration and discuss the standardized process used to update the background documents on the theoretical rationales, evidence and emerging issues underlying the 12 core dimensions for assessing the quality of patient decision aids. PMID:24624947
Health authority commissioning for quality in contraception services
Newman, M.; Bardsley, M.; Morgan, D.; Jacobson, B.
1998-01-01
OBJECTIVE: To compare the commissioning of contraception services by London health authorities with accepted models of good practice. DESIGN: Combined interview and postal surveys of all health authorities and National Health Service (NHS) trusts responsible for running family planning clinics in the Greater London area. MAIN OUTCOME MEASURES: Health authority commissioning was assessed on the presence of four key elements of good practice--strategies, coordination, service specifications, and quality standards in contracts--by monitoring activity and quality. RESULTS: Less than half the health authorities surveyed had written strategies or service specifications for contraception services. Arrangements for coordination of services were limited and monitoring was underdeveloped. CONCLUSION: The process of commissioning services for contraception seems to be relatively underdeveloped despite the importance of health problems associated with unplanned pregnancy in London. These findings raise questions about the capacity of health authorities to improve the quality of these services through the commissioning process. PMID:10185140
Quality Assurance Systems in Education and Training in Europe
NASA Astrophysics Data System (ADS)
Voinia, Claudiu Sorin; Tuşa, Ana; Simion, Carmen
2014-11-01
Member States have a duty to compare and learn more about the national education and professional training. The objectives of this paper were to identify specific characteristics, developments and highlighting key priorities in coordinating the development of specific quality assurance processes in the European Union. The aim of this work was to present the quality assurance systems in vocational education and training systems in the Member States of the European Union. The results were to identify the extent to which national initiatives of EU member States show interest in the quality of education. Data from research can be useful in developing strategic sector development programs, and local schools
Identification of Ways to Improve Military Construction for Energy-Efficient Facilities.
1987-12-01
inservice . Thus, it is necessary to control techniques, materials, and equip- S ment as part of the Military Construction, Army (MCA) process to ensure...Moreover, USACE often lacks proper test equipment and trained personnel at many construction sites. The 0 result is that acceptance testing often is...on a few diagnostic procedures. USACE quality assurance inspectors would be trained to do the tests. .-. Objectives 0 The overall objective of this
[IMPLEMENTATION OF A QUALITY MANAGEMENT SYSTEM IN A NUTRITION UNIT ACCORDING TO ISO 9001:2008].
Velasco Gimeno, Cristina; Cuerda Compés, Cristina; Alonso Puerta, Alba; Frías Soriano, Laura; Camblor Álvarez, Miguel; Bretón Lesmes, Irene; Plá Mestre, Rosa; Izquierdo Membrilla, Isabel; García-Peris, Pilar
2015-09-01
the implementation of quality management systems (QMS) in the health sector has made great progress in recent years, remains a key tool for the management and improvement of services provides to patients. to describe the process of implementing a quality management system (QMS) according to the standard ISO 9001:2008 in a Nutrition Unit. the implementation began in October 2012. Nutrition Unit was supported by Hospital Preventive Medicine and Quality Management Service (PMQM). Initially training sessions on QMS and ISO standards for staff were held. Quality Committee (QC) was established with representation of the medical and nursing staff. Every week, meeting took place among members of the QC and PMQM to define processes, procedures and quality indicators. We carry on a 2 months follow-up of these documents after their validation. a total of 4 processes were identified and documented (Nutritional status assessment, Nutritional treatment, Monitoring of nutritional treatment and Planning and control of oral feeding) and 13 operating procedures in which all the activity of the Unit were described. The interactions among them were defined in the processes map. Each process has associated specific quality indicators for measuring the state of the QMS, and identifying opportunities for improvement. All the documents associated with requirements of ISO 9001:2008 were developed: quality policy, quality objectives, quality manual, documents and records control, internal audit, nonconformities and corrective and preventive actions. The unit was certified by AENOR in April 2013. the implementation of a QMS causes a reorganization of the activities of the Unit in order to meet customer's expectations. Documenting these activities ensures a better understanding of the organization, defines the responsibilities of all staff and brings a better management of time and resources. QMS also improves the internal communication and is a motivational element. Explore the satisfaction and expectations of patients can include their view in the design of care processes. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Hörster, A C; Kulla, M; Brammen, D; Lefering, R
2018-06-01
Emergency department processes are often key for successful treatment. Therefore, collection of quality indicators is demanded. A basis for the collection is systematic, electronic documentation. The development of paper-based documentation into an electronic and interoperable national emergency registry is-besides the establishment of quality management for emergency departments-a target of the AKTIN project. The objective of this research is identification of internationally applied quality indicators. For the investigation of the current status of quality management in emergency departments based on quality indicators, a systematic literature search of the database PubMed, the Cochrane Library and the internet was performed. Of the 170 internationally applied quality indicators, 25 with at least two references are identified. A total of 10 quality indicators are ascertainable by the data set. An enlargement of the data set will enable the collection of seven further quality indicators. The implementation of data of care behind the emergency processes will provide eight additional quality indicators. This work was able to show that the potential of a national emergency registry for the establishment of quality indicators corresponds with the international systems taken into consideration and could provide a comparable collection of quality indicators.
Modeling Department of Defense controlled atmosphere transshipments for forward deployed forces
DOT National Transportation Integrated Search
1998-03-01
The objective of this thesis is to explore the cost savings, product quality improvement, and process efficiencies that can be realized by the integrated design and application of an innovative logistics system for the purchase and transshipment of f...
DOT National Transportation Integrated Search
2007-01-01
In order to be eligible for federal funds, urbanized areas are required to maintain a continuing, cooperative, and comprehensive (3C) transportation planning process that results in plans and programs consistent with the planning objectives of the me...
Peer exchange hosted by the Wyoming Department of Transportation, November 6-9, 2006.
DOT National Transportation Integrated Search
2006-11-01
The objective of the peer exchange programis to give State Departments of Transportation a means : to improve the quality and effectiveness of their researchmanagement processes. The Peer exchange : provides an opportunity for a State to examine its ...
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Physical-chemical quality of onion analyzed under drying temperature
NASA Astrophysics Data System (ADS)
Djaeni, M.; Arifin, U. F.; Sasongko, S. B.
2017-03-01
Drying is one of conventional processes to enhance shelf life of onion. However, the active compounds such as vitamin and anthocyanin (represented in red color), degraded due to the introduction of heat during the process. The objective of this research was to evaluate thiamine content as well as color in onion drying under different temperature. As an indicator, the thiamine and color was observed every 30 minutes for 2 hours. Results showed that thiamine content and color were sensitvely influenced by the temperature change. For example, at 50°C for 2 hours drying process, the thiamine degradation was 55.37 %, whereas, at 60°C with same drying time, the degradation was 74.01%. The quality degradation also increased by prolonging drying time.
Temperature Field Simulation of Powder Sintering Process with ANSYS
NASA Astrophysics Data System (ADS)
He, Hongxiu; Wang, Jun; Li, Shuting; Chen, Zhilong; Sun, Jinfeng; You, Ying
2018-03-01
Aiming at the “spheroidization phenomenon” in the laser sintering of metal powder and other quality problems of the forming parts due to the thermal effect, the finite element model of the three-dimensional transient metal powder was established by using the atomized iron powder as the research object. The simulation of the mobile heat source was realized by means of parametric design. The distribution of the temperature field during the sintering process under different laser power and different spot sizes was simulated by ANSYS software under the condition of fully considering the influence of heat conduction, thermal convection, thermal radiation and thermophysical parameters. The influence of these factors on the actual sintering process was also analyzed, which provides an effective way for forming quality control.
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
[Quality concept in health care. Methodology for its measurement].
Morera Guitart, J
2003-12-01
It is increasingly necessary that the neurologists achieve basic knowledgement in clinical management and medical care quality. We will review the concepts of medical care quality (MCQ). Of the definitions checked, we want to emphasize the following aspects. a) application of current scientific knowledge; b) interpersonal relationship; c) environment where the assistance is dispensed; d) results in health; e) cost of assistance; f) risks for the patient and g) patient satisfaction. For the analysis of the MCQ we could distinguish several components: scientific-technical component, efficacy, effectiveness, efficiency, accessibility, continuity, equity, appropriateness, and satisfaction of the patient and of the professional. One of the main objectives to measure the MCQ is to improve the assistance itself. For its measurement we can employ diverse methods depending on our objective: to improve the process, to do Benchmarking, to know the satisfaction of the patients or to guarantee the quality of the medical attention. The most used tools for this measurement are: establishment of criteria-indicator-standard for quality, elaboration of satisfaction questionnaires, interviews to key informant, analysis of complaints and claims of patients and professionals, and clinical audits. The role of the neurologist in the achievement of a high quality neurological attention if fundamental. Therefore, it is necessary some specific formation on: scientific and technical matter, communicative abilities, teamworking, management and organisation of tasks and pharmaco-economic evaluation, and a cultural change that involves every professional on the co-responsibility of the continuous improvement of the processes and of the results of his work, advancing gradually towards the excellence of medical assistance.
Cibachrome testing. [photographic processing and printing materials
NASA Technical Reports Server (NTRS)
Weinstein, M. S.
1974-01-01
The use of Cibachrome products as a solution to problems encountered when contact printing Kodak film type SO-397 onto Kodak Ektrachrome color reversal paper type 1993 is investigated. A roll of aerial imagery consisting of Kodak film types SO-397 and 2443 was contact printed onto Cibachrome and Kodak materials and compared in terms of color quality, resolution, cost, and compatibility with existing equipment and techniques. Objective measurements are given in terms of resolution and sensitometric response. Comparison prints and transparencies were viewed and ranked according to overall quality and aesthetic appeal. It is recommended that Cibachrome Print material be used in place of Kodak Ektachrome paper because it is more easily processed, the cost is equivalent, and it provides improved resolution, color quality, and image fade resistance.
[Thinking on designation of sham acupuncture in clinical research].
Pan, Li-Jia; Chen, Bo; Zhao, Xue; Guo, Yi
2014-01-01
Randomized controlled trials (RCT) is the source of the raw data of evidence-based medicine. Blind method is adopted in most of the high-quality RCT. Sham acupuncture is the main form of blinded in acupuncture clinical trial. In order to improve the quality of acupuncture clinical trail, based on the necessity of sham acupuncture in clinical research, the current situation as well as the existing problems of sham acupuncture, suggestions were put forward from the aspects of new way and new designation method which can be adopted as reference, and factors which have to be considered during the process of implementing. Various subjective and objective factors involving in the process of trial should be considered, and used of the current international standards, try to be quantification, and carry out strict quality monitoring.
A Framework Incorporating Community Preferences in Use ...
The report is intended to assist water quality officials, watershed managers, members of stakeholder groups, and other interested individuals in fully evaluating ecological and socioeconomic objectives and the gains and losses that often are involved in use attainment decisions. In addition, this report enables local, state, and tribal managers to better understand the benefits, as well as the costs, of attaining high water quality, and to incorporate community preferences in decision-making. Specific objectives are (1) to provide an introduction to the CWA and WQS regulation and analyses related to setting or changing designated uses; (2) create a basis for understanding the relationship between use-attainment decisions and the effects on ecosystems, ecosystem services, and ecological benefits; (3) serve as reference for methods that elicit or infer preferences for benefits and costs related to attaining uses and (4) present process for incorporating new approaches in water quality decisions.
Hassan, Ali H; Amer, Hala A; Maghrabi, Abdulhamaid A
2005-01-01
The objectives of this research were to assess the quality of dental services delivered in King Abdulaziz University and highlight the necessary recommendations that would improve it. The methods used were live photographs illustrating the structure of dental services of the faculty presented in the clinic buildings, waiting places, equipments, instruments and supplies, as well as the comfort and privacy. Review of official records of the faculty for the number, qualifications and training of the dental staff and auxiliary personnel, as well as the process of care (starting from patient registration until completion of treatment). Records also demonstrated the access and utilization of services delivered in the various departments, the quality of these services and of infection control measures and procedures. The results revealed the high quality of services delivered through evaluating the structure and process of care in the university dental clinics. Dental services of King Abdulaziz University conform to high quality standards, with implementation of some changes for improvement and development.
2011-01-01
Background A framework for high quality in post graduate training has been defined by the World Federation of Medical Education (WFME). The objective of this paper is to perform a systematic review of reviews to find current evidence regarding aspects of quality of post graduate training and to organise the results following the 9 areas of the WFME framework. Methods The systematic literature review was conducted in 2009 in Medline Ovid, EMBASE, ERIC and RDRB databases from 1995 onward. The reviews were selected by two independent researchers and a quality appraisal was based on the SIGN tool. Results 31 reviews met inclusion criteria. The majority of the reviews provided information about the training process (WFME area 2), the assessment of trainees (WFME area 3) and the trainees (WFME area 4). One review covered the area 8 'governance and administration'. No review was found in relation to the mission and outcomes, the evaluation of the training process and the continuous renewal (respectively areas 1, 7 and 9 of the WFME framework). Conclusions The majority of the reviews provided information about the training process, the assessment of trainees and the trainees. Indicators used for quality assessment purposes of post graduate training should be based on this evidence but further research is needed for some areas in particular to assess the quality of the training process. PMID:21977898
DATA QUALITY OBJECTIVES AND MEASUREMENT QUALITY OBJECTIVES FOR RESEARCH PROJECTS
The paper provides assistance with systematic planning using measurement quality objectives to those working on research projects. These performance criteria are more familiar to researchers than data quality objectives because they are more closely associated with the measuremen...
MacDonald, D.D.; Carr, R.S.; Eckenrod, D.; Greening, H.; Grabe, S.; Ingersoll, C.G.; Janicki, S.; Janicki, T.; Lindskoog, R.A.; Long, E.R.; Pribble, R.; Sloane, G.; Smorong, D.E.
2004-01-01
Tampa Bay is a large, urban estuary that is located in west central Florida. Although water quality conditions represent an important concern in this estuary, information from numerous sources indicates that sediment contamination also has the potential to adversely affect aquatic organisms, aquatic-dependent wildlife, and human health. As such, protecting relatively uncontaminated areas of the bay from contamination and reducing the amount of toxic chemicals in contaminated sediments have been identified as high-priority sediment management objectives for Tampa Bay. To address concerns related to sediment contamination in the bay, an ecosystem-based framework for assessing and managing sediment quality conditions was developed that included identification of sediment quality issues and concerns, development of ecosystem goals and objectives, selection of ecosystem health indicators, establishment of metrics and targets for key indicators, and incorporation of key indicators, metrics, and targets into watershed management plans and decision-making processes. This paper describes the process that was used to select and evaluate numerical sediment quality targets (SQTs) for assessing and managing contaminated sediments. These SQTs included measures of sediment chemistry, whole-sediment and pore-water toxicity, and benthic invertebrate community structure. In addition, the paper describes how the SQTs were used to develop site-specific concentration-response models that describe how the frequency of adverse biological effects changes with increasing concentrations of chemicals of potential concern. Finally, a key application of the SQTs for defining sediment management areas is discussed.
Investigation into the Use of the Concept Laser QM System as an In-Situ Research and Evaluation Tool
NASA Technical Reports Server (NTRS)
Bagg, Stacey
2014-01-01
The NASA Marshall Space Flight Center (MSFC) is using a Concept Laser Fusing (Cusing) M2 powder bed additive manufacturing system for the build of space flight prototypes and hardware. NASA MSFC is collecting and analyzing data from the M2 QM Meltpool and QM Coating systems for builds. This data is intended to aide in understanding of the powder-bed additive manufacturing process, and in the development of a thermal model for the process. The QM systems are marketed by Concept Laser GmbH as in-situ quality management modules. The QM Meltpool system uses both a high-speed near-IR camera and a photodiode to monitor the melt pool generated by the laser. The software determines from the camera images the size of the melt pool. The camera also measures the integrated intensity of the IR radiation, and the photodiode gives an intensity value based on the brightness of the melt pool. The QM coating system uses a high resolution optical camera to image the surface after each layer has been formed. The objective of this investigation was to determine the adequacy of the QM Meltpool system as a research instrument for in-situ measurement of melt pool size and temperature and its applicability to NASA's objectives in (1) Developing a process thermal model and (2) Quantifying feedback measurements with the intent of meeting quality requirements or specifications. Note that Concept Laser markets the system only as capable of giving an indication of changes between builds, not as an in-situ research and evaluation tool. A secondary objective of the investigation is to determine the adequacy of the QM Coating system as an in-situ layer-wise geometry and layer quality evaluation tool.
Software Formal Inspections Standard
NASA Technical Reports Server (NTRS)
1993-01-01
This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.
Quality By Design: Concept To Applications.
Swain, Suryakanta; Padhy, Rabinarayan; Jena, Bikash Ranjan; Babu, Sitty Manohar
2018-03-08
Quality by Design is associated to the modern, systematic, scientific and novel approach which is concerned with pre-distinct objectives that not only focus on product, process understanding but also leads to process control. It predominantly signifies the design and product improvement and the manufacturing process in order to fulfill the predefined manufactured goods or final products quality characteristics. It is quite essential to identify desire and required product performance report such as Target Product Profile, typical Quality Target Product Profile (QTPP) and Critical Quality attributes (CQA). This review highlighted about the concepts of QbD design space, for critical material attributes (CMAs) as well as the critical process parameters that can totally affect the CQAs within which the process shall be unaffected and consistently manufacture the required product. Risk assessment tools and design of experiments are its prime components. This paper outlines the basic knowledge of QbD, the key elements; steps as well as various tools for QbD implementation in pharmaceutics field are presented briefly. In addition to this, quite a lot of applications of QbD in numerous pharmaceutical related unit operations are discussed and summarized. This article provides a complete data as well as the road map for universal implementation and application of QbD for pharmaceutical products. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.
2017-09-01
In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed
ERIC Educational Resources Information Center
Cer, Erkan
2018-01-01
Purpose: The purpose of the current study is to reveal general qualities of the objectives in the mother-tongue curricula of Hong Kong and Shanghai-China, South Korea, Singapore, and Turkey in terms of higher-order thinking processes specified by PISA tests. Research Methods: In this study, the researcher used a qualitative research design.…
Automated assembly of camera modules using active alignment with up to six degrees of freedom
NASA Astrophysics Data System (ADS)
Bräuniger, K.; Stickler, D.; Winters, D.; Volmer, C.; Jahn, M.; Krey, S.
2014-03-01
With the upcoming Ultra High Definition (UHD) cameras, the accurate alignment of optical systems with respect to the UHD image sensor becomes increasingly important. Even with a perfect objective lens, the image quality will deteriorate when it is poorly aligned to the sensor. For evaluating the imaging quality the Modulation Transfer Function (MTF) is used as the most accepted test. In the first part it is described how the alignment errors that lead to a low imaging quality can be measured. Collimators with crosshair at defined field positions or a test chart are used as object generators for infinite-finite or respectively finite-finite conjugation. The process how to align the image sensor accurately to the optical system will be described. The focus position, shift, tilt and rotation of the image sensor are automatically corrected to obtain an optimized MTF for all field positions including the center. The software algorithm to grab images, calculate the MTF and adjust the image sensor in six degrees of freedom within less than 30 seconds per UHD camera module is described. The resulting accuracy of the image sensor rotation is better than 2 arcmin and the accuracy position alignment in x,y,z is better 2 μm. Finally, the process of gluing and UV-curing is described and how it is managed in the integrated process.
Data Quality Objectives for Tank Farms Waste Compatibility Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
1999-07-02
There are 177 waste storage tanks containing over 210,000 m{sup 3} (55 million gal) of mixed waste at the Hanford Site. The River Protection Project (RPP) has adopted the data quality objective (DQO) process used by the U.S. Environmental Protection Agency (EPA) (EPA 1994a) and implemented by RPP internal procedure (Banning 1999a) to identify the information and data needed to address safety issues. This DQO document is based on several documents that provide the technical basis for inputs and decision/action levels used to develop the decision rules that evaluate the transfer of wastes. A number of these documents are presentlymore » in the process of being revised. This document will need to be revised if there are changes to the technical criteria in these supporting documents. This DQO process supports various documents, such as sampling and analysis plans and double-shell tank (DST) waste analysis plans. This document identifies the type, quality, and quantity of data needed to determine whether transfer of supernatant can be performed safely. The requirements in this document are designed to prevent the mixing of incompatible waste as defined in Washington Administrative Code (WAC) 173-303-040. Waste transfers which meet the requirements contained in this document and the Double-Shell Tank Waste Analysis Plan (Mulkey 1998) are considered to be compatible, and prevent the mixing of incompatible waste.« less
Qualities of dental chart recording and coding.
Chantravekin, Yosananda; Tasananutree, Munchulika; Santaphongse, Supitcha; Aittiwarapoj, Anchisa
2013-01-01
Chart recording and coding are the important processes in the healthcare informatics system, but there were only a few reports in the dentistry field. The objectives of this study are to study the qualities of dental chart recording and coding, as well as the achievement of lecture/workshop on this topic. The study was performed by auditing the patient's charts at the TU Dental Student Clinic from July 2011-August 2012. The chart recording mean scores ranged from 51.0-55.7%, whereas the errors in the coding process were presented in the coder part more than the doctor part. The lecture/workshop could improve the scores only in some topics.
An Overview of the Quality Function Deployment (QFD) Technique
NASA Technical Reports Server (NTRS)
Sherif, Josef S.; Tran, Tuyet-Lan
1995-01-01
QFD is a product planning tool and a process methodology that enables all organizations, departments, and individuals in a business or a project to systematically focus on the critical performance, functions, and/or characteristics of a product that are the most important to the customer. It is part of the Total Quality Management (TQM) concept.. This presentation describes the objectives of QFD, the process for implementing the technique, the benefits derived from proper implementation, the kinds of systems where QFD is best utilized, what the success factors are, how QFD works, some guidelines for selection of a QFD team, and the functional roles of key team members.
Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang
2013-01-01
Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.
Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang
2013-01-01
Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202
Testing of an advanced thermochemical conversion reactor system
NASA Astrophysics Data System (ADS)
1990-01-01
This report presents the results of work conducted by MTCI to verify and confirm experimentally the ability of the MTCI gasification process to effectively generate a high-quality, medium-Btu gas from a wider variety of feedstock and waste than that attainable in air-blown, direct gasification systems. The system's overall simplicity, due to the compact nature of the pulse combustor, and the high heat transfer rates attainable within the pulsating flow resonance tubes, provide a decided and near-term potential economic advantage for the MTCI indirect gasification system. The primary objective was the design, construction, and testing of a Process Design Verification System for an indirectly heated, thermochemical fluid-bed reactor and a pulse combustor an an integrated system that can process alternative renewable sources of energy such as biomass, black liquor, municipal solid waste and waste hydrocarbons, including heavy oils into a useful product gas. The test objectives for the biomass portion of this program were to establish definitive performance data on biomass feedstocks covering a wide range of feedstock qualities and characteristics. The test objectives for the black liquor portion of this program were to verify the operation of the indirect gasifier on commercial black liquor containing 65 percent solids at several temperature levels and to characterize the bed carbon content, bed solids particle size and sulfur distribution as a function of gasification conditions.
The role of hospital managers in quality and patient safety: a systematic review
Parand, Anam; Dopson, Sue; Renz, Anna; Vincent, Charles
2014-01-01
Objectives To review the empirical literature to identify the activities, time spent and engagement of hospital managers in quality of care. Design A systematic review of the literature. Methods A search was carried out on the databases MEDLINE, PSYCHINFO, EMBASE, HMIC. The search strategy covered three facets: management, quality of care and the hospital setting comprising medical subject headings and key terms. Reviewers screened 15 447 titles/abstracts and 423 full texts were checked against inclusion criteria. Data extraction and quality assessment were performed on 19 included articles. Results The majority of studies were set in the USA and investigated Board/senior level management. The most common research designs were interviews and surveys on the perceptions of managerial quality and safety practices. Managerial activities comprised strategy, culture and data-centred activities, such as driving improvement culture and promotion of quality, strategy/goal setting and providing feedback. Significant positive associations with quality included compensation attached to quality, using quality improvement measures and having a Board quality committee. However, there is an inconsistency and inadequate employment of these conditions and actions across the sample hospitals. Conclusions There is some evidence that managers’ time spent and work can influence quality and safety clinical outcomes, processes and performance. However, there is a dearth of empirical studies, further weakened by a lack of objective outcome measures and little examination of actual actions undertaken. We present a model to summarise the conditions and activities that affect quality performance. PMID:25192876
Karim, Abdool Z
2009-01-01
The regional processing centre at Sunnybrook Health Sciences Centre recently faced the substantial challenge of increasing cleaning capacity to meet the current workload and anticipated future demand without increasing its operating budget. The solution, upgrading its cleaning and decontamination system to a highly automated system, met both objectives. An analysis of the impact of the change found that the new system provided additional benefits, including improved productivity and cleaning quality; decreased costs; reduced water, electricity and chemical use; improved worker safety and morale; and decreased overtime. Investing in innovative technology improved key departmental outcomes while meeting institutional environmental and cost savings objectives.
A Multivariate Quality Loss Function Approach for Optimization of Spinning Processes
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Mitra, Ankan
2018-05-01
Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.
Healthy participants in phase I clinical trials: the quality of their decision to take part.
Rabin, Cheryl; Tabak, Nili
2006-08-01
This study was set out to test the quality of the decision-making process of healthy volunteers in clinical trials. Researchers fear that the decision to volunteer for clinical trials is taken inadequately and that the signature on the consent forms, meant to affirm that consent was 'informed', is actually insubstantial. The study design was quasi-experimental, using a convenience quota sample. Over a period of a year, candidates were approached during their screening process for a proposed clinical trial, after concluding the required 'Informed Consent' procedure. In all, 100 participants in phase I trials filled out questionnaires based ultimately on the Janis and Mann model of vigilant information processing, during their stay in the research centre. Only 35% of the participants reached a 'quality decision'. There is a definite correlation between information processing and quality decision-making. However, many of the healthy research volunteers (58%) do not seek out information nor check alternatives before making a decision. Full disclosure is essential to a valid informed consent procedure but not sufficient; emphasis must be put on having the information understood and assimilated. Research nurses play a central role in achieving this objective.
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
Moreno-Martínez, Francisco Javier; Montoro, Pedro R
2012-01-01
This work presents a new set of 360 high quality colour images belonging to 23 semantic subcategories. Two hundred and thirty-six Spanish speakers named the items and also provided data from seven relevant psycholinguistic variables: age of acquisition, familiarity, manipulability, name agreement, typicality and visual complexity. Furthermore, we also present lexical frequency data derived from Internet search hits. Apart from the high number of variables evaluated, knowing that it affects the processing of stimuli, this new set presents important advantages over other similar image corpi: (a) this corpus presents a broad number of subcategories and images; for example, this will permit researchers to select stimuli of appropriate difficulty as required, (e.g., to deal with problems derived from ceiling effects); (b) the fact of using coloured stimuli provides a more realistic, ecologically-valid, representation of real life objects. In sum, this set of stimuli provides a useful tool for research on visual object- and word-processing, both in neurological patients and in healthy controls.
Johnson, Earl E; Light, Keri C
2015-09-01
To evaluate sound quality preferences of participants wearing hearing aids with different strengths of nonlinear frequency compression (NFC) processing versus no NFC processing. Two analysis methods, one without and one with a qualifier as to the magnitude of preferences, were compared for their percent agreement to differentiate a small difference in perceived sound quality as a result of applied NFC processing. A single-blind design was used with participants unaware of the presence or strength of NFC processing (independent variable). The National Acoustic Laboratories-Nonlinear 2 (NAL-NL2) prescription of amplification was chosen because audibility is intentionally not prescribed in the presence of larger sensorineural hearing loss thresholds. A lack of prescribed audibility, when present, was deemed an objective qualifier for NFC. NFC is known to improve the input bandwidth available to listeners when high-frequency audibility is not otherwise available and increasing strengths of NFC were examined. Experimental condition 3 (EC3) was stronger than the manufacturer default (EC2). More aggressive strengths (e.g., EC4 and EC5), however, were expected to include excessive distortion and even reduce the output bandwidth that had been prescribed as audible by NAL-NL2 (EC1). A total of 14 male Veterans with severe high-frequency sensorineural hearing loss. Participant sound quality preference ratings (dependent variable) without a qualifier as to the magnitude of preference were analyzed based on binomial probability theory, as is traditional with paired comparison data. The ratings with a qualifier as to the magnitude of preference were analyzed based on the nonparametric statistic of the Wilcoxon signed rank test. The binomial probability analysis method identified a sound quality preference as well as the nonparametric probability test method. As the strength of NFC increased, more participants preferred the EC with less NFC. Fourteen of 14 participants showed equal preference between EC1 and EC2 perhaps, in part, because EC2 showed no objective improvement in audibility for six of the 14 participants (42%). Thirteen of the 14 participants showed no preference between NAL-NL2 and EC3, but all participants had an objective improvement in audibility. With more NFC than EC3, more and more participants preferred the other EC with less NFC in the paired comparison. By referencing the recommended sensation levels of amplitude compression (e.g., NAL-NL2) in the ear canal of hearing aid wearers, the targeting of NFC parameters can likely be optimized with respect to improvements in effective audibility that may contribute to speech recognition without adversely impacting sound quality. After targeting of NFC parameters, providers can facilitate decisions about the use of NFC parameters (strengths of processing) via sound quality preference judgments using paired comparisons. American Academy of Audiology.
The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284
Implementation of the qualities of radiodiagnostic: mammography
NASA Astrophysics Data System (ADS)
Pacífico, L. C.; Magalhães, L. A. G.; Peixoto, J. G. P.; Fernandes, E.
2018-03-01
The objective of the present study was to evaluate the expanded uncertainty of the mammographic calibration process and present the result of the internal audit performed at the Laboratory of Radiological Sciences (LCR). The qualities of the mammographic beans that are references in the LCR, comprises two irradiation conditions: no-attenuated beam and attenuated beam. Both had satisfactory results, with an expanded uncertainty equals 2,1%. The internal audit was performed, and the degree of accordance with the ISO/IEC 17025 was evaluated. The result of the internal audit was satisfactory. We conclude that LCR can perform calibrations on mammography qualities for end users.
Gratacós, Jordi; Luelmo, Jesús; Rodríguez, Jesús; Notario, Jaume; Marco, Teresa Navío; de la Cueva, Pablo; Busquets, Manel Pujol; Font, Mercè García; Joven, Beatriz; Rivera, Raquel; Vega, Jose Luis Alvarez; Álvarez, Antonio Javier Chaves; Parera, Ricardo Sánchez; Carrascosa, Jose Carlos Ruiz; Martínez, Fernando José Rodríguez; Sánchez, José Pardo; Olmos, Carlos Feced; Pujol, Conrad; Galindez, Eva; Barrio, Silvia Pérez; Arana, Ana Urruticoechea; Hergueta, Mercedes; Coto, Pablo; Queiro, Rubén
2018-06-01
To define and give priority to standards of care and quality indicators of multidisciplinary care for patients with psoriatic arthritis (PsA). A systematic literature review on PsA standards of care and quality indicators was performed. An expert panel of rheumatologists and dermatologists who provide multidisciplinary care was established. In a consensus meeting group, the experts discussed and developed the standards of care and quality indicators and graded their priority, agreement and also the feasibility (only for quality indicators) following qualitative methodology and a Delphi process. Afterwards, these results were discussed with 2 focus groups, 1 with patients, another with health managers. A descriptive analysis is presented. We obtained 25 standards of care (9 of structure, 9 of process, 7 of results) and 24 quality indicators (2 of structure, 5 of process, 17 of results). Standards of care include relevant aspects in the multidisciplinary care of PsA patients like an appropriate physical infrastructure and technical equipment, the access to nursing care, labs and imaging techniques, other health professionals and treatments, or the development of care plans. Regarding quality indicators, the definition of multidisciplinary care model objectives and referral criteria, the establishment of responsibilities and coordination among professionals and the active evaluation of patients and data collection were given a high priority. Patients considered all of them as important. This set of standards of care and quality indicators for the multidisciplinary care of patients with PsA should help improve quality of care in these patients.
Diabetes and Hypertension Quality Measurement in Four Safety-Net Sites
Benkert, R.; Dennehy, P.; White, J.; Hamilton, A.; Tanner, C.
2014-01-01
Summary Background In this new era after the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, the literature on lessons learned with electronic health record (EHR) implementation needs to be revisited. Objectives Our objective was to describe what implementation of a commercially available EHR with built-in quality query algorithms showed us about our care for diabetes and hypertension populations in four safety net clinics, specifically feasibility of data retrieval, measurements over time, quality of data, and how our teams used this data. Methods A cross-sectional study was conducted from October 2008 to October 2012 in four safety-net clinics located in the Midwest and Western United States. A data warehouse that stores data from across the U.S was utilized for data extraction from patients with diabetes or hypertension diagnoses and at least two office visits per year. Standard quality measures were collected over a period of two to four years. All sites were engaged in a partnership model with the IT staff and a shared learning process to enhance the use of the quality metrics. Results While use of the algorithms was feasible across sites, challenges occurred when attempting to use the query results for research purposes. There was wide variation of both process and outcome results by individual centers. Composite calculations balanced out the differences seen in the individual measures. Despite using consistent quality definitions, the differences across centers had an impact on numerators and denominators. All sites agreed to a partnership model of EHR implementation, and each center utilized the available resources of the partnership for Center-specific quality initiatives. Conclusions Utilizing a shared EHR, a Regional Extension Center-like partnership model, and similar quality query algorithms allowed safety-net clinics to benchmark and improve the quality of care across differing patient populations and health care delivery models. PMID:25298815
Kato-Lin, Yi-Chin; Krishnamurti, Lakshmanan; Padman, Rema; Seltman, Howard J
2014-11-01
There is limited application and evaluation of health information systems in the management of vaso-occlusive pain crises in sickle cell disease (SCD) patients. This study evaluates the impact of digitization of paper-based individualized pain plans on process efficiency and care quality by examining both objective patient data and subjective clinician insights. Retrospective, before and after, mixed methods evaluation of digitization of paper documents in Children's Hospital of Pittsburgh of UPMC. Subjective perceptions are analyzed using surveys completed by 115 clinicians in emergency department (ED) and inpatient units (IP). Objective effects are evaluated using mixed models with data on 1089 ED visits collected via electronic chart review 28 months before and 22 months after the digitization. Surveys indicate that all clinicians perceived the digitization to improve the efficiency and quality of pain management. Physicians overwhelmingly preferred using the digitized plans, but only 44% of the nurses had the same response. Analysis of patient records indicates that adjusted time from analgesic order to administration was significantly reduced from 35.50 to 26.77 min (p<.05). However, time to first dose and some of the objective quality measures (time from administration to relief, relief rate, admission rate, and ED re-visit rate) were not significantly affected. The relatively simple intervention, high baseline performance, and limited accommodation of nurses' perspectives may account for the marginal improvements in process efficiency and quality outcomes. Additional efforts, particularly improved communication between physicians and nurses, are needed to further enhance quality of pain management. This study highlights the important role of health information technology (HIT) on vaso-occlusive pain management for pediatric patients with sickle cell disease and the critical challenges in accommodating human factor considerations in implementing and evaluating HIT effects. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Introductory Curriculum Materials, Project SCATE.
ERIC Educational Resources Information Center
Iowa State Dept. of Public Instruction, Des Moines. Div. of Curriculum.
The objective of Project SCATE (Students Concerned About Tomorrow's Environment) is for students to investigate environmental problems and the political processes involved in their solution. The four identified areas of concern are: (1) land use policy development; (2) air and water quality; (3) energy allocation and consumption; and (4) economic…
Long-term care information systems: an overview of the selection process.
Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara
2006-06-01
Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.
Developing evidence-based physical therapy clinical practice guidelines.
Kaplan, Sandra L; Coulter, Colleen; Fetters, Linda
2013-01-01
Recommended strategies for developing evidence-based clinical practice guidelines (CPGs) are provided. The intent is that future CPGs developed with the support of the Section on Pediatrics of the American Physical Therapy Association would consistently follow similar developmental processes to yield consistent quality and presentation. Steps in the process of developing CPGs are outlined and resources are provided to assist CPG developers in carrying out their task. These recommended processes may also be useful to CPG developers representing organizations with similar structures, objectives, and resources.
Quality and Certification of Electronic Health Records
Hoerbst, A.; Ammenwerth, E.
2010-01-01
Background Numerous projects, initiatives, and programs are dedicated to the development of Electronic Health Records (EHR) worldwide. Increasingly more of these plans have recently been brought from a scientific environment to real life applications. In this context, quality is a crucial factor with regard to the acceptance and utility of Electronic Health Records. However, the dissemination of the existing quality approaches is often rather limited. Objectives The present paper aims at the description and comparison of the current major quality certification approaches to EHRs. Methods A literature analysis was carried out in order to identify the relevant publications with regard to EHR quality certification. PubMed, ACM Digital Library, IEEExplore, CiteSeer, and Google (Scholar) were used to collect relevant sources. The documents that were obtained were analyzed using techniques of qualitative content analysis. Results The analysis discusses and compares the quality approaches of CCHIT, EuroRec, IHE, openEHR, and EN13606. These approaches differ with regard to their focus, support of service-oriented EHRs, process of (re-)certification and testing, number of systems certified and tested, supporting organizations, and regional relevance. Discussion The analyzed approaches show differences with regard to their structure and processes. System vendors can exploit these approaches in order to improve and certify their information systems. Health care organizations can use these approaches to support selection processes or to assess the quality of their own information systems. PMID:23616834
Deep learning methods to guide CT image reconstruction and reduce metal artifacts
NASA Astrophysics Data System (ADS)
Gjesteby, Lars; Yang, Qingsong; Xi, Yan; Zhou, Ye; Zhang, Junping; Wang, Ge
2017-03-01
The rapidly-rising field of machine learning, including deep learning, has inspired applications across many disciplines. In medical imaging, deep learning has been primarily used for image processing and analysis. In this paper, we integrate a convolutional neural network (CNN) into the computed tomography (CT) image reconstruction process. Our first task is to monitor the quality of CT images during iterative reconstruction and decide when to stop the process according to an intelligent numerical observer instead of using a traditional stopping rule, such as a fixed error threshold or a maximum number of iterations. After training on ground truth images, the CNN was successful in guiding an iterative reconstruction process to yield high-quality images. Our second task is to improve a sinogram to correct for artifacts caused by metal objects. A large number of interpolation and normalization-based schemes were introduced for metal artifact reduction (MAR) over the past four decades. The NMAR algorithm is considered a state-of-the-art method, although residual errors often remain in the reconstructed images, especially in cases of multiple metal objects. Here we merge NMAR with deep learning in the projection domain to achieve additional correction in critical image regions. Our results indicate that deep learning can be a viable tool to address CT reconstruction challenges.
NASA Astrophysics Data System (ADS)
Ţîţu, M. A.; Pop, A. B.; Ţîţu, Ș
2017-06-01
This paper presents a study on the modelling and optimization of certain variables by using the Taguchi Method with a view to modelling and optimizing the process of pressing tappets into anchors, process conducted in an organization that promotes knowledge-based management. The paper promotes practical concepts of the Taguchi Method and describes the way in which the objective functions are obtained and used during the modelling and optimization of the process of pressing tappets into the anchors.
Optimization of Collision Detection in Surgical Simulations
NASA Astrophysics Data System (ADS)
Custură-Crăciun, Dan; Cochior, Daniel; Neagu, Corneliu
2014-11-01
Just like flight and spaceship simulators already represent a standard, we expect that soon enough, surgical simulators should become a standard in medical applications. A simulations quality is strongly related to the image quality as well as the degree of realism of the simulation. Increased quality requires increased resolution, increased representation speed but more important, a larger amount of mathematical equations. To make it possible, not only that we need more efficient computers, but especially more calculation process optimizations. A simulator executes one of the most complex sets of calculations each time it detects a contact between the virtual objects, therefore optimization of collision detection is fatal for the work-speed of a simulator and hence in its quality
Process mapping as a framework for performance improvement in emergency general surgery.
DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad
2017-12-01
Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.
Process mapping as a framework for performance improvement in emergency general surgery.
DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad
2018-02-01
Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.
Shek, Daniel T L; Tam, Suet-yan
2009-01-01
To understand the implementation quality of the Tier 1 Program (Secondary 2 Curriculum) of the P.A.T.H.S. Project, process evaluation was carried out by co-walkers through classroom observation of 195 units in 131 schools. Results showed that the overall level of program adherence was generally high with an average of 84.55%, and different factors of the implementation process were evaluated as positive. Quality of program implementation and achievement of program objectives were predicted by students' participation and involvement, strategies to enhance students' motivation, opportunity for reflection, time management, and class preparation. Success in program implementation was predicted by students' participation and involvement, classroom control, interactive delivery method, strategies to enhance students' motivation, opportunity for reflection, and lesson preparation.
Influence of water quality on the embodied energy of drinking water treatment.
Santana, Mark V E; Zhang, Qiong; Mihelcic, James R
2014-01-01
Urban water treatment plants rely on energy intensive processes to provide safe, reliable water to users. Changes in influent water quality may alter the operation of a water treatment plant and its associated energy use or embodied energy. Therefore the objective of this study is to estimate the effect of influent water quality on the operational embodied energy of drinking water, using the city of Tampa, Florida as a case study. Water quality and water treatment data were obtained from the David L Tippin Water Treatment Facility (Tippin WTF). Life cycle energy analysis (LCEA) was conducted to calculate treatment chemical embodied energy values. Statistical methods including Pearson's correlation, linear regression, and relative importance were used to determine the influence of water quality on treatment plant operation and subsequently, embodied energy. Results showed that influent water quality was responsible for about 14.5% of the total operational embodied energy, mainly due to changes in treatment chemical dosages. The method used in this study can be applied to other urban drinking water contexts to determine if drinking water source quality control or modification of treatment processes will significantly minimize drinking water treatment embodied energy.
Patel, Hetal; Patel, Kishan; Tiwari, Sanjay; Pandey, Sonia; Shah, Shailesh; Gohel, Mukesh
2016-01-01
Microcrystalline cellulose (MCC) is an excellent excipient for the production of pellets by extrusion spheronization. However, it causes slow release rate of poorly water soluble drugs from pellets. Co-processed excipient prepared by spray drying (US4744987; US5686107; WO2003051338) and coprecipitation technique (WO9517831) are patented. The objective of present study was to develop co-processed MCC pellets (MOMLETS) by extrusion-spheronization technique using the principle of Quality by Design (QbD). Co-processed excipient core pellets (MOMLETS) were developed by extrusion spheronization technique using Quality by Design (QbD) approach. BCS class II drug (telmisartan) was layered onto it in a fluidized bed processor. Quality Target Product Profile (QTPP) and Critical Quality Attributes (CQA) for pellets were identified. Risk assessment was reported using Ishikawa diagram. Plackett Burman design was used to check the effect of seven independent variables; superdisintegrant, extruder speed, ethanol: water, spheronizer speed, extruder screen, pore former and MCC: lactose; on percentage drug release at 30 min. Pareto chart and normal probability plot was constructed to identify the significant factors. Box-Behnken design (BBD) using three most significant factors (Extruder screen size, type of superdisintegrant and type of pore former) was used as an optimization design. The control space was identified in which desired quality of the pellets can be obtained. Co-processed excipient core pellets (MOMLETS) were successfully developed by QbD approach. Versatility, Industrial scalability and simplicity are the main features of the proposed research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Saavedra, Juan Alejandro
Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.
Causes of cine image quality deterioration in cardiac catheterization laboratories.
Levin, D C; Dunham, L R; Stueve, R
1983-10-01
Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.
Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J
2006-01-01
The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.
Quality assurance in surgical practice through auditing.
Wong, W T
1980-05-01
An efficient auditing method is presented which involves objective criteria-based numerical screening of medical process and treatment outcome by paramedical staff and detailed analysis of deviated cases by surgeons. If properly performed it requires the study of no more than 50 cases in a diagnostic category to provide sufficient information about the quality of care. Encouraging points as well as problems are communicated to the surgeons to induce the maintenance or improvement of the standard of care. Graphic documentation of case performance is possible, allowing surgeons to compare results with their colleagues. The general performance level of several consecutive studies can be compared at a glance. In addition, logical education programs to improve the medical process can be designed on the basis of the problems identified. As all the cases with an unacceptable outcome are traceable to inadequate medical process, improvement in this area will decrease outcome defects. With the use of auditing and the follow-up technique described, the quality of care in surgery may be assured.
Stockdale, Susan E; Zuchowski, Jessica; Rubenstein, Lisa V; Sapir, Negar; Yano, Elizabeth M; Altman, Lisa; Fickel, Jacqueline J; McDougall, Skye; Dresselhaus, Timothy; Hamilton, Alison B
Although the patient-centered medical home endorses quality improvement principles, methods for supporting ongoing, systematic primary care quality improvement have not been evaluated. We introduced primary care quality councils at six Veterans Health Administration sites as an organizational intervention with three key design elements: (a) fostering interdisciplinary quality improvement leadership, (b) establishing a structured quality improvement process, and (c) facilitating organizationally aligned frontline quality improvement innovation. Our evaluation objectives were to (a) assess design element implementation, (b) describe implementation barriers and facilitators, and (c) assess successful quality improvement project completion and spread. We analyzed administrative records and conducted interviews with 85 organizational leaders. We developed and applied criteria for assessing design element implementation using hybrid deductive/inductive analytic techniques. All quality councils implemented interdisciplinary leadership and a structured quality improvement process, and all but one completed at least one quality improvement project and a toolkit for spreading improvements. Quality councils were perceived as most effective when service line leaders had well-functioning interdisciplinary communication. Matching positions within leadership hierarchies with appropriate supportive roles facilitated frontline quality improvement efforts. Two key resources were (a) a dedicated internal facilitator with project management, data collection, and presentation skills and (b) support for preparing customized data reports for identifying and addressing practice level quality issues. Overall, quality councils successfully cultivated interdisciplinary, multilevel primary care quality improvement leadership with accountability mechanisms and generated frontline innovations suitable for spread. Practice level performance data and quality improvement project management support were critical. In order to successfully facilitate systematic, sustainable primary care quality improvement, regional and executive health care system leaders should engage interdisciplinary practice level leadership in a priority-setting process that encourages frontline innovation and establish local structures such as quality councils to coordinate quality improvement initiatives, ensure accountability, and promote spread of best practices.
A deblocking algorithm based on color psychology for display quality enhancement
NASA Astrophysics Data System (ADS)
Yeh, Chia-Hung; Tseng, Wen-Yu; Huang, Kai-Lin
2012-12-01
This article proposes a post-processing deblocking filter to reduce blocking effects. The proposed algorithm detects blocking effects by fusing the results of Sobel edge detector and wavelet-based edge detector. The filtering stage provides four filter modes to eliminate blocking effects at different color regions according to human color vision and color psychology analysis. Experimental results show that the proposed algorithm has better subjective and objective qualities for H.264/AVC reconstructed videos when compared to several existing methods.
Modal control theory and application to aircraft lateral handling qualities design
NASA Technical Reports Server (NTRS)
Srinathkumar, S.
1978-01-01
A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.
A new hyperspectral imaging based device for quality control in plastic recycling
NASA Astrophysics Data System (ADS)
Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.
2013-05-01
The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.
NASA Astrophysics Data System (ADS)
Hanhart, Philippe; Ebrahimi, Touradj
2014-03-01
Crosstalk and vergence-accommodation rivalry negatively impact the quality of experience (QoE) provided by stereoscopic displays. However, exploiting visual attention and adapting the 3D rendering process on the fly can reduce these drawbacks. In this paper, we propose and evaluate two different approaches that exploit visual attention to improve 3D QoE on stereoscopic displays: an offline system, which uses a saliency map to predict gaze position, and an online system, which uses a remote eye tracking system to measure real time gaze positions. The gaze points were used in conjunction with the disparity map to extract the disparity of the object-of-interest. Horizontal image translation was performed to bring the fixated object on the screen plane. The user preference between standard 3D mode and the two proposed systems was evaluated through a subjective evaluation. Results show that exploiting visual attention significantly improves image quality and visual comfort, with a slight advantage for real time gaze determination. Depth quality is also improved, but the difference is not significant.
[Business organization theory: its potential use in the organization of the operating room].
Bartz, H-J
2005-07-01
The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR.
An Overview of NADE Accreditation
ERIC Educational Resources Information Center
Ferguson, Jennifer; Ludman, Naomi
2018-01-01
Accreditation is a process by which programs demonstrate their academic quality; that is, they demonstrate that they are making decisions for programmatic changes based on: (1) a sound theoretical foundation; (2) clearly stated mission, goals, and objectives; (3) a comprehensive self-study and thoughtful use of best practices; and (4) consistent,…
Decorin content and near infrared spectroscopy analysis of dried collagenous biomaterial samples
USDA-ARS?s Scientific Manuscript database
The efficient removal of proteoglycans, such as decorin, from hide when processing it traditionally to leather is generally acceptable and beneficial for leather quality, especially for softness and flexibility. The objective of this research was to determine the residual decorin content of dried c...
USDA-ARS?s Scientific Manuscript database
Toxoplasma gondii is a common protozoan parasite, whose environmentally-resistant stage, the oocyst, can contaminate irrigation water and fresh edible produce. Current washing steps in produce processing may not be effective for eliminating T. gondii from at-risk varieties of produce. The objective ...
Scoping review of potential quality indicators for hip fracture patient care
Pitzul, Kristen B; Munce, Sarah E P; Perrier, Laure; Beaupre, Lauren; Morin, Suzanne N; McGlasson, Rhona; Jaglal, Susan B
2017-01-01
Objective The purpose of this study is to identify existing or potential quality of care indicators (ie, current indicators as well as process and outcome measures) in the acute or postacute period, or across the continuum of care for older adults with hip fracture. Design Scoping review. Setting All care settings. Search strategy English peer-reviewed studies published from January 2000 to January 2016 were included. Literature search strategies were developed, and the search was peer-reviewed. Two reviewers independently piloted all forms, and all articles were screened in duplicate. Results The search yielded 2729 unique articles, of which 302 articles were included (11.1%). When indicators (eg, in-hospital mortality, acute care length of stay) and potential indicators (eg, comorbidities developed in hospital, walking ability) were grouped by the outcome or process construct they were trying to measure, the most common constructs were measures of mortality (outcome), length of stay (process) and time-sensitive measures (process). There was heterogeneity in definitions within constructs between studies. There was also a paucity of indicators and potential indicators in the postacute period. Conclusions To improve quality of care for patients with hip fracture and create a more efficient healthcare system, mechanisms for the measurement of quality of care across the entire continuum, not just during the acute period, are required. Future research should focus on decreasing the heterogeneity in definitions of quality indicators and the development and implementation of quality indicators for the postacute period. PMID:28325859
Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean
2015-06-01
According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes. Prognostic study, level III.
An elementary research on wireless transmission of holographic 3D moving pictures
NASA Astrophysics Data System (ADS)
Takano, Kunihiko; Sato, Koki; Endo, Takaya; Asano, Hiroaki; Fukuzawa, Atsuo; Asai, Kikuo
2009-05-01
In this paper, a transmitting process of a sequence of holograms describing 3D moving objects over the communicating wireless-network system is presented. A sequence of holograms involves holograms is transformed into a bit stream data, and then it is transmitted over the wireless LAN and Bluetooth. It is shown that applying this technique, holographic data of 3D moving object is transmitted in high quality and a relatively good reconstruction of holographic images is performed.
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Response Ant Colony Optimization of End Milling Surface Roughness
Kadirgama, K.; Noor, M. M.; Abd Alla, Ahmed N.
2010-01-01
Metal cutting processes are important due to increased consumer demands for quality metal cutting related products (more precise tolerances and better product surface roughness) that has driven the metal cutting industry to continuously improve quality control of metal cutting processes. This paper presents optimum surface roughness by using milling mould aluminium alloys (AA6061-T6) with Response Ant Colony Optimization (RACO). The approach is based on Response Surface Method (RSM) and Ant Colony Optimization (ACO). The main objectives to find the optimized parameters and the most dominant variables (cutting speed, feedrate, axial depth and radial depth). The first order model indicates that the feedrate is the most significant factor affecting surface roughness. PMID:22294914
Alemnji, George; Edghill, Lisa; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc
2017-01-01
Background Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. Objectives We report the development of a stepwise process for quality systems improvement in the Caribbean Region. Methods The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called ‘Laboratory Quality Management System – Stepwise Improvement Process (LQMS-SIP) Towards Accreditation’ to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. Results This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. Conclusion This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement. PMID:28879149
Digital holographic image fusion for a larger size object using compressive sensing
NASA Astrophysics Data System (ADS)
Tian, Qiuhong; Yan, Liping; Chen, Benyong; Yao, Jiabao; Zhang, Shihua
2017-05-01
Digital holographic imaging fusion for a larger size object using compressive sensing is proposed. In this method, the high frequency component of the digital hologram under discrete wavelet transform is represented sparsely by using compressive sensing so that the data redundancy of digital holographic recording can be resolved validly, the low frequency component is retained totally to ensure the image quality, and multiple reconstructed images with different clear parts corresponding to a laser spot size are fused to realize the high quality reconstructed image of a larger size object. In addition, a filter combing high-pass and low-pass filters is designed to remove the zero-order term from a digital hologram effectively. The digital holographic experimental setup based on off-axis Fresnel digital holography was constructed. The feasible and comparative experiments were carried out. The fused image was evaluated by using the Tamura texture features. The experimental results demonstrated that the proposed method can improve the processing efficiency and visual characteristics of the fused image and enlarge the size of the measured object effectively.
Assessing Program Learning Objectives to Improve Undergraduate Physics Education
NASA Astrophysics Data System (ADS)
Menke, Carrie
2014-03-01
Our physics undergraduate program has five program learning objectives (PLOs) focusing on (1) physical principles, (2) mathematical expertise, (3) experimental technique, (4) communication and teamwork, and (5) research proficiency. One PLO is assessed each year, with the results guiding modifications in our curriculum and future assessment practices; we have just completed our first cycle of assessing all PLOs. Our approach strives to maximize the ease and applicability of our assessment practices while maintaining faculty's flexibility in course design and delivery. Objectives are mapped onto our core curriculum with identified coursework collected as direct evidence. We've utilized mostly descriptive rubrics, applying them at the course and program levels as well as sharing them with the students. This has resulted in more efficient assessment that is also applicable to reaccreditation efforts, higher inter-rater reliability than with other rubric types, and higher quality capstone projects. We've also found that the varied quality of student writing can interfere with our assessment of other objectives. This poster outlines our processes, resources, and how we have used PLO assessment to strengthen our undergraduate program.
Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera
NASA Astrophysics Data System (ADS)
Dziri, Aziz; Duranton, Marc; Chapuis, Roland
2016-07-01
Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.
Applying image quality in cell phone cameras: lens distortion
NASA Astrophysics Data System (ADS)
Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje
2009-01-01
This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.
NASA Astrophysics Data System (ADS)
Kumbhar, N. N.; Mulay, A. V.
2016-08-01
The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.
Hulshof, C T; Verbeek, J H; van Dijk, F J; van der Weide, W E; Braam, I T
1999-06-01
To study the nature and extent of evaluation research in occupational health services (OHSs). Literature review of evaluation research in OHSs. On the basis of a conceptual model of OHS evaluation, empirical studies are categorised into aspects of input, process, output, outcome, and OHS core activities. Many methods to evaluate OHSs or OHS activities exist, depending on the objective and object of evaluation. The amount of empirical studies on evaluation of OHSs or OHS activities that met the non-restrictive inclusion criteria, was remarkably limited. Most of the 52 studies were more descriptive than evaluative. The methodological quality of most studies was not high. A differentiated picture of the evidence of effectiveness of OHSs arises. Occupational health consultations and occupational rehabilitation are hardly studied despite much time spent on the consultation by occupational physicians in most countries. The lack of effectiveness and efficiency of the pre-employment examination should lead to its abandonment as a means of selection of personnel by OHSs. Periodic health monitoring or surveillance, and education on occupational health hazards can be carried out with reasonable process quality. Identification and evaluation of occupational health hazards by a workplace survey can be done with a high output quality, which, however, does not guarantee a favourable outcome. Although rigorous study designs are not always applicable or feasible in daily practice, much more effort should be directed at the scientific evaluation of OHSs and OHS instruments. To develop evidence-based occupational health care the quality of evaluation studies should be improved. In particular, process and outcome of consultation and rehabilitation activities of occupational physicians need to be studied more.
Mori, S
2014-05-01
To ensure accuracy in respiratory-gating treatment, X-ray fluoroscopic imaging is used to detect tumour position in real time. Detection accuracy is strongly dependent on image quality, particularly positional differences between the patient and treatment couch. We developed a new algorithm to improve the quality of images obtained in X-ray fluoroscopic imaging and report the preliminary results. Two oblique X-ray fluoroscopic images were acquired using a dynamic flat panel detector (DFPD) for two patients with lung cancer. The weighting factor was applied to the DFPD image in respective columns, because most anatomical structures, as well as the treatment couch and port cover edge, were aligned in the superior-inferior direction when the patient lay on the treatment couch. The weighting factors for the respective columns were varied until the standard deviation of the pixel values within the image region was minimized. Once the weighting factors were calculated, the quality of the DFPD image was improved by applying the factors to multiframe images. Applying the image-processing algorithm produced substantial improvement in the quality of images, and the image contrast was increased. The treatment couch and irradiation port edge, which were not related to a patient's position, were removed. The average image-processing time was 1.1 ms, showing that this fast image processing can be applied to real-time tumour-tracking systems. These findings indicate that this image-processing algorithm improves the image quality in patients with lung cancer and successfully removes objects not related to the patient. Our image-processing algorithm might be useful in improving gated-treatment accuracy.
Increasing reconstruction quality of diffractive optical elements displayed with LC SLM
NASA Astrophysics Data System (ADS)
Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Sergey N.
2015-03-01
Phase liquid crystal (LC) spatial light modulators (SLM) are actively used in various applications. However, majority of scientific applications require stable phase modulation which might be hard to achieve with commercially available SLM due to its consumer origin. The use of digital voltage addressing scheme leads to phase temporal fluctuations, which results in lower diffraction efficiency and reconstruction quality of displayed diffractive optical elements (DOE). Due to high periodicity of fluctuations it should be possible to use knowledge of these fluctuations during DOE synthesis to minimize negative effect. We synthesized DOE using accurately measured phase fluctuations of phase LC SLM "HoloEye PLUTO VIS" to minimize its negative impact on displayed DOE reconstruction. Synthesis was conducted with versatile direct search with random trajectory (DSRT) method in the following way. Before DOE synthesis begun, two-dimensional dependency of SLM phase shift on addressed signal level and time from frame start was obtained. Then synthesis begins. First, initial phase distribution is created. Second, random trajectory of consecutive processing of all DOE elements is generated. Then iterative process begins. Each DOE element sequentially has its value changed to one that provides better value of objective criterion, e.g. lower deviation of reconstructed image from original one. If current element value provides best objective criterion value then it left unchanged. After all elements are processed, iteration repeats until stagnation is reached. It is demonstrated that application of SLM phase fluctuations knowledge in DOE synthesis with DSRT method leads to noticeable increase of DOE reconstruction quality.
NASA Astrophysics Data System (ADS)
Dahlan, Muhammad Hatta; Saleh, Abdullah; Asip, Faisol; Makmun, Akbar; Defi
2017-11-01
Application of membrane technology based on clay mixture, Activated Carbon from Bintaro, Zeolite and Bentonit to process the waste water of Songket cloth is Palembang traditionally cloth. The applied research is into the superior field of industrial and household waste processing with membrane ceramic technology. The objective of this research is to design the liquid waste separation tool of jumputan cloth using better and simpler ceramic membrane so that it can help the artisans of Palembang songket or songket in processing the waste in accordance with the standard of environmental quality standard (BML) and Pergub Sumsel no. 16 in 2005. The specific target to be achieved can decrease the waste of cloth jumputan in accordance with applicable environmental quality standards the method used in achieving the objectives of this study using 2 processes namely the adsorption process using activated carbon and the separation process using a ceramic membrane based on the composition of the mixture. The activated carbon from bintaro seeds is expected to decrease the concentration of liquid waste of Songket cloth. Bintaro seeds are non-edible fruits where the composition contains organic ingredients that can absorb because contains dyes and filler metals. The process of membranization in the processing is expected to decrease the concentration of waste better and clear water that can be used as recycled water for household use. With the composition of a mixture of clay-based materials: zeolite, bentonit, activated carbon from bintaro seeds are expected Find the solution and get the novelty value in the form of patent in this research
Bak, Jin Seop
2015-01-01
In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Structured light optical microscopy for three-dimensional reconstruction of technical surfaces
NASA Astrophysics Data System (ADS)
Kettel, Johannes; Reinecke, Holger; Müller, Claas
2016-04-01
In microsystems technology quality control of micro structured surfaces with different surface properties is playing an ever more important role. The process of quality control incorporates three-dimensional (3D) reconstruction of specularand diffusive reflecting technical surfaces. Due to the demand on high measurement accuracy and data acquisition rates, structured light optical microscopy has become a valuable solution to solve this problem providing high vertical and lateral resolution. However, 3D reconstruction of specular reflecting technical surfaces still remains a challenge to optical measurement principles. In this paper we present a measurement principle based on structured light optical microscopy which enables 3D reconstruction of specular- and diffusive reflecting technical surfaces. It is realized using two light paths of a stereo microscope equipped with different magnification levels. The right optical path of the stereo microscope is used to project structured light onto the object surface. The left optical path is used to capture the structured illuminated object surface with a camera. Structured light patterns are generated by a Digital Light Processing (DLP) device in combination with a high power Light Emitting Diode (LED). Structured light patterns are realized as a matrix of discrete light spots to illuminate defined areas on the object surface. The introduced measurement principle is based on multiple and parallel processed point measurements. Analysis of the measured Point Spread Function (PSF) by pattern recognition and model fitting algorithms enables the precise calculation of 3D coordinates. Using exemplary technical surfaces we demonstrate the successful application of our measurement principle.
A Hybrid Interval-Robust Optimization Model for Water Quality Management.
Xu, Jieyu; Li, Yongping; Huang, Guohe
2013-05-01
In water quality management problems, uncertainties may exist in many system components and pollution-related processes ( i.e. , random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval-robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements.
ISO9000 and the quality management system in the digital hospital.
Liu, Yalan; Yao, Bin; Zhang, Zigang
2002-01-01
ISO9000 quality management system (ISO9000QMS) emphasize on the customer-oriented, managers' leadership and all staff's joining, adopt the process method and system management, spread the taking facts as a basis to make decision and improve consistently, and establish win-win relation with the suppliers. So, the digital hospital can adopt the ISO9000QMS. In order to establish the ISO9000QMS, the digital hospital should: (1) Design integrally, including analyzing the operation procedure, clarifying the job duties, setting up the spreading team and setting the quality policy and objectives: (2) Learning the ISO9000 quality standards; (3) Drawing up the documents, including the quality manual, program files and operation guiding files; (4) Training according the documents; (5) Executing the quality standard, including the service quality auditing, quality record auditing and quality system auditing; (6) Improving continually. With the establishment of ISO900QMS, the digital hospital can appraise more accurately, analyze quality matters statistically and avoid the interference of artificial factors.
NASA Astrophysics Data System (ADS)
Nash, A. E., III
2017-12-01
The most common approaches to identifying the most effective mission design to maximize science return from a potential set of competing alternative design approaches are often inefficient and inaccurate. Recently, Team-X at the Jet Propulsion Laboratory undertook an effort to improve both the speed and quality of science - measurement - mission design trade studies. We will report on the methodology & processes employed and their effectiveness in trade study speed and quality. Our results indicate that facilitated subject matter expert peers are the keys to speed and quality improvements in the effectiveness of science - measurement - mission design trade studies.
Small satellite product assurance
NASA Astrophysics Data System (ADS)
Demontlivault, J.; Cadelec, Jacques
1993-01-01
In order to increase the interest in small satellites, their cost must be reduced; reducing product assurance costs induced by quality requirements is a major objective. For a logical approach, small satellites are classified in three main categories: satellites for experimental operations with a short lifetime, operational satellites manufactured in small mass with long lifetime requirements, operational satellites (long lifetime required), of which only a few models are produced. The various requirements as regards the product assurance are examined for each satellite category: general requirements for space approach, reliability, electronic components, materials and processes, quality assurance, documentation, tests, and management. Ideal product assurance system integrates quality teams and engineering teams.
Enabling nutrient security and sustainability through systems research.
Kaput, Jim; Kussmann, Martin; Mendoza, Yery; Le Coutre, Ronit; Cooper, Karen; Roulin, Anne
2015-05-01
Human and companion animal health depends upon nutritional quality of foods. Seed varieties, seasonal and local growing conditions, transportation, food processing, and storage, and local food customs can influence the nutrient content of food. A new and intensive area of investigation is emerging that recognizes many factors in these agri-food systems that influence the maintenance of nutrient quality which is fundamental to ensure nutrient security for world populations. Modeling how these systems function requires data from different sectors including agricultural, environmental, social, and economic, but also must incorporate basic nutrition and other biomedical sciences. Improving the agri-food system through advances in pre- and post-harvest processing methods, biofortification, or fortifying processed foods will aid in targeting nutrition for populations and individuals. The challenge to maintain and improve nutrient quality is magnified by the need to produce food locally and globally in a sustainable and consumer-acceptable manner for current and future populations. An unmet requirement for assessing how to improve nutrient quality, however, is the basic knowledge of how to define health. That is, health cannot be maintained or improved by altering nutrient quality without an adequate definition of what health means for individuals and populations. Defining and measuring health therefore becomes a critical objective for basic nutritional and other biomedical sciences.
Martínez-Pardo, María Esther; Mariano-Magaña, David
2007-01-01
Tissue banking is a complex operation concerned with the organisation and coordination of all the steps, that is, from donor selection up to storage and distribution of the final products for therapeutic, diagnostic, instruction and research purposes. An appropriate quality framework should be established in order to cover all the specific methodology as well as the general aspects of quality management, such as research and development, design, instruction and training, specific documentation, traceability, corrective action, client satisfaction, and the like. Such a framework can be obtained by developing a quality management system (QMS) in accordance with a suitable international standard: ISO 9001:2000. This paper presents the implementation process of the tissue bank QMS at the Instituto Nacional de Investigaciones Nucleares in Mexico. The objective of the paper is to share the experience gained by the tissue bank personnel [radiosterilised tissue bank (BTR)] at the Instituto Nacional de Investigaciones Nucleares (ININ, National Institute of Nuclear Research), during implementation of the ISO 9001:2000 certification process. At present, the quality management system (QMS) of ININ also complies with the Mexican standard NMX-CC-9001:2000. The scope of this QMS is Research, Development and Processing of Biological Tissues Sterilised by Gamma Radiation, among others.
Hahlweg, Pola; Didi, Sarah; Kriston, Levente; Härter, Martin; Nestoriuc, Yvonne; Scholl, Isabelle
2017-11-17
The quality of decision-making in multidisciplinary team meetings (MDTMs) depends on the quality of information presented and the quality of team processes. Few studies have examined these factors using a standardized approach. The aim of this study was to objectively document the processes involved in decision-making in MDTMs, document the outcomes in terms of whether a treatment recommendation was given (none vs. singular vs. multiple), and to identify factors related to type of treatment recommendation. An adaptation of the observer rating scale Multidisciplinary Tumor Board Metric for the Observation of Decision-Making (MDT-MODe) was used to assess the quality of the presented information and team processes in MDTMs. Data was analyzed using descriptive statistics and mixed logistic regression analysis. N = 249 cases were observed in N = 29 MDTMs. While cancer-specific medical information was judged to be of high quality, psychosocial information and information regarding patient views were considered to be of low quality. In 25% of the cases no, in 64% one, and in 10% more than one treatment recommendations were given (1% missing data). Giving no treatment recommendation was associated with duration of case discussion, duration of the MDTM session, quality of case history, quality of radiological information, and specialization of the MDTM. Higher levels of medical and treatment uncertainty during discussions were found to be associated with a higher probability for more than one treatment recommendation. The quality of different aspects of information was observed to differ greatly. In general, we did not find MDTMs to be in line with the principles of patient-centered care. Recommendation outcome varied substantially between different specializations of MDTMs. The quality of certain information was associated with the recommendation outcome. Uncertainty during discussions was related to more than one recommendation being considered. Time constraints were found to play an important role. Some of those aspects seem modifiable, which offers possibilities for the reorganization of MDTMs.
On the performance of metrics to predict quality in point cloud representations
NASA Astrophysics Data System (ADS)
Alexiou, Evangelos; Ebrahimi, Touradj
2017-09-01
Point clouds are a promising alternative for immersive representation of visual contents. Recently, an increased interest has been observed in the acquisition, processing and rendering of this modality. Although subjective and objective evaluations are critical in order to assess the visual quality of media content, they still remain open problems for point cloud representation. In this paper we focus our efforts on subjective quality assessment of point cloud geometry, subject to typical types of impairments such as noise corruption and compression-like distortions. In particular, we propose a subjective methodology that is closer to real-life scenarios of point cloud visualization. The performance of the state-of-the-art objective metrics is assessed by considering the subjective scores as the ground truth. Moreover, we investigate the impact of adopting different test methodologies by comparing them. Advantages and drawbacks of every approach are reported, based on statistical analysis. The results and conclusions of this work provide useful insights that could be considered in future experimentation.
Has the use of talc an effect on yield and extra virgin olive oil quality?
Caponio, Francesco; Squeo, Giacomo; Difonzo, Graziana; Pasqualone, Antonella; Summo, Carmine; Paradiso, Vito Michele
2016-08-01
The maximization of both extraction yield and extra virgin olive oil quality during olive processing are the main objectives of the olive oil industry. As regards extraction yield, it can be improved by both acting on time/temperature of malaxation and using physical coadjuvants. It is well known that, generally, increasing temperature of malaxation gives an increase in oil extraction yield due to a reduction in oily phase viscosity; however, high malaxation temperature can compromise the nutritional and health values of extra virgin olive oil, leading to undesirable effects such as accelerated oxidative process and loss of volatile compounds responsible for oil flavor and fragrance. The addition of physical coadjuvants in olive oil processing during the malaxation phase, not excluded by EC regulations owing to its exclusively physical action, is well known to promote the breakdown of oil/water emulsions and consequently make oil extraction easier, thus increasing the yield. Among physical coadjuvants, micronized natural talc is used for olive oil processing above all for Spanish and Italian olive cultivars. The quality of extra virgin olive oil depends on numerous variables such as olive cultivar, ripeness degree and quality, machines utilized for processing, oil storage conditions, etc. However, the coadjuvants utilized in olive processing can also influence virgin olive oil characteristics. The literature highlights an increase in oil yield by micronized natural talc addition during olive processing, whereas no clear trend was observed as regards the chemical, nutritional and sensory characteristics of extra virgin olive oil. Although an increase in oil stability was reported, no effect of talc was found on the evolution of virgin olive oil quality indices during storage. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Ayhan, Zehra; Eştürk, Okan
2009-06-01
Minimally processed ready-to-eat pomegranate arils have become popular due to their convenience, high value, unique sensory characteristics, and health benefits. The objective of this study was to monitor quality parameters and to extend the shelf life of ready-to-eat pomegranate arils packaged with modified atmospheres. Minimally processed pomegranate arils were packed in PP trays sealed with BOPP film under 4 atmospheres including low and super atmospheric oxygen. Packaged arils were stored at 5 degrees C for 18 d and monitored for internal atmosphere and quality attributes. Atmosphere equilibrium was reached for all MAP applications except for high oxygen. As a general trend, slight or no significant change was detected in chemical and physical attributes of pomegranate arils during cold storage. The aerobic mesophilic bacteria were in the range of 2.30 to 4.51 log CFU/g at the end of the storage, which did not affect the sensory quality. Overall, the pomegranate arils packed with air, nitrogen, and enriched oxygen kept quality attributes and were acceptable to sensory panelists on day 18; however, marketability period was limited to 15 d for the low oxygen atmosphere. PP trays sealed with BOPP film combined with either passive or active modified atmospheres and storage at 5 degrees C provided commercially acceptable arils for 18 d with high quality and convenience.
Faulkner, K; Järvinen, H; Butler, P; McLean, I D; Pentecost, M; Rickard, M; Abdullah, B
2010-01-01
The International Atomic Energy Agency (IAEA) has a mandate to assist member states in areas of human health and particularly in the use of radiation for diagnosis and treatment. Clinical audit is seen as an essential tool to assist in assuring the quality of radiation medicine, particularly in the instance of multidisciplinary audit of diagnostic radiology. Consequently, an external clinical audit programme has been developed by the IAEA to examine the structure and processes existent at a clinical site, with the basic objectives of: (1) improvement in the quality of patient care; (2) promotion of the effective use of resources; (3) enhancement of the provision and organisation of clinical services; (4) further professional education and training. These objectives apply in four general areas of service delivery, namely quality management and infrastructure, patient procedures, technical procedures and education, training and research. In the IAEA approach, the audit process is initiated by a request from the centre seeking the audit. A three-member team, comprising a radiologist, medical physicist and radiographer, subsequently undertakes a 5-d audit visit to the clinical site to perform the audit and write the formal audit report. Preparation for the audit visit is crucial and involves the local clinical centre completing a form, which provides the audit team with information on the clinical centre. While all main aspects of clinical structure and process are examined, particular attention is paid to radiation-related activities as described in the relevant documents such as the IAEA Basic Safety Standards, the Code of Practice for Dosimetry in Diagnostic Radiology and related equipment and quality assurance documentation. It should be stressed, however, that the clinical audit does not have any regulatory function. The main purpose of the IAEA approach to clinical audit is one of promoting quality improvement and learning. This paper describes the background to the clinical audit programme and the IAEA clinical audit protocol.
Small target detection using objectness and saliency
NASA Astrophysics Data System (ADS)
Zhang, Naiwen; Xiao, Yang; Fang, Zhiwen; Yang, Jian; Wang, Li; Li, Tao
2017-10-01
We are motived by the need for generic object detection algorithm which achieves high recall for small targets in complex scenes with acceptable computational efficiency. We propose a novel object detection algorithm, which has high localization quality with acceptable computational cost. Firstly, we obtain the objectness map as in BING[1] and use NMS to get the top N points. Then, k-means algorithm is used to cluster them into K classes according to their location. We set the center points of the K classes as seed points. For each seed point, an object potential region is extracted. Finally, a fast salient object detection algorithm[2] is applied to the object potential regions to highlight objectlike pixels, and a series of efficient post-processing operations are proposed to locate the targets. Our method runs at 5 FPS on 1000*1000 images, and significantly outperforms previous methods on small targets in cluttered background.
NASA Technical Reports Server (NTRS)
1992-01-01
The George M. Low Trophy is awarded to current NASA contractors, subcontractors, and suppliers in the aerospace industry who have demonstrated sustained excellence and outstanding achievements in quality and productivity for three or more years. The objectives of the award are to increase public awareness of the importance of quality and productivity to the Nation's aerospace program and industry in general; encourage domestic business to continue efforts to enhance quality, increase productivity, and thereby strengthen competitiveness; and provide the means for sharing the successful methods and techniques used by the applicants with other American enterprises. Information is given on candidate eligibility for large businesses, the selection process, the nomination letter, and the application report.
Roy, Alexis T; Carver, Courtney; Jiradejvong, Patpong; Limb, Charles J
2015-01-01
Med-El cochlear implant (CI) patients are typically programmed with either the fine structure processing (FSP) or high-definition continuous interleaved sampling (HDCIS) strategy. FSP is the newer-generation strategy and aims to provide more direct encoding of fine structure information compared with HDCIS. Since fine structure information is extremely important in music listening, FSP may offer improvements in musical sound quality for CI users. Despite widespread clinical use of both strategies, few studies have assessed the possible benefits in music perception for the FSP strategy. The objective of this study is to measure the differences in musical sound quality discrimination between the FSP and HDCIS strategies. Musical sound quality discrimination was measured using a previously designed evaluation, called Cochlear Implant-MUltiple Stimulus with Hidden Reference and Anchor (CI-MUSHRA). In this evaluation, participants were required to detect sound quality differences between an unaltered real-world musical stimulus and versions of the stimulus in which various amount of bass (low) frequency information was removed via a high-pass filer. Eight CI users, currently using the FSP strategy, were enrolled in this study. In the first session, participants completed the CI-MUSHRA evaluation with their FSP strategy. Patients were then programmed with the clinical-default HDCIS strategy, which they used for 2 months to allow for acclimatization. After acclimatization, each participant returned for the second session, during which they were retested with HDCIS, and then switched back to their original FSP strategy and tested acutely. Sixteen normal-hearing (NH) controls completed a CI-MUSHRA evaluation for comparison, in which NH controls listened to music samples under normal acoustic conditions, without CI stimulation. Sensitivity to high-pass filtering more closely resembled that of NH controls when CI users were programmed with the clinical-default FSP strategy compared with performance when programmed with HDCIS (mixed-design analysis of variance, p < 0.05). The clinical-default FSP strategy offers improvements in musical sound quality discrimination for CI users with respect to bass frequency perception. This improved bass frequency discrimination may in turn support enhanced musical sound quality. This is the first study that has demonstrated objective improvements in musical sound quality discrimination with the newer-generation FSP strategy. These positive results may help guide the selection of processing strategies for Med-El CI patients. In addition, CI-MUSHRA may also provide a novel method for assessing the benefits of newer processing strategies in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
[Infrastructure and contents of clinical data management plan].
Shen, Tong; Xu, Lie-dong; Fu, Hai-jun; Liu, Yan; He, Jia; Chen, Ping-yan; Song, Yu-fei
2015-11-01
Establishment of quality management system (QMS) plays a critical role in the clinical data management (CDM). The objectives of CDM are to ensure the quality and integrity of the trial data. Thus, every stage or element that may impact the quality outcomes of clinical studies should be in the controlled manner, which is referred to the full life cycle of CDM associated with the data collection, handling and statistical analysis of trial data. Based on the QMS, this paper provides consensus on how to develop a compliant clinical data management plan (CDMP). According to the essential requirements of the CDM, the CDMP should encompass each process of data collection, data capture and cleaning, medical coding, data verification and reconciliation, database monitoring and management, external data transmission and integration, data documentation and data quality assurance and so on. Creating and following up data management plan in each designed data management steps, dynamically record systems used, actions taken, parties involved will build and confirm regulated data management processes, standard operational procedures and effective quality metrics in all data management activities. CDMP is one of most important data management documents that is the solid foundation for clinical data quality.
Roldan, Stephanie M
2017-01-01
One of the fundamental goals of object recognition research is to understand how a cognitive representation produced from the output of filtered and transformed sensory information facilitates efficient viewer behavior. Given that mental imagery strongly resembles perceptual processes in both cortical regions and subjective visual qualities, it is reasonable to question whether mental imagery facilitates cognition in a manner similar to that of perceptual viewing: via the detection and recognition of distinguishing features. Categorizing the feature content of mental imagery holds potential as a reverse pathway by which to identify the components of a visual stimulus which are most critical for the creation and retrieval of a visual representation. This review will examine the likelihood that the information represented in visual mental imagery reflects distinctive object features thought to facilitate efficient object categorization and recognition during perceptual viewing. If it is the case that these representational features resemble their sensory counterparts in both spatial and semantic qualities, they may well be accessible through mental imagery as evaluated through current investigative techniques. In this review, methods applied to mental imagery research and their findings are reviewed and evaluated for their efficiency in accessing internal representations, and implications for identifying diagnostic features are discussed. An argument is made for the benefits of combining mental imagery assessment methods with diagnostic feature research to advance the understanding of visual perceptive processes, with suggestions for avenues of future investigation.
Roldan, Stephanie M.
2017-01-01
One of the fundamental goals of object recognition research is to understand how a cognitive representation produced from the output of filtered and transformed sensory information facilitates efficient viewer behavior. Given that mental imagery strongly resembles perceptual processes in both cortical regions and subjective visual qualities, it is reasonable to question whether mental imagery facilitates cognition in a manner similar to that of perceptual viewing: via the detection and recognition of distinguishing features. Categorizing the feature content of mental imagery holds potential as a reverse pathway by which to identify the components of a visual stimulus which are most critical for the creation and retrieval of a visual representation. This review will examine the likelihood that the information represented in visual mental imagery reflects distinctive object features thought to facilitate efficient object categorization and recognition during perceptual viewing. If it is the case that these representational features resemble their sensory counterparts in both spatial and semantic qualities, they may well be accessible through mental imagery as evaluated through current investigative techniques. In this review, methods applied to mental imagery research and their findings are reviewed and evaluated for their efficiency in accessing internal representations, and implications for identifying diagnostic features are discussed. An argument is made for the benefits of combining mental imagery assessment methods with diagnostic feature research to advance the understanding of visual perceptive processes, with suggestions for avenues of future investigation. PMID:28588538
Business process modeling in healthcare.
Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd
2012-01-01
The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.
Harvesting geographic features from heterogeneous raster maps
NASA Astrophysics Data System (ADS)
Chiang, Yao-Yi
2010-11-01
Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial dataset. The road vectorization and text recognition results outperform state-of-art commercial products, and with considerably less user input. The approach in this thesis allows us to make use of the geospatial information of heterogeneous maps locked in raster format.
Cutting Zone Temperature Identification During Machining of Nickel Alloy Inconel 718
NASA Astrophysics Data System (ADS)
Czán, Andrej; Daniš, Igor; Holubják, Jozef; Zaušková, Lucia; Czánová, Tatiana; Mikloš, Matej; Martikáň, Pavol
2017-12-01
Quality of machined surface is affected by quality of cutting process. There are many parameters, which influence on the quality of the cutting process. The cutting temperature is one of most important parameters that influence the tool life and the quality of machined surfaces. Its identification and determination is key objective in specialized machining processes such as dry machining of hard-to-machine materials. It is well known that maximum temperature is obtained in the tool rake face at the vicinity of the cutting edge. A moderate level of cutting edge temperature and a low thermal shock reduce the tool wear phenomena, and a low temperature gradient in the machined sublayer reduces the risk of high tensile residual stresses. The thermocouple method was used to measure the temperature directly in the cutting zone. An original thermocouple was specially developed for measuring of temperature in the cutting zone, surface and subsurface layers of machined surface. This paper deals with identification of temperature and temperature gradient during dry peripheral milling of Inconel 718. The measurements were used to identification the temperature gradients and to reconstruct the thermal distribution in cutting zone with various cutting conditions.
Physical analyses of compost from composting plants in Brazil.
Barreira, L P; Philippi Junior, A; Rodrigues, M S; Tenório, J A S
2008-01-01
Nowadays the composting process has shown itself to be an alternative in the treatment of municipal solid wastes by composting plants. However, although more than 50% of the waste generated by the Brazilian population is composed of matter susceptible to organic composting, this process is, still today, insufficiently developed in Brazil, due to low compost quality and lack of investments in the sector. The objective of this work was to use physical analyses to evaluate the quality of the compost produced at 14 operative composting plants in the Sao Paulo State in Brazil. For this purpose, size distribution and total inert content tests were done. The results were analyzed by grouping the plants according to their productive processes: plants with a rotating drum, plants with shredders or mills, and plants without treatment after the sorting conveyor belt. Compost quality was analyzed considering the limits imposed by the Brazilian Legislation and the European standards for inert contents. The size distribution tests showed the influence of the machinery after the sorting conveyer on the granule sizes as well as the inert content, which contributes to the presence of materials that reduce the quality of the final product.
Fields, Dail; Roman, Paul M; Blum, Terry C
2012-01-01
Objective To examine the relationships among general management systems, patient-focused quality management/continuous process improvement (TQM/CPI) processes, resource availability, and multiple dimensions of substance use disorder (SUD) treatment. Data Sources/Study Setting Data are from a nationally representative sample of 221 SUD treatment centers through the National Treatment Center Study (NTCS). Study Design The design was a cross-sectional field study using latent variable structural equation models. The key variables are management practices, TQM/continuous quality improvement (CQI) practices, resource availability, and treatment center performance. Data Collection Interviews and questionnaires provided data from treatment center administrative directors and clinical directors in 2007–2008. Principal Findings Patient-focused TQM/CQI practices fully mediated the relationship between internal management practices and performance. The effects of TQM/CQI on performance are significantly larger for treatment centers with higher levels of staff per patient. Conclusions Internal management practices may create a setting that supports implementation of specific patient-focused practices and protocols inherent to TQM/CQI processes. However, the positive effects of internal management practices on treatment center performance occur through use of specific patient-focused TQM/CPI practices and have more impact when greater amounts of supporting resources are present. PMID:22098342
Sewell, David K; Lilburn, Simon D; Smith, Philip L
2016-11-01
A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can occur. The need to orient the focus of attention implies that single-object accounts typically predict response time costs associated with object selection even when working memory is not full (i.e., memory load is less than 4 items). For other theories that assume storage of multiple items in the focus of attention, predictions depend on specific assumptions about the way resources are allocated among items held in the focus, and how this affects the time course of retrieval of items from the focus. These broad theoretical accounts have been difficult to distinguish because conventional analyses fail to separate components of empirical response times related to decision-making from components related to selection and retrieval processes associated with accessing information in working memory. To better distinguish these response time components from one another, we analyze data from a probed visual working memory task using extensions of the diffusion decision model. Analysis of model parameters revealed that increases in memory load resulted in (a) reductions in the quality of the underlying stimulus representations in a manner consistent with a sample size model of visual working memory capacity and (b) systematic increases in the time needed to selectively access a probed representation in memory. The results are consistent with single-object theories of the focus of attention. The results are also consistent with a subset of theories that assume a multiobject focus of attention in which resource allocation diminishes both the quality and accessibility of the underlying representations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)
ERIC Educational Resources Information Center
Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan
2015-01-01
Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…
The size and quality of soil organic matter (SOM) pool can vary between ecosystems and can affect many soil properties. The objective of this study was to examine the relationship between gross N transformation rates and microbial populations and to investigate the role that SOM...
USDA-ARS?s Scientific Manuscript database
Certain roasted peanut quality sensory attributes are important breeding objectives for peanut product manufacturers and consumers. Currently the only means of measuring these traits is the use of a trained sensory panel. This is a costly and time-consuming process. It is desirable, from a cost, ti...
78 FR 49337 - Direct Grant Programs and Definitions That Apply to Department Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-13
... Department's grant process with the Secretary's policy objectives and allow Department programs to design....210(c) (Quality of the Project Design) (Amended Sec. Sec. 75.209 and 75.210); 6. Authorize program... regulations. We group major issues according to subject. Analysis of Comments and Changes: An analysis of the...
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy/Environmental eXtender (APEX) is a watershed-scale water quality model that includes detailed representation of agricultural management but currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop a process-based ...
ERIC Educational Resources Information Center
Trueman, Stephen; Borrell-Damian, Lidia; Smith, John H.
2014-01-01
The modernisation process of universities has historically highlighted the necessity of providing support structures to facilitate contacts and relationships between research groups and the outside environment, with the objective of increasing the quantity and improving the quality of collaborative research activity. The first steps in this…
42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... effective management, safety, and proper performance of chest image acquisition, digitization, processing... digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object (e.g... radiographic image files from six or more sample chest radiographs that are of acceptable quality to one or...
Overarching objectives for the development of the East Fork Watershed Test Bed in Southwestern Ohio include: 1) providing research infrastructure for integrating risk assessment and management research on the scale of a large multi-use watershed (1295 km2); 2) Focusing on process...
Health Information Exchange: The Determinants of Usage and the Impact on Utilization
ERIC Educational Resources Information Center
Vest, Joshua Ryan
2010-01-01
Health information exchange (HIE) is the process of electronically sharing patient-level information among different organizations with the objectives of quality and cost improvements. The adoption of HIE in the United States is not widespread, but numerous efforts at facilitating HIE exist and the incentives for electronic health record system…
In the process of adapting the Agency's Data Quality Objectives Workshop for presentation at an ORD Research Facility, ownership and consensus approval of the presentation by the Division's research staff was sought. Three groups of researchers, at various levels of responsibilit...
The Professional Doctorate: From Anglo-Saxon to European Challenges
ERIC Educational Resources Information Center
Huisman, Jeroen; Naidoo, Rajani
2006-01-01
This paper addresses the debate on the third cycle of European higher education. Currently, much attention is paid to improving the structure and quality of doctorate education in the European context of the Bologna process and the Lisbon objectives. However, alternatives to the traditional doctorate are hardly addressed in the policy documents of…
Liquefaction of torrefied wood using microwave irradiation
Mengchao Zhou; Thomas Eberhardt; Pingping Xin; Chung-Yun Hse; Hui Pan
2016-01-01
Torrefaction is an effective pretreatment method to improve the uniformity and quality of lignocellulosic biomass before further thermal processing (e.g., gasification, combustion). The objective of this study was to determine the impacts of torrefaction as a pretreatment before liquefaction. Wood chips were torrefied for 2 h at three different temperatures (230, 260,...
Chapter 6. Landscape Analysis for Habitat Monitoring
Samuel A. Cushman; Kevin McGarigal; Kevin S. McKelvey; Christina D. Vojta; Claudia M. Regan
2013-01-01
The primary objective of this chapter is to describe standardized methods for measur¬ing and monitoring attributes of landscape pattern in support of habitat monitoring. This chapter describes the process of monitoring categorical landscape maps in which either selected habitat attributes or different classes of habitat quality are represented as different patch types...
Influence of Students' Feedback on the Quality of Adult Higher Distance Education Service Delivery
ERIC Educational Resources Information Center
Oduaran, Akpovire
2017-01-01
The evaluation of a program's compliance with service delivery and features necessary for the attainment of the program's educational objectives, student outcomes and continuous improvement is an important element in program accreditation and continuous improvement process. The study reported in this paper investigated the possible effects of…
Hydrological processes and model representation: impact of soft data on calibration
J.G. Arnold; M.A. Youssef; H. Yen; M.J. White; A.Y. Sheshukov; A.M. Sadeghi; D.N. Moriasi; J.L. Steiner; Devendra Amatya; R.W. Skaggs; E.B. Haney; J. Jeong; M. Arabi; P.H. Gowda
2015-01-01
Hydrologic and water quality models are increasingly used to determine the environmental impacts of climate variability and land management. Due to differing model objectives and differences in monitored data, there are currently no universally accepted procedures for model calibration and validation in the literature. In an effort to develop accepted model calibration...
Feasibility of Jujube peeling using novel infrared radiation heating technology
USDA-ARS?s Scientific Manuscript database
Infrared (IR) radiation heating has a promising potential to be used as a sustainable and effective method to eliminate the use of water and chemicals in the jujube-peeling process and enhance the quality of peeled products. The objective of this study was to investigate the feasibility of use IR he...
USDA-ARS?s Scientific Manuscript database
The objective of this research was to develop an integrated process to produce biogas and high-quality particleboard using saline creeping wild ryegrass (CWR), Leymus triticoides through anaerobic digestion (AD). Besides producing biogas, AD also serves as a pretreatment method to remove the wax la...
USDA-ARS?s Scientific Manuscript database
Phosphorus (P) recovery and re-use will become increasingly important for water quality protection and sustainable nutrient cycling as environmental regulations become stricter and global P reserves decline. The objective of this study was to examine and characterize several magnesium phosphates re...
Demonstration tests of infrared peeling system with electrical emitters for tomatoes
USDA-ARS?s Scientific Manuscript database
Infrared (IR) dry-peeling is an emerging technology that could avoid the drawbacks of steam and lye peeling of tomatoes. The objectives of this research was to evaluate the performance of an IR peeling system at two tomato processing plants located in California and to compare product quality, peela...
Publicly disclosed information about the quality of health care: response of the US public
Schneider, E; Lieberman, T
2001-01-01
Public disclosure of information about the quality of health plans, hospitals, and doctors continues to be controversial. The US experience of the past decade suggests that sophisticated quality measures and reporting systems that disclose information on quality have improved the process and outcomes of care in limited ways in some settings, but these efforts have not led to the "consumer choice" market envisaged. Important reasons for this failure include limited salience of objective measures to consumers, the complexity of the task of interpretation, and insufficient use of quality results by organised purchasers and insurers to inform contracting and pricing decisions. Nevertheless, public disclosure may motivate quality managers and providers to undertake changes that improve the delivery of care. Efforts to measure and report information about quality should remain public, but may be most effective if they are targeted to the needs of institutional and individual providers of care. Key Words: public disclosure; quality of health care; quality improvement PMID:11389318
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case. PMID:26125631
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks.
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case.
Chung, Eun-Sung; Lee, Kil Seong
2009-03-01
The objective of this study is to develop an alternative evaluation index (AEI) in order to determine the priorities of a range of alternatives using both the hydrological simulation program in FORTRAN (HSPF) and multicriteria decision making (MCDM) techniques. In order to formulate the HSPF model, sensitivity analyses of water quantity (peak discharge and total volume) and quality (BOD peak concentrations and total loads) are conducted and a number of critical parameters were selected. To achieve a more precise simulation, the study watershed is divided into four regions for calibration and verification according to landuse, location, slope, and climate data. All evaluation criteria were selected using the Driver-Pressure-State-Impact-Response (DPSIR) model, a sustainability evaluation concept. The Analytic Hierarchy Process is used to estimate the weights of the criteria and the effects of water quantity and quality were quantified by HSPF simulation. In addition, AEIs that reflected residents' preferences for management objectives are proposed in order to induce the stakeholder to participate in the decision making process.
Moreno-Martínez, Francisco Javier; Montoro, Pedro R.
2012-01-01
This work presents a new set of 360 high quality colour images belonging to 23 semantic subcategories. Two hundred and thirty-six Spanish speakers named the items and also provided data from seven relevant psycholinguistic variables: age of acquisition, familiarity, manipulability, name agreement, typicality and visual complexity. Furthermore, we also present lexical frequency data derived from Internet search hits. Apart from the high number of variables evaluated, knowing that it affects the processing of stimuli, this new set presents important advantages over other similar image corpi: (a) this corpus presents a broad number of subcategories and images; for example, this will permit researchers to select stimuli of appropriate difficulty as required, (e.g., to deal with problems derived from ceiling effects); (b) the fact of using coloured stimuli provides a more realistic, ecologically-valid, representation of real life objects. In sum, this set of stimuli provides a useful tool for research on visual object-and word- processing, both in neurological patients and in healthy controls. PMID:22662166
Medical Image Processing Server applied to Quality Control of Nuclear Medicine.
NASA Astrophysics Data System (ADS)
Vergara, C.; Graffigna, J. P.; Marino, E.; Omati, S.; Holleywell, P.
2016-04-01
This paper is framed within the area of medical image processing and aims to present the process of installation, configuration and implementation of a processing server of medical images (MIPS) in the Fundación Escuela de Medicina Nuclear located in Mendoza, Argentina (FUESMEN). It has been developed in the Gabinete de Tecnologia Médica (GA.TE.ME), Facultad de Ingeniería-Universidad Nacional de San Juan. MIPS is a software that using the DICOM standard, can receive medical imaging studies of different modalities or viewing stations, then it executes algorithms and finally returns the results to other devices. To achieve the objectives previously mentioned, preliminary tests were conducted in the laboratory. More over, tools were remotely installed in clinical enviroment. The appropiate protocols for setting up and using them in different services were established once defined those suitable algorithms. Finally, it’s important to focus on the implementation and training that is provided in FUESMEN, using nuclear medicine quality control processes. Results on implementation are exposed in this work.
Evaluation of Distance Course Effectiveness - Exploring the Quality of Interactive Processes
NASA Astrophysics Data System (ADS)
Botelho, Francisco Villa Ulhôa; Vicari, Rosa Maria
Understanding the dynamics of learning processes implies an understanding of their components: individuals, environment or context and mediation. It is known that distance learning (DL) has a distinctive characteristic in relation to the mediation component. Due to the need of overcoming the barriers of distance and time, DL intensively uses information and communication technologies (ICT) to perform interactive processes. Construction of effective learning environments depends on human relationships. It also depends on the emotionality placed on such relationships. Therefore, knowing how to act in virtual environments in the sense of creating the required ambiance for animation of learning processes has a unique importance. This is the theme of this study. Its general objectives were achieved and can be summarized as follows: analyze indexes that are significant for evaluations of distance course effectiveness; investigate to which extent effectiveness of DL courses is correlated with quality of interactive processes; search characteristics of the conversations by individuals interacting in study groups that are formed in virtual environments, which may contribute to effectiveness of distance courses.
Restabilizing attachment to cultural objects. Aesthetics, emotions and biography.
Benzecry, Claudio E
2015-12-01
The scholarship on aesthetics and materiality has studied how objects help shape identity, social action and subjectivity. Objects, as 'equipment[s] for living' (Luhmann 2000), become the 'obligatory passage points humans have to contend with in order to pursue their projects (Latour 1991). They provide patterns to which bodies can unconsciously latch onto, or help human agents work towards particular states of being (DeNora 2000, 2003). Objects are central in the long term process of taste construction, as any attachment to an object is made out of a delicate equilibrium of mediators, bodies, situations and techniques (Hennion and his collaborators (Hennion and Fouquet 2001; Hennion and Gomart 1999). In all of these accounts objects are the end result of long-term processes of stabilization, in which the actual material object (a musical piece, a sculpture, an art installation, a glass of wine, the oeuvre of Bach as we know it) is both a result and yet a key co-producer of its own generation. Whereas the literature has been generous and detailed in exploring the processes of assembling and sustaining object-centered attachments, it has not sufficiently engaged with what happens when the aesthetic elements of cultural artifacts that have produced emotional resonance are transformed: what do these artifacts morph into? What explains the transition (or not) of different cultural objects? And relatedly, what happens to the key aesthetic qualities that were so central to how the objects had been defined, and to those who have emotionally attached to them? To answer these questions, this article uses as exemplars two different cases of attachment, predicated on the distinctive features of a cultural object--the transcendence of opera and the authenticity of a soccer jersey--that have undergone transformations. © London School of Economics and Political Science 2015.
Training in intensive care medicine. A challenge within reach.
Castellanos-Ortega, A; Rothen, H U; Franco, N; Rayo, L A; Martín-Loeches, I; Ramírez, P; Cuñat de la Hoz, J
2014-01-01
The medical training model is currently immersed in a process of change. The new paradigm is intended to be more effective, more integrated within the healthcare system, and strongly oriented towards the direct application of knowledge to clinical practice. Compared with the established training system based on certification of the completion of a series or rotations and stays in certain healthcare units, the new model proposes a more structured training process based on the gradual acquisition of specific competences, in which residents must play an active role in designing their own training program. Training based on competences guarantees more transparent, updated and homogeneous learning of objective quality, and which can be homologated internationally. The tutors play a key role as the main directors of the process, and institutional commitment to their work is crucial. In this context, tutors should receive time and specific formation to allow the evaluation of training as the cornerstone of the new model. New forms of objective summative and training evaluation should be introduced to guarantee that the predefined competences and skills are effectively acquired. The free movement of specialists within Europe is very desirable and implies that training quality must be high and amenable to homologation among the different countries. The Competency Based training in Intensive Care Medicine in Europe program is our main reference for achieving this goal. Scientific societies in turn must impulse and facilitate all those initiatives destined to improve healthcare quality and therefore specialist training. They have the mission of designing strategies and processes that favor training, accreditation and advisory activities with the government authorities. Copyright © 2013 Elsevier España, S.L. y SEMICYUC. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
The quality of instruments to assess the process of shared decision making: A systematic review
Bomhof-Roordink, Hanna; Smith, Ian P.; Scholl, Isabelle; Stiggelbout, Anne M.; Pieterse, Arwen H.
2018-01-01
Objective To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. Methods In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. Results We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Conclusions Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument’s content and characteristics such as the perspective that they assess. We recommend refinement and validation of existing instruments, and the use of COSMIN-guidelines to help guarantee high-quality evaluations. PMID:29447193
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less
Multiscale visual quality assessment for cluster analysis with self-organizing maps
NASA Astrophysics Data System (ADS)
Bernard, Jürgen; von Landesberger, Tatiana; Bremm, Sebastian; Schreck, Tobias
2011-01-01
Cluster analysis is an important data mining technique for analyzing large amounts of data, reducing many objects to a limited number of clusters. Cluster visualization techniques aim at supporting the user in better understanding the characteristics and relationships among the found clusters. While promising approaches to visual cluster analysis already exist, these usually fall short of incorporating the quality of the obtained clustering results. However, due to the nature of the clustering process, quality plays an important aspect, as for most practical data sets, typically many different clusterings are possible. Being aware of clustering quality is important to judge the expressiveness of a given cluster visualization, or to adjust the clustering process with refined parameters, among others. In this work, we present an encompassing suite of visual tools for quality assessment of an important visual cluster algorithm, namely, the Self-Organizing Map (SOM) technique. We define, measure, and visualize the notion of SOM cluster quality along a hierarchy of cluster abstractions. The quality abstractions range from simple scalar-valued quality scores up to the structural comparison of a given SOM clustering with output of additional supportive clustering methods. The suite of methods allows the user to assess the SOM quality on the appropriate abstraction level, and arrive at improved clustering results. We implement our tools in an integrated system, apply it on experimental data sets, and show its applicability.
Formation of the image on the receiver of thermal radiation
NASA Astrophysics Data System (ADS)
Akimenko, Tatiana A.
2018-04-01
The formation of the thermal picture of the observed scene with the verification of the quality of the thermal images obtained is one of the important stages of the technological process that determine the quality of the thermal imaging observation system. In this article propose to consider a model for the formation of a thermal picture of a scene, which must take into account: the features of the object of observation as the source of the signal; signal transmission through the physical elements of the thermal imaging system that produce signal processing at the optical, photoelectronic and electronic stages, which determines the final parameters of the signal and its compliance with the requirements for thermal information and measurement systems.
NASA Astrophysics Data System (ADS)
Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd
2017-10-01
Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandoval, D; Mlady, G; Selwyn, R
Purpose: To bring together radiologists, technologists, and physicists to utilize post-processing techniques in digital radiography (DR) in order to optimize image acquisition and improve image quality. Methods: Sub-optimal images acquired on a new General Electric (GE) DR system were flagged for follow-up by radiologists and reviewed by technologists and medical physicists. Various exam types from adult musculoskeletal (n=35), adult chest (n=4), and pediatric (n=7) were chosen for review. 673 total images were reviewed. These images were processed using five customized algorithms provided by GE. An image score sheet was created allowing the radiologist to assign a numeric score to eachmore » of the processed images, this allowed for objective comparison to the original images. Each image was scored based on seven properties: 1) overall image look, 2) soft tissue contrast, 3) high contrast, 4) latitude, 5) tissue equalization, 6) edge enhancement, 7) visualization of structures. Additional space allowed for additional comments not captured in scoring categories. Radiologists scored the images from 1 – 10 with 1 being non-diagnostic quality and 10 being superior diagnostic quality. Scores for each custom algorithm for each image set were summed. The algorithm with the highest score for each image set was then set as the default processing. Results: Images placed into the PACS “QC folder” for image processing reasons decreased. Feedback from radiologists was, overall, that image quality for these studies had improved. All default processing for these image types was changed to the new algorithm. Conclusion: This work is an example of the collaboration between radiologists, technologists, and physicists at the University of New Mexico to add value to the radiology department. The significant amount of work required to prepare the processing algorithms, reprocessing and scoring of the images was eagerly taken on by all team members in order to produce better quality images and improve patient care.« less
Findings From a Nursing Care Audit Based on the Nursing Process: A Descriptive Study
Poortaghi, Sarieh; Salsali, Mahvash; Ebadi, Abbas; Rahnavard, Zahra; Maleki, Farzaneh
2015-01-01
Background: Although using the nursing process improves nursing care quality, few studies have evaluated nursing performance in accordance with nursing process steps either nationally or internationally. Objectives: This study aimed to audit nursing care based on a nursing process model. Patients and Methods: This was a cross-sectional descriptive study in which a nursing audit checklist was designed and validated for assessing nurses’ compliance with nursing process. A total of 300 nurses from various clinical settings of Tehran university of medical sciences were selected. Data were analyzed using descriptive and inferential statistics, including frequencies, Pearson correlation coefficient and independent samples t-tests. Results: The compliance rate of nursing process indicators was 79.71 ± 0.87. Mean compliance scores did not significantly differ by education level and gender. However, overall compliance scores were correlated with nurses’ age (r = 0.26, P = 0.001) and work experience (r = 0.273, P = 0.001). Conclusions: Nursing process indicators can be used to audit nursing care. Such audits can be used as quality assurance tools. PMID:26576448
Agents for Change: Nonphysician Medical Providers and Health Care Quality
Boucher, Nathan A; McMillen, Marvin A; Gould, James S
2015-01-01
Quality medical care is a clinical and public health imperative, but defining quality and achieving improved, measureable outcomes are extremely complex challenges. Adherence to best practice invariably improves outcomes. Nonphysician medical providers (NPMPs), such as physician assistants and advanced practice nurses (eg, nurse practitioners, advanced practice registered nurses, certified registered nurse anesthetists, and certified nurse midwives), may be the first caregivers to encounter the patient and can act as agents for change for an organization’s quality-improvement mandate. NPMPs are well positioned to both initiate and ensure optimal adherence to best practices and care processes from the moment of initial contact because they have robust clinical training and are integral to trainee/staff education and the timely delivery of care. The health care quality aspects that the practicing NPMP can affect are objective, appreciative, and perceptive. As bedside practitioners and participants in the administrative and team process, NPMPs can fine-tune care delivery, avoiding the problem areas defined by the Institute of Medicine: misuse, overuse, and underuse of care. This commentary explores how NPMPs can affect quality by 1) supporting best practices through the promotion of guidelines and protocols, and 2) playing active, if not leadership, roles in patient engagement and organizational quality-improvement efforts. PMID:25663213
Hassett, Brian; Singh, Ena; Mahgoub, Ehab; O'Brien, Julie; Vicik, Steven M; Fitzpatrick, Brian
2018-01-01
Etanercept (ETN) (Enbrel®) is a soluble protein that binds to, and specifically inhibits, tumor necrosis factor (TNF), a proinflammatory cytokine. ETN is synthesized in Chinese hamster ovary cells by recombinant DNA technology as a fusion protein, with a fully human TNFRII ectodomain linked to the Fc portion of human IgG1. Successful manufacture of biologics, such as ETN, requires sophisticated process and product understanding, as well as meticulous control of operations to maintain product consistency. The objective of this evaluation was to show that the product profile of ETN drug substance (DS) has been consistent over the course of production. Multiple orthogonal biochemical analyses, which included evaluation of attributes indicative of product purity, potency, and quality, were assessed on >2,000 batches of ETN from three sites of DS manufacture, during the period 1998-2015. Based on the key quality attributes of product purity (assessed by hydrophobic interaction chromatography HPLC), binding activity (to TNF by ELISA), potency (inhibition of TNF-induced apoptosis by cell-based bioassay) and quality (N-linked oligosaccharide map), we show that the integrity of ETN DS has remained consistent over time. This consistency was maintained through three major enhancements to the initial process of manufacturing that were supported by detailed comparability assessments, and approved by the European Medicines Agency. Examination of results for all major quality attributes for ETN DS indicates a highly consistent process for over 18 years and throughout changes to the manufacturing process, without affecting safety and efficacy, as demonstrated across a wide range of clinical trials of ETN in multiple inflammatory diseases.
Object positioning in storages of robotized workcells using LabVIEW Vision
NASA Astrophysics Data System (ADS)
Hryniewicz, P.; Banaś, W.; Sękala, A.; Gwiazda, A.; Foit, K.; Kost, G.
2015-11-01
During the manufacturing process, each performed task is previously developed and adapted to the conditions and the possibilities of the manufacturing plant. The production process is supervised by a team of specialists because any downtime causes great loss of time and hence financial loss. Sensors used in industry for tracking and supervision various stages of a production process make it much easier to maintain it continuous. One of groups of sensors used in industrial applications are non-contact sensors. This group includes: light barriers, optical sensors, rangefinders, vision systems, and ultrasonic sensors. Through to the rapid development of electronics the vision systems were widespread as the most flexible type of non-contact sensors. These systems consist of cameras, devices for data acquisition, devices for data analysis and specialized software. Vision systems work well as sensors that control the production process itself as well as the sensors that control the product quality level. The LabVIEW program as well as the LabVIEW Vision and LabVIEW Builder represent the application that enables program the informatics system intended to process and product quality control. The paper presents elaborated application for positioning elements in a robotized workcell. Basing on geometric parameters of manipulated object or on the basis of previously developed graphical pattern it is possible to determine the position of particular manipulated elements. This application could work in an automatic mode and in real time cooperating with the robot control system. It allows making the workcell functioning more autonomous.
Hoshin Kanri: a technique for strategic quality management.
Tennant, C; Roberts, P A
2000-01-01
This paper describes a technique for Strategic Quality Management (SQM), known as Hoshin Kanri, which has been operated as a management system in many Japanese companies since the 1960s. It represents a core aspect of Japanese companies' management systems, and is stated as: the means by which the overall control system and Total Quality Management (TQM) are deployed. Hoshin Kanri is not particularly unique in its concept of establishing and tracking individual goals and objectives, but the manner in which the objectives and the means to achieve them are developed and deployed is. The problem with applying the concept of Strategic Quality Management (SQM) using Hoshin Kanri, is that it can tend to challenge the traditional authoritarian strategic planning models, which have become the paradigms of modern business. Yet Hoshin Kanri provides an appropriate tool for declaration of the strategic vision for the business while integrating goals and targets in a single holistic model. There have been various adaptations of Hoshin Kanri to align the technique to Western thinking and management approaches, yet outside Japan its significance has gone largely unreported. It is proposed that Hoshin Kanri is an effective methodology for SQM, which has a number of benefits over the more conventional planning techniques. The benefits of Hoshin Kanri as a tool for Strategic Quality Management (SQM) compared to conventional planning systems include: integration of strategic objectives with tactical daily management, the application of the plan-do-check-act cycle to business process management, parallel planning and execution methodology, company wide approach, improvements in communication, increased consensus and buy-in to goal setting, and cross-functional-management integration.
Tvedt, Christine; Sjetne, Ingeborg Strømseng; Helgeland, Jon; Bukholm, Geir
2012-01-01
Objectives The purpose of this study was to identify organisational processes and structures that are associated with nurse-reported patient safety and quality of nursing. Design This is an observational cross-sectional study using survey methods. Setting Respondents from 31 Norwegian hospitals with more than 85 beds were included in the survey. Participants All registered nurses working in direct patient care in a position of 20% or more were invited to answer the survey. In this study, 3618 nurses from surgical and medical wards responded (response rate 58.9). Nurses' practice environment was defined as organisational processes and measured by the Nursing Work Index Revised and items from Hospital Survey on Patient Safety Culture. Outcome measures Nurses' assessments of patient safety, quality of nursing, confidence in how their patients manage after discharge and frequency of adverse events were used as outcome measures. Results Quality system, nurse–physician relation, patient safety management and staff adequacy were process measures associated with nurse-reported work-related and patient-related outcomes, but we found no associations with nurse participation, education and career and ward leadership. Most organisational structures were non-significant in the multilevel model except for nurses’ affiliations to medical department and hospital type. Conclusions Organisational structures may have minor impact on how nurses perceive work-related and patient-related outcomes, but the findings in this study indicate that there is a considerable potential to address organisational design in improvement of patient safety and quality of care. PMID:23263021
Ranking Reputation and Quality in Online Rating Systems
Liao, Hao; Zeng, An; Xiao, Rui; Ren, Zhuo-Ming; Chen, Duan-Bing; Zhang, Yi-Cheng
2014-01-01
How to design an accurate and robust ranking algorithm is a fundamental problem with wide applications in many real systems. It is especially significant in online rating systems due to the existence of some spammers. In the literature, many well-performed iterative ranking methods have been proposed. These methods can effectively recognize the unreliable users and reduce their weight in judging the quality of objects, and finally lead to a more accurate evaluation of the online products. In this paper, we design an iterative ranking method with high performance in both accuracy and robustness. More specifically, a reputation redistribution process is introduced to enhance the influence of highly reputed users and two penalty factors enable the algorithm resistance to malicious behaviors. Validation of our method is performed in both artificial and real user-object bipartite networks. PMID:24819119
Salorio-Corbetto, Marina; Baer, Thomas; Moore, Brian C. J.
2017-01-01
Abstract Objective: The objective was to assess the degradation of speech sound quality produced by frequency compression for listeners with extensive high-frequency dead regions (DRs). Design: Quality ratings were obtained using values of the starting frequency (Sf) of the frequency compression both below and above the estimated edge frequency, fe, of each DR. Thus, the value of Sf often fell below the lowest value currently used in clinical practice. Several compression ratios were used for each value of Sf. Stimuli were sentences processed via a prototype hearing aid based on Phonak Exélia Art P. Study sample: Five participants (eight ears) with extensive high-frequency DRs were tested. Results: Reductions of sound-quality produced by frequency compression were small to moderate. Ratings decreased significantly with decreasing Sf and increasing CR. The mean ratings were lowest for the lowest Sf and highest CR. Ratings varied across participants, with one participant rating frequency compression lower than no frequency compression even when Sf was above fe. Conclusions: Frequency compression degraded sound quality somewhat for this small group of participants with extensive high-frequency DRs. The degradation was greater for lower values of Sf relative to fe, and for greater values of CR. Results varied across participants. PMID:27724057
Microbiological monitoring for the US Geological Survey National Water-Quality Assessment Program
Francy, Donna S.; Myers, Donna N.; Helsel, Dennis R.
2000-01-01
Data to characterize the microbiological quality of the Nation?s fresh, marine, and estuarine waters are usually collected for local purposes, most often to judge compliance with standards for protection of public health in swimmable or drinkable waters. Methods and procedures vary with the objectives and practices of the parties collecting data and are continuously being developed or modified. Therefore, it is difficult to provide a nationally consistent picture of the microbial quality of the Nation?s waters. Study objectives and guidelines for a national microbiological monitoring program are outlined in this report, using the framework of the U.S. Geological Survey (USGS) National Water-Quality Assessment (NAWQA) program. A national program is designed to provide long-term data on the presence of microbiological pathogens and indicators in ground water and surface water to support effective water policy and management. Three major groups of waterborne pathogens affect the public health acceptability of waters in the United States?bacteria, protozoa, and viruses. Microbiological monitoring in NAWQA would be designed to assess the occurrence, distribution, and trends of pathogenic organisms and indicators in surface waters and ground waters; relate the patterns discerned to factors that help explain them; and improve our understanding of the processes that control microbiological water quality.
Landon, M.K.; Delin, G.N.; Nelson, K.J.; Regan, C.P.; Lamb, J.A.; Larson, S.J.; Capel, P.D.; Anderson, J.L.; Dowdy, R.H.
1997-01-01
The Minnesota Management Systems Evaluation Area (MSEA) project was part of a multi-scale, inter-agency initiative to evaluate the effects of agricultural management systems on water quality in the midwest corn belt. The research area was located in the Anoka Sand Plain about 5 kilometers southwest of Princeton, Minnesota. The ground-water-quality monitoring network within and immediately surrounding the research area consisted of 73 observation wells and 25 multiport wells. The primary objectives of the ground-water monitoring program at the Minnesota MSEA were to: (1) determine the effects of three farming systems on ground-water quality, and (2) understand the processes and factors affecting the loading, transport, and fate of agricultural chemicals in ground water at the site. This report presents well construction, geologic, water-level, chemical application, water-quality, and quality-assurance data used to evaluate the effects of farming systems on ground-water quality during 1991-95.
NASA Astrophysics Data System (ADS)
Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.
2014-02-01
Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.
NASA Astrophysics Data System (ADS)
Luo, Lin
2017-08-01
In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Patrick
This Closure Report (CR) presents information supporting the clean closure of Corrective Action Unit (CAU) 412: Clean Slate I Plutonium Dispersion (TTR), located on the Tonopah Test Range, Nevada. CAU 412 consists of a release of radionuclides to the surrounding soil from a storage–transportation test conducted on May 25, 1963. Corrective action investigation (CAI) activities were performed in April and May 2015, as set forth in the Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 412: Clean Slate I Plutonium Dispersion (TTR), Tonopah Test Range, Nevada; and in accordance with the Soils Activity Quality Assurance Plan. Themore » purpose of the CAI was to fulfill data needs as defined during the data quality objectives process. The CAU 412 dataset of investigation results was evaluated based on a data quality assessment. This assessment demonstrated the dataset is complete and acceptable for use in fulfilling the data needs identified by the data quality objectives process. This CR provides documentation and justification for the clean closure of CAU 412 under the FFACO without further corrective action. This justification is based on historical knowledge of the site, previous site investigations, implementation of the 1997 interim corrective action, and the results of the CAI. The corrective action of clean closure was confirmed as appropriate for closure of CAU 412 based on achievement of the following closure objectives: Radiological contamination at the site is less than the final action level using the ground troops exposure scenario (i.e., the radiological dose is less than the final action level): Removable alpha contamination is less than the high contamination area criterion: No potential source material is present at the site, and any impacted soil associated with potential source material has been removed so that remaining soil contains contaminants at concentrations less than the final action levels: and There is sufficient information to characterize investigation and remediation waste for disposal.« less
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
2014-01-01
Background We recently demonstrated that quality of spirometry in primary care could markedly improve with remote offline support from specialized professionals. It is hypothesized that implementation of automatic online assessment of quality of spirometry using information and communication technologies may significantly enhance the potential for extensive deployment of a high quality spirometry program in integrated care settings. Objective The objective of the study was to elaborate and validate a Clinical Decision Support System (CDSS) for automatic online quality assessment of spirometry. Methods The CDSS was done through a three step process including: (1) identification of optimal sampling frequency; (2) iterations to build-up an initial version using the 24 standard spirometry curves recommended by the American Thoracic Society; and (3) iterations to refine the CDSS using 270 curves from 90 patients. In each of these steps the results were checked against one expert. Finally, 778 spirometry curves from 291 patients were analyzed for validation purposes. Results The CDSS generated appropriate online classification and certification in 685/778 (88.1%) of spirometry testing, with 96% sensitivity and 95% specificity. Conclusions Consequently, only 93/778 (11.9%) of spirometry testing required offline remote classification by an expert, indicating a potential positive role of the CDSS in the deployment of a high quality spirometry program in an integrated care setting. PMID:25600957
Characterization of the reference wave in a compact digital holographic camera.
Park, I S; Middleton, R J C; Coggrave, C R; Ruiz, P D; Coupland, J M
2018-01-01
A hologram is a recording of the interference between an unknown object wave and a coherent reference wave. Providing the object and reference waves are sufficiently separated in some region of space and the reference beam is known, a high-fidelity reconstruction of the object wave is possible. In traditional optical holography, high-quality reconstruction is achieved by careful reillumination of the holographic plate with the exact same reference wave that was used at the recording stage. To reconstruct high-quality digital holograms the exact parameters of the reference wave must be known mathematically. This paper discusses a technique that obtains the mathematical parameters that characterize a strongly divergent reference wave that originates from a fiber source in a new compact digital holographic camera. This is a lensless design that is similar in principle to a Fourier hologram, but because of the large numerical aperture, the usual paraxial approximations cannot be applied and the Fourier relationship is inexact. To characterize the reference wave, recordings of quasi-planar object waves are made at various angles of incidence using a Dammann grating. An optimization process is then used to find the reference wave that reconstructs a stigmatic image of the object wave regardless of the angle of incidence.
On the evaluation of segmentation editing tools
Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.
2014-01-01
Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063
The impact of the condenser on cytogenetic image quality in digital microscope system.
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.
NASA Technical Reports Server (NTRS)
Vajingortin, L. D.; Roisman, W. P.
1991-01-01
The problem of ensuring the required quality of products and/or technological processes often becomes more difficult due to the fact that there is not general theory of determining the optimal sets of value of the primary factors, i.e., of the output parameters of the parts and units comprising an object and ensuring the correspondence of the object's parameters to the quality requirements. This is the main reason for the amount of time taken to finish complex vital article. To create this theory, one has to overcome a number of difficulties and to solve the following tasks: the creation of reliable and stable mathematical models showing the influence of the primary factors on the output parameters; finding a new technique of assigning tolerances for primary factors with regard to economical, technological, and other criteria, the technique being based on the solution of the main problem; well reasoned assignment of nominal values for primary factors which serve as the basis for creating tolerances. Each of the above listed tasks is of independent importance. An attempt is made to give solutions for this problem. The above problem dealing with quality ensuring an mathematically formalized aspect is called the multiple inverse problem.
Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J
1995-01-01
OBJECTIVE: This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. DATA SOURCES AND STUDY SETTING: Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. STUDY DESIGN: The study involved cross-sectional examination of the named relationships. DATA COLLECTION/EXTRACTION METHODS: Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. PRINCIPAL FINDINGS: A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. CONCLUSIONS: What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult challenges in this regard. PMID:7782222
Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques
NASA Astrophysics Data System (ADS)
Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.
2017-10-01
It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.
Aisopou, Angeliki; Stoianov, Ivan; Graham, Nigel J D
2012-01-01
Monitoring the quality of drinking water from the treatment plant to the consumers tap is critical to ensure compliance with national standards and/or WHO guideline levels. There are a number of processes and factors affecting the water quality during transmission and distribution which are little understood. A significant obstacle for gaining a detailed knowledge of various physical and chemical processes and the effect of the hydraulic conditions on the water quality deterioration within water supply systems is the lack of reliable and low-cost (both capital and O & M) water quality sensors for continuous monitoring. This paper has two objectives. The first one is to present a detailed evaluation of the performance of a novel in-pipe multi-parameter sensor probe for reagent- and membrane-free continuous water quality monitoring in water supply systems. The second objective is to describe the results from experimental research which was conducted to acquire continuous water quality and high-frequency hydraulic data for the quantitative assessment of the water quality changes occurring under steady and unsteady-state flow conditions. The laboratory and field evaluation of the multi-parameter sensor probe showed that the sensors have a rapid dynamic response, average repeatability and unreliable accuracy. The uncertainties in the sensor data present significant challenges for the analysis and interpretation of the acquired data and their use for water quality modelling, decision support and control in operational systems. Notwithstanding these uncertainties, the unique data sets acquired from transmission and distribution systems demonstrated the deleterious effect of unsteady state flow conditions on various water quality parameters. These studies demonstrate: (i) the significant impact of the unsteady-state hydraulic conditions on the disinfectant residual, turbidity and colour caused by the re-suspension of sediments, scouring of biofilms and tubercles from the pipe and increased mixing, and the need for further experimental research to investigate these interactions; (ii) important advances in sensor technologies which provide unique opportunities to study both the dynamic hydraulic conditions and water quality changes in operational systems. The research in these two areas is critical to better understand and manage the water quality deterioration in ageing water transmission and distribution systems. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mitka, B.; Szelest, P.
2013-12-01
This paper presents the issues related to the acquisition and processing of terrestrial photogrammetry and laser scanning for building educational portals and virtual museums. Discusses the specific requirements of measurement technology and data processing for all kinds of objects, ranging from architecture through sculpture and architectural detail on the fabric and individual museum exhibits. Educational portals and virtual museums require a modern, high-quality visuals (3D models, virtual tours, animations, etc.) supplemented by descriptive content or audio commentary. Source for obtaining such materials are mostly terrestrial laser scanning and photogrammetry as technologies that provide complete information about the presented geometric objects. However, the performance requirements of web services impose severe restrictions on the presented content. It is necessary to use optimalization geometry process to streamline the way of its presentation. Equally important problem concerns the selection of appropriate technology and process measurement data processing presented for each type of objects. Only skillful selection of measuring equipment and data processing tools effectively ensure the achievement of a satisfactory end result. Both terrestrial laser scanning technology and digital close range photogrammetry has its strengths which should be used but also the limitations that must be taken into account in this kind of work. The key is choosing the right scanner for both the measured object and terrain such as pixel size in the performance of his photos.
Roudier, B; Davit, B; Schütz, H; Cardot, J-M
2015-01-01
The in vitro-in vivo correlation (IVIVC) (Food and Drug Administration 1997) aims to predict performances in vivo of a pharmaceutical formulation based on its in vitro characteristics. It is a complex process that (i) incorporates in a gradual and incremental way a large amount of information and (ii) requires information from different properties (formulation, analytical, clinical) and associated dedicated treatments (statistics, modeling, simulation). These results in many studies that are initiated and integrated into the specifications (quality target product profile, QTPP). This latter defines the appropriate experimental designs (quality by design, QbD) (Food and Drug Administration 2011, 2012) whose main objectives are determination (i) of key factors of development and manufacturing (critical process parameters, CPPs) and (ii) of critical points of physicochemical nature relating to active ingredients (API) and critical quality attribute (CQA) which may have implications in terms of efficiency, safety, and inoffensiveness for the patient, due to their non-inclusion. These processes generate a very large amount of data that is necessary to structure. In this context, the storage of information in a database (DB) and the management of this database (database management system, DBMS) become an important issue for the management of projects and IVIVC and more generally for development of new pharmaceutical forms. This article describes the implementation of a prototype object-oriented database (OODB) considered as a tool, which is helpful for decision taking, responding in a structured and consistent way to the issues of project management of IVIVC (including bioequivalence and bioavailability) (Food and Drug Administration 2003) necessary for the implementation of QTPP.
Surfactant studies for bench-scale operation
NASA Technical Reports Server (NTRS)
Hickey, Gregory S.; Sharma, Pramod K.
1993-01-01
A phase 2 study has been initiated to investigate surfactant-assisted coal liquefaction, with the objective of quantifying the enhancement in liquid yields and product quality. This report covers the second quarter of work. The major accomplishments were: completion of coal liquefaction autoclave reactor runs with Illinois number 6 coal at processing temperatures of 300, 325, and 350 C, and pressures of 1800 psig; analysis of the filter cake and the filtrate obtained from the treated slurry in each run; and correlation of the coal conversions and the liquid yield quality to the surfactant concentration. An increase in coal conversions and upgrading of the liquid product quality due to surfactant addition was observed for all runs.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
Agyeman-Duah, Josephine Nana Afrakoma; Theurer, Antje; Munthali, Charles; Alide, Noor; Neuhann, Florian
2014-01-02
Knowledge regarding the best approaches to improving the quality of healthcare and their implementation is lacking in many resource-limited settings. The Medical Department of Kamuzu Central Hospital in Malawi set out to improve the quality of care provided to its patients and establish itself as a recognized centre in teaching, operations research and supervision of district hospitals. Efforts in the past to achieve these objectives were short-lived, and largely unsuccessful. Against this background, a situational analysis was performed to aid the Medical Department to define and prioritize its quality improvement activities. A mix of quantitative and qualitative methods was applied using checklists for observed practice, review of registers, key informant interviews and structured patient interviews. The mixed methods comprised triangulation by including the perspectives of the clients, healthcare providers from within and outside the department, and the field researcher's perspectives by means of document review and participatory observation. Human resource shortages, staff attitudes and shortage of equipment were identified as major constraints to patient care, and the running of the Medical Department. Processes, including documentation in registers and files and communication within and across cadres of staff were also found to be insufficient and thus undermining the effort of staff and management in establishing a sustained high quality culture. Depending on their past experience and knowledge, the stakeholder interviewees revealed different perspectives and expectations of quality healthcare and the intended quality improvement process. Establishing a quality improvement process in resource-limited settings is an enormous task, considering the host of challenges that these facilities face. The steps towards changing the status quo for improved quality care require critical self-assessment, the willingness to change as well as determined commitment and contributions from clients, staff and management.
Tamjidy, Mehran; Baharudin, B. T. Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-01-01
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon’s entropy. PMID:28772893
Tamjidy, Mehran; Baharudin, B T Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-05-15
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.
Quality Indicators for Safe Medication Preparation and Administration: A Systematic Review
Maaskant, Jolanda M.; de Boer, Monica; Krediet, C. T. Paul; Nieveen van Dijkum, Els J. M.
2015-01-01
Background One-third of all medication errors causing harm to hospitalized patients occur in the medication preparation and administration phase, which is predominantly a nursing activity. To monitor, evaluate and improve the quality and safety of this process, evidence-based quality indicators can be used. Objectives The aim of study was to identify evidence-based quality indicators (structure, process and outcome) for safe in-hospital medication preparation and administration. Methods MEDLINE, EMBASE and CINAHL were searched for relevant studies published up to January 2015. Additionally, nine databases were searched to identify relevant grey literature. Two reviewers independently selected studies if (1) the method for quality indicator development combined a literature search with expert panel opinion, (2) the study contained quality indicators on medication safety, and (3) any of the quality indicators were applicable to hospital medication preparation and administration. A multidisciplinary team appraised the studies independently using the AIRE instrument, which contains four domains and 20 items. Quality indicators applicable to in-hospital medication preparation and administration were extracted using a structured form. Results The search identified 1683 studies, of which 64 were reviewed in detail and five met the inclusion criteria. Overall, according to the AIRE domains, all studies were clear on purpose; most of them applied stakeholder involvement and used evidence reasonably; usage of the indicator in practice was scarcely described. A total of 21 quality indicators were identified: 5 structure indicators (e.g. safety management and high alert medication), 11 process indicators (e.g. verification and protocols) and 5 outcome indicators (e.g. harm and death). These quality indicators partially cover the 7 rights. Conclusion Despite the relatively small number of included studies, the identified quality indicators can serve as an excellent starting point for further development of nursing specific quality indicators for medication safety. Especially on the right patient, right route, right time and right documentation there is room future development of quality indicators. PMID:25884623
NASA Technical Reports Server (NTRS)
1992-01-01
The George M. Low Trophy is awarded to current NASA contractors, subcontractors, and suppliers in the aerospace industry who have demonstrated sustained excellence and outstanding achievements in quality and productivity for three or more years. The objectives of the award are to increase public awareness of the importance of quality and productivity to the Nation's aerospace program and industry in general; encourage domestic business to continue efforts to enhance quality, increase productivity, and thereby strengthen competitiveness; and provide the means for sharing the successful methods and techniques used by the applicants with other American enterprises. Information is given on candidate eligibility for large businesses, the selection process, the nomination letter, and the application report. The 1992 highlights and recipients are included.
Multi Objective Optimization of Yarn Quality and Fibre Quality Using Evolutionary Algorithm
NASA Astrophysics Data System (ADS)
Ghosh, Anindya; Das, Subhasis; Banerjee, Debamalya
2013-03-01
The quality and cost of resulting yarn play a significant role to determine its end application. The challenging task of any spinner lies in producing a good quality yarn with added cost benefit. The present work does a multi-objective optimization on two objectives, viz. maximization of cotton yarn strength and minimization of raw material quality. The first objective function has been formulated based on the artificial neural network input-output relation between cotton fibre properties and yarn strength. The second objective function is formulated with the well known regression equation of spinning consistency index. It is obvious that these two objectives are conflicting in nature i.e. not a single combination of cotton fibre parameters does exist which produce maximum yarn strength and minimum cotton fibre quality simultaneously. Therefore, it has several optimal solutions from which a trade-off is needed depending upon the requirement of user. In this work, the optimal solutions are obtained with an elitist multi-objective evolutionary algorithm based on Non-dominated Sorting Genetic Algorithm II (NSGA-II). These optimum solutions may lead to the efficient exploitation of raw materials to produce better quality yarns at low costs.
NASA Astrophysics Data System (ADS)
Belyaev, P. S.; Mishchenko, S. V.; Belyaev, V. P.; Belousov, O. A.; Frolov, V. A.
2018-04-01
The objects of this study are petroleum road bitumen and polymeric bituminous binder for road surfaces obtained by polymer materials. The subject of the study is monitoring the polymer-bitumen binder quality changes as a result of varying the bitumen modification process. The purpose of the work is to identify the patterns of the modification process and build a mathematical model that provides the ability to calculate and select technological equipment. It is shown that the polymer-bitumen binder production with specified quality parameters can be ensured in apparatuses with agitators in turbulent mode without the colloidal mills use. Bitumen mix and modifying additives limiting indicators which can be used as restrictions in the form of mathematical model inequalities are defined. A mathematical model for the polymer-bitumen binder preparation has been developed and its adequacy has been confirmed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickerson, Patricia O'Donnell; Summa, Deborah Ann; Liu, Cheng
The goals of this project were to demonstrate reliable, reproducible solid state bonding of aluminum 6061 alloy plates together to encapsulate DU-10 wt% Mo surrogate fuel foils. This was done as part of the CONVERT Fuel Fabrication Capability effort in Process Baseline Development . Bonding was done using Hot Isotatic Pressing (HIP) of evacuated stainless steel cans (a.k.a HIP cans) containing fuel plate components and strongbacks. Gross macroscopic measurements of HIP cans prior to HIP and after HIP were used as part of this demonstration, and were used to determine the accuracy of a finitie element model of the HIPmore » bonding process. The quality of the bonding was measured by controlled miniature bulge testing for Al-Al, Al-Zr, and Zr-DU bonds. A special objective was to determine if the HIP process consistently produces good quality bonding and to determine the best characterization techniques for technology transfer.« less
Guideline for primary care management of headache in adults
Becker, Werner J.; Findlay, Ted; Moga, Carmen; Scott, N. Ann; Harstall, Christa; Taenzer, Paul
2015-01-01
Abstract Objective To increase the use of evidence-informed approaches to diagnosis, investigation, and treatment of headache for patients in primary care. Quality of evidence A comprehensive search was conducted for relevant guidelines and systematic reviews published between January 2000 and May 2011. The guidelines were critically appraised using the AGREE (Appraisal of Guidelines for Research and Evaluation) tool, and the 6 highest-quality guidelines were used as seed guidelines for the guideline adaptation process. Main message A multidisciplinary guideline development group of primary care providers and other specialists crafted 91 specific recommendations using a consensus process. The recommendations cover diagnosis, investigation, and management of migraine, tension-type, medication-overuse, and cluster headache. Conclusion A clinical practice guideline for the Canadian health care context was created using a guideline adaptation process to assist multidisciplinary primary care practitioners in providing evidence-informed care for patients with headache. PMID:26273080
Sherman, L A
1999-07-01
Outcomes data in medicine can be limited by subjective methodologic issues such as poor selection of end points and use of nonvalidated systems for quality adjustment. Blood transfusion analyses are further complicated by the fact that transfusion seldom is primary therapy but is usually supportive or adjunctive. Thus, much of the outcome data in transfusion medicine are either unavailable or in one of two areas. The first area is prevention of bad sequelae of various cytopenias or factor deficiencies. The second is decreasing adverse effects of transfusion itself. A different useful area for outcome and root cause approaches in individual institutions is examining preanalytical and postanalytical processes of their own. Examples are sample labeling accuracy, quality and timeliness of blood suppliers, internal delivery processes and times, and product wastage. Use review can be changed to real time from retrospective time. By reducing complaints about service to objective data, realistic change can be made in internal and external processes.
Physical/chemical closed-loop water-recycling
NASA Technical Reports Server (NTRS)
Herrmann, Cal C.; Wydeven, Theodore
1991-01-01
Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on Earth, in regions where extensive water recycling is needed or where advanced water treatment is essential to meet EPA health standards.
The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training
ERIC Educational Resources Information Center
Sandrey, Michelle A.; Bulger, Sean M.
2008-01-01
Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…
ERIC Educational Resources Information Center
Trelle, Alexandra N.; Henson, Richard N.; Green, Deborah A. E.; Simons, Jon S.
2017-01-01
In a Yes/No object recognition memory test with similar lures, older adults typically exhibit elevated rates of false recognition. However, the contributions of impaired retrieval, relative to reduced availability of target details, are difficult to disentangle using such a test. The present investigation sought to decouple these factors by…
ERIC Educational Resources Information Center
Saied, Hala; James, Joemol; Singh, Evangelin Jeya; Al Humaied, Lulawah
2016-01-01
Clinical training is of paramount importance in nursing education and clinical evaluation is one of the most challenging responsibilities of nursing faculty. The use of objective tools and criteria and involvement of the students in the evaluation process are some techniques to facilitate quality learning in the clinical setting. Aim: The aim of…
USDA-ARS?s Scientific Manuscript database
Microbial contamination of waters is the critical public health issue. The watershed-scale process-based modeling of bacteria fate and transport (F&T) has been proven to serve as the useful tool for predicting microbial water quality and evaluating management practices. The objective of this work is...
Faculty Recommendations for Web Tools: Implications for Course Management Systems
ERIC Educational Resources Information Center
Oliver, Kevin; Moore, John
2008-01-01
A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…
ERIC Educational Resources Information Center
Zima, Bonnie T.; Bussing, Regina; Tang, Lingqi; Zhang, Lily; Ettner, Susan; Belin, Thomas R.; Wells, Kenneth B.
2010-01-01
Objective: To examine whether clinical severity is greater among children receiving attention-deficit/hyperactivity disorder (ADHD) care in primary care compared with those in specialty mental health clinics, and to examine how care processes and clinical outcomes vary by sector across three 6-month time intervals. Method: This was a longitudinal…
USDA-ARS?s Scientific Manuscript database
Certain roasted peanut quality sensory attributes are very important breeding objectives for peanut manufactory and consumers. Currently the only means of measuring these traits is the use of a trained sensory panel. This is a costly and time-consuming process. It is desirable, from a cost, time an...
The role of the landscape architect in applied forest landscape management: a case study on process
Wayne Tlusty
1979-01-01
Land planning allocations are often multi-resource concepts, with visual quality objectives addressing the appropriate level of visual resource management. Current legislation and/or regulations often require interdisciplinary teams to implement planning decisions. A considerable amount of information is currently avail-able on visual assessment techniques both for...
ERIC Educational Resources Information Center
Cardone, Kenneth; Paine, Mary
Activities for grades 4, 5, 6, and junior high acquaint students with consumer and economic problems, particularly how people spend money and methods used in advertising. The guide opens with a vocabulary list. Then, five objectives, using hypothetical situations, introduce the student to the decisions involved in spending money wisely. For…
NASA EEE Parts and Advanced Interconnect Program (AIP)
NASA Technical Reports Server (NTRS)
Gindorf, T.; Garrison, A.
1996-01-01
none given From Program Objectives: I. Accelerate the readiness of new technologies through development of validation, assessment and test method/tools II. Provide NASA Projects infusion paths for emerging technologies III. Provide NASA Projects technology selection, application and validation guidelines for harware and processes IV. Disseminate quality assurance, reliability, validation, tools and availability information to the NASA community.
Laminated thermoplastic composite material from recycled high density polyethylene
NASA Technical Reports Server (NTRS)
Liu, Ping; Waskom, Tommy L.
1994-01-01
The design of a materials-science, educational experiment is presented. The student should understand the fundamentals of polymer processing and mechanical property testing of materials. The ability to use American Society for Testing and Materials (ASTM) standards is also necessary for designing material test specimens and testing procedures. The objectives of the experiment are (1) to understand the concept of laminated composite materials, processing, testing, and quality assurance of thermoplastic composites and (2) to observe an application example of recycled plastics.
A Hybrid Interval–Robust Optimization Model for Water Quality Management
Xu, Jieyu; Li, Yongping; Huang, Guohe
2013-01-01
Abstract In water quality management problems, uncertainties may exist in many system components and pollution-related processes (i.e., random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval–robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements. PMID:23922495
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Gary E.; Diefenderfer, Heida L.; Ebberts, Blaine D.
The purpose ofthis document is to describe research, monitoring, and evaluation (RME) for the Federal Columbia River Estuary Program. The intent of this RME effort is to provide data and information to evaluate progress toward meeting program goals and objectives and support decision-making in the Estuary Program. The goal of the Estuary Program is to understand, conserve, and restore the estuary ecosystem to improve the performance of listed salmonid populations. The Estuary Program has five general objectives, designed to fulfill the program goal, as follows. 1. Understand the primary stressors affecting ecosystem controlling factors, such as ocean conditions and invasivemore » species. 2. Conserve and restore factors controlling ecosystem structures and processes, such as hydrodynamics and water quality. 3. Increase the quantity and quality of ecosystem structures, i.e., habitats, juvenile salmonids use during migration through the estuary. 4. Maintain the food web to benefit salmonid performance. 5. Improve salmonid performance in terms of life history diversity, foraging success, growth, and survival. The goal of estuary RME is to provide pertinent and timely research and monitoring information to planners, implementers, and managers of the Estuary Program. In conclusion, the estuary RME effort is designed to meet the research and monitoring needs of the estuary Program using an adaptive management process. Estuary RME's success and usefulness will depend on the actual conduct of adaptive management, as embodied in the objectives, implrementation, data, reporting, and synthesis, evaluation, and decision-making described herein.« less
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
Aksu, Buket; Paradkar, Anant; de Matas, Marcel; Ozer, Ozgen; Güneri, Tamer; York, Peter
2012-12-01
The publication of the International Conference of Harmonization (ICH) Q8, Q9, and Q10 guidelines paved the way for the standardization of quality after the Food and Drug Administration issued current Good Manufacturing Practices guidelines in 2003. "Quality by Design", mentioned in the ICH Q8 guideline, offers a better scientific understanding of critical process and product qualities using knowledge obtained during the life cycle of a product. In this scope, the "knowledge space" is a summary of all process knowledge obtained during product development, and the "design space" is the area in which a product can be manufactured within acceptable limits. To create the spaces, artificial neural networks (ANNs) can be used to emphasize the multidimensional interactions of input variables and to closely bind these variables to a design space. This helps guide the experimental design process to include interactions among the input variables, along with modeling and optimization of pharmaceutical formulations. The objective of this study was to develop an integrated multivariate approach to obtain a quality product based on an understanding of the cause-effect relationships between formulation ingredients and product properties with ANNs and genetic programming on the ramipril tablets prepared by the direct compression method. In this study, the data are generated through the systematic application of the design of experiments (DoE) principles and optimization studies using artificial neural networks and neurofuzzy logic programs.
Quality control in the recycling stream of PVC from window frames by hyperspectral imaging
NASA Astrophysics Data System (ADS)
Luciani, Valentina; Serranti, Silvia; Bonifazi, Giuseppe; Di Maio, Francesco; Rem, Peter
2013-05-01
Polyvinyl chloride (PVC) is one of the most commonly used thermoplastic materials in respect to the worldwide polymer consumption. PVC is mainly used in the building and construction sector, products such as pipes, window frames, cable insulation, floors, coverings, roofing sheets, etc. are realised utilising this material. In recent years, the problem of PVC waste disposal gained increasing importance in the public discussion. The quantity of used PVC items entering the waste stream is gradually increased as progressively greater numbers of PVC products approach to the end of their useful economic lives. The quality of the recycled PVC depends on the characteristics of the recycling process and the quality of the input waste. Not all PVC-containing waste streams have the same economic value. A transparent relation between value and composition is required to decide if the recycling process is cost effective for a particular waste stream. An objective and reliable quality control technique is needed in the recycling industry for the monitoring of both recycled flow streams and final products in the plant. In this work hyperspectral imaging technique in the near infrared (NIR) range (1000-1700 nm) was applied to identify unwanted plastic contaminants and rubber present in PVC coming from windows frame waste in order to assess a quality control procedure during its recycling process. Results showed as PVC, PE and rubber can be identified adopting the NIR-HSI approach.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Joffe, Hadine; White, David P.; Crawford, Sybil L.; McCurnin, Kristin E.; Economou, Nicole; Connors, Stephanie; Hall, Janet E.
2013-01-01
Objectives The impact of hot flashes on sleep is of great clinical interest, but results are inconsistent, especially when both hot flashes and sleep are measured objectively. Using objective and subjective measurements, we examined the impact of hot flashes on sleep by inducing hot flashes with a gonadotropin-releasing hormone agonist (GnRHa). Methods The GnRHa leuprolide was administered to 20 healthy premenopausal volunteers without hot flashes or sleep disturbances. Induced hot flashes were assessed objectively (skin-conductance monitor) and subjectively (daily diary) during one-month follow-up. Changes from baseline in objective (actigraphy) and subjective sleep quality (Pittsburgh Sleep Quality Index [PSQI]) were compared between women who did and did not develop objective hot flashes, and, in parallel analyses, subjective hot flashes. Results New-onset hot flashes were recorded in 14 (70%) and reported by 14 (70%) women (80% concordance). Estradiol was universally suppressed. Objective sleep efficiency worsened in women with objective hot flashes and improved in women without objective hot flashes (median decrease 2.6%, increase 4.2%, p=0.005). Subjective sleep quality worsened more in those with than without subjective hot flashes (median increase PSQI 2.5 vs. 1.0, p=0.03). Objective hot flashes were not associated with subjective sleep quality, nor were subjective symptoms linked to objective sleep measures. Conclusions This experimental model of induced hot flashes demonstrates a causal relationship between hot flashes and poor sleep quality. Objective hot flashes result in worse objective sleep efficiency, while subjective hot flashes worsen perceived sleep quality. PMID:23481119
A no-reference video quality assessment metric based on ROI
NASA Astrophysics Data System (ADS)
Jia, Lixiu; Zhong, Xuefei; Tu, Yan; Niu, Wenjuan
2015-01-01
A no reference video quality assessment metric based on the region of interest (ROI) was proposed in this paper. In the metric, objective video quality was evaluated by integrating the quality of two compressed artifacts, i.e. blurring distortion and blocking distortion. The Gaussian kernel function was used to extract the human density maps of the H.264 coding videos from the subjective eye tracking data. An objective bottom-up ROI extraction model based on magnitude discrepancy of discrete wavelet transform between two consecutive frames, center weighted color opponent model, luminance contrast model and frequency saliency model based on spectral residual was built. Then only the objective saliency maps were used to compute the objective blurring and blocking quality. The results indicate that the objective ROI extraction metric has a higher the area under the curve (AUC) value. Comparing with the conventional video quality assessment metrics which measured all the video quality frames, the metric proposed in this paper not only decreased the computation complexity, but improved the correlation between subjective mean opinion score (MOS) and objective scores.
Estimation of 3D reconstruction errors in a stereo-vision system
NASA Astrophysics Data System (ADS)
Belhaoua, A.; Kohler, S.; Hirsch, E.
2009-06-01
The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.
Mexico City Air Quality Research Initiative; Volume 5, Strategic evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-03-01
Members of the Task HI (Strategic Evaluation) team were responsible for the development of a methodology to evaluate policies designed to alleviate air pollution in Mexico City. This methodology utilizes information from various reports that examined ways to reduce pollutant emissions, results from models that calculate the improvement in air quality due to a reduction in pollutant emissions, and the opinions of experts as to the requirements and trade-offs that are involved in developing a program to address the air pollution problem in Mexico City. The methodology combines these data to produce comparisons between different approaches to improving Mexico City`smore » air quality. These comparisons take into account not only objective factors such as the air quality improvement or cost of the different approaches, but also subjective factors such as public acceptance or political attractiveness of the different approaches. The end result of the process is a ranking of the different approaches and, more importantly, the process provides insights into the implications of implementing a particular approach or policy.« less
Chip Design Process Optimization Based on Design Quality Assessment
NASA Astrophysics Data System (ADS)
Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel
2010-06-01
Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.
The clinical nurse specialist as resuscitation process manager.
Schneiderhahn, Mary Elizabeth; Fish, Anne Folta
2014-01-01
The purpose of this article was to describe the history and leadership dimensions of the role of resuscitation process manager and provide specific examples of how this role is implemented at a Midwest medical center. In 1992, a medical center in the Midwest needed a nurse to manage resuscitation care. This role designation meant that this nurse became central to all quality improvement efforts in resuscitation care. The role expanded as clinical resuscitation guidelines were updated and as the medical center grew. The role became known as the critical care clinical nurse specialist as resuscitation process manager. This clinical care nurse specialist was called a manager, but she had no direct line authority, so she accomplished her objectives by forming a multitude of collaborative networks. Based on a framework by Finkelman, the manager role incorporated specific leadership abilities in quality improvement: (1) coordination of medical center-wide resuscitation, (2) use of interprofessional teams, (3) integration of evidence into practice, and (4) staff coaching to develop leadership. The manager coordinates resuscitation care with the goals of prevention of arrests if possible, efficient and effective implementation of resuscitation protocols, high quality of patient and family support during and after the resuscitation event, and creation or revision of resuscitation policies for in-hospital and for ambulatory care areas. The manager designs a comprehensive set of meaningful and measurable process and outcome indicators with input from interprofessional teams. The manager engages staff in learning, reflecting on care given, and using the evidence base for resuscitation care. Finally, the manager role is a balance between leading quality improvement efforts and coaching staff to implement and sustain these quality improvement initiatives. Revisions to clinical guidelines for resuscitation care since the 1990s have resulted in medical centers developing improved resuscitation processes that require management. The manager enhances collaborative quality improvement efforts that are in line with Institute of Medicine recommendations. The role of resuscitation process manager may be of interest to medical centers striving for excellence in evidence-based resuscitation care.
Saint-Joly, C; Desbois, S; Lotti, J P
2000-01-01
The performance of the anaerobic digestion process depends deeply on the quality of the waste to be treated. This has been already demonstrated at the lab-scale. The objective of this study is to confirm this result at the industrial scale, with very long representative period and with the same process, the Valorga process. According to the waste quality and the collection type and even with the same conditions of fermentation, the biogas yield can vary by a factor of 1.5 when it is expressed (under normal conditions of pressure and temperature) in m3 biogas/t fresh waste, and by a factor of 2 when it is expressed in m3 CH4/t volatile solids. So, the biogas performance does not characterise a process since it is deeply governed by waste composition. This biogas productivity becomes a pertinent parameter only with consistent and relevant hypothesis and/or analytical results on the waste composition which depends on the collection procedure, the site characteristics and the season.
Quality improvement and accreditation readiness in state public health agencies.
Madamala, Kusuma; Sellers, Katie; Beitsch, Leslie M; Pearsol, Jim; Jarris, Paul
2012-01-01
There were 3 specific objectives of this study. The first objective was to examine the progress of state/territorial health assessment, health improvement planning, performance management, and quality improvement (QI) activities at state/territorial health agencies and compare findings to the 2007 findings when available. A second objective was to examine respondent interest and readiness for national voluntary accreditation. A final objective was to explore organizational factors (eg, leadership and capacity) that may influence QI or accreditation readiness. Cross-sectional study. State and Territorial Public Health Agencies. Survey respondents were organizational leaders at State and Territorial Public Health Agencies. Sixty-seven percent of respondents reported having a formal performance management process in place. Approximately 77% of respondents reported a QI process in place. Seventy-three percent of respondents agreed or strongly agreed that they would seek accreditation and 36% agreed or strongly agreed that they would seek accreditation in the first 2 years of the program. In terms of accreditation prerequisites, a strategic plan was most frequently developed, followed by a state/territorial health assessment and health improvement plan, respectively. Advancements in the practice and applied research of QI in state public health agencies are necessary steps for improving performance. In particular, strengthening the measurement of the QI construct is essential for meaningfully assessing current practice patterns and informing future programming and policy decisions. Continued QI training and technical assistance to agency staff and leadership is also critical. Accreditation may be the pivotal factor to strengthen both QI practice and research. Respondent interest in seeking accreditation may indicate the perceived value of accreditation to the agency.
A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments.
Langer, Astrid
2012-08-16
Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test.
A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments
2012-01-01
Background Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. Methods To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. Results The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. Conclusions The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test. PMID:22894708
Danovitch, Judith H; Mills, Candice M
2017-09-01
This study examines the factors underlying young children's preference for products bearing a familiar character's image. Three-year-olds (N = 92) chose between low-quality objects with images on or near the objects and high-quality objects without images. Children showed stronger preferences for damaged objects bearing images of a preferred familiar character than for objects bearing images of a preferred colour star, and they showed weak preferences for damaged objects with the character near, but not on, the object. The results suggest that children's preference for low-quality products bearing character images is driven by prior exposure to characters, and not only by the act of identifying a favourite. Statement of contribution What is already known on this subject? Children are exposed to characters in the media and on products such as clothing and school supplies. Products featuring familiar characters appeal to preschool children, even if they are of low quality. What does this study add? Three-year-olds prefer damaged objects with an image of a favourite character over plain undamaged objects. Children's preference is not solely a function of having identified a favourite image or of attentional cues. © 2017 The British Psychological Society.