Sample records for quality objective process

  1. Objective assessment of MPEG-2 video quality

    NASA Astrophysics Data System (ADS)

    Gastaldo, Paolo; Zunino, Rodolfo; Rovetta, Stefano

    2002-07-01

    The increasing use of video compression standards in broadcasting television systems has required, in recent years, the development of video quality measurements that take into account artifacts specifically caused by digital compression techniques. In this paper we present a methodology for the objective quality assessment of MPEG video streams by using circular back-propagation feedforward neural networks. Mapping neural networks can render nonlinear relationships between objective features and subjective judgments, thus avoiding any simplifying assumption on the complexity of the model. The neural network processes an instantaneous set of input values, and yields an associated estimate of perceived quality. Therefore, the neural-network approach turns objective quality assessment into adaptive modeling of subjective perception. The objective features used for the estimate are chosen according to the assessed relevance to perceived quality and are continuously extracted in real time from compressed video streams. The overall system mimics perception but does not require any analytical model of the underlying physical phenomenon. The capability to process compressed video streams represents an important advantage over existing approaches, like avoiding the stream-decoding process greatly enhances real-time performance. Experimental results confirm that the system provides satisfactory, continuous-time approximations for actual scoring curves concerning real test videos.

  2. DATA QUALITY OBJECTIVES AND MEASUREMENT QUALITY OBJECTIVES FOR RESEARCH PROJECTS

    EPA Science Inventory

    The paper provides assistance with systematic planning using measurement quality objectives to those working on research projects. These performance criteria are more familiar to researchers than data quality objectives because they are more closely associated with the measuremen...

  3. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios

  4. APPLICATION OF DATA QUALITY OBJECTIVES AND MEASUREMENT QUALITY OBJECTIVES TO RESEARCH PROJECTS

    EPA Science Inventory

    The paper assists systematic planning for research projects. It presents planning concepts in terms that have some utility for researchers. For example, measurement quality objectives are more familiar to researchers than data quality objectives because these quality criteria are...

  5. Data Quality Objectives Process for Designation of K Basins Debris

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WESTCOTT, J.L.

    2000-05-22

    The U.S. Department of Energy has developed a schedule and approach for the removal of spent fuels, sludge, and debris from the K East (KE) and K West (KW) Basins, located in the 100 Area at the Hanford Site. The project that is the subject of this data quality objective (DQO) process is focused on the removal of debris from the K Basins and onsite disposal of the debris at the Environmental Restoration Disposal Facility (ERDF). This material previously has been dispositioned at the Hanford Low-Level Burial Grounds (LLBGs) or Central Waste Complex (CWC). The goal of this DQO processmore » and the resulting Sampling and Analysis Plan (SAP) is to provide the strategy for characterizing and designating the K-Basin debris to determine if it meets the Environmental Restoration Disposal Facility Waste Acceptance Criteria (WAC), Revision 3 (BHI 1998). A critical part of the DQO process is to agree on regulatory and WAC interpretation, to support preparation of the DQO workbook and SAP.« less

  6. Quality inspection guided laser processing of irregular shape objects by stereo vision measurement: application in badminton shuttle manufacturing

    NASA Astrophysics Data System (ADS)

    Qi, Li; Wang, Shun; Zhang, Yixin; Sun, Yingying; Zhang, Xuping

    2015-11-01

    The quality inspection process is usually carried out after first processing of the raw materials such as cutting and milling. This is because the parts of the materials to be used are unidentified until they have been trimmed. If the quality of the material is assessed before the laser process, then the energy and efforts wasted on defected materials can be saved. We proposed a new production scheme that can achieve quantitative quality inspection prior to primitive laser cutting by means of three-dimensional (3-D) vision measurement. First, the 3-D model of the object is reconstructed by the stereo cameras, from which the spatial cutting path is derived. Second, collaborating with another rear camera, the 3-D cutting path is reprojected to both the frontal and rear views of the object and thus generates the regions-of-interest (ROIs) for surface defect analysis. An accurate visual guided laser process and reprojection-based ROI segmentation are enabled by a global-optimization-based trinocular calibration method. The prototype system was built and tested with the processing of raw duck feathers for high-quality badminton shuttle manufacture. Incorporating with a two-dimensional wavelet-decomposition-based defect analysis algorithm, both the geometrical and appearance features of the raw feathers are quantified before they are cut into small patches, which result in fully automatic feather cutting and sorting.

  7. DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM

    EPA Science Inventory

    The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...

  8. Assuring the Quality of Agricultural Learning Repositories: Issues for the Learning Object Metadata Creation Process of the CGIAR

    NASA Astrophysics Data System (ADS)

    Zschocke, Thomas; Beniest, Jan

    The Consultative Group on International Agricultural Re- search (CGIAR) has established a digital repository to share its teaching and learning resources along with descriptive educational information based on the IEEE Learning Object Metadata (LOM) standard. As a critical component of any digital repository, quality metadata are critical not only to enable users to find more easily the resources they require, but also for the operation and interoperability of the repository itself. Studies show that repositories have difficulties in obtaining good quality metadata from their contributors, especially when this process involves many different stakeholders as is the case with the CGIAR as an international organization. To address this issue the CGIAR began investigating the Open ECBCheck as well as the ISO/IEC 19796-1 standard to establish quality protocols for its training. The paper highlights the implications and challenges posed by strengthening the metadata creation workflow for disseminating learning objects of the CGIAR.

  9. Double shell tanks (DST) chemistry control data quality objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING, D.L.

    2001-10-09

    One of the main functions of the River Protection Project is to store the Hanford Site tank waste until the Waste Treatment Plant (WTP) is ready to receive and process the waste. Waste from the older single-shell tanks is being transferred to the newer double-shell tanks (DSTs). Therefore, the integrity of the DSTs must be maintained until the waste from all tanks has been retrieved and transferred to the WTP. To help maintain the integrity of the DSTs over the life of the project, specific chemistry limits have been established to control corrosion of the DSTs. These waste chemistry limitsmore » are presented in the Technical Safety Requirements (TSR) document HNF-SD-WM-TSR-006, Sec. 5 . IS, Rev 2B (CHG 200 I). In order to control the chemistry in the DSTs, the Chemistry Control Program will require analyses of the tank waste. This document describes the Data Quality Objective (DUO) process undertaken to ensure appropriate data will be collected to control the waste chemistry in the DSTs. The DQO process was implemented in accordance with Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. Ib, Vol. IV, Section 4.16, (Banning 2001) and the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994), with some modifications to accommodate project or tank specific requirements and constraints.« less

  10. Object-processing neural efficiency differentiates object from spatial visualizers.

    PubMed

    Motes, Michael A; Malach, Rafael; Kozhevnikov, Maria

    2008-11-19

    The visual system processes object properties and spatial properties in distinct subsystems, and we hypothesized that this distinction might extend to individual differences in visual processing. We conducted a functional MRI study investigating the neural underpinnings of individual differences in object versus spatial visual processing. Nine participants of high object-processing ability ('object' visualizers) and eight participants of high spatial-processing ability ('spatial' visualizers) were scanned, while they performed an object-processing task. Object visualizers showed lower bilateral neural activity in lateral occipital complex and lower right-lateralized neural activity in dorsolateral prefrontal cortex. The data indicate that high object-processing ability is associated with more efficient use of visual-object resources, resulting in less neural activity in the object-processing pathway.

  11. DATA QUALITY OBJECTIVES IN RESEARCH PLANNING AT MED: THREE CASE STUDIES

    EPA Science Inventory

    This course will give a quality assurance perspective to research planning by describing the Data Quality Objective Process....Written plans are mandatory for all EPA environmental data collection activities according to EPA Order 5360.1 CHG 1 and Federal Acquisition Regulations,...

  12. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MULKEY, C.H.

    1999-07-02

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for themore » Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements.« less

  13. Data quality objectives for the initial fuel conditioning examinations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, L.A.

    The Data Quality Objectives (DQOs) were established for the response of the first group of fuel samples shipped from the K West Basin to the Hanford 327 Building hot cells for examinations to the proposed Path Forward conditioning process. Controlled temperature and atmosphere furnace testing testing will establish performance parameters using the conditioning process (drying, sludge drying, hydride decomposition passivation) proposed by the Independent Technical Assessment (ITA) Team as the baseline.

  14. Quality Assurance for Digital Learning Object Repositories: Issues for the Metadata Creation Process

    ERIC Educational Resources Information Center

    Currier, Sarah; Barton, Jane; O'Beirne, Ronan; Ryan, Ben

    2004-01-01

    Metadata enables users to find the resources they require, therefore it is an important component of any digital learning object repository. Much work has already been done within the learning technology community to assure metadata quality, focused on the development of metadata standards, specifications and vocabularies and their implementation…

  15. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    PubMed

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  16. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  17. Multi Objective Optimization of Yarn Quality and Fibre Quality Using Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Ghosh, Anindya; Das, Subhasis; Banerjee, Debamalya

    2013-03-01

    The quality and cost of resulting yarn play a significant role to determine its end application. The challenging task of any spinner lies in producing a good quality yarn with added cost benefit. The present work does a multi-objective optimization on two objectives, viz. maximization of cotton yarn strength and minimization of raw material quality. The first objective function has been formulated based on the artificial neural network input-output relation between cotton fibre properties and yarn strength. The second objective function is formulated with the well known regression equation of spinning consistency index. It is obvious that these two objectives are conflicting in nature i.e. not a single combination of cotton fibre parameters does exist which produce maximum yarn strength and minimum cotton fibre quality simultaneously. Therefore, it has several optimal solutions from which a trade-off is needed depending upon the requirement of user. In this work, the optimal solutions are obtained with an elitist multi-objective evolutionary algorithm based on Non-dominated Sorting Genetic Algorithm II (NSGA-II). These optimum solutions may lead to the efficient exploitation of raw materials to produce better quality yarns at low costs.

  18. Saving Educational Dollars through Quality Objectives.

    ERIC Educational Resources Information Center

    Alvir, Howard P.

    This document is a collection of working papers written to meet the specific needs of teachers who are starting to think about and write performance objectives. It emphasizes qualitative objectives as opposed to quantitative classroom goals. The author describes quality objectives as marked by their clarity, accessibility, accountability, and…

  19. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  20. Objective Quality Assessment for Color-to-Gray Image Conversion.

    PubMed

    Ma, Kede; Zhao, Tiesong; Zeng, Kai; Wang, Zhou

    2015-12-01

    Color-to-gray (C2G) image conversion is the process of transforming a color image into a grayscale one. Despite its wide usage in real-world applications, little work has been dedicated to compare the performance of C2G conversion algorithms. Subjective evaluation is reliable but is also inconvenient and time consuming. Here, we make one of the first attempts to develop an objective quality model that automatically predicts the perceived quality of C2G converted images. Inspired by the philosophy of the structural similarity index, we propose a C2G structural similarity (C2G-SSIM) index, which evaluates the luminance, contrast, and structure similarities between the reference color image and the C2G converted image. The three components are then combined depending on image type to yield an overall quality measure. Experimental results show that the proposed C2G-SSIM index has close agreement with subjective rankings and significantly outperforms existing objective quality metrics for C2G conversion. To explore the potentials of C2G-SSIM, we further demonstrate its use in two applications: 1) automatic parameter tuning for C2G conversion algorithms and 2) adaptive fusion of C2G converted images.

  1. Objective quality assessment for multiexposure multifocus image fusion.

    PubMed

    Hassen, Rania; Wang, Zhou; Salama, Magdy M A

    2015-09-01

    There has been a growing interest in image fusion technologies, but how to objectively evaluate the quality of fused images has not been fully understood. Here, we propose a method for objective quality assessment of multiexposure multifocus image fusion based on the evaluation of three key factors of fused image quality: 1) contrast preservation; 2) sharpness; and 3) structure preservation. Subjective experiments are conducted to create an image fusion database, based on which, performance evaluation shows that the proposed fusion quality index correlates well with subjective scores, and gives a significant improvement over the existing fusion quality measures.

  2. Manipulation of Unknown Objects to Improve the Grasp Quality Using Tactile Information.

    PubMed

    Montaño, Andrés; Suárez, Raúl

    2018-05-03

    This work presents a novel and simple approach in the area of manipulation of unknown objects considering both geometric and mechanical constraints of the robotic hand. Starting with an initial blind grasp, our method improves the grasp quality through manipulation considering the three common goals of the manipulation process: improving the hand configuration, the grasp quality and the object positioning, and, at the same time, prevents the object from falling. Tactile feedback is used to obtain local information of the contacts between the fingertips and the object, and no additional exteroceptive feedback sources are considered in the approach. The main novelty of this work lies in the fact that the grasp optimization is performed on-line as a reactive procedure using the tactile and kinematic information obtained during the manipulation. Experimental results are shown to illustrate the efficiency of the approach.

  3. Process perspective on image quality evaluation

    NASA Astrophysics Data System (ADS)

    Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

    2008-01-01

    The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

  4. The ventral visual pathway: an expanded neural framework for the processing of object quality.

    PubMed

    Kravitz, Dwight J; Saleem, Kadharbatcha S; Baker, Chris I; Ungerleider, Leslie G; Mishkin, Mortimer

    2013-01-01

    Since the original characterization of the ventral visual pathway, our knowledge of its neuroanatomy, functional properties, and extrinsic targets has grown considerably. Here we synthesize this recent evidence and propose that the ventral pathway is best understood as a recurrent occipitotemporal network containing neural representations of object quality both utilized and constrained by at least six distinct cortical and subcortical systems. Each system serves its own specialized behavioral, cognitive, or affective function, collectively providing the raison d'être for the ventral visual pathway. This expanded framework contrasts with the depiction of the ventral visual pathway as a largely serial staged hierarchy culminating in singular object representations and more parsimoniously incorporates attentional, contextual, and feedback effects. Published by Elsevier Ltd.

  5. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2014-10-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  6. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2015-01-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  7. Fast processing of microscopic images using object-based extended depth of field.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Pannarut, Montri; Shaw, Philip J; Tongsima, Sissades

    2016-12-22

    Microscopic analysis requires that foreground objects of interest, e.g. cells, are in focus. In a typical microscopic specimen, the foreground objects may lie on different depths of field necessitating capture of multiple images taken at different focal planes. The extended depth of field (EDoF) technique is a computational method for merging images from different depths of field into a composite image with all foreground objects in focus. Composite images generated by EDoF can be applied in automated image processing and pattern recognition systems. However, current algorithms for EDoF are computationally intensive and impractical, especially for applications such as medical diagnosis where rapid sample turnaround is important. Since foreground objects typically constitute a minor part of an image, the EDoF technique could be made to work much faster if only foreground regions are processed to make the composite image. We propose a novel algorithm called object-based extended depths of field (OEDoF) to address this issue. The OEDoF algorithm consists of four major modules: 1) color conversion, 2) object region identification, 3) good contrast pixel identification and 4) detail merging. First, the algorithm employs color conversion to enhance contrast followed by identification of foreground pixels. A composite image is constructed using only these foreground pixels, which dramatically reduces the computational time. We used 250 images obtained from 45 specimens of confirmed malaria infections to test our proposed algorithm. The resulting composite images with all in-focus objects were produced using the proposed OEDoF algorithm. We measured the performance of OEDoF in terms of image clarity (quality) and processing time. The features of interest selected by the OEDoF algorithm are comparable in quality with equivalent regions in images processed by the state-of-the-art complex wavelet EDoF algorithm; however, OEDoF required four times less processing time. This

  8. Information processing during NREM sleep and sleep quality in insomnia.

    PubMed

    Ceklic, Tijana; Bastien, Célyne H

    2015-12-01

    Insomnia sufferers (INS) are cortically hyperaroused during sleep, which seems to translate into altered information processing during nighttime. While information processing, as measured by event-related potentials (ERPs), during wake appears to be associated with sleep quality of the preceding night, the existence of such an association during nighttime has never been investigated. This study aims to investigate nighttime information processing among good sleepers (GS) and INS while considering concomitant sleep quality. Following a multistep clinical evaluation, INS and GS participants underwent 4 consecutive nights of PSG recordings in the sleep laboratory. Thirty nine GS (mean age 34.56±9.02) and twenty nine INS (mean age 43.03±9.12) were included in the study. ERPs (N1, P2, N350) were recorded all night on Night 4 (oddball paradigm) during NREM sleep. Regardless of sleep quality, INS presented a larger N350 amplitude during SWS (p=0.042) while GS showed a larger N350 amplitude during late-night stage 2 sleep (p=0.004). Regardless of diagnosis, those who slept objectively well showed a smaller N350 amplitude (p=0.020) while those who slept subjectively well showed a smaller P2 (p<0.001) and N350 amplitude (p=0.006). Also, those who reported an objectively bad night as good showed smaller P2 (p< 0.001) and N350 (p=0.010) amplitudes. Information processing seems to be associated with concomitant subjective and objective sleep quality for both GS and INS. However, INS show an alteration in information processing during sleep, especially for inhibition processes, regardless of their sleep quality. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. The ventral visual pathway: An expanded neural framework for the processing of object quality

    PubMed Central

    Kravitz, Dwight J.; Saleem, Kadharbatcha S.; Baker, Chris I.; Ungerleider, Leslie G.; Mishkin, Mortimer

    2012-01-01

    Since the original characterization of the ventral visual pathway our knowledge of its neuroanatomy, functional properties, and extrinsic targets has grown considerably. Here we synthesize this recent evidence and propose that the ventral pathway is best understood as a recurrent occipitotemporal network containing neural representations of object quality both utilized and constrained by at least six distinct cortical and subcortical systems. Each system serves its own specialized behavioral, cognitive, or affective function, collectively providing the raison d’etre for the ventral visual pathway. This expanded framework contrasts with the depiction of the ventral visual pathway as a largely serial staged hierarchy that culminates in singular object representations for utilization mainly by ventrolateral prefrontal cortex and, more parsimoniously than this account, incorporates attentional, contextual, and feedback effects. PMID:23265839

  10. The objective impact of clinical peer review on hospital quality and safety.

    PubMed

    Edwards, Marc T

    2011-01-01

    Despite its importance, the objective impact of clinical peer review on the quality and safety of care has not been studied. Data from 296 acute care hospitals show that peer review program and related organizational factors can explain up to 18% of the variation in standardized measures of quality and patient safety. The majority of programs rely on an outmoded and dysfunctional process model. Adoption of best practices informed by the continuing study of peer review program effectiveness has the potential to significantly improve patient outcomes.

  11. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world.

  12. Objective speech quality evaluation of real-time speech coders

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Russell, W. H.; Huggins, A. W. F.

    1984-02-01

    This report describes the work performed in two areas: subjective testing of a real-time 16 kbit/s adaptive predictive coder (APC) and objective speech quality evaluation of real-time coders. The speech intelligibility of the APC coder was tested using the Diagnostic Rhyme Test (DRT), and the speech quality was tested using the Diagnostic Acceptability Measure (DAM) test, under eight operating conditions involving channel error, acoustic background noise, and tandem link with two other coders. The test results showed that the DRT and DAM scores of the APC coder equalled or exceeded the corresponding test scores fo the 32 kbit/s CVSD coder. In the area of objective speech quality evaluation, the report describes the development, testing, and validation of a procedure for automatically computing several objective speech quality measures, given only the tape-recordings of the input speech and the corresponding output speech of a real-time speech coder.

  13. Distinct cognitive mechanisms involved in the processing of single objects and object ensembles

    PubMed Central

    Cant, Jonathan S.; Sun, Sol Z.; Xu, Yaoda

    2015-01-01

    Behavioral research has demonstrated that the shape and texture of single objects can be processed independently. Similarly, neuroimaging results have shown that an object's shape and texture are processed in distinct brain regions with shape in the lateral occipital area and texture in parahippocampal cortex. Meanwhile, objects are not always seen in isolation and are often grouped together as an ensemble. We recently showed that the processing of ensembles also involves parahippocampal cortex and that the shape and texture of ensemble elements are processed together within this region. These neural data suggest that the independence seen between shape and texture in single-object perception would not be observed in object-ensemble perception. Here we tested this prediction by examining whether observers could attend to the shape of ensemble elements while ignoring changes in an unattended texture feature and vice versa. Across six behavioral experiments, we replicated previous findings of independence between shape and texture in single-object perception. In contrast, we observed that changes in an unattended ensemble feature negatively impacted the processing of an attended ensemble feature only when ensemble features were attended globally. When they were attended locally, thereby making ensemble processing similar to single-object processing, interference was abolished. Overall, these findings confirm previous neuroimaging results and suggest that distinct cognitive mechanisms may be involved in single-object and object-ensemble perception. Additionally, they show that the scope of visual attention plays a critical role in determining which type of object processing (ensemble or single object) is engaged by the visual system. PMID:26360156

  14. Realization of high quality production schedules: Structuring quality factors via iteration of user specification processes

    NASA Technical Reports Server (NTRS)

    Hamazaki, Takashi

    1992-01-01

    This paper describes an architecture for realizing high quality production schedules. Although quality is one of the most important aspects of production scheduling, it is difficult, even for a user, to specify precisely. However, it is also true that the decision as to whether a scheduler is good or bad can only be made by the user. This paper proposes the following: (1) the quality of a schedule can be represented in the form of quality factors, i.e. constraints and objectives of the domain, and their structure; (2) quality factors and their structure can be used for decision making at local decision points during the scheduling process; and (3) that they can be defined via iteration of user specification processes.

  15. Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II

    NASA Astrophysics Data System (ADS)

    Pal, Kamal; Pal, Surjya K.

    2018-05-01

    Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.

  16. Objective Quality and Intelligibility Prediction for Users of Assistive Listening Devices

    PubMed Central

    Falk, Tiago H.; Parsa, Vijay; Santos, João F.; Arehart, Kathryn; Hazrati, Oldooz; Huber, Rainer; Kates, James M.; Scollie, Susan

    2015-01-01

    This article presents an overview of twelve existing objective speech quality and intelligibility prediction tools. Two classes of algorithms are presented, namely intrusive and non-intrusive, with the former requiring the use of a reference signal, while the latter does not. Investigated metrics include both those developed for normal hearing listeners, as well as those tailored particularly for hearing impaired (HI) listeners who are users of assistive listening devices (i.e., hearing aids, HAs, and cochlear implants, CIs). Representative examples of those optimized for HI listeners include the speech-to-reverberation modulation energy ratio, tailored to hearing aids (SRMR-HA) and to cochlear implants (SRMR-CI); the modulation spectrum area (ModA); the hearing aid speech quality (HASQI) and perception indices (HASPI); and the PErception MOdel - hearing impairment quality (PEMO-Q-HI). The objective metrics are tested on three subjectively-rated speech datasets covering reverberation-alone, noise-alone, and reverberation-plus-noise degradation conditions, as well as degradations resultant from nonlinear frequency compression and different speech enhancement strategies. The advantages and limitations of each measure are highlighted and recommendations are given for suggested uses of the different tools under specific environmental and processing conditions. PMID:26052190

  17. Object-based neglect in number processing

    PubMed Central

    2013-01-01

    Recent evidence suggests that neglect patients seem to have particular problems representing relatively smaller numbers corresponding to the left part of the mental number line. However, while this indicates space-based neglect for representational number space little is known about whether and - if so - how object-based neglect influences number processing. To evaluate influences of object-based neglect in numerical cognition, a group of neglect patients and two control groups had to compare two-digit numbers to an internally represented standard. Conceptualizing two-digit numbers as objects of which the left part (i.e., the tens digit should be specifically neglected) we were able to evaluate object-based neglect for number magnitude processing. Object-based neglect was indicated by a larger unit-decade compatibility effect actually reflecting impaired processing of the leftward tens digits. Additionally, faster processing of within- as compared to between-decade items provided further evidence suggesting particular difficulties in integrating tens and units into the place-value structure of the Arabic number system. In summary, the present study indicates that, in addition to the spatial representation of number magnitude, also the processing of place-value information of multi-digit numbers seems specifically impaired in neglect patients. PMID:23343126

  18. Objective speech quality assessment and the RPE-LTP coding algorithm in different noise and language conditions.

    PubMed

    Hansen, J H; Nandkumar, S

    1995-01-01

    The formulation of reliable signal processing algorithms for speech coding and synthesis require the selection of a prior criterion of performance. Though coding efficiency (bits/second) or computational requirements can be used, a final performance measure must always include speech quality. In this paper, three objective speech quality measures are considered with respect to quality assessment for American English, noisy American English, and noise-free versions of seven languages. The purpose is to determine whether objective quality measures can be used to quantify changes in quality for a given voice coding method, with a known subjective performance level, as background noise or language conditions are changed. The speech coding algorithm chosen is regular-pulse excitation with long-term prediction (RPE-LTP), which has been chosen as the standard voice compression algorithm for the European Digital Mobile Radio system. Three areas are considered for objective quality assessment which include: (i) vocoder performance for American English in a noise-free environment, (ii) speech quality variation for three additive background noise sources, and (iii) noise-free performance for seven languages which include English, Japanese, Finnish, German, Hindi, Spanish, and French. It is suggested that although existing objective quality measures will never replace subjective testing, they can be a useful means of assessing changes in performance, identifying areas for improvement in algorithm design, and augmenting subjective quality tests for voice coding/compression algorithms in noise-free, noisy, and/or non-English applications.

  19. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2010-08-03

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less

  20. Ecologically and economically conscious design of the injected pultrusion process via multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Srinivasagupta, Deepak; Kardos, John L.

    2004-05-01

    Injected pultrusion (IP) is an environmentally benign continuous process for low-cost manufacture of prismatic polymer composites. IP has been of recent regulatory interest as an option to achieve significant vapour emissions reduction. This work describes the design of the IP process with multiple design objectives. In our previous work (Srinivasagupta D et al 2003 J. Compos. Mater. at press), an algorithm for economic design using a validated three-dimensional physical model of the IP process was developed, subject to controllability considerations. In this work, this algorithm was used in a multi-objective optimization approach to simultaneously meet economic, quality related, and environmental objectives. The retrofit design of a bench-scale set-up was considered, and the concept of exergy loss in the process, as well as in vapour emission, was introduced. The multi-objective approach was able to determine the optimal values of the processing parameters such as heating zone temperatures and resin injection pressure, as well as the equipment specifications (die dimensions, heater, puller and pump ratings) that satisfy the various objectives in a weighted sense, and result in enhanced throughput rates. The economic objective did not coincide with the environmental objective, and a compromise became necessary. It was seen that most of the exergy loss is in the conversion of electric power into process heating. Vapour exergy loss was observed to be negligible for the most part.

  1. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  2. A new approach to the identification of Landscape Quality Objectives (LQOs) as a set of indicators.

    PubMed

    Sowińska-Świerkosz, Barbara Natalia; Chmielewski, Tadeusz J

    2016-12-15

    The objective of the paper is threefold: (1) to introduce Landscape Quality Objectives (LQOs) as a set of indicators; (2) to present a method of linking social and expert opinion in the process of the formulation of landscape indicators; and (3) to present a methodological framework for the identification of LQOs. The implementation of these goals adopted a six-stage procedure based on the use of landscape units: (1) GIS analysis; (2) classification; (3) social survey; (4) expert value judgement; (5) quality assessment; and (6) guidelines formulation. The essence of the research was the presentation of features that determine landscape quality according to public opinion as a set of indicators. The results showed that 80 such indicators were identified, of both a qualitative (49) and a quantitative character (31). Among the analysed units, 60% (18 objects) featured socially expected (and confirmed by experts) levels of landscape quality, and 20% (6 objects) required overall quality improvement in terms of both public and expert opinion. The adopted procedure provides a new tool for integrating social responsibility into environmental management. The advantage of the presented method is the possibility of its application in the territories of various European countries. It is flexible enough to be based on cartographic studies, landscape research methods, and environmental quality standards existing in a given country. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  4. DATA QUALITY OBJECTIVE SUMMARY REPORT FOR THE 105 K EAST ION EXCHANGE COLUMN MONOLITH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JOCHEN, R.M.

    2007-08-02

    The 105-K East (KE) Basin Ion Exchange Column (IXC) cells, lead caves, and the surrounding vault are to be removed as necessary components in implementing ''Hanford Federal Facility Agreement and Consent Order'' (Ecology et al. 2003) milestone M-034-32 (Complete Removal of the K East Basin Structure). The IXCs consist of six units located in the KE Basin, three in operating positions in cells and three stored in a lead cave. Methods to remove the IXCs from the KE Basin were evaluated in KBC-28343, ''Disposal of K East Basin Ion Exchange Column Evaluation''. The method selected for removal was grouting themore » six IXCs into a single monolith for disposal at the Environmental Restoration Disposal Facility (ERDF). Grout will be added to the IXC cells, IXC lead caves containing spent IXCs, and in the spaces between the lead cave walls and metal skin, to immobilize the contaminants, provide self-shielding, minimize void space, and provide a structurally stable waste form. The waste to be offered for disposal is the encapsulated monolith defined by the exterior surfaces of the vault and the lower surface of the underlying slab. This document presents summary of the data quality objective (DQO) process establishing the decisions and data required to support decision-making activities for the disposition of the IXC monolith. The DQO process is completed in accordance with the seven-step planning process described in EPA QA/G-4, ''Guidance for the Data Quality Objectives Process'', which is used to clarify and study objectives; define the appropriate type, quantity, and quality of data; and support defensible decision-making. The DQO process involves the following steps: (1) state the problem; (2) identify the decision; (3) identify the inputs to the decision; (4) define the boundaries of the study; (5) develop a decision rule (DR); (6) specify tolerable limits on decision errors; and (7) optimize the design for obtaining data.« less

  5. Data Quality Objectives for Tank Farms Waste Compatibility Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING, D.L.

    1999-07-02

    There are 177 waste storage tanks containing over 210,000 m{sup 3} (55 million gal) of mixed waste at the Hanford Site. The River Protection Project (RPP) has adopted the data quality objective (DQO) process used by the U.S. Environmental Protection Agency (EPA) (EPA 1994a) and implemented by RPP internal procedure (Banning 1999a) to identify the information and data needed to address safety issues. This DQO document is based on several documents that provide the technical basis for inputs and decision/action levels used to develop the decision rules that evaluate the transfer of wastes. A number of these documents are presentlymore » in the process of being revised. This document will need to be revised if there are changes to the technical criteria in these supporting documents. This DQO process supports various documents, such as sampling and analysis plans and double-shell tank (DST) waste analysis plans. This document identifies the type, quality, and quantity of data needed to determine whether transfer of supernatant can be performed safely. The requirements in this document are designed to prevent the mixing of incompatible waste as defined in Washington Administrative Code (WAC) 173-303-040. Waste transfers which meet the requirements contained in this document and the Double-Shell Tank Waste Analysis Plan (Mulkey 1998) are considered to be compatible, and prevent the mixing of incompatible waste.« less

  6. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  7. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  8. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  9. DATA QUALITY OBJECTIVES AND STATISTICAL DESIGN SUPPORT FOR DEVELOPMENT OF A MONITORING PROTOCOL FOR RECREATIONAL WATERS

    EPA Science Inventory

    The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.

  10. Dissociating verbal and nonverbal audiovisual object processing.

    PubMed

    Hocking, Julia; Price, Cathy J

    2009-02-01

    This fMRI study investigates how audiovisual integration differs for verbal stimuli that can be matched at a phonological level and nonverbal stimuli that can be matched at a semantic level. Subjects were presented simultaneously with one visual and one auditory stimulus and were instructed to decide whether these stimuli referred to the same object or not. Verbal stimuli were simultaneously presented spoken and written object names, and nonverbal stimuli were photographs of objects simultaneously presented with naturally occurring object sounds. Stimulus differences were controlled by including two further conditions that paired photographs of objects with spoken words and object sounds with written words. Verbal matching, relative to all other conditions, increased activation in a region of the left superior temporal sulcus that has previously been associated with phonological processing. Nonverbal matching, relative to all other conditions, increased activation in a right fusiform region that has previously been associated with structural and conceptual object processing. Thus, we demonstrate how brain activation for audiovisual integration depends on the verbal content of the stimuli, even when stimulus and task processing differences are controlled.

  11. Objects and categories: feature statistics and object processing in the ventral stream.

    PubMed

    Tyler, Lorraine K; Chiu, Shannon; Zhuang, Jie; Randall, Billi; Devereux, Barry J; Wright, Paul; Clarke, Alex; Taylor, Kirsten I

    2013-10-01

    Recognizing an object involves more than just visual analyses; its meaning must also be decoded. Extensive research has shown that processing the visual properties of objects relies on a hierarchically organized stream in ventral occipitotemporal cortex, with increasingly more complex visual features being coded from posterior to anterior sites culminating in the perirhinal cortex (PRC) in the anteromedial temporal lobe (aMTL). The neurobiological principles of the conceptual analysis of objects remain more controversial. Much research has focused on two neural regions-the fusiform gyrus and aMTL, both of which show semantic category differences, but of different types. fMRI studies show category differentiation in the fusiform gyrus, based on clusters of semantically similar objects, whereas category-specific deficits, specifically for living things, are associated with damage to the aMTL. These category-specific deficits for living things have been attributed to problems in differentiating between highly similar objects, a process that involves the PRC. To determine whether the PRC and the fusiform gyri contribute to different aspects of an object's meaning, with differentiation between confusable objects in the PRC and categorization based on object similarity in the fusiform, we carried out an fMRI study of object processing based on a feature-based model that characterizes the degree of semantic similarity and difference between objects and object categories. Participants saw 388 objects for which feature statistic information was available and named the objects at the basic level while undergoing fMRI scanning. After controlling for the effects of visual information, we found that feature statistics that capture similarity between objects formed category clusters in fusiform gyri, such that objects with many shared features (typical of living things) were associated with activity in the lateral fusiform gyri whereas objects with fewer shared features (typical

  12. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2011-02-11

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less

  13. Objective vocal quality in children using cochlear implants: a multiparameter approach.

    PubMed

    Baudonck, Nele; D'haeseleer, Evelien; Dhooge, Ingeborg; Van Lierde, Kristiane

    2011-11-01

    The purpose of this study was to determine the objective vocal quality in 36 prelingually deaf children using cochlear implant (CI) with a mean age of 9 years. An additional purpose was to compare the objective vocal quality of these 36 CI users with 25 age-matched children with prelingual severe hearing loss using conventional hearing aids (HAs) and 25 normal hearing (NH) children. The design for this cross-sectional study was a multigroup posttest-only design. The objective vocal quality was measured by means of the dysphonia severity index (DSI). Moreover, perceptual voice assessment using the GRBASI scale was performed. CI children have a vocal quality by means of the DSI of +1.8, corresponding with a DSI% of 68%, indicating a borderline vocal quality situated 2% above the limit of normality. The voice was perceptually characterized by the presence of a very slight grade of hoarseness, roughness, strained phonation, and higher pitch and intensity levels. No significant objective vocal quality differences were measured between the voices of the CI children, HA users, and NH children. According to the results, one aspect of the vocal approach in children with CI and using HAs must be focused on the improvement of the strained vocal characteristic and the use of a lower pitch and intensity level. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  14. A Quality Sorting of Fruit Using a New Automatic Image Processing Method

    NASA Astrophysics Data System (ADS)

    Amenomori, Michihiro; Yokomizu, Nobuyuki

    This paper presents an innovative approach for quality sorting of objects such as apples sorting in an agricultural factory, using an image processing algorithm. The objective of our approach are; firstly to sort the objects by their colors precisely; secondly to detect any irregularity of the colors surrounding the apples efficiently. An experiment has been conducted and the results have been obtained and compared with that has been preformed by human sorting process and by color sensor sorting devices. The results demonstrate that our approach is capable to sort the objects rapidly and the percentage of classification valid rate was 100 %.

  15. Corporate objectives and the planning process.

    PubMed

    White, S

    1990-02-01

    The embodiment of corporate objectives in a workable planning process enables all employees to develop an identity larger than themselves. This results in a more cohesive body and makes it easier to implement the organization's strategy and mission. The senior executives at University Hospital have a long history with the organization and therefore know it well. Whether the new process makes planning more coordinated and comprehensive will be measured by both the subjective and the objective assessment of these executives.

  16. Methodology for Evaluating Quality and Reusability of Learning Objects

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Bireniene, Virginija; Serikoviene, Silvija

    2011-01-01

    The aim of the paper is to present the scientific model and several methods for the expert evaluation of quality of learning objects (LOs) paying especial attention to LOs reusability level. The activities of eQNet Quality Network for a European Learning Resource Exchange (LRE) aimed to improve reusability of LOs of European Schoolnet's LRE…

  17. Emotion and Object Processing in Parkinson's Disease

    ERIC Educational Resources Information Center

    Cohen, Henri; Gagne, Marie-Helene; Hess, Ursula; Pourcher, Emmanuelle

    2010-01-01

    The neuropsychological literature on the processing of emotions in Parkinson's disease (PD) reveals conflicting evidence about the role of the basal ganglia in the recognition of facial emotions. Hence, the present study had two objectives. One was to determine the extent to which the visual processing of emotions and objects differs in PD. The…

  18. Quality and Content of Individualized Habilitation Plan Objectives in Residential Settings.

    ERIC Educational Resources Information Center

    Stancliffe, Roger J.; Hayden, Mary F.; Lakin, K. Charlie

    2000-01-01

    The quality, number, and content of residential Individualized Habilitation Plans (IHP) objectives were evaluated for 155 adult institution and community residents. Over 90 percent of objectives were functional and age appropriate. Community residents had significantly more IHP objectives and also had objectives from a wider variety of content…

  19. Objective quality assessment of tone-mapped images.

    PubMed

    Yeganeh, Hojatollah; Wang, Zhou

    2013-02-01

    Tone-mapping operators (TMOs) that convert high dynamic range (HDR) to low dynamic range (LDR) images provide practically useful tools for the visualization of HDR images on standard LDR displays. Different TMOs create different tone-mapped images, and a natural question is which one has the best quality. Without an appropriate quality measure, different TMOs cannot be compared, and further improvement is directionless. Subjective rating may be a reliable evaluation method, but it is expensive and time consuming, and more importantly, is difficult to be embedded into optimization frameworks. Here we propose an objective quality assessment algorithm for tone-mapped images by combining: 1) a multiscale signal fidelity measure on the basis of a modified structural similarity index and 2) a naturalness measure on the basis of intensity statistics of natural images. Validations using independent subject-rated image databases show good correlations between subjective ranking score and the proposed tone-mapped image quality index (TMQI). Furthermore, we demonstrate the extended applications of TMQI using two examples-parameter tuning for TMOs and adaptive fusion of multiple tone-mapped images.

  20. The probability of object-scene co-occurrence influences object identification processes.

    PubMed

    Sauvé, Geneviève; Harmand, Mariane; Vanni, Léa; Brodeur, Mathieu B

    2017-07-01

    Contextual information allows the human brain to make predictions about the identity of objects that might be seen and irregularities between an object and its background slow down perception and identification processes. Bar and colleagues modeled the mechanisms underlying this beneficial effect suggesting that the brain stocks information about the statistical regularities of object and scene co-occurrence. Their model suggests that these recurring regularities could be conceptualized along a continuum in which the probability of seeing an object within a given scene can be high (probable condition), moderate (improbable condition) or null (impossible condition). In the present experiment, we propose to disentangle the electrophysiological correlates of these context effects by directly comparing object-scene pairs found along this continuum. We recorded the event-related potentials of 30 healthy participants (18-34 years old) and analyzed their brain activity in three time windows associated with context effects. We observed anterior negativities between 250 and 500 ms after object onset for the improbable and impossible conditions (improbable more negative than impossible) compared to the probable condition as well as a parieto-occipital positivity (improbable more positive than impossible). The brain may use different processing pathways to identify objects depending on whether the probability of co-occurrence with the scene is moderate (rely more on top-down effects) or null (rely more on bottom-up influences). The posterior positivity could index error monitoring aimed to ensure that no false information is integrated into mental representations of the world.

  1. Environmental flows and water quality objectives for the River Murray.

    PubMed

    Gippel, C; Jacobs, T; McLeod, T

    2002-01-01

    Over the past decade, there intense consideration of managing flows in the River Murray to provide environmental benefits. In 1990 the Murray-Darling Basin Ministerial Council adopted a water quality policy: To maintain and, where necessary, improve existing water quality in the rivers of the Murray-Darling Basin for all beneficial uses - agricultural, environmental, urban, industrial and recreational, and in 1994 a flow policy: To maintain and where necessary improve existing flow regimes in the waterways of the Murray-Darling Basin to protect and enhance the riverine environment. The Audit of Water Use followed in 1995, culminating in the decision of the Ministerial Council to implement an interim cap on new diversions for consumptive use (the "Cap") in a bid to halt declining river health. In March 1999 the Environmental Flows and Water Quality Objectives for the River Murray Project (the Project) was set up, primarily to establish be developed that aims to achieve a sustainable river environment and water quality, in accordance with community needs, and including an adaptive approach to management and operation of the River. It will lead to objectives for water quality and environmental flows that are feasible, appropriate, have the support of the scientific, management and stakeholder communities, and carry acceptable levels of risk. This paper describes four key aspects of the process being undertaken to determine the objectives, and design the flow options that will meet those objectives: establishment of an appropriate technical, advisory and administrative framework; establishing clear evidence for regulation impacts; undergoing assessment of environmental flow needs; and filling knowledge gaps. A review of the impacts of flow regulation on the health of the River Murray revealed evidence for decline, but the case for flow regulation as the main cause is circumstantial or uncertain. This is to be expected, because the decline of the River Murray results

  2. A system to program projects to meet visual quality objectives

    Treesearch

    Fred L. Henley; Frank L. Hunsaker

    1979-01-01

    The U. S. Forest Service has established Visual Quality Objectives for National Forest lands and determined a method to ascertain the Visual Absorption Capability of those lands. Combining the two mapping inventories has allowed the Forest Service to retain the visual quality while managing natural resources.

  3. Projector-Based Augmented Reality for Quality Inspection of Scanned Objects

    NASA Astrophysics Data System (ADS)

    Kern, J.; Weinmann, M.; Wursthorn, S.

    2017-09-01

    After scanning or reconstructing the geometry of objects, we need to inspect the result of our work. Are there any parts missing? Is every detail covered in the desired quality? We typically do this by looking at the resulting point clouds or meshes of our objects on-screen. What, if we could see the information directly visualized on the object itself? Augmented reality is the generic term for bringing virtual information into our real environment. In our paper, we show how we can project any 3D information like thematic visualizations or specific monitoring information with reference to our object onto the object's surface itself, thus augmenting it with additional information. For small objects that could for instance be scanned in a laboratory, we propose a low-cost method involving a projector-camera system to solve this task. The user only needs a calibration board with coded fiducial markers to calibrate the system and to estimate the projector's pose later on for projecting textures with information onto the object's surface. Changes within the projected 3D information or of the projector's pose will be applied in real-time. Our results clearly reveal that such a simple setup will deliver a good quality of the augmented information.

  4. The use of process mapping in healthcare quality improvement projects.

    PubMed

    Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James

    2018-05-01

    Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.

  5. Data Quality Objectives Supporting Radiological Air Emissions Monitoring for the Marine Sciences Laboratory, Sequim Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, J. Matthew; Meier, Kirsten M.; Snyder, Sandra F.

    2012-12-27

    This document of Data Quality Objectives (DQOs) was prepared based on the U.S. Environmental Protection Agency (EPA) Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA, QA/G4, 2/2006 (EPA 2006), as well as several other published DQOs. The intent of this report is to determine the necessary steps required to ensure that radioactive emissions to the air from the Marine Sciences Laboratory (MSL) headquartered at the Pacific Northwest National Laboratory’s Sequim Marine Research Operations (Sequim Site) on Washington State’s Olympic Peninsula are managed in accordance with regulatory requirements and best practices. The Sequim Site was transitioned in Octobermore » 2012 from private operation under Battelle Memorial Institute to an exclusive use contract with the U.S. Department of Energy, Office of Science, Pacific Northwest Site Office.« less

  6. Parallel Processing of Objects in a Naming Task

    ERIC Educational Resources Information Center

    Meyer, Antje S.; Ouellet, Marc; Hacker, Christine

    2008-01-01

    The authors investigated whether speakers who named several objects processed them sequentially or in parallel. Speakers named object triplets, arranged in a triangle, in the order left, right, and bottom object. The left object was easy or difficult to identify and name. During the saccade from the left to the right object, the right object shown…

  7. Teacher Trainees as Learning Object Designers: Problems and Issues in Learning Object Development Process

    ERIC Educational Resources Information Center

    Guler, Cetin; Altun, Arif

    2010-01-01

    Learning objects (LOs) can be defined as resources that are reusable, digital with the aim of fulfilling learning objectives (or expectations). Educators, both at the individual and institutional levels, are cautioned about the fact that LOs are to be processed through a proper development process. Who should be involved in the LO development…

  8. Study on Handing Process and Quality Degradation of Oil Palm Fresh Fruit Bunches (FFB)

    NASA Astrophysics Data System (ADS)

    Mat Sharif, Zainon Binti; Taib, Norhasnina Binti Mohd; Yusof, Mohd Sallehuddin Bin; Rahim, Mohammad Zulafif Bin; Tobi, Abdul Latif Bin Mohd; Othman, Mohd Syafiq Bin

    2017-05-01

    The main objective of this study is to determine the relationship between quality of oil palm fresh fruit bunches (FFB) and handling processes. The study employs exploratory and descriptive design, with quantitative approach and purposive sampling using self-administrated questionnaires, were obtained from 30 smallholder respondents from the Southern Region, Peninsular Malaysia. The study reveals that there was a convincing relationship between quality of oil palm fresh fruit bunches (FFB) and handling processes. The main handling process factors influencing quality of oil palm fresh fruit bunches (FFB) were harvesting activity and handling at the plantation area. As a result, it can be deduced that the handling process factors variable explains 82.80% of the variance that reflects the quality of oil palm fresh fruit bunches (FFB). The overall findings reveal that the handling process factors do play a significant role in the quality of oil palm fresh fruit bunches (FFB).

  9. Object processing in the infant: lessons from neuroscience.

    PubMed

    Wilcox, Teresa; Biondi, Marisa

    2015-07-01

    Object identification is a fundamental cognitive capacity that forms the basis for complex thought and behavior. The adult cortex is organized into functionally distinct visual object-processing pathways that mediate this ability. Insights into the origin of these pathways have begun to emerge through the use of neuroimaging techniques with infant populations. The outcome of this work supports the view that, from the early days of life, object-processing pathways are organized in a way that resembles that of the adult. At the same time, theoretically important changes in patterns of cortical activation are observed during the first year. These findings lead to a new understanding of the cognitive and neural architecture in infants that supports their emerging object-processing capacities. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A Perceptually Weighted Rank Correlation Indicator for Objective Image Quality Assessment

    NASA Astrophysics Data System (ADS)

    Wu, Qingbo; Li, Hongliang; Meng, Fanman; Ngan, King N.

    2018-05-01

    In the field of objective image quality assessment (IQA), the Spearman's $\\rho$ and Kendall's $\\tau$ are two most popular rank correlation indicators, which straightforwardly assign uniform weight to all quality levels and assume each pair of images are sortable. They are successful for measuring the average accuracy of an IQA metric in ranking multiple processed images. However, two important perceptual properties are ignored by them as well. Firstly, the sorting accuracy (SA) of high quality images are usually more important than the poor quality ones in many real world applications, where only the top-ranked images would be pushed to the users. Secondly, due to the subjective uncertainty in making judgement, two perceptually similar images are usually hardly sortable, whose ranks do not contribute to the evaluation of an IQA metric. To more accurately compare different IQA algorithms, we explore a perceptually weighted rank correlation indicator in this paper, which rewards the capability of correctly ranking high quality images, and suppresses the attention towards insensitive rank mistakes. More specifically, we focus on activating `valid' pairwise comparison towards image quality, whose difference exceeds a given sensory threshold (ST). Meanwhile, each image pair is assigned an unique weight, which is determined by both the quality level and rank deviation. By modifying the perception threshold, we can illustrate the sorting accuracy with a more sophisticated SA-ST curve, rather than a single rank correlation coefficient. The proposed indicator offers a new insight for interpreting visual perception behaviors. Furthermore, the applicability of our indicator is validated in recommending robust IQA metrics for both the degraded and enhanced image data.

  11. Quality of Individualised Education Programme Goals and Objectives for Preschool Children with Disabilities

    ERIC Educational Resources Information Center

    Rakap, Salih

    2015-01-01

    Individualised education programmes (IEPs) are the road maps for individualising services for children with disabilities, specifically through the development of high-quality child goals/objectives. High-quality IEP goals/objectives that are developed based on a comprehensive assessment of child functioning and directly connected to intervention…

  12. The highs and lows of object impossibility: effects of spatial frequency on holistic processing of impossible objects.

    PubMed

    Freud, Erez; Avidan, Galia; Ganel, Tzvi

    2015-02-01

    Holistic processing, the decoding of a stimulus as a unified whole, is a basic characteristic of object perception. Recent research using Garner's speeded classification task has shown that this processing style is utilized even for impossible objects that contain an inherent spatial ambiguity. In particular, similar Garner interference effects were found for possible and impossible objects, indicating similar holistic processing styles for the two object categories. In the present study, we further investigated the perceptual mechanisms that mediate such holistic representation of impossible objects. We relied on the notion that, whereas information embedded in the high-spatial-frequency (HSF) content supports fine-detailed processing of object features, the information conveyed by low spatial frequencies (LSF) is more crucial for the emergence of a holistic shape representation. To test the effects of image frequency on the holistic processing of impossible objects, participants performed the Garner speeded classification task on images of possible and impossible cubes filtered for their LSF and HSF information. For images containing only LSF, similar interference effects were observed for possible and impossible objects, indicating that the two object categories were processed in a holistic manner. In contrast, for the HSF images, Garner interference was obtained only for possible, but not for impossible objects. Importantly, we provided evidence to show that this effect could not be attributed to a lack of sensitivity to object possibility in the LSF images. Particularly, even for full-spectrum images, Garner interference was still observed for both possible and impossible objects. Additionally, performance in an object classification task revealed high sensitivity to object possibility, even for LSF images. Taken together, these findings suggest that the visual system can tolerate the spatial ambiguity typical to impossible objects by relying on information

  13. A mask quality control tool for the OSIRIS multi-object spectrograph

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor

    2012-09-01

    OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.

  14. (LMRG): Microscope Resolution, Objective Quality, Spectral Accuracy and Spectral Un-mixing

    PubMed Central

    Bayles, Carol J.; Cole, Richard W.; Eason, Brady; Girard, Anne-Marie; Jinadasa, Tushare; Martin, Karen; McNamara, George; Opansky, Cynthia; Schulz, Katherine; Thibault, Marc; Brown, Claire M.

    2012-01-01

    The second study by the LMRG focuses on measuring confocal laser scanning microscope (CLSM) resolution, objective lens quality, spectral imaging accuracy and spectral un-mixing. Affordable test samples for each aspect of the study were designed, prepared and sent to 116 labs from 23 countries across the globe. Detailed protocols were designed for the three tests and customized for most of the major confocal instruments being used by the study participants. One protocol developed for measuring resolution and objective quality was recently published in Nature Protocols (Cole, R. W., T. Jinadasa, et al. (2011). Nature Protocols 6(12): 1929–1941). The first study involved 3D imaging of sub-resolution fluorescent microspheres to determine the microscope point spread function. Results of the resolution studies as well as point spread function quality (i.e. objective lens quality) from 140 different objective lenses will be presented. The second study of spectral accuracy looked at the reflection of the laser excitation lines into the spectral detection in order to determine the accuracy of these systems to report back the accurate laser emission wavelengths. Results will be presented from 42 different spectral confocal systems. Finally, samples with double orange beads (orange core and orange coating) were imaged spectrally and the imaging software was used to un-mix fluorescence signals from the two orange dyes. Results from 26 different confocal systems will be summarized. Time will be left to discuss possibilities for the next LMRG study.

  15. -The Influence of Scene Context on Parafoveal Processing of Objects.

    PubMed

    Castelhano, Monica S; Pereira, Effie J

    2017-04-21

    Many studies in reading have shown the enhancing effect of context on the processing of a word before it is directly fixated (parafoveal processing of words; Balota et al., 1985; Balota & Rayner, 1983; Ehrlich & Rayner, 1981). Here, we examined whether scene context influences the parafoveal processing of objects and enhances the extraction of object information. Using a modified boundary paradigm (Rayner, 1975), the Dot-Boundary paradigm, participants fixated on a suddenly-onsetting cue before the preview object would onset 4° away. The preview object could be identical to the target, visually similar, visually dissimilar, or a control (black rectangle). The preview changed to the target object once a saccade toward the object was made. Critically, the objects were presented on either a consistent or an inconsistent scene background. Results revealed that there was a greater processing benefit for consistent than inconsistent scene backgrounds and that identical and visually similar previews produced greater processing benefits than other previews. In the second experiment, we added an additional context condition in which the target location was inconsistent, but the scene semantics remained consistent. We found that changing the location of the target object disrupted the processing benefit derived from the consistent context. Most importantly, across both experiments, the effect of preview was not enhanced by scene context. Thus, preview information and scene context appear to independently boost the parafoveal processing of objects without any interaction from object-scene congruency.

  16. Toward objective image quality metrics: the AIC Eval Program of the JPEG

    NASA Astrophysics Data System (ADS)

    Richter, Thomas; Larabi, Chaker

    2008-08-01

    Objective quality assessment of lossy image compression codecs is an important part of the recent call of the JPEG for Advanced Image Coding. The target of the AIC ad-hoc group is twofold: First, to receive state-of-the-art still image codecs and to propose suitable technology for standardization; and second, to study objective image quality metrics to evaluate the performance of such codes. Even tthough the performance of an objective metric is defined by how well it predicts the outcome of a subjective assessment, one can also study the usefulness of a metric in a non-traditional way indirectly, namely by measuring the subjective quality improvement of a codec that has been optimized for a specific objective metric. This approach shall be demonstrated here on the recently proposed HDPhoto format14 introduced by Microsoft and a SSIM-tuned17 version of it by one of the authors. We compare these two implementations with JPEG1 in two variations and a visual and PSNR optimal JPEG200013 implementation. To this end, we use subjective and objective tests based on the multiscale SSIM and a new DCT based metric.

  17. An objective method for a video quality evaluation in a 3DTV service

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2015-09-01

    The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.

  18. Production and processing studies on calpain-system gene markers for tenderness in Brahman cattle: 2. Objective meat quality.

    PubMed

    Cafe, L M; McIntyre, B L; Robinson, D L; Geesink, G H; Barendse, W; Pethick, D W; Thompson, J M; Greenwood, P L

    2010-09-01

    Effects and interactions of calpain-system tenderness gene markers on objective meat quality traits of Brahman (Bos indicus) cattle were quantified within 2 concurrent experiments at different locations. Cattle were selected for study from commercial and research herds at weaning based on their genotype for calpastatin (CAST) and calpain 3 (CAPN3) gene markers for beef tenderness. Gene marker status for mu-calpain (CAPN1-4751 and CAPN1-316) was also determined for inclusion in statistical analyses. Eighty-two heifer and 82 castrated male cattle with 0 or 2 favorable alleles for CAST and CAPN3 were studied in New South Wales (NSW), and 143 castrated male cattle with 0, 1, or 2 favorable alleles for CAST and CAPN3 were studied in Western Australia (WA). The cattle were backgrounded for 6 to 8 mo and grain-fed for 117 d (NSW) or 80 d (WA) before slaughter. One-half the cattle in each experiment were implanted with a hormonal growth promotant during feedlotting. One side of each carcass was suspended from the Achilles tendon (AT) and the other from the pelvis (tenderstretch). The M. longissimus lumborum from both sides and the M. semitendinosus from the AT side were collected; then samples of each were aged at 1 degrees C for 1 or 7 d. Favorable alleles for one or more markers reduced shear force, with little effect on other meat quality traits. The size of effects of individual markers varied with site, muscle, method of carcass suspension, and aging period. Individual marker effects were additive as evident in cattle with 4 favorable alleles for CAST and CAPN3 markers, which had shear force reductions of 12.2 N (P < 0.001, NSW) and 9.3 N (P = 0.002, WA) in AT 7 d aged M. longissimus lumborum compared with those with no favorable alleles. There was no evidence (all P > 0.05) of interactions between the gene markers, or between the hormonal growth promotant and gene markers for any meat quality traits. This study provides further evidence that selection based on the

  19. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  20. A lexicographic weighted Tchebycheff approach for multi-constrained multi-objective optimization of the surface grinding process

    NASA Astrophysics Data System (ADS)

    Khalilpourazari, Soheyl; Khalilpourazary, Saman

    2017-05-01

    In this article a multi-objective mathematical model is developed to minimize total time and cost while maximizing the production rate and surface finish quality in the grinding process. The model aims to determine optimal values of the decision variables considering process constraints. A lexicographic weighted Tchebycheff approach is developed to obtain efficient Pareto-optimal solutions of the problem in both rough and finished conditions. Utilizing a polyhedral branch-and-cut algorithm, the lexicographic weighted Tchebycheff model of the proposed multi-objective model is solved using GAMS software. The Pareto-optimal solutions provide a proper trade-off between conflicting objective functions which helps the decision maker to select the best values for the decision variables. Sensitivity analyses are performed to determine the effect of change in the grain size, grinding ratio, feed rate, labour cost per hour, length of workpiece, wheel diameter and downfeed of grinding parameters on each value of the objective function.

  1. Objective Video Quality Assessment Based on Machine Learning for Underwater Scientific Applications

    PubMed Central

    Moreno-Roldán, José-Miguel; Luque-Nieto, Miguel-Ángel; Poncela, Javier; Otero, Pablo

    2017-01-01

    Video services are meant to be a fundamental tool in the development of oceanic research. The current technology for underwater networks (UWNs) imposes strong constraints in the transmission capacity since only a severely limited bitrate is available. However, previous studies have shown that the quality of experience (QoE) is enough for ocean scientists to consider the service useful, although the perceived quality can change significantly for small ranges of variation of video parameters. In this context, objective video quality assessment (VQA) methods become essential in network planning and real time quality adaptation fields. This paper presents two specialized models for objective VQA, designed to match the special requirements of UWNs. The models are built upon machine learning techniques and trained with actual user data gathered from subjective tests. Our performance analysis shows how both of them can successfully estimate quality as a mean opinion score (MOS) value and, for the second model, even compute a distribution function for user scores. PMID:28333123

  2. 36 CFR 218.10 - Objection time periods and process.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Objection time periods and... Objection time periods and process. (a) Time to file an objection. Written objections, including any... of objectors to ensure that their objection is received in a timely manner. (b) Computation of time...

  3. 36 CFR 218.10 - Objection time periods and process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... calendar day (11:59 p.m. in the time zone of the receiving office) for objections filed by electronic means... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Objection time periods and... Objection time periods and process. (a) Time to file an objection. Written objections, including any...

  4. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  5. Land Surface Process and Air Quality Research and Applications at MSFC

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale; Khan, Maudood

    2007-01-01

    This viewgraph presentation provides an overview of land surface process and air quality research at MSFC including atmospheric modeling and ongoing research whose objective is to undertake a comprehensive spatiotemporal analysis of the effects of accurate land surface characterization on atmospheric modeling results, and public health applications. Land use maps as well as 10 meter air temperature, surface wind, PBL mean difference heights, NOx, ozone, and O3+NO2 plots as well as spatial growth model outputs are included. Emissions and general air quality modeling are also discussed.

  6. Dynamic information processing states revealed through neurocognitive models of object semantics

    PubMed Central

    Clarke, Alex

    2015-01-01

    Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632

  7. Color Image Processing and Object Tracking System

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

    1996-01-01

    This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

  8. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  9. High-quality slab-based intermixing method for fusion rendering of multiple medical objects.

    PubMed

    Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil

    2016-01-01

    The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Imaging-based optical caliper for objects in hot manufacturing processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Howard

    OG Technologies, Inc. (OGT), in conjunction with its industrial and academic partners, proposes to develop an Imaging-Based Optical Caliper (hereafter referred to as OC) for Objects in Hot Manufacturing Processes. The goal is to develop and demonstrate the OC with the synergy of OGT's current technological pool and other innovations to provide a light weight, robust, safe and accurate portable dimensional measurement device for hot objects with integrated wireless communication capacity to enable real time process control. The technical areas of interest in this project are the combination of advanced imaging, Sensor Fusion, and process control. OGT believes that themore » synergistic interactions between its current set of technologies and other innovations could deliver products that are viable and have high impact in the hot manufacture processes, such as steel making, steel rolling, open die forging, and glass industries, resulting in a new energy efficient control paradigm in the operations through improved yield, prolonged tool life and improved quality. In-line dimension measurement and control is of interest to the steel makers, yet current industry focus is on the final product dimension only instead of whole process due to the limit of man power, system cost and operator safety concerns. As sensor technologies advances, the industry started to see the need to enforce better dimensional control throughout the process, but lack the proper tools to do so. OGT along with its industrial partners represent the indigenous effort of technological development to serve the US steel industry. The immediate market that can use and get benefited from the proposed OC is the Steel Industry. The deployment of the OC has the potential to provide benefits in reduction of energy waste, CO2 emission, waste water amount, toxic waste, and so forth. The potential market after further expended function includes Hot Forging and Freight Industries. The OC prototypes were

  11. Fast and accurate edge orientation processing during object manipulation

    PubMed Central

    Flanagan, J Randall; Johansson, Roland S

    2018-01-01

    Quickly and accurately extracting information about a touched object’s orientation is a critical aspect of dexterous object manipulation. However, the speed and acuity of tactile edge orientation processing with respect to the fingertips as reported in previous perceptual studies appear inadequate in these respects. Here we directly establish the tactile system’s capacity to process edge-orientation information during dexterous manipulation. Participants extracted tactile information about edge orientation very quickly, using it within 200 ms of first touching the object. Participants were also strikingly accurate. With edges spanning the entire fingertip, edge-orientation resolution was better than 3° in our object manipulation task, which is several times better than reported in previous perceptual studies. Performance remained impressive even with edges as short as 2 mm, consistent with our ability to precisely manipulate very small objects. Taken together, our results radically redefine the spatial processing capacity of the tactile system. PMID:29611804

  12. Effect diffraction on a viewed object has on improvement of object optical image quality in a turbulent medium

    NASA Astrophysics Data System (ADS)

    Banakh, Viktor A.; Sazanovich, Valentina M.; Tsvik, Ruvim S.

    1997-09-01

    The influence of diffraction on the object, coherently illuminated and viewed through a random medium from the same point, on the image quality betterment caused by the counter wave correlation is studied experimentally. The measurements were carried out with the use of setup modeling artificial convective turbulence. It is shown that in the case of spatially limited reflector with the Fresnel number of the reflector surface radius r ranging from 3 to 12 the contribution of the counter wave correlation into image intensity distribution is maximal as compared with the point objects (r objects of large size (r > U.

  13. Processing Technology Selection for Municipal Sewage Treatment Based on a Multi-Objective Decision Model under Uncertainty.

    PubMed

    Chen, Xudong; Xu, Zhongwen; Yao, Liming; Ma, Ning

    2018-03-05

    This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.

  14. A Multivariate Quality Loss Function Approach for Optimization of Spinning Processes

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Mitra, Ankan

    2018-05-01

    Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.

  15. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Technical Reports Server (NTRS)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  16. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Astrophysics Data System (ADS)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  17. Holistic processing of impossible objects: evidence from Garner's speeded-classification task.

    PubMed

    Freud, Erez; Avidan, Galia; Ganel, Tzvi

    2013-12-18

    Holistic processing, the decoding of the global structure of a stimulus while the local parts are not explicitly represented, is a basic characteristic of object perception. The current study was aimed to test whether such a representation could be created even for objects that violate fundamental principles of spatial organization, namely impossible objects. Previous studies argued that these objects cannot be represented holistically in long-term memory because they lack coherent 3D structure. Here, we utilized Garner's speeded classification task to test whether the perception of possible and impossible objects is mediated by similar holistic processing mechanisms. To this end, participants were asked to make speeded classifications of one object dimension while an irrelevant dimension was kept constant (baseline condition) or when this dimension varied (filtering condition). It is well accepted that ignoring the irrelevant dimension is impossible when holistic perception is mandatory, thus the extent of Garner interference in performance between the baseline and filtering conditions serves as an index of holistic processing. Critically, in Experiment 1, similar levels of Garner interference were found for possible and impossible objects implying holistic perception of both object types. Experiment 2 extended these results and demonstrated that even when depth information was explicitly processed, participants were still unable to process one dimension (width/depth) while ignoring the irrelevant dimension (depth/width, respectively). The results of Experiment 3 replicated the basic pattern found in Experiments 1 and 2 using a novel set of object exemplars. In Experiment 4, we used possible and impossible versions of the Penrose triangles in which information about impossibility is embedded in the internal elements of the objects which participant were explicitly asked to judge. As in Experiments 1-3, similar Garner interference was found for possible and

  18. Data Quality Objectives Supporting the Environmental Soil Monitoring Program for the Idaho National Laboratory Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haney, Thomas Jay

    This document describes the process used to develop data quality objectives for the Idaho National Laboratory (INL) Environmental Soil Monitoring Program in accordance with U.S. Environmental Protection Agency guidance. This document also develops and presents the logic that was used to determine the specific number of soil monitoring locations at the INL Site, at locations bordering the INL Site, and at locations in the surrounding regional area. The monitoring location logic follows the guidance from the U.S. Department of Energy for environmental surveillance of its facilities.

  19. The influence of familiar characters and other appealing images on young children's preference for low-quality objects.

    PubMed

    Danovitch, Judith H; Mills, Candice M

    2017-09-01

    This study examines the factors underlying young children's preference for products bearing a familiar character's image. Three-year-olds (N = 92) chose between low-quality objects with images on or near the objects and high-quality objects without images. Children showed stronger preferences for damaged objects bearing images of a preferred familiar character than for objects bearing images of a preferred colour star, and they showed weak preferences for damaged objects with the character near, but not on, the object. The results suggest that children's preference for low-quality products bearing character images is driven by prior exposure to characters, and not only by the act of identifying a favourite. Statement of contribution What is already known on this subject? Children are exposed to characters in the media and on products such as clothing and school supplies. Products featuring familiar characters appeal to preschool children, even if they are of low quality. What does this study add? Three-year-olds prefer damaged objects with an image of a favourite character over plain undamaged objects. Children's preference is not solely a function of having identified a favourite image or of attentional cues. © 2017 The British Psychological Society.

  20. Verbal Labels Modulate Perceptual Object Processing in 1-Year-Old Children

    ERIC Educational Resources Information Center

    Gliga, Teodora; Volein, Agnes; Csibra, Gergely

    2010-01-01

    Whether verbal labels help infants visually process and categorize objects is a contentious issue. Using electroencephalography, we investigated whether possessing familiar or novel labels for objects directly enhances 1-year-old children's neural processes underlying the perception of those objects. We found enhanced gamma-band (20-60 Hz)…

  1. Air Quality Management Process Cycle

    EPA Pesticide Factsheets

    Air quality management are activities a regulatory authority undertakes to protect human health and the environment from the harmful effects of air pollution. The process of managing air quality can be illustrated as a cycle of inter-related elements.

  2. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  3. Selective visual attention in object detection processes

    NASA Astrophysics Data System (ADS)

    Paletta, Lucas; Goyal, Anurag; Greindl, Christian

    2003-03-01

    Object detection is an enabling technology that plays a key role in many application areas, such as content based media retrieval. Attentive cognitive vision systems are here proposed where the focus of attention is directed towards the most relevant target. The most promising information is interpreted in a sequential process that dynamically makes use of knowledge and that enables spatial reasoning on the local object information. The presented work proposes an innovative application of attention mechanisms for object detection which is most general in its understanding of information and action selection. The attentive detection system uses a cascade of increasingly complex classifiers for the stepwise identification of regions of interest (ROIs) and recursively refined object hypotheses. While the most coarse classifiers are used to determine first approximations on a region of interest in the input image, more complex classifiers are used for more refined ROIs to give more confident estimates. Objects are modelled by local appearance based representations and in terms of posterior distributions of the object samples in eigenspace. The discrimination function to discern between objects is modeled by a radial basis functions (RBF) network that has been compared with alternative networks and been proved consistent and superior to other artifical neural networks for appearance based object recognition. The experiments were led for the automatic detection of brand objects in Formula One broadcasts within the European Commission's cognitive vision project DETECT.

  4. Using Multi-Objective Genetic Programming to Synthesize Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Ross, Brian; Imada, Janine

    Genetic programming is used to automatically construct stochastic processes written in the stochastic π-calculus. Grammar-guided genetic programming constrains search to useful process algebra structures. The time-series behaviour of a target process is denoted with a suitable selection of statistical feature tests. Feature tests can permit complex process behaviours to be effectively evaluated. However, they must be selected with care, in order to accurately characterize the desired process behaviour. Multi-objective evaluation is shown to be appropriate for this application, since it permits heterogeneous statistical feature tests to reside as independent objectives. Multiple undominated solutions can be saved and evaluated after a run, for determination of those that are most appropriate. Since there can be a vast number of candidate solutions, however, strategies for filtering and analyzing this set are required.

  5. Figure-ground organization and object recognition processes: an interactive account.

    PubMed

    Vecera, S P; O'Reilly, R C

    1998-04-01

    Traditional bottom-up models of visual processing assume that figure-ground organization precedes object recognition. This assumption seems logically necessary: How can object recognition occur before a region is labeled as figure? However, some behavioral studies find that familiar regions are more likely to be labeled figure than less familiar regions, a problematic finding for bottom-up models. An interactive account is proposed in which figure-ground processes receive top-down input from object representations in a hierarchical system. A graded, interactive computational model is presented that accounts for behavioral results in which familiarity effects are found. The interactive model offers an alternative conception of visual processing to bottom-up models.

  6. The physician's quality of life: Relationship with ego defense mechanisms and object relations.

    PubMed

    Miranda, Benedito; Louzã, Mário Rodrigues

    2015-11-01

    To assess whether ego defense mechanisms and object relations (the way an individual subjectively experiences his/her relationships with others) are related to quality of life among physicians. In this cross-sectional mail survey, 602 physicians from Botucatu, SP, Brazil, were sent a socio-demographic questionnaire, the Bell Object Relations and Reality Testing Inventory-Form O (BORRTI-O), the Defense Style Questionnaire-40 (DSQ-40), and the World Health Organization Abbreviated Instrument for Quality of Life Assessment (WHOQOL-BREF). 198 questionnaires (33%) with valid responses were obtained. High BORRTI-O scores (indicative of pathology) on the alienation, egocentricity and insecure attachment subscales were associated with reduced WHOQOL-BREF scores for the psychological health and social relationship domains. Immature ego defense mechanisms were associated with lower WHOQOL-BREF scores for all domains. No significant associations of WHOQOL-BREF scores with working hours, workplace or monthly income were observed in the study population WHOQOL-BREF scores correlated with mature defense mechanisms and normal object relations, suggesting an association between psychological maturity and quality of life among physicians. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Discourse accessibility constraints in children’s processing of object relative clauses

    PubMed Central

    Haendler, Yair; Kliegl, Reinhold; Adani, Flavia

    2015-01-01

    Children’s poor performance on object relative clauses has been explained in terms of intervention locality. This approach predicts that object relatives with a full DP head and an embedded pronominal subject are easier than object relatives in which both the head noun and the embedded subject are full DPs. This prediction is shared by other accounts formulated to explain processing mechanisms. We conducted a visual-world study designed to test the off-line comprehension and on-line processing of object relatives in German-speaking 5-year-olds. Children were tested on three types of object relatives, all having a full DP head noun and differing with respect to the type of nominal phrase that appeared in the embedded subject position: another full DP, a 1st- or a 3rd-person pronoun. Grammatical skills and memory capacity were also assessed in order to see whether and how they affect children’s performance. Most accurately processed were object relatives with 1st-person pronoun, independently of children’s language and memory skills. Performance on object relatives with two full DPs was overall more accurate than on object relatives with 3rd-person pronoun. In the former condition, children with stronger grammatical skills accurately processed the structure and their memory abilities determined how fast they were; in the latter condition, children only processed accurately the structure if they were strong both in their grammatical skills and in their memory capacity. The results are discussed in the light of accounts that predict different pronoun effects like the ones we find, which depend on the referential properties of the pronouns. We then discuss which role language and memory abilities might have in processing object relatives with various embedded nominal phrases. PMID:26157410

  8. Level of structural quality and process quality in rural preschool classrooms

    PubMed Central

    Hartman, Suzanne C.; Warash, Barbara G.; Curtis, Reagan; Hirst, Jessica Day

    2017-01-01

    Preschool classrooms with varying levels of structural quality requirements across the state of West Virginia were investigated for differences in measured structural and process quality. Quality was measured using group size, child-to-teacher/staff ratio, teacher education, and the Early Childhood Environmental Rating Scale-Revised (ECERS-R; Harms, T., Clifford, R. M., & Cryer, D. (2005). The early childhood environment rating scale-revised. New York, NY: Teachers College Press). Thirty-six classrooms with less structural quality requirements and 136 with more structural quality requirements were measured. There were significant differences between classroom type, with classrooms with more structural quality requirements having significantly higher teacher education levels and higher environmental rating scores on the ECERS-R subscales of Space and Furnishings, Activities, and Program Structure. Results support previous research that stricter structural state regulations are correlated with higher measured structural and process quality in preschool classrooms. Implications for preschool state quality standards are discussed. PMID:29056814

  9. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  10. Effects of Case Manager Feedback on the Quality of Individual Habilitation Plan Objectives.

    ERIC Educational Resources Information Center

    Horner, Robert H.; And Others

    1990-01-01

    The functional relation between feedback and improved writing of Individual Habilitation Plan objectives in four adult service agencies serving the retarded was assessed. The agencies improved the quality of their written objectives after receiving feedback from the case manager and maintained those gains 18 months after feedback was terminated.…

  11. Minimum specific cost control of technological processes realized in a living objects-containing microenvironment.

    PubMed

    Amelkin, Alexander A; Blagoveschenskaya, Margarita M; Lobanov, Yury V; Amelkin, Anatoly K

    2003-01-01

    The purpose of the present work is to work out an approach for the development of software and the choice of hardware structures when designing subsystems for automatic control of technological processes realized in living objects containing limited space (microenvironment). The subsystems for automatic control of the microenvironment (SACME) under development use the Devices for Air Prophylactic Treatment, Aeroionization, and Purification (DAPTAP) as execution units for increasing the level of safety and quality of agricultural raw material and foodstuffs, for reducing the losses of agricultural produce during storage and cultivation, as well as for intensifying the processes of activation of agricultural produce and industrial microorganisms. A set of interconnected SACMEs works within the framework of a general microenvironmental system (MES). In this research, the population of baker's yeast is chosen as a basic object of control under the industrial fed-batch cultivation in a bubbling bioreactor. This project is an example of a minimum cost automation approach. The microenvironment optimal control problem for baker's yeast cultivation is reduced from a profit maximum to the maximization of overall yield by the reason that the material flow-oriented specific cost correlates closely with the reciprocal value of the overall yield. Implementation of the project partially solves a local sustainability problem and supports a balance of microeconomical, microecological and microsocial systems within a technological subsystem realized in a microenvironment maintaining an optimal value of economical criterion (e.g. minimum material, flow-oriented specific cost) and ensuring: (a) economical growth (profit increase, raw material saving); (b) high security, safety and quality of agricultural raw material during storage process and of food produce during a technological process; elimination of the contact of gaseous harmful substances with a subproduct during various

  12. Assessing Learning, Quality and Engagement in Learning Objects: The Learning Object Evaluation Scale for Students (LOES-S)

    ERIC Educational Resources Information Center

    Kay, Robin H.; Knaack, Liesel

    2009-01-01

    Learning objects are interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and/or guiding the cognitive processes of learners. Research on the impact, effectiveness, and usefulness of learning objects is limited, partially because comprehensive, theoretically based, reliable, and valid evaluation…

  13. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  14. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  15. D Imaging for Museum Artefacts: a Portable Test Object for Heritage and Museum Documentation of Small Objects

    NASA Astrophysics Data System (ADS)

    Hess, M.; Robson, S.

    2012-07-01

    3D colour image data generated for the recording of small museum objects and archaeological finds are highly variable in quality and fitness for purpose. Whilst current technology is capable of extremely high quality outputs, there are currently no common standards or applicable guidelines in either the museum or engineering domain suited to scientific evaluation, understanding and tendering for 3D colour digital data. This paper firstly explains the rationale towards and requirements for 3D digital documentation in museums. Secondly it describes the design process, development and use of a new portable test object suited to sensor evaluation and the provision of user acceptance metrics. The test object is specifically designed for museums and heritage institutions and includes known surface and geometric properties which support quantitative and comparative imaging on different systems. The development for a supporting protocol will allow object reference data to be included in the data processing workflow with specific reference to conservation and curation.

  16. A system to evaluate the scientific quality of biological and restoration objectives using National Wildlife Refuge Comprehensive Conservation Plans as a case study

    USGS Publications Warehouse

    Schroeder, R.L.

    2006-01-01

    It is widely accepted that plans for restoration projects should contain specific, measurable, and science-based objectives to guide restoration efforts. The United States Fish and Wildlife Service (USFWS) is in the process of developing Comprehensive Conservation Plans (CCPs) for more than 500 units in the National Wildlife Refuge System (NWRS). These plans contain objectives for biological and ecosystem restoration efforts on the refuges. Based on USFWS policy, a system was developed to evaluate the scientific quality of such objectives based on three critical factors: (1) Is the objective specific, measurable, achievable, results-oriented, and time-fixed? (2) What is the extent of the rationale that explains the assumptions, logic, and reasoning for the objective? (3) How well was available science used in the development of the objective? The evaluation system scores each factor on a scale of 1 (poor) to 4 (excellent) according to detailed criteria. The biological and restoration objectives from CCPs published as of September 2004 (60 total) were evaluated. The overall average score for all biological and restoration objectives was 1.73. Average scores for each factor were: Factor 1-1.97; Factor 2-1.86; Factor 3-1.38. The overall scores increased from 1997 to 2004. Future restoration efforts may benefit by using this evaluation system during the process of plan development, to ensure that biological and restoration objectives are of the highest scientific quality possible prior to the implementation of restoration plans, and to allow for improved monitoring and adaptive management.

  17. Redefining and expanding quality assurance.

    PubMed

    Robins, J L

    1992-12-01

    To meet the current standards of excellence necessary for blood establishments, we have learned from industry that a movement toward organization-wide quality assurance/total quality management must be made. Everyone in the organization must accept responsibility for participating in providing the highest quality products and services. Quality must be built into processes and design systems to support these quality processes. Quality assurance has been redefined to include a quality planning function described as the most effective way of designing quality into processes. A formalized quality planning process must be part of quality assurance. Continuous quality improvement has been identified as the strategy every blood establishment must support while striving for error-free processing as the long-term objective. The auditing process has been realigned to support and facilitate this same objective. Implementing organization-wide quality assurance/total quality management is one proven plan for guaranteeing the quality of the 20 million products that are transfused into 4 million patients each year and for moving toward the new order.

  18. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  19. Support system, excavation arrangement, and process of supporting an object

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Bill W.

    2017-08-01

    A support system, an excavation arrangement, and a process of supporting an object are disclosed. The support system includes a weight-bearing device and a camming mechanism positioned below the weight-bearing device. A downward force on the weight-bearing device at least partially secures the camming mechanism to opposing surfaces. The excavation arrangement includes a borehole, a support system positioned within and secured to the borehole, and an object positioned on and supported by the support system. The process includes positioning and securing the support system and positioning the object on the weight-bearing device.

  20. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  1. Measuring Software Product Quality: The ISO 25000 Series and CMMI

    DTIC Science & Technology

    2004-06-14

    performance objectives” covers objectives and requirements for product quality, service quality , and process performance. Process performance objectives...such that product quality, service quality , and process performance attributes are measurable and controlled throughout the project (internal and

  2. The Minimum Data Set Depression Quality Indicator: Does It Reflect Differences in Care Processes?

    ERIC Educational Resources Information Center

    Simmons, S.F.; Cadogan, M.P.; Cabrera, G.R.; Al-Samarrai, N.R.; Jorge, J.S.; Levy-Storms, L.; Osterweil, D.; Schnelle, J.F.

    2004-01-01

    Purpose. The objective of this work was to determine if nursing homes that score differently on prevalence of depression, according to the Minimum Data Set (MDS) quality indicator, also provide different processes of care related to depression. Design and Methods. A cross-sectional study with 396 long-term residents in 14 skilled nursing…

  3. A neuroanatomical model of space-based and object-centered processing in spatial neglect.

    PubMed

    Pedrazzini, Elena; Schnider, Armin; Ptak, Radek

    2017-11-01

    Visual attention can be deployed in space-based or object-centered reference frames. Right-hemisphere damage may lead to distinct deficits of space- or object-based processing, and such dissociations are thought to underlie the heterogeneous nature of spatial neglect. Previous studies have suggested that object-centered processing deficits (such as in copying, reading or line bisection) result from damage to retro-rolandic regions while impaired spatial exploration reflects damage to more anterior regions. However, this evidence is based on small samples and heterogeneous tasks. Here, we tested a theoretical model of neglect that takes in account the space- and object-based processing and relates them to neuroanatomical predictors. One hundred and one right-hemisphere-damaged patients were examined with classic neuropsychological tests and structural brain imaging. Relations between neglect measures and damage to the temporal-parietal junction, intraparietal cortex, insula and middle frontal gyrus were examined with two structural equation models by assuming that object-centered processing (involved in line bisection and single-word reading) and space-based processing (involved in cancelation tasks) either represented a unique latent variable or two distinct variables. Of these two models the latter had better explanatory power. Damage to the intraparietal sulcus was a significant predictor of object-centered, but not space-based processing, while damage to the temporal-parietal junction predicted space-based, but not object-centered processing. Space-based processing and object-centered processing were strongly intercorrelated, indicating that they rely on similar, albeit partly dissociated processes. These findings indicate that object-centered and space-based deficits in neglect are partly independent and result from superior parietal and inferior parietal damage, respectively.

  4. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  5. Effects of transference work in the context of therapeutic alliance and quality of object relations.

    PubMed

    Høglend, Per; Hersoug, Anne Grete; Bøgwald, Kjell-Petter; Amlo, Svein; Marble, Alice; Sørbye, Øystein; Røssberg, Jan Ivar; Ulberg, Randi; Gabbard, Glen O; Crits-Christoph, Paul

    2011-10-01

    Transference interpretation is considered as a core active ingredient in dynamic psychotherapy. In common clinical theory, it is maintained that more mature relationships, as well as a strong therapeutic alliance, may be prerequisites for successful transference work. In this study, the interaction between quality of object relations, transference interpretation, and alliance is estimated. One hundred outpatients seeking psychotherapy for depression, anxiety, and personality disorders were randomly assigned to 1 year of weekly sessions of dynamic psychotherapy with transference interpretation or to the same type and duration of treatment, but without the use of transference interpretation. Quality of Object Relations (QOR)-lifelong pattern was evaluated before treatment (P. Høglend, 1994). The Working Alliance Inventory (A. O. Horvath & L. S. Greenberg, 1989; T. J. Tracey & A. M. Kokotovic, 1989) was rated in Session 7. The primary outcome variable was the Psychodynamic Functioning Scales (P. Høglend et al., 2000), measured at pretreatment, posttreatment, and 1 year after treatment termination. A significant Treatment Group × Quality of Object Relations × Alliance interaction was present, indicating that alliance had a significantly different impact on effects of transference interpretation, depending on the level of QOR. The impact of transference interpretation on psychodynamic functioning was more positive within the context of a weak therapeutic alliance for patients with low quality of object relations. For patients with more mature object relations and high alliance, the authors observed a negative effect of transference work. The specific effects of transference work was influenced by the interaction of object relations and alliance, but in the direct opposite direction of what is generally maintained in mainstream clinical theory.

  6. Task and spatial frequency modulations of object processing: an EEG study.

    PubMed

    Craddock, Matt; Martinovic, Jasna; Müller, Matthias M

    2013-01-01

    Visual object processing may follow a coarse-to-fine sequence imposed by fast processing of low spatial frequencies (LSF) and slow processing of high spatial frequencies (HSF). Objects can be categorized at varying levels of specificity: the superordinate (e.g. animal), the basic (e.g. dog), or the subordinate (e.g. Border Collie). We tested whether superordinate and more specific categorization depend on different spatial frequency ranges, and whether any such dependencies might be revealed by or influence signals recorded using EEG. We used event-related potentials (ERPs) and time-frequency (TF) analysis to examine the time course of object processing while participants performed either a grammatical gender-classification task (which generally forces basic-level categorization) or a living/non-living judgement (superordinate categorization) on everyday, real-life objects. Objects were filtered to contain only HSF or LSF. We found a greater positivity and greater negativity for HSF than for LSF pictures in the P1 and N1 respectively, but no effects of task on either component. A later, fronto-central negativity (N350) was more negative in the gender-classification task than the superordinate categorization task, which may indicate that this component relates to semantic or syntactic processing. We found no significant effects of task or spatial frequency on evoked or total gamma band responses. Our results demonstrate early differences in processing of HSF and LSF content that were not modulated by categorization task, with later responses reflecting such higher-level cognitive factors.

  7. Effect of reciprocating agitation thermal processing (RA-TP) on quality of canned tomato (Solanum lycopersicum) puree.

    PubMed

    Pratap Singh, Anubhav; Singh, Anika; Ramaswamy, Hosahalli S

    2017-06-01

    Reciprocating agitation thermal processing (RA-TP) is a recent innovation in the field of canning for obtaining high-quality canned food. The objective of this study was to compare RA-TP processing with conventional non-agitated (still) processing with respect to the impact on quality (color, antioxidant capacity, total phenols, carotenoid and lycopene contents) of canned tomato (Solanum lycopersicum) puree. Owing to a 63-81% reduction in process times as compared with still processing, tomato puree with a brighter red color (closer to fresh) was obtained during RA-TP. At 3 Hz reciprocation frequency, the loss of antioxidant, lycopene and carotenoid contents could be reduced to 34, 8 and 8% respectively as compared with 96, 41 and 52% respectively during still processing. In fact, the phenolic content for RA-TP at 3 Hz was 5% higher than in fresh puree. Quality retention generally increased with an increase in frequency, although the differences were less significant at higher reciprocation frequencies (between 2 and 3 Hz). Research findings indicate that RA-TP can be effective to obtain thermally processed foods with high-quality attribute retention. It can also be concluded that a very high reciprocation frequency (>3 Hz) is not necessarily needed and significant quality improvement can be obtained at lower frequencies (∼2 Hz). © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  8. Flight Dynamics Mission Support and Quality Assurance Process

    NASA Technical Reports Server (NTRS)

    Oh, InHwan

    1996-01-01

    This paper summarizes the method of the Computer Sciences Corporation Flight Dynamics Operation (FDO) quality assurance approach to support the National Aeronautics and Space Administration Goddard Space Flight Center Flight Dynamics Support Branch. Historically, a strong need has existed for developing systematic quality assurance using methods that account for the unique nature and environment of satellite Flight Dynamics mission support. Over the past few years FDO has developed and implemented proactive quality assurance processes applied to each of the six phases of the Flight Dynamics mission support life cycle: systems and operations concept, system requirements and specifications, software development support, operations planing and training, launch support, and on-orbit mission operations. Rather than performing quality assurance as a final step after work is completed, quality assurance has been built in as work progresses in the form of process assurance. Process assurance activities occur throughout the Flight Dynamics mission support life cycle. The FDO Product Assurance Office developed process checklists for prephase process reviews, mission team orientations, in-progress reviews, and end-of-phase audits. This paper will outline the evolving history of FDO quality assurance approaches, discuss the tailoring of Computer Science Corporations's process assurance cycle procedures, describe some of the quality assurance approaches that have been or are being developed, and present some of the successful results.

  9. A Validation of Object-Oriented Design Metrics as Quality Indicators

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Briand, Lionel C.; Melo, Walcelio

    1997-01-01

    This paper presents the results of a study in which we empirically investigated the suits of object-oriented (00) design metrics introduced in another work. More specifically, our goal is to assess these metrics as predictors of fault-prone classes and, therefore, determine whether they can be used as early quality indicators. This study is complementary to the work described where the same suite of metrics had been used to assess frequencies of maintenance changes to classes. To perform our validation accurately, we collected data on the development of eight medium-sized information management systems based on identical requirements. All eight projects were developed using a sequential life cycle model, a well-known 00 analysis/design method and the C++ programming language. Based on empirical and quantitative analysis, the advantages and drawbacks of these 00 metrics are discussed. Several of Chidamber and Kamerer's 00 metrics appear to be useful to predict class fault-proneness during the early phases of the life-cycle. Also, on our data set, they are better predictors than 'traditional' code metrics, which can only be collected at a later phase of the software development processes.

  10. The COGs (context, object, and goals) in multisensory processing.

    PubMed

    ten Oever, Sanne; Romei, Vincenzo; van Atteveldt, Nienke; Soto-Faraco, Salvador; Murray, Micah M; Matusz, Pawel J

    2016-05-01

    Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and "top-down" control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer's goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications.

  11. Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1

    PubMed Central

    González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto

    2015-01-01

    Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173

  12. [Improvement of medical processes with Six Sigma - practicable zero-defect quality in preparation for surgery].

    PubMed

    Sobottka, Stephan B; Töpfer, Armin; Eberlein-Gonska, Maria; Schackert, Gabriele; Albrecht, D Michael

    2010-01-01

    Six Sigma is an innovative management- approach to reach practicable zero- defect quality in medical service processes. The Six Sigma principle utilizes strategies, which are based on quantitative measurements and which seek to optimize processes, limit deviations or dispersion from the target process. Hence, Six Sigma aims to eliminate errors or quality problems of all kinds. A pilot project to optimize the preparation for neurosurgery could now show that the Six Sigma method enhanced patient safety in medical care, while at the same time disturbances in the hospital processes and failure costs could be avoided. All six defined safety relevant quality indicators were significantly improved by changes in the workflow by using a standardized process- and patient- oriented approach. Certain defined quality standards such as a 100% complete surgical preparation at start of surgery and the required initial contact of the surgeon with the patient/ surgical record on the eve of surgery could be fulfilled within the range of practical zero- defect quality. Likewise, the degree of completion of the surgical record by 4 p.m. on the eve of surgery and their quality could be improved by a factor of 170 and 16, respectively, at sigma values of 4.43 and 4.38. The other two safety quality indicators "non-communicated changes in the OR- schedule" and the "completeness of the OR- schedule by 12:30 a.m. on the day before surgery" also show an impressive improvement by a factor of 2.8 and 7.7, respectively, corresponding with sigma values of 3.34 and 3.51. The results of this pilot project demonstrate that the Six Sigma method is eminently suitable for improving quality of medical processes. In our experience this methodology is suitable, even for complex clinical processes with a variety of stakeholders. In particular, in processes in which patient safety plays a key role, the objective of achieving a zero- defect quality is reasonable and should definitely be aspirated. Copyright

  13. Auditory-visual object recognition time suggests specific processing for animal sounds.

    PubMed

    Suied, Clara; Viaud-Delmon, Isabelle

    2009-01-01

    Recognizing an object requires binding together several cues, which may be distributed across different sensory modalities, and ignoring competing information originating from other objects. In addition, knowledge of the semantic category of an object is fundamental to determine how we should react to it. Here we investigate the role of semantic categories in the processing of auditory-visual objects. We used an auditory-visual object-recognition task (go/no-go paradigm). We compared recognition times for two categories: a biologically relevant one (animals) and a non-biologically relevant one (means of transport). Participants were asked to react as fast as possible to target objects, presented in the visual and/or the auditory modality, and to withhold their response for distractor objects. A first main finding was that, when participants were presented with unimodal or bimodal congruent stimuli (an image and a sound from the same object), similar reaction times were observed for all object categories. Thus, there was no advantage in the speed of recognition for biologically relevant compared to non-biologically relevant objects. A second finding was that, in the presence of a biologically relevant auditory distractor, the processing of a target object was slowed down, whether or not it was itself biologically relevant. It seems impossible to effectively ignore an animal sound, even when it is irrelevant to the task. These results suggest a specific and mandatory processing of animal sounds, possibly due to phylogenetic memory and consistent with the idea that hearing is particularly efficient as an alerting sense. They also highlight the importance of taking into account the auditory modality when investigating the way object concepts of biologically relevant categories are stored and retrieved.

  14. Action and object processing in brain-injured speakers of Chinese.

    PubMed

    Arévalo, Analia L; Lu, Ching-Ching; Huang, Lydia B-Y; Bates, Elizabeth A; Dronkers, Nina F

    2011-11-01

    To see whether action and object processing across different tasks and modalities differs in brain-injured speakers of Chinese with varying fluency and lesion locations within the left hemisphere. Words and pictures representing actions and objects were presented to a group of 33 participants whose native and/or dominant language was Mandarin Chinese: 23 patients with left-hemisphere lesions due to stroke and 10 language-, age- and education-matched healthy control participants. A set of 120 stimulus items was presented to each participant in three different forms: as black and white line drawings (for picture-naming), as written words (for reading) and as aurally presented words (for word repetition). Patients were divided into groups for two separate analyses: Analysis 1 divided and compared patients based on fluency (Fluent vs. Nonfluent) and Analysis 2 compared patients based on lesion location (Anterior vs. Posterior). Both analyses yielded similar results: Fluent, Nonfluent, Anterior, and Posterior patients all produced significantly more errors when processing action (M = 0.73, SD = 0.45) relative to object (M = 0.79, SD = 0.41) stimuli, and this effect was strongest in the picture-naming task. As in our previous study with English-speaking participants using the same experimental design (Arévalo et al., 2007, Arévalo, Moineau, Saygin, Ludy, & Bates, 2005), we did not find evidence for a double-dissociation in action and object processing between groups with different lesion and fluency profiles. These combined data bring us closer to a more informed view of action/object processing in the brain in both healthy and brain-injured individuals.

  15. Post-examination interpretation of objective test data: monitoring and improving the quality of high-stakes examinations--a commentary on two AMEE Guides.

    PubMed

    Tavakol, Mohsen; Dennick, Reg

    2012-01-01

    As great emphasis is rightly placed upon the importance of assessment to judge the quality of our future healthcare professionals, it is appropriate not only to choose the most appropriate assessment method, but to continually monitor the quality of the tests themselves, in a hope that we may continually improve the process. This article stresses the importance of quality control mechanisms in the exam cycle and briefly outlines some of the key psychometric concepts including reliability measures, factor analysis, generalisability theory and item response theory. The importance of such analyses for the standard setting procedures is emphasised. This article also accompanies two new AMEE Guides in Medical Education (Tavakol M, Dennick R. Post-examination Analysis of Objective Tests: AMEE Guide No. 54 and Tavakol M, Dennick R. 2012. Post examination analysis of objective test data: Monitoring and improving the quality of high stakes examinations: AMEE Guide No. 66) which provide the reader with practical examples of analysis and interpretation, in order to help develop valid and reliable tests.

  16. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  17. Neural correlates of the object-recall process in semantic memory.

    PubMed

    Assaf, Michal; Calhoun, Vince D; Kuzu, Cheedem H; Kraut, Michael A; Rivkin, Paul R; Hart, John; Pearlson, Godfrey D

    2006-10-30

    The recall of an object from features is a specific operation in semantic memory in which the thalamus and pre-supplementary motor area (pre-SMA) are integrally involved. Other higher-order semantic cortices are also likely to be involved. We used the object-recall-from-features paradigm, with more sensitive scanning techniques and larger sample size, to replicate and extend our previous results. Eighteen right-handed healthy participants performed an object-recall task and an association semantic task, while undergoing functional magnetic resonance imaging. During object-recall, subjects determined whether words pairs describing object features combined to recall an object; during the association task they decided if two words were related. Of brain areas specifically involved in object recall, in addition to the thalamus and pre-SMA, other regions included the left dorsolateral prefrontal cortex, inferior parietal lobule, and middle temporal gyrus, and bilateral rostral anterior cingulate and inferior frontal gyri. These regions are involved in semantic processing, verbal working memory and response-conflict detection and monitoring. The thalamus likely helps to coordinate activity of these different brain areas. Understanding the circuit that normally mediates this process is relevant for schizophrenia, where many regions in this circuit are functionally abnormal and semantic memory is impaired.

  18. Donabedian's structure-process-outcome quality of care model: Validation in an integrated trauma system.

    PubMed

    Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean

    2015-06-01

    According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes

  19. Color image processing and object tracking workstation

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Paulick, Michael J.

    1992-01-01

    A system is described for automatic and semiautomatic tracking of objects on film or video tape which was developed to meet the needs of the microgravity combustion and fluid science experiments at NASA Lewis. The system consists of individual hardware parts working under computer control to achieve a high degree of automation. The most important hardware parts include 16 mm film projector, a lens system, a video camera, an S-VHS tapedeck, a frame grabber, and some storage and output devices. Both the projector and tapedeck have a computer interface enabling remote control. Tracking software was developed to control the overall operation. In the automatic mode, the main tracking program controls the projector or the tapedeck frame incrementation, grabs a frame, processes it, locates the edge of the objects being tracked, and stores the coordinates in a file. This process is performed repeatedly until the last frame is reached. Three representative applications are described. These applications represent typical uses and include tracking the propagation of a flame front, tracking the movement of a liquid-gas interface with extremely poor visibility, and characterizing a diffusion flame according to color and shape.

  20. Hierarchical Processing of Auditory Objects in Humans

    PubMed Central

    Kumar, Sukhbinder; Stephan, Klaas E; Warren, Jason D; Friston, Karl J; Griffiths, Timothy D

    2007-01-01

    This work examines the computational architecture used by the brain during the analysis of the spectral envelope of sounds, an important acoustic feature for defining auditory objects. Dynamic causal modelling and Bayesian model selection were used to evaluate a family of 16 network models explaining functional magnetic resonance imaging responses in the right temporal lobe during spectral envelope analysis. The models encode different hypotheses about the effective connectivity between Heschl's Gyrus (HG), containing the primary auditory cortex, planum temporale (PT), and superior temporal sulcus (STS), and the modulation of that coupling during spectral envelope analysis. In particular, we aimed to determine whether information processing during spectral envelope analysis takes place in a serial or parallel fashion. The analysis provides strong support for a serial architecture with connections from HG to PT and from PT to STS and an increase of the HG to PT connection during spectral envelope analysis. The work supports a computational model of auditory object processing, based on the abstraction of spectro-temporal “templates” in the PT before further analysis of the abstracted form in anterior temporal lobe areas. PMID:17542641

  1. Process for coating an object with silicon carbide

    NASA Technical Reports Server (NTRS)

    Levin, Harry (Inventor)

    1989-01-01

    A process for coating a carbon or graphite object with silicon carbide by contacting it with silicon liquid and vapor over various lengths of contact time. In the process, a stream of silicon-containing precursor material in gaseous phase below the decomposition temperature of said gas and a co-reactant, carrier or diluent gas such as hydrogen is passed through a hole within a high emissivity, thin, insulating septum into a reaction chamber above the melting point of silicon. The thin septum has one face below the decomposition temperature of the gas and an opposite face exposed to the reaction chamber. The precursor gas is decomposed directly to silicon in the reaction chamber. A stream of any decomposition gas and any unreacted precursor gas from said reaction chamber is removed. The object within the reaction chamber is then contacted with silicon, and recovered after it has been coated with silicon carbide.

  2. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³process of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  3. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  4. A Comparative Study of the Quality of Teaching Learning Process at Post Graduate Level in the Faculty of Science and Social Science

    ERIC Educational Resources Information Center

    Shahzadi, Uzma; Shaheen, Gulnaz; Shah, Ashfaque Ahmed

    2012-01-01

    The study was intended to compare the quality of teaching learning process in the faculty of social science and science at University of Sargodha. This study was descriptive and quantitative in nature. The objectives of the study were to compare the quality of teaching learning process in the faculty of social science and science at University of…

  5. Improvement of quality of 3D printed objects by elimination of microscopic structural defects in fused deposition modeling.

    PubMed

    Gordeev, Evgeniy G; Galushko, Alexey S; Ananikov, Valentine P

    2018-01-01

    Additive manufacturing with fused deposition modeling (FDM) is currently optimized for a wide range of research and commercial applications. The major disadvantage of FDM-created products is their low quality and structural defects (porosity), which impose an obstacle to utilizing them in functional prototyping and direct digital manufacturing of objects intended to contact with gases and liquids. This article describes a simple and efficient approach for assessing the quality of 3D printed objects. Using this approach it was shown that the wall permeability of a printed object depends on its geometric shape and is gradually reduced in a following series: cylinder > cube > pyramid > sphere > cone. Filament feed rate, wall geometry and G-code-defined wall structure were found as primary parameters that influence the quality of 3D-printed products. Optimization of these parameters led to an overall increase in quality and improvement of sealing properties. It was demonstrated that high quality of 3D printed objects can be achieved using routinely available printers and standard filaments.

  6. Invariant visual object recognition and shape processing in rats

    PubMed Central

    Zoccolan, Davide

    2015-01-01

    Invariant visual object recognition is the ability to recognize visual objects despite the vastly different images that each object can project onto the retina during natural vision, depending on its position and size within the visual field, its orientation relative to the viewer, etc. Achieving invariant recognition represents such a formidable computational challenge that is often assumed to be a unique hallmark of primate vision. Historically, this has limited the invasive investigation of its neuronal underpinnings to monkey studies, in spite of the narrow range of experimental approaches that these animal models allow. Meanwhile, rodents have been largely neglected as models of object vision, because of the widespread belief that they are incapable of advanced visual processing. However, the powerful array of experimental tools that have been developed to dissect neuronal circuits in rodents has made these species very attractive to vision scientists too, promoting a new tide of studies that have started to systematically explore visual functions in rats and mice. Rats, in particular, have been the subjects of several behavioral studies, aimed at assessing how advanced object recognition and shape processing is in this species. Here, I review these recent investigations, as well as earlier studies of rat pattern vision, to provide an historical overview and a critical summary of the status of the knowledge about rat object vision. The picture emerging from this survey is very encouraging with regard to the possibility of using rats as complementary models to monkeys in the study of higher-level vision. PMID:25561421

  7. [Electrophysiological bases of semantic processing of objects].

    PubMed

    Kahlaoui, Karima; Baccino, Thierry; Joanette, Yves; Magnié, Marie-Noële

    2007-02-01

    How pictures and words are stored and processed in the human brain constitute a long-standing question in cognitive psychology. Behavioral studies have yielded a large amount of data addressing this issue. Generally speaking, these data show that there are some interactions between the semantic processing of pictures and words. However, behavioral methods can provide only limited insight into certain findings. Fortunately, Event-Related Potential (ERP) provides on-line cues about the temporal nature of cognitive processes and contributes to the exploration of their neural substrates. ERPs have been used in order to better understand semantic processing of words and pictures. The main objective of this article is to offer an overview of the electrophysiologic bases of semantic processing of words and pictures. Studies presented in this article showed that the processing of words is associated with an N 400 component, whereas pictures elicited both N 300 and N 400 components. Topographical analysis of the N 400 distribution over the scalp is compatible with the idea that both image-mediated concrete words and pictures access an amodal semantic system. However, given the distinctive N 300 patterns, observed only during picture processing, it appears that picture and word processing rely upon distinct neuronal networks, even if they end up activating more or less similar semantic representations.

  8. ERPs Differentially Reflect Automatic and Deliberate Processing of the Functional Manipulability of Objects

    PubMed Central

    Madan, Christopher R.; Chen, Yvonne Y.; Singhal, Anthony

    2016-01-01

    It is known that the functional properties of an object can interact with perceptual, cognitive, and motor processes. Previously we have found that a between-subjects manipulation of judgment instructions resulted in different manipulability-related memory biases in an incidental memory test. To better understand this effect we recorded electroencephalography (EEG) while participants made judgments about images of objects that were either high or low in functional manipulability (e.g., hammer vs. ladder). Using a between-subjects design, participants judged whether they had seen the object recently (Personal Experience), or could manipulate the object using their hand (Functionality). We focused on the P300 and slow-wave event-related potentials (ERPs) as reflections of attentional allocation. In both groups, we observed higher P300 and slow wave amplitudes for high-manipulability objects at electrodes Pz and C3. As P300 is thought to reflect bottom-up attentional processes, this may suggest that the processing of high-manipulability objects recruited more attentional resources. Additionally, the P300 effect was greater in the Functionality group. A more complex pattern was observed at electrode C3 during slow wave: processing the high-manipulability objects in the Functionality instruction evoked a more positive slow wave than in the other three conditions, likely related to motor simulation processes. These data provide neural evidence that effects of manipulability on stimulus processing are further mediated by automatic vs. deliberate motor-related processing. PMID:27536224

  9. Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques

    NASA Astrophysics Data System (ADS)

    Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.

    2017-10-01

    It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.

  10. Objective measurement of the optical image quality in the human eye

    NASA Astrophysics Data System (ADS)

    Navarro, Rafael M.

    2001-05-01

    This communication reviews some recent studies on the optical performance of the human eye. Although the retinal image cannot be recorded directly, different objective methods have been developed, which permit to determine optical quality parameters, such as the Point Spread Function (PSF), the Modulation Transfer Function (MTF), the geometrical ray aberrations or the wavefront distortions, in the living human eye. These methods have been applied in both basic and applied research. This includes the measurement of the optical performance of the eye across visual field, the optical quality of eyes with intraocular lens implants, the aberrations induced by LASIK refractive surgery, or the manufacture of customized phase plates to compensate the wavefront aberration in the eye.

  11. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  12. Teaching Quality Object-Oriented Programming

    ERIC Educational Resources Information Center

    Feldman, Yishai A.

    2005-01-01

    Computer science students need to learn how to write high-quality software. An important methodology for achieving quality is design-by-contract, in which code is developed together with its specification, which is given as class invariants and method pre- and postconditions. This paper describes practical experience in teaching design-by-contract…

  13. Self-Inversion of the Image of a Small-Scale Opaque Object in the Process of Focusing of the Illuminating Beam in an Absorbing Medium

    NASA Astrophysics Data System (ADS)

    Bubis, E. L.; Lozhrkarev, V. V.; Stepanov, A. N.; Smirnov, A. I.; Martynov, V. O.; Mal'shakova, O. A.; Silin, D. E.; Gusev, S. A.

    2017-03-01

    We describe the process of adaptive self-inversion of an image (nonlinear switching) of smallscale opaque object, when the amplitude-modulated laser beam, which illuminates it, is focused in a weakly absorbing medium. It is shown that, despite the nonlocal character of the process, which is due to thermal nonlinearity, the brightness-inverse image is characterized by acceptable quality and a high conversion coefficient. It is shown that the coefficient of conversion of the original image to the inverse one depends on the ratio of the object dimensions and the size of the illuminating beam, and decreases sharply for relatively large objects. The obtained experimental data agree with the numerical calculations. Inversion of the images of several model objects and microdefects in a nonlinear KDP crystal is demonstrated.

  14. Operation room tool handling and miscommunication scenarios: an object-process methodology conceptual model.

    PubMed

    Wachs, Juan P; Frenkel, Boaz; Dori, Dov

    2014-11-01

    Errors in the delivery of medical care are the principal cause of inpatient mortality and morbidity, accounting for around 98,000 deaths in the United States of America (USA) annually. Ineffective team communication, especially in the operation room (OR), is a major root of these errors. This miscommunication can be reduced by analyzing and constructing a conceptual model of communication and miscommunication in the OR. We introduce the principles underlying Object-Process Methodology (OPM)-based modeling of the intricate interactions between the surgeon and the surgical technician while handling surgical instruments in the OR. This model is a software- and hardware-independent description of the agents engaged in communication events, their physical activities, and their interactions. The model enables assessing whether the task-related objectives of the surgical procedure were achieved and completed successfully and what errors can occur during the communication. The facts used to construct the model were gathered from observations of various types of operations miscommunications in the operating room and its outcomes. The model takes advantage of the compact ontology of OPM, which is comprised of stateful objects - things that exist physically or informatically, and processes - things that transform objects by creating them, consuming them or changing their state. The modeled communication modalities are verbal and non-verbal, and errors are modeled as processes that deviate from the "sunny day" scenario. Using OPM refinement mechanism of in-zooming, key processes are drilled into and elaborated, along with the objects that are required as agents or instruments, or objects that these processes transform. The model was developed through an iterative process of observation, modeling, group discussions, and simplification. The model faithfully represents the processes related to tool handling that take place in an OR during an operation. The specification is at

  15. Effects of Transference Work in the Context of Therapeutic Alliance and Quality of Object Relations

    ERIC Educational Resources Information Center

    Hoglend, Per; Hersoug, Anne Grete; Bogwald, Kjell-Petter; Amlo, Svein; Marble, Alice; Sorbye, Oystein; Rossberg, Jan Ivar; Ulberg, Randi; Gabbard, Glen O.; Crits-Christoph, Paul

    2011-01-01

    Objective: Transference interpretation is considered as a core active ingredient in dynamic psychotherapy. In common clinical theory, it is maintained that more mature relationships, as well as a strong therapeutic alliance, may be prerequisites for successful transference work. In this study, the interaction between quality of object relations,…

  16. Fuel quality processing study, volume 1

    NASA Astrophysics Data System (ADS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-04-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  17. Fuel quality processing study, volume 1

    NASA Technical Reports Server (NTRS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-01-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  18. Sensory Processing Relates to Attachment to Childhood Comfort Objects of College Students

    ERIC Educational Resources Information Center

    Kalpidou, Maria

    2012-01-01

    The author tested the hypothesis that attachment to comfort objects is based on the sensory processing characteristics of the individual. Fifty-two undergraduate students with and without a childhood comfort object reported sensory responses and performed a tactile threshold task. Those with a comfort object described their object and rated their…

  19. Monitoring Processes in Visual Search Enhanced by Professional Experience: The Case of Orange Quality-Control Workers

    PubMed Central

    Visalli, Antonino; Vallesi, Antonino

    2018-01-01

    Visual search tasks have often been used to investigate how cognitive processes change with expertise. Several studies have shown visual experts' advantages in detecting objects related to their expertise. Here, we tried to extend these findings by investigating whether professional search experience could boost top-down monitoring processes involved in visual search, independently of advantages specific to objects of expertise. To this aim, we recruited a group of quality-control workers employed in citrus farms. Given the specific features of this type of job, we expected that the extensive employment of monitoring mechanisms during orange selection could enhance these mechanisms even in search situations in which orange-related expertise is not suitable. To test this hypothesis, we compared performance of our experimental group and of a well-matched control group on a computerized visual search task. In one block the target was an orange (expertise target) while in the other block the target was a Smurfette doll (neutral target). The a priori hypothesis was to find an advantage for quality-controllers in those situations in which monitoring was especially involved, that is, when deciding the presence/absence of the target required a more extensive inspection of the search array. Results were consistent with our hypothesis. Quality-controllers were faster in those conditions that extensively required monitoring processes, specifically, the Smurfette-present and both target-absent conditions. No differences emerged in the orange-present condition, which resulted to mainly rely on bottom-up processes. These results suggest that top-down processes in visual search can be enhanced through immersive real-life experience beyond visual expertise advantages. PMID:29497392

  20. Systems and processes that ensure high quality care.

    PubMed

    Bassett, Sally; Westmore, Kathryn

    2012-10-01

    This is the second in a series of articles examining the components of good corporate governance. It considers how the structures and processes for quality governance can affect an organisation's ability to be assured about the quality of care. Complex information systems and procedures can lead to poor quality care, but sound structures and processes alone are insufficient to ensure good governance, and behavioural factors play a significant part in making sure that staff are enabled to provide good quality care. The next article in this series looks at how the information reporting of an organisation can affect its governance.

  1. Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4

    EPA Pesticide Factsheets

    Provides a standard working tool for project managers and planners to develop DQO for determining the type, quantity, and quality of data needed to reach defensible decisions or make credible estimates.

  2. A Quality Process Approach to Electronic System Reliability: Supplier Quality Assessment Procedure. Volume 2

    DTIC Science & Technology

    1993-11-01

    AND SUPPORT SERVICE QUALITY ....... 16 3.5.7 SUPPLIER QUALITY ............................................................................. 16 3.6...RESULTS ......................................................................................................... 16 3.6.1 PRODUCT AND SERVICE QUALITY RESULTS...10 5.6 Business Process and Support Service Quality 20 5.7 Supplier Quality 20 6.0 Results 180 6.1 Product and Service Quality Results 90 6.2 Business

  3. A Case for Inhibition: Visual Attention Suppresses the Processing of Irrelevant Objects

    ERIC Educational Resources Information Center

    Wuhr, Peter; Frings, Christian

    2008-01-01

    The present study investigated the ability to inhibit the processing of an irrelevant visual object while processing a relevant one. Participants were presented with 2 overlapping shapes (e.g., circle and square) in different colors. The task was to name the color of the relevant object designated by shape. Congruent or incongruent color words…

  4. A fuzzy MCDM model with objective and subjective weights for evaluating service quality in hotel industries

    NASA Astrophysics Data System (ADS)

    Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi

    2013-12-01

    This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

  5. Prevention and management of "do not return" notices: a quality improvement process for supplemental staffing nursing agencies.

    PubMed

    Ade-Oshifogun, Jochebed Bosede; Dufelmeier, Thaddeus

    2012-01-01

    This article describes a quality improvement process for "do not return" (DNR) notices for healthcare supplemental staffing agencies and healthcare facilities that use them. It is imperative that supplemental staffing agencies partner with healthcare facilities in assuring the quality of supplemental staff. Although supplemental staffing agencies attempt to ensure quality staffing, supplemental staff are sometimes subjectively evaluated by healthcare facilities as "DNR." The objective of this article is to describe a quality improvement process to prevent and manage "DNR" within healthcare organizations. We developed a curriculum and accompanying evaluation tool by adapting Rampersad's problem-solving discipline approach: (a) definition of area(s) for improvement; (b) identification of all possible causes; (c) development of an action plan; (d) implementation of the action plan; (e) evaluation for program improvement; and (f) standardization of the process. Face and content validity of the evaluation tool was ascertained by input from a panel of experienced supplemental staff and nursing faculty. This curriculum and its evaluation tool will have practical implications for supplemental staffing agencies and healthcare facilities in reducing "DNR" rates and in meeting certification/accreditation requirements. Further work is needed to translate this process into future research. © 2012 Wiley Periodicals, Inc.

  6. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  7. Quality Assessment of College Admissions Processes.

    ERIC Educational Resources Information Center

    Fisher, Caroline; Weymann, Elizabeth; Todd, Amy

    2000-01-01

    This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…

  8. Object shape and orientation do not routinely influence performance during language processing.

    PubMed

    Rommers, Joost; Meyer, Antje S; Huettig, Falk

    2013-11-01

    The role of visual representations during language processing remains unclear: They could be activated as a necessary part of the comprehension process, or they could be less crucial and influence performance in a task-dependent manner. In the present experiments, participants read sentences about an object. The sentences implied that the object had a specific shape or orientation. They then either named a picture of that object (Experiments 1 and 3) or decided whether the object had been mentioned in the sentence (Experiment 2). Orientation information did not reliably influence performance in any of the experiments. Shape representations influenced performance most strongly when participants were asked to compare a sentence with a picture or when they were explicitly asked to use mental imagery while reading the sentences. Thus, in contrast to previous claims, implied visual information often does not contribute substantially to the comprehension process during normal reading.

  9. The Timing of Visual Object Categorization

    PubMed Central

    Mack, Michael L.; Palmeri, Thomas J.

    2011-01-01

    An object can be categorized at different levels of abstraction: as natural or man-made, animal or plant, bird or dog, or as a Northern Cardinal or Pyrrhuloxia. There has been growing interest in understanding how quickly categorizations at different levels are made and how the timing of those perceptual decisions changes with experience. We specifically contrast two perspectives on the timing of object categorization at different levels of abstraction. By one account, the relative timing implies a relative timing of stages of visual processing that are tied to particular levels of object categorization: Fast categorizations are fast because they precede other categorizations within the visual processing hierarchy. By another account, the relative timing reflects when perceptual features are available over time and the quality of perceptual evidence used to drive a perceptual decision process: Fast simply means fast, it does not mean first. Understanding the short-term and long-term temporal dynamics of object categorizations is key to developing computational models of visual object recognition. We briefly review a number of models of object categorization and outline how they explain the timing of visual object categorization at different levels of abstraction. PMID:21811480

  10. E-Learning Quality Assurance: A Process-Oriented Lifecycle Model

    ERIC Educational Resources Information Center

    Abdous, M'hammed

    2009-01-01

    Purpose: The purpose of this paper is to propose a process-oriented lifecycle model for ensuring quality in e-learning development and delivery. As a dynamic and iterative process, quality assurance (QA) is intertwined with the e-learning development process. Design/methodology/approach: After reviewing the existing literature, particularly…

  11. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  12. Providing leadership to a decentralized total quality process.

    PubMed

    Diederich, J J; Eisenberg, M

    1993-01-01

    Integrating total quality management into the culture of an organization and the daily work of employees requires a decentralized leadership structure that encourages all employees to become involved. This article, based upon the experience of the University of Michigan Hospitals Professional Services Divisional Lead Team, outlines a process for decentralizing the total quality management process.

  13. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  14. Executing Quality: A Grounded Theory of Child Care Quality Improvement Engagement Process in Pennsylvania

    ERIC Educational Resources Information Center

    Critchosin, Heather

    2014-01-01

    Executing Quality describes the perceived process experienced by participants while engaging in Keystone Standards, Training, Assistance, Resources, and Support (Keystone STARS) quality rating improvement system (QRIS). The purpose of this qualitative inquiry was to understand the process of Keystone STARS engagement in order to generate a…

  15. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  16. Process quality of decision-making in multidisciplinary cancer team meetings: a structured observational study.

    PubMed

    Hahlweg, Pola; Didi, Sarah; Kriston, Levente; Härter, Martin; Nestoriuc, Yvonne; Scholl, Isabelle

    2017-11-17

    The quality of decision-making in multidisciplinary team meetings (MDTMs) depends on the quality of information presented and the quality of team processes. Few studies have examined these factors using a standardized approach. The aim of this study was to objectively document the processes involved in decision-making in MDTMs, document the outcomes in terms of whether a treatment recommendation was given (none vs. singular vs. multiple), and to identify factors related to type of treatment recommendation. An adaptation of the observer rating scale Multidisciplinary Tumor Board Metric for the Observation of Decision-Making (MDT-MODe) was used to assess the quality of the presented information and team processes in MDTMs. Data was analyzed using descriptive statistics and mixed logistic regression analysis. N = 249 cases were observed in N = 29 MDTMs. While cancer-specific medical information was judged to be of high quality, psychosocial information and information regarding patient views were considered to be of low quality. In 25% of the cases no, in 64% one, and in 10% more than one treatment recommendations were given (1% missing data). Giving no treatment recommendation was associated with duration of case discussion, duration of the MDTM session, quality of case history, quality of radiological information, and specialization of the MDTM. Higher levels of medical and treatment uncertainty during discussions were found to be associated with a higher probability for more than one treatment recommendation. The quality of different aspects of information was observed to differ greatly. In general, we did not find MDTMs to be in line with the principles of patient-centered care. Recommendation outcome varied substantially between different specializations of MDTMs. The quality of certain information was associated with the recommendation outcome. Uncertainty during discussions was related to more than one recommendation being considered. Time constraints

  17. Tracker: Image-Processing and Object-Tracking System Developed

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Theodore W.

    1999-01-01

    Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in

  18. Nurse practitioners as attending providers for workers with uncomplicated back injuries: using administrative data to evaluate quality and process of care.

    PubMed

    Sears, Jeanne M; Wickizer, Thomas M; Franklin, Gary M; Cheadle, Allen D; Berkowitz, Bobbie

    2007-08-01

    The objectives of this study were 1) to identify quality and process of care indicators available in administrative workers' compensation data and to document their association with work disability outcomes, and 2) to use these indicators to assess whether nurse practitioners (NPs), recently authorized to serve as attending providers for injured workers in Washington State, performed differently than did primary care physicians (PCPs). Quality and process of care indicators for NP and PCP back injury claims from Washington State were compared using direct standardization and logistic regression. This study found little evidence of differences between NP and PCP claims in case mix or quality of care. The process of care indicators that we identified were highly associated with the duration of work disability and have potential for further development to assess and promote quality improvement.

  19. Abnormalities of Object Visual Processing in Body Dysmorphic Disorder

    PubMed Central

    Feusner, Jamie D.; Hembacher, Emily; Moller, Hayley; Moody, Teena D.

    2013-01-01

    Background Individuals with body dysmorphic disorder may have perceptual distortions for their appearance. Previous studies suggest imbalances in detailed relative to configural/holistic visual processing when viewing faces. No study has investigated the neural correlates of processing non-symptom-related stimuli. The objective of this study was to determine whether individuals with body dysmorphic disorder have abnormal patterns of brain activation when viewing non-face/non-body object stimuli. Methods Fourteen medication-free participants with DSM-IV body dysmorphic disorder and 14 healthy controls participated. We performed functional magnetic resonance imaging while participants matched photographs of houses that were unaltered, contained only high spatial frequency (high detail) information, or only low spatial frequency (low detail) information. The primary outcome was group differences in blood oxygen level-dependent signal changes. Results The body dysmorphic disorder group showed lesser activity in the parahippocampal gyrus, lingual gyrus, and precuneus for low spatial frequency images. There were greater activations in medial prefrontal regions for high spatial frequency images, although no significant differences when compared to a low-level baseline. Greater symptom severity was associated with lesser activity in dorsal occipital cortex and ventrolateral prefrontal cortex for normal and high spatial frequency images. Conclusions Individuals with body dysmorphic disorder have abnormal brain activation patterns when viewing objects. Hypoactivity in visual association areas for configural and holistic (low detail) elements and abnormal allocation of prefrontal systems for details is consistent with a model of imbalances in global vs. local processing. This may occur not only for appearance but also for general stimuli unrelated to their symptoms. PMID:21557897

  20. Manufacturing history of etanercept (Enbrel®): Consistency of product quality through major process revisions.

    PubMed

    Hassett, Brian; Singh, Ena; Mahgoub, Ehab; O'Brien, Julie; Vicik, Steven M; Fitzpatrick, Brian

    2018-01-01

    Etanercept (ETN) (Enbrel®) is a soluble protein that binds to, and specifically inhibits, tumor necrosis factor (TNF), a proinflammatory cytokine. ETN is synthesized in Chinese hamster ovary cells by recombinant DNA technology as a fusion protein, with a fully human TNFRII ectodomain linked to the Fc portion of human IgG1. Successful manufacture of biologics, such as ETN, requires sophisticated process and product understanding, as well as meticulous control of operations to maintain product consistency. The objective of this evaluation was to show that the product profile of ETN drug substance (DS) has been consistent over the course of production. Multiple orthogonal biochemical analyses, which included evaluation of attributes indicative of product purity, potency, and quality, were assessed on >2,000 batches of ETN from three sites of DS manufacture, during the period 1998-2015. Based on the key quality attributes of product purity (assessed by hydrophobic interaction chromatography HPLC), binding activity (to TNF by ELISA), potency (inhibition of TNF-induced apoptosis by cell-based bioassay) and quality (N-linked oligosaccharide map), we show that the integrity of ETN DS has remained consistent over time. This consistency was maintained through three major enhancements to the initial process of manufacturing that were supported by detailed comparability assessments, and approved by the European Medicines Agency. Examination of results for all major quality attributes for ETN DS indicates a highly consistent process for over 18 years and throughout changes to the manufacturing process, without affecting safety and efficacy, as demonstrated across a wide range of clinical trials of ETN in multiple inflammatory diseases.

  1. The quality of instruments to assess the process of shared decision making: A systematic review

    PubMed Central

    Bomhof-Roordink, Hanna; Smith, Ian P.; Scholl, Isabelle; Stiggelbout, Anne M.; Pieterse, Arwen H.

    2018-01-01

    Objective To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. Methods In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. Results We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Conclusions Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument’s content and characteristics such as the perspective that they assess. We recommend refinement and validation of

  2. The Balanced Scorecard of acute settings: development process, definition of 20 strategic objectives and implementation.

    PubMed

    Groene, Oliver; Brandt, Elimer; Schmidt, Werner; Moeller, Johannes

    2009-08-01

    Strategy development and implementation in acute care settings is often restricted by competing challenges, the pace of policy reform and the existence of parallel hierarchies. To describe a generic approach to strategy development, illustrate the use of the Balanced Scorecard as a tool to facilitate strategy implementation and demonstrate how to break down strategic goals into measurable elements. Multi-method approach using three different conceptual models: Health Promoting Hospitals Standards and Strategies, the European Foundation for Quality Management (EFQM) Model and the Balanced Scorecard. A bundle of qualitative and quantitative methods were used including in-depth interviews, standardized organization-wide surveys on organizational values, staff satisfaction and patient experience. Three acute care hospitals in four different locations belonging to a German holding group. Chief executive officer, senior medical officers, working group leaders and hospital staff. Development and implementation of the Balanced Scorecard. Twenty strategic objectives with corresponding Balanced Scorecard measures. A stepped approach from strategy development to implementation is presented to identify key themes for strategy development, drafting a strategy map and developing strategic objectives and measures. The Balanced Scorecard, in combination with the EFQM model, is a useful tool to guide strategy development and implementation in health care organizations. As for other quality improvement and management tools not specifically developed for health care organizations, some adaptations are required to improve acceptability among professionals. The step-wise approach of strategy development and implementation presented here may support similar processes in comparable organizations.

  3. Evaluation of image quality in terahertz pulsed imaging using test objects.

    PubMed

    Fitzgerald, A J; Berry, E; Miles, R E; Zinovev, N N; Smith, M A; Chamberlain, J M

    2002-11-07

    As with other imaging modalities, the performance of terahertz (THz) imaging systems is limited by factors of spatial resolution, contrast and noise. The purpose of this paper is to introduce test objects and image analysis methods to evaluate and compare THz image quality in a quantitative and objective way, so that alternative terahertz imaging system configurations and acquisition techniques can be compared, and the range of image parameters can be assessed. Two test objects were designed and manufactured, one to determine the modulation transfer functions (MTF) and the other to derive image signal to noise ratio (SNR) at a range of contrasts. As expected the higher THz frequencies had larger MTFs, and better spatial resolution as determined by the spatial frequency at which the MTF dropped below the 20% threshold. Image SNR was compared for time domain and frequency domain image parameters and time delay based images consistently demonstrated higher SNR than intensity based parameters such as relative transmittance because the latter are more strongly affected by the sources of noise in the THz system such as laser fluctuations and detector shot noise.

  4. Beyond Faces and Expertise: Facelike Holistic Processing of Nonface Objects in the Absence of Expertise.

    PubMed

    Zhao, Mintao; Bülthoff, Heinrich H; Bülthoff, Isabelle

    2016-02-01

    Holistic processing-the tendency to perceive objects as indecomposable wholes-has long been viewed as a process specific to faces or objects of expertise. Although current theories differ in what causes holistic processing, they share a fundamental constraint for its generalization: Nonface objects cannot elicit facelike holistic processing in the absence of expertise. Contrary to this prevailing view, here we show that line patterns with salient Gestalt information (i.e., connectedness, closure, and continuity between parts) can be processed as holistically as faces without any training. Moreover, weakening the saliency of Gestalt information in these patterns reduced holistic processing of them, which indicates that Gestalt information plays a crucial role in holistic processing. Therefore, holistic processing can be achieved not only via a top-down route based on expertise, but also via a bottom-up route relying merely on object-based information. The finding that facelike holistic processing can extend beyond the domains of faces and objects of expertise poses a challenge to current dominant theories. © The Author(s) 2015.

  5. Chicago Residents’ Perceptions of Air Quality: Objective Pollution, the Built Environment, and Neighborhood Stigma Theory

    PubMed Central

    King, Katherine E.

    2014-01-01

    Substantial research documents higher pollution levels in minority neighborhoods, but little research evaluates how residents perceive their own communities’ pollution risks. According to “Neighborhood stigma” theory, survey respondents share a cultural bias that minorities cause social dysfunction, leading to over-reports of dysfunction in minority communities. This study investigates perceptions of residential outdoor air quality by linking objective data on built and social environments with multiple measures of pollution and a representative survey of Chicago residents. Consistent with the scholarly narrative, results show air quality is rated worse where minorities and poverty are concentrated, even after extensive adjustment for objective pollution and built environment measures. Perceptions of air pollution may thus be driven by neighborhood socioeconomic position far more than by respondents’ ability to perceive pollution. The finding that 63.5% of the sample reported excellent or good air quality helps to explain current challenging in promoting environmental action. PMID:26527847

  6. [Refractive precision and objective quality of vision after toric lens implantation in cataract surgery].

    PubMed

    Debois, A; Nochez, Y; Bezo, C; Bellicaud, D; Pisella, P-J

    2012-10-01

    To study efficacy and predictability of toric IOL implantation for correction of preoperative corneal astigmatism by analysing spherocylindrical refractive precision and objective quality of vision. Prospective study of 13 eyes undergoing micro-incisional cataract surgery through a 1.8mm corneal incision with toric IOL implantation (Lentis L313T(®), Oculentis) to treat over one D of preoperative corneal astigmatism. Preoperative evaluation included keratometry, subjective refraction, and total and corneal aberrometry (KR-1(®), Topcon). Six months postoperatively, measurements included slit lamp photography, documenting IOL rotation, tilt or decentration, uncorrected visual acuity, best-corrected visual acuity and objective quality of vision measurement (OQAS(®) Visiometrics, Spain). Postoperatively, mean uncorrected distance visual acuity was 8.33/10 ± 1.91 (0.09 ± 0.11 LogMar). Mean postoperative refractive sphere was 0.13 ± 0.73 diopters. Mean refractive astigmatism was -0.66 ± 0.56 diopters with corneal astigmatism of 2.17 ± 0.68 diopters. Mean IOL rotation was 4.4° ± 3.6° (range 0° to 10°). Mean rotation of this IOL at 6 months was less than 5°, demonstrating stability of the optic within the capsular bag. Objective quality of vision measurements were consistent with subjective uncorrected visual acuity. Implantation of the L313T(®) IOL is safe and effective for correction of corneal astigmatism in 1.8mm micro-incisional cataract surgery. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  7. Mirror-Image Confusions: Implications for Representation and Processing of Object Orientation

    ERIC Educational Resources Information Center

    Gregory, Emma; McCloskey, Michael

    2010-01-01

    Perceiving the orientation of objects is important for interacting with the world, yet little is known about the mental representation or processing of object orientation information. The tendency of humans and other species to confuse mirror images provides a potential clue. However, the appropriate characterization of this phenomenon is not…

  8. The Effects of Directional Processing on Objective and Subjective Listening Effort

    ERIC Educational Resources Information Center

    Picou, Erin M.; Moore, Travis M.; Ricketts, Todd A.

    2017-01-01

    Purpose: The purposes of this investigation were (a) to evaluate the effects of hearing aid directional processing on subjective and objective listening effort and (b) to investigate the potential relationships between subjective and objective measures of effort. Method: Sixteen adults with mild to severe hearing loss were tested with study…

  9. A Total Quality Leadership Process Improvement Model

    DTIC Science & Technology

    1993-12-01

    Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR

  10. Medical Image Processing Server applied to Quality Control of Nuclear Medicine.

    NASA Astrophysics Data System (ADS)

    Vergara, C.; Graffigna, J. P.; Marino, E.; Omati, S.; Holleywell, P.

    2016-04-01

    This paper is framed within the area of medical image processing and aims to present the process of installation, configuration and implementation of a processing server of medical images (MIPS) in the Fundación Escuela de Medicina Nuclear located in Mendoza, Argentina (FUESMEN). It has been developed in the Gabinete de Tecnologia Médica (GA.TE.ME), Facultad de Ingeniería-Universidad Nacional de San Juan. MIPS is a software that using the DICOM standard, can receive medical imaging studies of different modalities or viewing stations, then it executes algorithms and finally returns the results to other devices. To achieve the objectives previously mentioned, preliminary tests were conducted in the laboratory. More over, tools were remotely installed in clinical enviroment. The appropiate protocols for setting up and using them in different services were established once defined those suitable algorithms. Finally, it’s important to focus on the implementation and training that is provided in FUESMEN, using nuclear medicine quality control processes. Results on implementation are exposed in this work.

  11. Evaluation of Distance Course Effectiveness - Exploring the Quality of Interactive Processes

    NASA Astrophysics Data System (ADS)

    Botelho, Francisco Villa Ulhôa; Vicari, Rosa Maria

    Understanding the dynamics of learning processes implies an understanding of their components: individuals, environment or context and mediation. It is known that distance learning (DL) has a distinctive characteristic in relation to the mediation component. Due to the need of overcoming the barriers of distance and time, DL intensively uses information and communication technologies (ICT) to perform interactive processes. Construction of effective learning environments depends on human relationships. It also depends on the emotionality placed on such relationships. Therefore, knowing how to act in virtual environments in the sense of creating the required ambiance for animation of learning processes has a unique importance. This is the theme of this study. Its general objectives were achieved and can be summarized as follows: analyze indexes that are significant for evaluations of distance course effectiveness; investigate to which extent effectiveness of DL courses is correlated with quality of interactive processes; search characteristics of the conversations by individuals interacting in study groups that are formed in virtual environments, which may contribute to effectiveness of distance courses.

  12. The process of managerial control in quality improvement initiatives.

    PubMed

    Slovensky, D J; Fottler, M D

    1994-11-01

    The fundamental intent of strategic management is to position an organization with in its market to exploit organizational competencies and strengths to gain competitive advantage. Competitive advantage may be achieved through such strategies as low cost, high quality, or unique services or products. For health care organizations accredited by the Joint Commission on Accreditation of Healthcare Organizations, continually improving both processes and outcomes of organizational performance--quality improvement--in all operational areas of the organization is a mandated strategy. Defining and measuring quality and controlling the quality improvement strategy remain problematic. The article discusses the nature and processes of managerial control, some potential measures of quality, and related information needs.

  13. Service Quality Of Diagnostic Fine Needle Aspiration Cytology In A Tertiary Care Hospital Of Lahore (Process Measure As Patient's Perspective).

    PubMed

    Rizvi, Zainab; Usmani, Rabia Arshed; Rizvi, Amna; Wazir, Salim; Zahra, Taskeen; Rasool, Hafza

    2017-01-01

    Quality of any service is the most important aspect for the manufacturer as well as the consumer. The primary objective of any nation's health system is to provide supreme quality health care services to its patients. The objective of this study was to assess the quality of diagnostic fine needle aspiration cytology service in a tertiary care hospital. As Patient's perspectives provide valuable information on quality of process, therefore, patient's perception in terms of satisfaction with the service was measured. In this cross sectional analytical study, 291 patients undergoing fine needle aspiration cytology in Mayo Hospital were selected by systematic sampling technique. Information regarding satisfaction of patients with four dimensions of service quality process, namely "procedure, sterilization, conduct and competency of doctor" was collected through interview on questionnaire. The questionnaire was developed on SERVQUAL model, a measurement tool, for quality assessment of services provided to patients. All items were assessed on 2- point likert scale (0=dissatisfied, 1=satisfied). Frequencies and percentages of satisfied and dissatisfied patients were recorded for each item and all items in each dimension were scored. If the percentage of sum of all item scores of a dimension was ≥60, the dimension was 'good quality'. Whereas <60% was 'poor quality' dimension. Data was analysed using epi-info-3.5.1. Fisher test was applied to check statistical significance. (p-value <0.05). Out of the 4 dimensions of service quality process, Procedure (48.8%), Sterilization (51.5%) and practitioner conduct (50.9%) were perceived as 'poor' by the patients. Only practitioner competency (67.4%) was perceived as 'good'. Comparison of dimensions of service quality scoring with overall level of patient satisfaction revealed that all 4 dimensions were significantly related to patient dissatisfaction (p<.05). The study suggests that service quality of therapeutic and diagnostic

  14. More than a feeling: The bidirectional convergence of semantic visual object and somatosensory processing.

    PubMed

    Ekstrand, Chelsea; Neudorf, Josh; Lorentz, Eric; Gould, Layla; Mickleborough, Marla; Borowsky, Ron

    2017-11-01

    Prevalent theories of semantic processing assert that the sensorimotor system plays a functional role in the semantic processing of manipulable objects. While motor execution has been shown to impact object processing, involvement of the somatosensory system has remained relatively unexplored. Therefore, we developed two novel priming paradigms. In Experiment 1, participants received a vibratory hand prime (on half the trials) prior to viewing a picture of either an object interacted primarily with the hand (e.g., a cup) or the foot (e.g., a soccer ball) and reported how they would interact with it. In Experiment 2, the same objects became the prime and participants were required to identify whether the vibratory stimulation occurred to their hand or foot. In both experiments, somatosensory priming effects arose for the hand objects, while foot objects showed no priming benefits. These results suggest that object semantic knowledge bidirectionally converges with the somatosensory system. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Assessing the quality of radiographic processing in general dental practice.

    PubMed

    Thornley, P H; Stewardson, D A; Rout, P G J; Burke, F J T

    2006-05-13

    To determine if a commercial device (Vischeck) for monitoring film processing quality was a practical option in general dental practice, and to assess processing quality among a group of GDPs in the West Midlands with this device. Clinical evaluation. General dental practice, UK, 2004. Ten GDP volunteers from a practice based research group processed Vischeck strips (a) when chemicals were changed, (b) one week later, and (c) immediately before the next change of chemicals. These were compared with strips processed under ideal conditions. Additionally, a series of duplicate radiographs were produced and processed together with Vischeck strips in progressively more dilute developer solutions to compare the change in radiograph quality assessed clinically with that derived from the Vischeck. The Vischeck strips suggested that at the time chosen for change of processing chemicals, eight dentists had been processing films well beyond the point indicated for replacement. Solutions were changed after a wide range of time periods and number of films processed. The calibration of the Vischeck strip correlated closely to a clinical assessment of acceptable film quality. Vischeck strips are a useful aid to monitoring processing quality in automatic developers in general dental practice. Most of this group of GDPs were using chemicals beyond the point at which diagnostic yield would be affected.

  16. A Taxonomy of Object-Oriented Measures Modeling the Object-Oriented Space

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to control the quality of software and the software development process, it is important to understand the measurement of software. A first step toward a better comprehension of software measurement is the categorization of software measures by some meaningful taxonomy. The most worthwhile taxonomy would capture the fundamental nature of the object-oriented (O-O) space. The principal characteristics of object-oriented software offer a starting point for such a categorization of measures. This paper introduces a taxonomy of measures based upon fourteen characteristics of object-oriented software gathered from the literature. This taxonomy allows us to easily see gaps or redundancies in the existing O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with measures taken from the literature.

  17. A Taxonomy of Object-Oriented Measures Modeling the Object Oriented Space

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to control the quality of software and the software development process, it is important to understand the measurement of software. A first step toward a better comprehension of software measurement is the categorization of software measures by some meaningful taxonomy. The most worthwhile taxonomy would capture the fundamental nature of the object-oriented (O-O) space. The principal characteristics of object-oriented software offer a starting point for such a categorization of measures. This paper introduces a taxonomy of measures based upon fourteen characteristics of object-oriented software gathered from the literature. This taxonomy allows us to easily see gaps or redundancies in the existing O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with measures taken from the literature.

  18. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  19. The Sensitivity of Derived Estimates to the Measurement Quality Objectives for Independent Variables

    Treesearch

    Francis A. Roesch

    2005-01-01

    The effect of varying the allowed measurement error for individual tree variables upon county estimates of gross cubic-foot volume was examined. Measurement Quality Objectives (MQOs) for three forest tree variables (biological identity, diameter, and height) used in individual tree gross cubic-foot volume equations were varied from the current USDA Forest Service...

  20. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  1. Comparison of the performance of intraoral X-ray sensors using objective image quality assessment.

    PubMed

    Hellén-Halme, Kristina; Johansson, Curt; Nilsson, Mats

    2016-05-01

    The main aim of this study was to evaluate the performance of 10 individual sensors of the same make, using objective measures of key image quality parameters. A further aim was to compare 8 brands of sensors. Ten new sensors of 8 different models from 6 manufacturers (i.e., 80 sensors) were included in the study. All sensors were exposed in a standardized way using an X-ray tube voltage of 60 kVp and different exposure times. Sensor response, noise, low-contrast resolution, spatial resolution and uniformity were measured. Individual differences between sensors of the same brand were surprisingly large in some cases. There were clear differences in the characteristics of the different brands of sensors. The largest variations were found for individual sensor response for some of the brands studied. Also, noise level and low contrast resolution showed large variations between brands. Sensors, even of the same brand, vary significantly in their quality. It is thus valuable to establish action levels for the acceptance of newly delivered sensors and to use objective image quality control for commissioning purposes and periodic checks to ensure high performance of individual digital sensors. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Fox Valley Technical College Quality First Process Model.

    ERIC Educational Resources Information Center

    Fox Valley Technical Coll., Appleton, WI.

    An overview is provided of the Quality First Process Model developed by Fox Valley Technical College (FVTC), Wisconsin, to provide guidelines for quality instruction and service consistent with the highest educational standards. The 16-step model involves activities that should be adaptable to any organization. The steps of the quality model are…

  3. JPEG vs. JPEG 2000: an objective comparison of image encoding quality

    NASA Astrophysics Data System (ADS)

    Ebrahimi, Farzad; Chamik, Matthieu; Winkler, Stefan

    2004-11-01

    This paper describes an objective comparison of the image quality of different encoders. Our approach is based on estimating the visual impact of compression artifacts on perceived quality. We present a tool that measures these artifacts in an image and uses them to compute a prediction of the Mean Opinion Score (MOS) obtained in subjective experiments. We show that the MOS predictions by our proposed tool are a better indicator of perceived image quality than PSNR, especially for highly compressed images. For the encoder comparison, we compress a set of 29 test images with two JPEG encoders (Adobe Photoshop and IrfanView) and three JPEG2000 encoders (JasPer, Kakadu, and IrfanView) at various compression ratios. We compute blockiness, blur, and MOS predictions as well as PSNR of the compressed images. Our results show that the IrfanView JPEG encoder produces consistently better images than the Adobe Photoshop JPEG encoder at the same data rate. The differences between the JPEG2000 encoders in our test are less pronounced; JasPer comes out as the best codec, closely followed by IrfanView and Kakadu. Comparing the JPEG- and JPEG2000-encoding quality of IrfanView, we find that JPEG has a slight edge at low compression ratios, while JPEG2000 is the clear winner at medium and high compression ratios.

  4. Effect of geometrical features various objects on the data quality obtained with measured by TLS

    NASA Astrophysics Data System (ADS)

    Pawłowicz, J. A.

    2017-08-01

    Collecting data on different building structures using Terrestrial Laser Scanning (TLS) has become in recent years a very popular due to minimize the time required to complete the task as compared to traditional methods. Technical parameters of 3D scanning devices (digitizers) are increasingly being improved, and the accuracy of the data collected allows you to play not only the geometry of an existing object in a digital image, but also enables the assessment of his condition. This is possible thanks to the digitalization of existing objects e.g., a 3D laser scanner, with which is obtained a digital data base is presented in the form of a cloud of points and by using reverse engineering. Measurements using laser scanners depends to a large extent, on the quality of the returning beam reflected from the target surface, towards the receiver. High impact on the strength and quality of the beam returning to the geometric features of the object. These properties may contribute to the emergence of some, sometimes even serious errors during scanning of various shapes. The study defined the effect of the laser beam distortion during the measurement objects with the same material but with different geometrical features on their three-dimensional imaging obtained from measurements made using TLS. We present the problem of data quality, dependent on the deflection of the beam intensity and shape of the object selected examples. The knowledge of these problems allows to obtain valuable data necessary for the implementation of digitization and the visualization of virtually any building structure made of any materials. The studies has been proven that the increase in the density of scanning does not affect the values of mean square error. The increase in the angle of incidence of the beam onto a flat surface, however, causes a decrease in the intensity of scattered radiation that reaches the receiver. The article presents an analysis of the laser beam reflected from broken at

  5. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  6. Distinct Visual Processing of Real Objects and Pictures of Those Objects in 7- to 9-month-old Infants

    PubMed Central

    Gerhard, Theresa M.; Culham, Jody C.; Schwarzer, Gudrun

    2016-01-01

    The present study examined 7- and 9-month-old infants’ visual habituation to real objects and pictures of the same objects and their preferences between real and pictorial versions of the same objects following habituation. Different hypotheses would predict that infants may habituate faster to pictures than real objects (based on proposed theoretical links between behavioral habituation in infants and neuroimaging adaptation in adults) or to real objects vs. pictures (based on past infant electrophysiology data). Sixty-one 7-month-old infants and fifty-nine 9-month-old infants were habituated to either a real object or a picture of the same object and afterward preference tested with the habituation object paired with either the novel real object or its picture counterpart. Infants of both age groups showed basic information-processing advantages for real objects. Specifically, during the initial presentations, 9-month-old infants looked longer at stimuli in both formats than the 7-month olds but more importantly both age groups looked longer at real objects than pictures, though with repeated presentations, they habituated faster for real objects such that at the end of habituation, they looked equally at both types of stimuli. Surprisingly, even after habituation, infants preferred to look at the real objects, regardless of whether they had habituated to photos or real objects. Our findings suggest that from as early as 7-months of age, infants show strong preferences for real objects, perhaps because real objects are visually richer and/or enable the potential for genuine interactions. PMID:27378962

  7. Stuck on semantics: Processing of irrelevant object-scene inconsistencies modulates ongoing gaze behavior.

    PubMed

    Cornelissen, Tim H W; Võ, Melissa L-H

    2017-01-01

    People have an amazing ability to identify objects and scenes with only a glimpse. How automatic is this scene and object identification? Are scene and object semantics-let alone their semantic congruity-processed to a degree that modulates ongoing gaze behavior even if they are irrelevant to the task at hand? Objects that do not fit the semantics of the scene (e.g., a toothbrush in an office) are typically fixated longer and more often than objects that are congruent with the scene context. In this study, we overlaid a letter T onto photographs of indoor scenes and instructed participants to search for it. Some of these background images contained scene-incongruent objects. Despite their lack of relevance to the search, we found that participants spent more time in total looking at semantically incongruent compared to congruent objects in the same position of the scene. Subsequent tests of explicit and implicit memory showed that participants did not remember many of the inconsistent objects and no more of the consistent objects. We argue that when we view natural environments, scene and object relationships are processed obligatorily, such that irrelevant semantic mismatches between scene and object identity can modulate ongoing eye-movement behavior.

  8. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    PubMed

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  9. Preliminary evaluation of feeder and lint slide moisture addition on ginning, fiber quality, and textile processing of western cotton

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to evaluate the effects of moisture addition at the gin stand feeder conditioning hopper and/or the battery condenser slide on gin performance and Western cotton fiber quality and textile processing. The test treatments included no moisture addition, feeder hopper hum...

  10. The Effects of Visual Degradation on Attended Objects and the Ability to Process Unattended Objects within the Visual Array

    DTIC Science & Technology

    2010-09-01

    field at once (e.g., Biederman , Blickle, Teitelbaum, & Klatsky, 1988), and objects of interest typically receive the attention required to recognize them...field ( Biederman & Cooper, 1991) and image size changes ( Biederman & Cooper, 1992). Yet, only attended objects are recognized when mirror images...left-right reversals) occur ( Biederman & Cooper, 1991). Due to these results, Hummel (2001) proposed that attended images are processed by both

  11. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  12. [Quality management in a clinical research facility: Evaluation of changes in quality in-house figures and the appraisal of in-house quality indicators].

    PubMed

    Aden, Bile; Allekotte, Silke; Mösges, Ralph

    2016-12-01

    For long-term maintenance and improvement of quality within a clinical research institute, the implementation and certification of a quality management system is suitable. Due to the implemented quality management system according to the still valid DIN EN ISO 9001:2008 desired quality objectives are achieved effectively. The evaluation of quality scores and the appraisal of in-house quality indicators make an important contribution in this regard. In order to achieve this and draw quality assurance conclusions, quality indicators as sensible and sensitive as possible are developed. For this, own key objectives, the retrospective evaluation of quality scores, a prospective follow-up and also discussions establish the basis. In the in-house clinical research institute the measures introduced by the quality management led to higher efficiency in work processes, improved staff skills, higher customer satisfaction and overall to more successful outcomes in relation to the self-defined key objectives. Copyright © 2016. Published by Elsevier GmbH.

  13. Processing graspable object images and their nouns is impaired in Parkinson's disease patients.

    PubMed

    Buccino, Giovanni; Dalla Volta, Riccardo; Arabia, Gennarina; Morelli, Maurizio; Chiriaco, Carmelina; Lupo, Angela; Silipo, Franco; Quattrone, Aldo

    2018-03-01

    According to embodiment, the recruitment of the motor system is necessary to process language material expressing a motor content. Coherently, an impairment of the motor system should affect the capacity to process language items with a motor content. The aim of the present study was to assess the capacity to process graspable objects and their nouns in Parkinson's disease (PD) patients and healthy controls. Participants saw photos and nouns depicting graspable and non-graspable objects. Scrambled images and pseudo-words served as control stimuli. At 150 msec after stimulus presentation, they had to respond when the stimulus referred to a real object, and refrain from responding when it was meaningless (go-no go paradigm). In the control group, participants gave slower motor responses for stimuli (both photos and nouns) related to graspable objects as compared to non-graspable ones. This in keeping with data obtained in a previous study with young healthy participants. In the PD group, motor responses were similar for both graspable and non-graspable items. Moreover, error number was significantly greater than in controls. These findings support the notion that when the motor circuits are lesioned, like in PD, patients do not show the typical modulation of motor responses and have troubles in processing graspable objects and their nouns. Copyright © 2017. Published by Elsevier Ltd.

  14. When a Dog Has a Pen for a Tail: The Time Course of Creative Object Processing

    ERIC Educational Resources Information Center

    Wang, Botao; Duan, Haijun; Qi, Senqing; Hu, Weiping; Zhang, Huan

    2017-01-01

    Creative objects differ from ordinary objects in that they are created by human beings to contain novel, creative information. Previous research has demonstrated that ordinary object processing involves both a perceptual process for analyzing different features of the visual input and a higher-order process for evaluating the relevance of this…

  15. Process quality planning of quality function deployment for carrot syrup

    NASA Astrophysics Data System (ADS)

    Ekawati, Yurida; Noya, Sunday; Widjaja, Filemon

    2017-06-01

    Carrot products are rarely available in the market. Based on previous research that had been done using QFD to generate product design of carrots products, the research to produce the process quality planning had been carried out. The carrot product studied was carrot syrup. The research resulted in a process planning matrix for carrot syrup. The matrix gives information about critical process plan and the priority of the critical process plan. The critical process plan on the production process of carrot syrup consists of carrots sorting, carrots peeling, carrots washing, blanching process, carrots cutting, the making of pureed carrots, filtering carrot juice, the addition of sugar in carrot juice, the addition of food additives in carrot juice, syrup boiling, syrup filtering, syrup filling into the bottle, the bottle closure and cooling. The information will help the design of the production process of carrot syrup.

  16. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  17. Close interpersonal proximity modulates visuomotor processing of object affordances in shared, social space.

    PubMed

    Saccone, Elizabeth J; Szpak, Ancret; Churches, Owen; Nicholls, Michael E R

    2018-01-01

    Research suggests that the human brain codes manipulable objects as possibilities for action, or affordances, particularly objects close to the body. Near-body space is not only a zone for body-environment interaction but also is socially relevant, as we are driven to preserve our near-body, personal space from others. The current, novel study investigated how close proximity of a stranger modulates visuomotor processing of object affordances in shared, social space. Participants performed a behavioural object recognition task both alone and with a human confederate. All object images were in participants' reachable space but appeared relatively closer to the participant or the confederate. Results revealed when participants were alone, objects in both locations produced an affordance congruency effect but when the confederate was present, only objects nearer the participant elicited the effect. Findings suggest space is divided between strangers to preserve independent near-body space boundaries, and in turn this process influences motor coding for stimuli within that social space. To demonstrate that this visuomotor modulation represents a social phenomenon, rather than a general, attentional effect, two subsequent experiments employed nonhuman joint conditions. Neither a small, Japanese, waving cat statue (Experiment 2) nor a metronome (Experiment 3) modulated the affordance effect as in Experiment 1. These findings suggest a truly social explanation of the key interaction from Experiment 1. This study represents an important step toward understanding object affordance processing in real-world, social contexts and has implications broadly across fields of social action and cognition, and body space representation.

  18. Quality management of manufacturing process based on manufacturing execution system

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Jiang, Yang; Jiang, Weizhuo

    2017-04-01

    Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.

  19. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  20. Objective Assessment of Image Quality VI: Imaging in Radiation Therapy

    PubMed Central

    Barrett, Harrison H.; Kupinski, Matthew A.; Müeller, Stefan; Halpern, Howard J.; Morris, John C.; Dwyer, Roisin

    2015-01-01

    Earlier work on Objective Assessment of Image Quality (OAIQ) focused largely on estimation or classification tasks in which the desired outcome of imaging is accurate diagnosis. This paper develops a general framework for assessing imaging quality on the basis of therapeutic outcomes rather than diagnostic performance. By analogy to Receiver Operating Characteristic (ROC) curves and their variants as used in diagnostic OAIQ, the method proposed here utilizes the Therapy Operating Characteristic or TOC curves, which are plots of the probability of tumor control vs. the probability of normal-tissue complications as the overall dose level of a radiotherapy treatment is varied. The proposed figure of merit is the area under the TOC curve, denoted AUTOC. This paper reviews an earlier exposition of the theory of TOC and AUTOC, which was specific to the assessment of image-segmentation algorithms, and extends it to other applications of imaging in external-beam radiation treatment as well as in treatment with internal radioactive sources. For each application, a methodology for computing the TOC is presented. A key difference between ROC and TOC is that the latter can be defined for a single patient rather than a population of patients. PMID:24200954

  1. Performance evaluation of objective quality metrics for HDR image compression

    NASA Astrophysics Data System (ADS)

    Valenzise, Giuseppe; De Simone, Francesca; Lauga, Paul; Dufaux, Frederic

    2014-09-01

    Due to the much larger luminance and contrast characteristics of high dynamic range (HDR) images, well-known objective quality metrics, widely used for the assessment of low dynamic range (LDR) content, cannot be directly applied to HDR images in order to predict their perceptual fidelity. To overcome this limitation, advanced fidelity metrics, such as the HDR-VDP, have been proposed to accurately predict visually significant differences. However, their complex calibration may make them difficult to use in practice. A simpler approach consists in computing arithmetic or structural fidelity metrics, such as PSNR and SSIM, on perceptually encoded luminance values but the performance of quality prediction in this case has not been clearly studied. In this paper, we aim at providing a better comprehension of the limits and the potentialities of this approach, by means of a subjective study. We compare the performance of HDR-VDP to that of PSNR and SSIM computed on perceptually encoded luminance values, when considering compressed HDR images. Our results show that these simpler metrics can be effectively employed to assess image fidelity for applications such as HDR image compression.

  2. Material quality development during the automated tow placement process

    NASA Astrophysics Data System (ADS)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  3. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  4. IEEE Std 730 Software Quality Assurance: Supporting CMMI-DEV v1.3, Product and Process Quality Assurance

    DTIC Science & Technology

    2011-05-27

    frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207

  5. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    PubMed

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  6. Functional Dissociations within the Ventral Object Processing Pathway: Cognitive Modules or a Hierarchical Continuum?

    ERIC Educational Resources Information Center

    Cowell, Rosemary A.; Bussey, Timothy J.; Saksida, Lisa M.

    2010-01-01

    We examined the organization and function of the ventral object processing pathway. The prevailing theoretical approach in this field holds that the ventral object processing stream has a modular organization, in which visual perception is carried out in posterior regions and visual memory is carried out, independently, in the anterior temporal…

  7. Quality assessment of baby food made of different pre-processed organic raw materials under industrial processing conditions.

    PubMed

    Seidel, Kathrin; Kahl, Johannes; Paoletti, Flavio; Birlouez, Ines; Busscher, Nicolaas; Kretzschmar, Ursula; Särkkä-Tirkkonen, Marjo; Seljåsen, Randi; Sinesio, Fiorella; Torp, Torfinn; Baiamonte, Irene

    2015-02-01

    The market for processed food is rapidly growing. The industry needs methods for "processing with care" leading to high quality products in order to meet consumers' expectations. Processing influences the quality of the finished product through various factors. In carrot baby food, these are the raw material, the pre-processing and storage treatments as well as the processing conditions. In this study, a quality assessment was performed on baby food made from different pre-processed raw materials. The experiments were carried out under industrial conditions using fresh, frozen and stored organic carrots as raw material. Statistically significant differences were found for sensory attributes among the three autoclaved puree samples (e.g. overall odour F = 90.72, p < 0.001). Samples processed from frozen carrots show increased moisture content and decrease of several chemical constituents. Biocrystallization identified changes between replications of the cooking. Pre-treatment of raw material has a significant influence on the final quality of the baby food.

  8. Bioactive compounds and quality parameters of avocado oil obtained by different processes.

    PubMed

    Krumreich, Fernanda D; Borges, Caroline D; Mendonça, Carla Rosane B; Jansen-Alves, Cristina; Zambiazi, Rui C

    2018-08-15

    The objective of this study was to evaluate the quality of avocado oil whose pulp was processed through different drying and oil extraction methods. The physicochemical characteristics of avocados cv. Breda were determined after drying the pulp in an oven under ventilation (40 °C and 60 °C) and vacuum oven (60 °C), followed by the oil extracted by mechanical pressing or the Soxhlet method. From the approximately 72% pulp found in the avocado fruit, the 16% fraction is lipids. The quality indices evaluated in avocado oil showed better results when the pulp was dried at 60 °C under vacuum and oil extraction was done by the Soxhlet method with petroleum ether, whereas the bioactive compounds were better preserved when the avocado pulp was dried at 60 °C under ventilation and mechanical pressing was used for the oil extraction. Among the fatty acids found, oleic acid was the main. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  10. A process for developing standards to promote quality in general practice.

    PubMed

    Khoury, Julie; Krejany, Catherine J; Versteeg, Roald W; Lodewyckx, Michaela A; Pike, Simone R; Civil, Michael S; Jiwa, Moyez

    2018-06-02

    Since 1991, the Royal Australian College of General Practitioners' (RACGP) Standards for General Practices (the Standards) have provided a framework for quality care, risk management and best practice in the operation of Australian general practices. The Standards are also linked to incentives for general practice remuneration. These Standards were revised in 2017. The objective of this study is to describe the process undertaken to develop the fifth edition Standards published in 2017 to inform future standards development both nationally and internationally. A modified Delphi process was deployed to develop the fifth edition Standards. Development was directed by the RACGP and led by an expert panel of GPs and representatives of stakeholder groups who were assisted and facilitated by a team from RACGP. Each draft was released for stakeholder feedback and tested twice before the final version was submitted for approval by the RACGP board. Four rounds of consultation and two rounds of piloting were carried out over 32 months. The Standards were redrafted after each round. One hundred and fifty-two individuals and 225 stakeholder groups participated in the development of the Standards. Twenty-three new indicators were recommended and grouped into three sections in a new modular structure that was different from the previous edition. The Standards represent the consensus view of national stakeholders on the indicators of quality and safety in Australian general practice and primary care.

  11. Product quality considerations for mammalian cell culture process development and manufacturing.

    PubMed

    Gramer, Michael J

    2014-01-01

    The manufacturing of a biologic drug from mammalian cells results in not a single substance, but an array of product isoforms, also known as variants. These isoforms arise due to intracellular or extracellular events as a result of biological or chemical modification. The most common examples related to biomanufacturing include amino acid modifications (glycosylation, isomerization, oxidation, adduct formation, pyroglutamate formation, phosphorylation, sulfation, amidation), amino acid sequence variants (genetic mutations, amino acid misincorporation, N- and C-terminal heterogeneity, clipping), and higher-order structure modifications (misfolding, aggregation, disulfide pairing). Process-related impurities (HCP, DNA, media components, viral particles) are also important quality attributes related to product safety. The observed ranges associated with each quality attribute define the product quality profile. A biologic drug must have a correct and consistent quality profile throughout clinical development and scale-up to commercial production to ensure product safety and efficacy. In general, the upstream process (cell culture) defines the quality of product-related substances, whereas the downstream process (purification) defines the residual level of process- and product-related impurities. The purpose of this chapter is to review the impact of the cell culture process on product quality. Emphasis is placed on studies with industrial significance and where the direct mechanism of product quality impact was determined. Where possible, recommendations for maintaining consistent or improved quality are provided.

  12. On numerical model of time-dependent processes in three-dimensional porous heat-releasing objects

    NASA Astrophysics Data System (ADS)

    Lutsenko, Nickolay A.

    2016-10-01

    The gas flows in the gravity field through porous objects with heat-releasing sources are investigated when the self-regulation of the flow rate of the gas passing through the porous object takes place. Such objects can appear after various natural or man-made disasters (like the exploded unit of the Chernobyl NPP). The mathematical model and the original numerical method, based on a combination of explicit and implicit finite difference schemes, are developed for investigating the time-dependent processes in 3D porous energy-releasing objects. The advantage of the numerical model is its ability to describe unsteady processes under both natural convection and forced filtration. The gas cooling of 3D porous objects with different distribution of heat sources is studied using computational experiment.

  13. A Controlled Agitation Process for Improving Quality of Canned Green Beans during Agitation Thermal Processing.

    PubMed

    Singh, Anika; Pratap Singh, Anubhav; Ramaswamy, Hosahalli S

    2016-06-01

    This work introduces the concept of a controlled agitation thermal process to reduce quality damage in liquid-particulate products during agitation thermal processing. Reciprocating agitation thermal processing (RA-TP) was used as the agitation thermal process. In order to reduce the impact of agitation, a new concept of "stopping agitations after sufficient development of cold-spot temperature" was proposed. Green beans were processed in No. 2 (307×409) cans filled with liquids of various consistency (0% to 2% CMC) at various frequencies (1 to 3 Hz) of RA-TP using a full-factorial design and heat penetration results were collected. Corresponding operator's process time to impart a 10-min process lethality (Fo ) and agitation time (AT) were calculated using heat penetration results. Accordingly, products were processed again by stopping agitations as per 3 agitation regimes, namely; full time agitation, equilibration time agitation, and partial time agitation. Processed products were photographed and tested for visual quality, color, texture, breakage of green beans, turbidity, and percentage of insoluble solids in can liquid. Results showed that stopping agitations after sufficient development of cold-spot temperatures is an effective way of reducing product damages caused by agitation (for example, breakage of beans and its leaching into liquid). Agitations till one-log temperature difference gave best color, texture and visual product quality for low-viscosity liquid-particulate mixture and extended agitations till equilibration time was best for high-viscosity products. Thus, it was shown that a controlled agitation thermal process is more effective in obtaining high product quality as compared to a regular agitation thermal process. © 2016 Institute of Food Technologists®

  14. A Subjective and Objective Process for Athletic Training Student Selection

    ERIC Educational Resources Information Center

    Hawkins, Jeremy R.; McLoda, Todd A.; Stanek, Justin M.

    2015-01-01

    Context: Admission decisions are made annually concerning whom to accept into athletic training programs. Objective: To present an approach used to make admissions decisions at an undergraduate athletic training program and to corroborate this information by comparing each aspect to nursing program admission processes. Background: Annually,…

  15. Improvement of image quality of coherently illuminated objects in a turbulent atmosphere

    NASA Astrophysics Data System (ADS)

    Banakh, Viktor A.; Chen, Ben-Nam

    1994-06-01

    It is shown that the phenomenon of correlation of opposing waves may lead to improvement of image quality of coherently illuminated objects in a turbulent atmosphere in the case of strong intensity fluctuations. The extent of this improvement depends on the relation between sizes of the output and receiving apertures. The betterment of visibility in a turbulent atmosphere becomes maximal in the case of their proximity and vanishes if the sizes of illuminating and receiving apertures are distinguished from each other significantly.

  16. Total Quality Management (TQM). Process Action Team Course

    DTIC Science & Technology

    1990-05-30

    SHET SC EXHAUSTE May 30,1990 DEPARTMENT OF DEFENSE Lfl N CI TOTAL QUALITY MANAGEMENT (TQM) Process Action Team Coursef ©990, Booz.Allen & Hamilton Inc...organization’s TQM infrastructure. If you need additional information, please refer to the student manual, Total Quality Management (TOM) Awareness Seminar that...Programs. These efforts were identified in Appendix A of Booz, Allen’s training manual Qtl Quality Management Awareness Seminar. Revision 5, November 15

  17. Fuel quality/processing study. Volume 4: On site processing studies

    NASA Technical Reports Server (NTRS)

    Jones, G. E., Jr.; Cutrone, M.; Doering, H.; Hickey, J.

    1981-01-01

    Fuel treated at the turbine and the turbine exhaust gas processed at the turbine site are studied. Fuel treatments protect the turbine from contaminants or impurities either in the upgrading fuel as produced or picked up by the fuel during normal transportation. Exhaust gas treatments provide for the reduction of NOx and SOx to environmentally acceptable levels. The impact of fuel quality upon turbine maintenance and deterioration is considered. On site costs include not only the fuel treatment costs as such, but also incremental costs incurred by the turbine operator if a turbine fuel of low quality is not acceptably upgraded.

  18. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  19. A single-rate context-dependent learning process underlies rapid adaptation to familiar object dynamics.

    PubMed

    Ingram, James N; Howard, Ian S; Flanagan, J Randall; Wolpert, Daniel M

    2011-09-01

    Motor learning has been extensively studied using dynamic (force-field) perturbations. These induce movement errors that result in adaptive changes to the motor commands. Several state-space models have been developed to explain how trial-by-trial errors drive the progressive adaptation observed in such studies. These models have been applied to adaptation involving novel dynamics, which typically occurs over tens to hundreds of trials, and which appears to be mediated by a dual-rate adaptation process. In contrast, when manipulating objects with familiar dynamics, subjects adapt rapidly within a few trials. Here, we apply state-space models to familiar dynamics, asking whether adaptation is mediated by a single-rate or dual-rate process. Previously, we reported a task in which subjects rotate an object with known dynamics. By presenting the object at different visual orientations, adaptation was shown to be context-specific, with limited generalization to novel orientations. Here we show that a multiple-context state-space model, with a generalization function tuned to visual object orientation, can reproduce the time-course of adaptation and de-adaptation as well as the observed context-dependent behavior. In contrast to the dual-rate process associated with novel dynamics, we show that a single-rate process mediates adaptation to familiar object dynamics. The model predicts that during exposure to the object across multiple orientations, there will be a degree of independence for adaptation and de-adaptation within each context, and that the states associated with all contexts will slowly de-adapt during exposure in one particular context. We confirm these predictions in two new experiments. Results of the current study thus highlight similarities and differences in the processes engaged during exposure to novel versus familiar dynamics. In both cases, adaptation is mediated by multiple context-specific representations. In the case of familiar object dynamics

  20. 7 CFR 457.144 - Northern potato crop insurance-processing quality endorsement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 6 2011-01-01 2011-01-01 false Northern potato crop insurance-processing quality... Northern potato crop insurance—processing quality endorsement. The Northern Potato Crop Insurance.... Definitions Broker. Any business enterprise regularly engaged in the buying and selling of processing potatoes...

  1. 7 CFR 457.144 - Northern potato crop insurance-processing quality endorsement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 6 2014-01-01 2014-01-01 false Northern potato crop insurance-processing quality... Northern potato crop insurance—processing quality endorsement. The Northern Potato Crop Insurance.... Definitions Broker. Any business enterprise regularly engaged in the buying and selling of processing potatoes...

  2. 7 CFR 457.144 - Northern potato crop insurance-processing quality endorsement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 6 2013-01-01 2013-01-01 false Northern potato crop insurance-processing quality... Northern potato crop insurance—processing quality endorsement. The Northern Potato Crop Insurance.... Definitions Broker. Any business enterprise regularly engaged in the buying and selling of processing potatoes...

  3. 7 CFR 457.144 - Northern potato crop insurance-processing quality endorsement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 6 2012-01-01 2012-01-01 false Northern potato crop insurance-processing quality... Northern potato crop insurance—processing quality endorsement. The Northern Potato Crop Insurance.... Definitions Broker. Any business enterprise regularly engaged in the buying and selling of processing potatoes...

  4. Achieving performance breakthroughs in an HMO business process through quality planning.

    PubMed

    Hanan, K B

    1993-01-01

    Kaiser Permanente's Georgia Region commissioned a quality planning team to design a new process to improve payments to its suppliers and vendors. The result of the team's effort was a 73 percent reduction in cycle time. This team's experiences point to the advantages of process redesign as a quality planning model, as well as some general guidelines for its most effective use in teams. If quality planning project teams are carefully configured, sufficiently expert in the existing process, and properly supported by management, organizations can achieve potentially dramatic improvements in process performance using this approach.

  5. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  6. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  7. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  8. A real time quality control application for animal production by image processing.

    PubMed

    Sungur, Cemil; Özkan, Halil

    2015-11-01

    Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.

  9. A Guide to Pathways through the Pre-Five Quality Process.

    ERIC Educational Resources Information Center

    Strathclyde Regional Council, Glasgow (Scotland).

    This guide describes a quality process for external and internal evaluation of the elementary school education department. The term "pathway" is used to define routes through the quality process that describe any school administrative activity in terms of the indicators and examples of good practice. There are five pathways: process…

  10. A cross-sectional study to identify organisational processes associated with nurse-reported quality and patient safety

    PubMed Central

    Tvedt, Christine; Sjetne, Ingeborg Strømseng; Helgeland, Jon; Bukholm, Geir

    2012-01-01

    Objectives The purpose of this study was to identify organisational processes and structures that are associated with nurse-reported patient safety and quality of nursing. Design This is an observational cross-sectional study using survey methods. Setting Respondents from 31 Norwegian hospitals with more than 85 beds were included in the survey. Participants All registered nurses working in direct patient care in a position of 20% or more were invited to answer the survey. In this study, 3618 nurses from surgical and medical wards responded (response rate 58.9). Nurses' practice environment was defined as organisational processes and measured by the Nursing Work Index Revised and items from Hospital Survey on Patient Safety Culture. Outcome measures Nurses' assessments of patient safety, quality of nursing, confidence in how their patients manage after discharge and frequency of adverse events were used as outcome measures. Results Quality system, nurse–physician relation, patient safety management and staff adequacy were process measures associated with nurse-reported work-related and patient-related outcomes, but we found no associations with nurse participation, education and career and ward leadership. Most organisational structures were non-significant in the multilevel model except for nurses’ affiliations to medical department and hospital type. Conclusions Organisational structures may have minor impact on how nurses perceive work-related and patient-related outcomes, but the findings in this study indicate that there is a considerable potential to address organisational design in improvement of patient safety and quality of care. PMID:23263021

  11. Measuring the Process and Quality of Informed Consent for Clinical Research: Development and Testing

    PubMed Central

    Cohn, Elizabeth Gross; Jia, Haomiao; Smith, Winifred Chapman; Erwin, Katherine; Larson, Elaine L.

    2013-01-01

    Purpose/Objectives To develop and assess the reliability and validity of an observational instrument, the Process and Quality of Informed Consent (P-QIC). Design A pilot study of the psychometrics of a tool designed to measure the quality and process of the informed consent encounter in clinical research. The study used professionally filmed, simulated consent encounters designed to vary in process and quality. Setting A major urban teaching hospital in the northeastern region of the United States. Sample 63 students enrolled in health-related programs participated in psychometric testing, 16 students participated in test-retest reliability, and 5 investigator-participant dyads were observed for the actual consent encounters. Methods For reliability and validity testing, students watched and rated videotaped simulations of four consent encounters intentionally varied in process and content and rated them with the proposed instrument. Test-retest reliability was established by raters watching the videotaped simulations twice. Inter-rater reliability was demonstrated by two simultaneous but independent raters observing an actual consent encounter. Main Research Variables The essential elements of information and communication for informed consent. Findings The initial testing of the P-QIC demonstrated reliable and valid psychometric properties in both the simulated standardized consent encounters and actual consent encounters in the hospital setting. Conclusions The P-QIC is an easy-to-use observational tool that provides a quick assessment of the areas of strength and areas that need improvement in a consent encounter. It can be used in the initial trainings of new investigators or consent administrators and in ongoing programs of improvement for informed consent. Implications for Nursing The development of a validated observational instrument will allow investigators to assess the consent process more accurately and evaluate strategies designed to improve it. PMID

  12. Multi-pollutant surface objective analyses and mapping of air quality health index over North America.

    PubMed

    Robichaud, Alain; Ménard, Richard; Zaïtseva, Yulia; Anselmo, David

    2016-01-01

    Air quality, like weather, can affect everyone, but responses differ depending on the sensitivity and health condition of a given individual. To help protect exposed populations, many countries have put in place real-time air quality nowcasting and forecasting capabilities. We present in this paper an optimal combination of air quality measurements and model outputs and show that it leads to significant improvements in the spatial representativeness of air quality. The product is referred to as multi-pollutant surface objective analyses (MPSOAs). Moreover, based on MPSOA, a geographical mapping of the Canadian Air Quality Health Index (AQHI) is also presented which provides users (policy makers, public, air quality forecasters, and epidemiologists) with a more accurate picture of the health risk anytime and anywhere in Canada and the USA. Since pollutants can also behave as passive atmospheric tracers, they provide information about transport and dispersion and, hence, reveal synoptic and regional meteorological phenomena. MPSOA could also be used to build air pollution climatology, compute local and national trends in air quality, and detect systematic biases in numerical air quality (AQ) models. Finally, initializing AQ models at regular time intervals with MPSOA can produce more accurate air quality forecasts. It is for these reasons that the Canadian Meteorological Centre (CMC) in collaboration with the Air Quality Research Division (AQRD) of Environment Canada has recently implemented MPSOA in their daily operations.

  13. Effect of processing conditions on quality of green beans subjected to reciprocating agitation thermal processing.

    PubMed

    Singh, Anika; Singh, Anubhav Pratap; Ramaswamy, Hosahalli S

    2015-12-01

    The effect of reciprocating agitation thermal processing (RA-TP) on quality of canned beans was evaluated in a lab-scale reciprocating retort. Green beans were selected due to their soft texture and sensitive color. Green beans (2.5cm length×0.8cm diameter) were filled into 307×409 cans with carboxylmethylcellulose (0-2%) solutions and processed at different temperatures (110-130°C) and reciprocation frequency (1-3Hz) for predetermined heating times to achieve a process lethality (F o ) of 10min. Products processed at higher temperatures and higher reciprocation frequencies resulted in better retention of chlorophyll and antioxidant activity. However, high reciprocation frequency also resulted in texture losses, with higher breakage of beans, increased turbidity and higher leaching. There was total loss of product quality at the highest agitation speed, especially with low viscosity covering solutions. Results suggest that reciprocating agitation frequency needs to be adequately moderated to get the best quality. For getting best quality, particularly for canned liquid particulate foods with soft particulates and those susceptible to high impact agitation, a gentle reciprocating motion (~1Hz) would be a good compromise. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. [Analysis and countermeasure for quality risk in process of traditional Chinese medicine preparations].

    PubMed

    Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing

    2017-03-01

    Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.

  15. Laser-induced acoustic imaging of underground objects

    NASA Astrophysics Data System (ADS)

    Li, Wen; DiMarzio, Charles A.; McKnight, Stephen W.; Sauermann, Gerhard O.; Miller, Eric L.

    1999-02-01

    This paper introduces a new demining technique based on the photo-acoustic interaction, together with results from photo- acoustic experiments. We have buried different types of targets (metal, rubber and plastic) in different media (sand, soil and water) and imaged them by measuring reflection of acoustic waves generated by irradiation with a CO2 laser. Research has been focused on the signal acquisition and signal processing. A deconvolution method using Wiener filters is utilized in data processing. Using a uniform spatial distribution of laser pulses at the ground's surface, we obtained 3D images of buried objects. The images give us a clear representation of the shapes of the underground objects. The quality of the images depends on the mismatch of acoustic impedance of the buried objects, the bandwidth and center frequency of the acoustic sensors and the selection of filter functions.

  16. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation

    PubMed Central

    Alemnji, George; Edghill, Lisa; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Background Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. Objectives We report the development of a stepwise process for quality systems improvement in the Caribbean Region. Methods The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called ‘Laboratory Quality Management System – Stepwise Improvement Process (LQMS-SIP) Towards Accreditation’ to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. Results This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. Conclusion This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement. PMID:28879149

  17. Quality Assurance of Quality Assurance Agencies from an Asian Perspective: Regulation, Autonomy and Accountability

    ERIC Educational Resources Information Center

    Hou, Angela Yung-Chi; Ince, Martin; Tsai, Sandy; Chiang, Chung Lin

    2015-01-01

    As quality guardians of higher education, quality assurance agencies are required to guarantee the credibility of the review process and to ensure the objectivity and transparency of their decisions and recommendations. These agencies are therefore expected to use a range of internal and external approaches to prove the quality of their review…

  18. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  19. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  20. A sound quality model for objective synthesis evaluation of vehicle interior noise based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Wang, Y. S.; Shen, G. Q.; Xing, Y. F.

    2014-03-01

    Based on the artificial neural network (ANN) technique, an objective sound quality evaluation (SQE) model for synthesis annoyance of vehicle interior noises is presented in this paper. According to the standard named GB/T18697, firstly, the interior noises under different working conditions of a sample vehicle are measured and saved in a noise database. Some mathematical models for loudness, sharpness and roughness of the measured vehicle noises are established and performed by Matlab programming. Sound qualities of the vehicle interior noises are also estimated by jury tests following the anchored semantic differential (ASD) procedure. Using the objective and subjective evaluation results, furthermore, an ANN-based model for synthetical annoyance evaluation of vehicle noises, so-called ANN-SAE, is developed. Finally, the ANN-SAE model is proved by some verification tests with the leave-one-out algorithm. The results suggest that the proposed ANN-SAE model is accurate and effective and can be directly used to estimate sound quality of the vehicle interior noises, which is very helpful for vehicle acoustical designs and improvements. The ANN-SAE approach may be extended to deal with other sound-related fields for product quality evaluations in SQE engineering.

  1. Quality by Design (QbD) Approach for Development of Co-Processed Excipient Pellets (MOMLETS) By Extrusion-Spheronization Technique.

    PubMed

    Patel, Hetal; Patel, Kishan; Tiwari, Sanjay; Pandey, Sonia; Shah, Shailesh; Gohel, Mukesh

    2016-01-01

    Microcrystalline cellulose (MCC) is an excellent excipient for the production of pellets by extrusion spheronization. However, it causes slow release rate of poorly water soluble drugs from pellets. Co-processed excipient prepared by spray drying (US4744987; US5686107; WO2003051338) and coprecipitation technique (WO9517831) are patented. The objective of present study was to develop co-processed MCC pellets (MOMLETS) by extrusion-spheronization technique using the principle of Quality by Design (QbD). Co-processed excipient core pellets (MOMLETS) were developed by extrusion spheronization technique using Quality by Design (QbD) approach. BCS class II drug (telmisartan) was layered onto it in a fluidized bed processor. Quality Target Product Profile (QTPP) and Critical Quality Attributes (CQA) for pellets were identified. Risk assessment was reported using Ishikawa diagram. Plackett Burman design was used to check the effect of seven independent variables; superdisintegrant, extruder speed, ethanol: water, spheronizer speed, extruder screen, pore former and MCC: lactose; on percentage drug release at 30 min. Pareto chart and normal probability plot was constructed to identify the significant factors. Box-Behnken design (BBD) using three most significant factors (Extruder screen size, type of superdisintegrant and type of pore former) was used as an optimization design. The control space was identified in which desired quality of the pellets can be obtained. Co-processed excipient core pellets (MOMLETS) were successfully developed by QbD approach. Versatility, Industrial scalability and simplicity are the main features of the proposed research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  3. Demystifying process mapping: a key step in neurosurgical quality improvement initiatives.

    PubMed

    McLaughlin, Nancy; Rodstein, Jennifer; Burke, Michael A; Martin, Neil A

    2014-08-01

    Reliable delivery of optimal care can be challenging for care providers. Health care leaders have integrated various business tools to assist them and their teams in ensuring consistent delivery of safe and top-quality care. The cornerstone to all quality improvement strategies is the detailed understanding of the current state of a process, captured by process mapping. Process mapping empowers caregivers to audit how they are currently delivering care to subsequently strategically plan improvement initiatives. As a community, neurosurgery has clearly shown dedication to enhancing patient safety and delivering quality care. A care redesign strategy named NERVS (Neurosurgery Enhanced Recovery after surgery, Value, and Safety) is currently being developed and piloted within our department. Through this initiative, a multidisciplinary team led by a clinician neurosurgeon has process mapped the way care is currently being delivered throughout the entire episode of care. Neurosurgeons are becoming leaders in quality programs, and their education on the quality improvement strategies and tools is essential. The authors present a comprehensive review of process mapping, demystifying its planning, its building, and its analysis. The particularities of using process maps, initially a business tool, in the health care arena are discussed, and their specific use in an academic neurosurgical department is presented.

  4. Bootstrap Signal-to-Noise Confidence Intervals: An Objective Method for Subject Exclusion and Quality Control in ERP Studies

    PubMed Central

    Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.

    2016-01-01

    Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849

  5. Image processing strategies based on saliency segmentation for object recognition under simulated prosthetic vision.

    PubMed

    Li, Heng; Su, Xiaofan; Wang, Jing; Kan, Han; Han, Tingting; Zeng, Yajie; Chai, Xinyu

    2018-01-01

    Current retinal prostheses can only generate low-resolution visual percepts constituted of limited phosphenes which are elicited by an electrode array and with uncontrollable color and restricted grayscale. Under this visual perception, prosthetic recipients can just complete some simple visual tasks, but more complex tasks like face identification/object recognition are extremely difficult. Therefore, it is necessary to investigate and apply image processing strategies for optimizing the visual perception of the recipients. This study focuses on recognition of the object of interest employing simulated prosthetic vision. We used a saliency segmentation method based on a biologically plausible graph-based visual saliency model and a grabCut-based self-adaptive-iterative optimization framework to automatically extract foreground objects. Based on this, two image processing strategies, Addition of Separate Pixelization and Background Pixel Shrink, were further utilized to enhance the extracted foreground objects. i) The results showed by verification of psychophysical experiments that under simulated prosthetic vision, both strategies had marked advantages over Direct Pixelization in terms of recognition accuracy and efficiency. ii) We also found that recognition performance under two strategies was tied to the segmentation results and was affected positively by the paired-interrelated objects in the scene. The use of the saliency segmentation method and image processing strategies can automatically extract and enhance foreground objects, and significantly improve object recognition performance towards recipients implanted a high-density implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. [Global immunization policies and recommendations: objectives and process].

    PubMed

    Duclos, Philippe; Okwo-Bele, Jean-Marie

    2007-04-01

    The World Health Organization (WHO) has a dual mandate of providing global policies, standards and norms as well as support for member countries in applying such policies and standards to national programmes with the aim to improve health. The vaccine world is changing and with it the demands and expectations of the global and national policy makers, donors, and other interested parties. Changes pertain to : new vaccines and technologies developments, vaccine safety issues, regulation and approval of vaccines, and increased funding flowing through new financing mechanisms. This places a special responsibility on WHO to respond effectively. WHO has recently reviewed and optimized its policy making structure for vaccines and immunization and adjusted it to the new Global Immunization Vision and Strategy, which broadens the scope of immunization efforts to all age groups and vaccines with emphasis on integration of immunization delivery with other health interventions. This includes an extended consultation process to promptly generate evidence base recommendations, ensuring transparency of the decision making process and added communication efforts. This article presents the objectives and impact of the process set to develop global immunization policies, norms, standards and recommendations. The key advisory committees landscape contributing to this process is described. This includes the Strategic Advisory Group of Experts, the Global Advisory Committee on Vaccine Safety and the Expert Committee on Biological Standardization. The elaboration of WHO vaccine position papers is also described.

  7. Image quality and dose differences caused by vendor-specific image processing of neonatal radiographs.

    PubMed

    Sensakovic, William F; O'Dell, M Cody; Letter, Haley; Kohler, Nathan; Rop, Baiywo; Cook, Jane; Logsdon, Gregory; Varich, Laura

    2016-10-01

    Image processing plays an important role in optimizing image quality and radiation dose in projection radiography. Unfortunately commercial algorithms are black boxes that are often left at or near vendor default settings rather than being optimized. We hypothesize that different commercial image-processing systems, when left at or near default settings, create significant differences in image quality. We further hypothesize that image-quality differences can be exploited to produce images of equivalent quality but lower radiation dose. We used a portable radiography system to acquire images on a neonatal chest phantom and recorded the entrance surface air kerma (ESAK). We applied two image-processing systems (Optima XR220amx, by GE Healthcare, Waukesha, WI; and MUSICA(2) by Agfa HealthCare, Mortsel, Belgium) to the images. Seven observers (attending pediatric radiologists and radiology residents) independently assessed image quality using two methods: rating and matching. Image-quality ratings were independently assessed by each observer on a 10-point scale. Matching consisted of each observer matching GE-processed images and Agfa-processed images with equivalent image quality. A total of 210 rating tasks and 42 matching tasks were performed and effective dose was estimated. Median Agfa-processed image-quality ratings were higher than GE-processed ratings. Non-diagnostic ratings were seen over a wider range of doses for GE-processed images than for Agfa-processed images. During matching tasks, observers matched image quality between GE-processed images and Agfa-processed images acquired at a lower effective dose (11 ± 9 μSv; P < 0.0001). Image-processing methods significantly impact perceived image quality. These image-quality differences can be exploited to alter protocols and produce images of equivalent image quality but lower doses. Those purchasing projection radiography systems or third-party image-processing software should be aware that image

  8. Shuttle Mission STS-50: Orbital Processing of High-Quality CdTe Compound Semiconductors Experiment: Final Flight Sample Characterization Report

    NASA Technical Reports Server (NTRS)

    Larson, David J.; Casagrande, Luis G.; DiMarzio, Don; Alexander, J. Iwan D.; Carlson, Fred; Lee, Taipo; Dudley, Michael; Raghathamachar, Balaji

    1998-01-01

    The Orbital Processing of High-Quality Doped and Alloyed CdTe Compound Semiconductors program was initiated to investigate, quantitatively, the influences of gravitationally dependent phenomena on the growth and quality of bulk compound semiconductors. The objective was to improve crystal quality (both structural and compositional) and to better understand and control the variables within the crystal growth production process. The empirical effort entailed the development of a terrestrial (one-g) experiment baseline for quantitative comparison with microgravity (mu-g) results. This effort was supported by the development of high-fidelity process models of heat transfer, fluid flow and solute redistribution, and thermo-mechanical stress occurring in the furnace, safety cartridge, ampoule, and crystal throughout the melting, seeding, crystal growth, and post-solidification processing. In addition, the sensitivity of the orbital experiments was analyzed with respect to the residual microgravity (mu-g) environment, both steady state and g-jitter. CdZnTe crystals were grown in one-g and in mu-g. Crystals processed terrestrially were grown at the NASA Ground Control Experiments Laboratory (GCEL) and at Grumman Aerospace Corporation (now Northrop Grumman Corporation). Two mu-g crystals were grown in the Crystal Growth Furnace (CGF) during the First United States Microgravity Laboratory Mission (USML-1), STS-50, June 24 - July 9, 1992.

  9. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    PubMed

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  10. Quality Control and Peer Review of Data Sets: Mapping Data Archiving Processes to Data Publication Requirements

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.; Daniels, M.; Eaker, C.; Strand, G.; Williams, S. F.; Worley, S. J.

    2012-12-01

    Data sets exist within scientific research and knowledge networks as both technical and non-technical entities. Establishing the quality of data sets is a multi-faceted task that encompasses many automated and manual processes. Data sets have always been essential for science research, but now need to be more visible as first-class scholarly objects at national, international, and local levels. Many initiatives are establishing procedures to publish and curate data sets, as well as to promote professional rewards for researchers that collect, create, manage, and preserve data sets. Traditionally, research quality has been assessed by peer review of textual publications, e.g. journal articles, conference proceedings, and books. Citation indices then provide standard measures of productivity used to reward individuals for their peer-reviewed work. Whether a similar peer review process is appropriate for assessing and ensuring the quality of data sets remains as an open question. How does the traditional process of peer review apply to data sets? This presentation will describe current work being done at the National Center for Atmospheric Research (NCAR) in the context of the Peer REview for Publication & Accreditation of Research Data in the Earth sciences (PREPARDE) project. PREPARDE is assessing practices and processes for data peer review, with the goal of developing recommendations. NCAR data management teams perform various kinds of quality assessment and review of data sets prior to making them publicly available. The poster will investigate how notions of peer review relate to the types of data review already in place at NCAR. We highlight the data set characteristics and management/archiving processes that challenge the traditional peer review processes by using a number of questions as probes, including: Who is qualified to review data sets? What formal and informal documentation is necessary to allow someone outside of a research team to review a data set

  11. Position, Possession or Process? Understanding Objective and Subjective Employability during University-to-Work Transitions

    ERIC Educational Resources Information Center

    Okay-Somerville, Belgin; Scholarios, Dora

    2017-01-01

    This article aims to understand predictors of objective (i.e. job offers, employment status and employment quality) and subjective (i.e. perceived) graduate employability during university-to-work transitions. Using survey data from two cohorts of graduates in the UK (N = 293), it contrasts three competing theoretical approaches to employability:…

  12. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  13. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  14. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  15. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  16. Overall quality and shelf life of minimally processed and modified atmosphere packaged 'ready-to-eat' pomegranate arils.

    PubMed

    Ayhan, Zehra; Eştürk, Okan

    2009-06-01

    Minimally processed ready-to-eat pomegranate arils have become popular due to their convenience, high value, unique sensory characteristics, and health benefits. The objective of this study was to monitor quality parameters and to extend the shelf life of ready-to-eat pomegranate arils packaged with modified atmospheres. Minimally processed pomegranate arils were packed in PP trays sealed with BOPP film under 4 atmospheres including low and super atmospheric oxygen. Packaged arils were stored at 5 degrees C for 18 d and monitored for internal atmosphere and quality attributes. Atmosphere equilibrium was reached for all MAP applications except for high oxygen. As a general trend, slight or no significant change was detected in chemical and physical attributes of pomegranate arils during cold storage. The aerobic mesophilic bacteria were in the range of 2.30 to 4.51 log CFU/g at the end of the storage, which did not affect the sensory quality. Overall, the pomegranate arils packed with air, nitrogen, and enriched oxygen kept quality attributes and were acceptable to sensory panelists on day 18; however, marketability period was limited to 15 d for the low oxygen atmosphere. PP trays sealed with BOPP film combined with either passive or active modified atmospheres and storage at 5 degrees C provided commercially acceptable arils for 18 d with high quality and convenience.

  17. Hospital quality measures: are process indicators associated with hospital standardized mortality ratios in French acute care hospitals?

    PubMed

    Ngantcha, Marcus; Le-Pogam, Marie-Annick; Calmus, Sophie; Grenier, Catherine; Evrard, Isabelle; Lamarche-Vadel, Agathe; Rey, Grégoire

    2017-08-22

    Results of associations between process and mortality indicators, both used for the external assessment of hospital care quality or public reporting, differ strongly across studies. However, most of those studies were conducted in North America or United Kingdom. Providing new evidence based on French data could fuel the international debate on quality of care indicators and help inform French policy-makers. The objective of our study was to explore whether optimal care delivery in French hospitals as assessed by their Hospital Process Indicators (HPIs) is associated with low Hospital Standardized Mortality Ratios (HSMRs). The French National Authority for Health (HAS) routinely collects for each hospital located in France, a set of mandatory HPIs. Five HPIs were selected among the process indicators collected by the HAS in 2009. They were measured using random samples of 60 to 80 medical records from inpatients admitted between January 1st, 2009 and December 31, 2009 in respect with some selection criteria. HSMRs were estimated at 30, 60 and 90 days post-admission (dpa) using administrative health data extracted from the national health insurance information system (SNIIR-AM) which covers 77% of the French population. Associations between HPIs and HSMRs were assessed by Poisson regression models corrected for measurement errors with a simulation-extrapolation (SIMEX) method. Most associations studied were not statistically significant. Only two process indicators were found associated with HSMRs. Completeness and quality of anesthetic records was negatively associated with 30 dpa HSMR (0.72 [0.52-0.99]). Early detection of nutritional disorders was negatively associated with all HSMRs: 30 dpa HSMR (0.71 [0.54-0.95]), 60 dpa HSMR (0.51 [0.39-0.67]) and 90 dpa HSMR (0.52 [0.40-0.68]). In absence of gold standard of quality of care measurement, the limited number of associations suggested to drive in-depth improvements in order to better determine associations

  18. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  19. Experimental results supporting the determination of service quality objectives for DBS systems

    NASA Technical Reports Server (NTRS)

    Chouinard, G.; Whyte, W. A., Jr.; Goldberg, A. A.; Jones, B. L.

    1985-01-01

    A summary of the results of a joint United States and Canadian program on subjective measurements of the picture degradation caused by noise and interference on an NTSC encoded color television signal is given in this paper. The effects of system noise, cochannel and adjacent channel interference, and both single entry and aggregate as well as a combination of these types of interference were subjectively evaluated by expert and nonexpert viewers under reference conditions. These results were used to develop the rationale used at RARC '83 to establish the service quality objective for planning the DBS service for the American continents.

  20. Learning to Appraise the Quality of Qualitative Research Articles: A Contextualized Learning Object for Constructing Knowledge

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2011-01-01

    Helping beginning qualitative researchers critically appraise qualitative research articles is a common learning objective for introductory methodology courses. To aid students in achieving competency in appraising the quality of qualitative research articles, a multi-part activity incorporating the Critical Appraisal Skills Programme's (CASP)…

  1. Reflections on the Quality Indicator Process

    ERIC Educational Resources Information Center

    Barrett, James R.; Taggart, Germaine

    2011-01-01

    The purpose of this paper is to share a description of the process used by Fort Hays State University (FHSU) as a self-study of the FHSU alternative certification program, known as Transition to Teaching. Team members used the Quality Indicators designed as a part of a Department of Education Transition to Teaching Grant called the KNOTtT Project.…

  2. Improvement of Selected Logistics Processes Using Quality Engineering Tools

    NASA Astrophysics Data System (ADS)

    Zasadzień, Michał; Žarnovský, Jozef

    2018-03-01

    Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.

  3. Quality status display for a vibration welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony

    A method includes receiving, during a vibration welding process, a set of sensory signals from a collection of sensors positioned with respect to a work piece during formation of a weld on or within the work piece. The method also includes receiving control signals from a welding controller during the process, with the control signals causing the welding horn to vibrate at a calibrated frequency, and processing the received sensory and control signals using a host machine. Additionally, the method includes displaying a predicted weld quality status on a surface of the work piece using a status projector. The methodmore » may include identifying and display a quality status of a suspect weld. The laser projector may project a laser beam directly onto or immediately adjacent to the suspect welds, e.g., as a red, green, blue laser or a gas laser having a switched color filter.« less

  4. [Quality process control system of Chinese medicine preparation based on "holistic view"].

    PubMed

    Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming

    2018-01-01

    "High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.

  5. Low Quality Natural Gas Sulfur Removal and Recovery CNG Claus Sulfur Recovery Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klint, V.W.; Dale, P.R.; Stephenson, C.

    1997-10-01

    Increased use of natural gas (methane) in the domestic energy market will force the development of large non-producing gas reserves now considered to be low quality. Large reserves of low quality natural gas (LQNG) contaminated with hydrogen sulfide (H{sub 2}S), carbon dioxide (CO{sub 2}) and nitrogen (N) are available but not suitable for treatment using current conventional gas treating methods due to economic and environmental constraints. A group of three technologies have been integrated to allow for processing of these LQNG reserves; the Controlled Freeze Zone (CFZ) process for hydrocarbon / acid gas separation; the Triple Point Crystallizer (TPC) processmore » for H{sub 2}S / C0{sub 2} separation and the CNG Claus process for recovery of elemental sulfur from H{sub 2}S. The combined CFZ/TPC/CNG Claus group of processes is one program aimed at developing an alternative gas treating technology which is both economically and environmentally suitable for developing these low quality natural gas reserves. The CFZ/TPC/CNG Claus process is capable of treating low quality natural gas containing >10% C0{sub 2} and measurable levels of H{sub 2}S and N{sub 2} to pipeline specifications. The integrated CFZ / CNG Claus Process or the stand-alone CNG Claus Process has a number of attractive features for treating LQNG. The processes are capable of treating raw gas with a variety of trace contaminant components. The processes can also accommodate large changes in raw gas composition and flow rates. The combined processes are capable of achieving virtually undetectable levels of H{sub 2}S and significantly less than 2% CO in the product methane. The separation processes operate at pressure and deliver a high pressure (ca. 100 psia) acid gas (H{sub 2}S) stream for processing in the CNG Claus unit. This allows for substantial reductions in plant vessel size as compared to conventional Claus / Tail gas treating technologies. A close integration of the components of the CNG

  6. Structure, Process, and Outcome Quality of Surgical Site Infection Surveillance in Switzerland.

    PubMed

    Kuster, Stefan P; Eisenring, Marie-Christine; Sax, Hugo; Troillet, Nicolas

    2017-10-01

    OBJECTIVE To assess the structure and quality of surveillance activities and to validate outcome detection in the Swiss national surgical site infection (SSI) surveillance program. DESIGN Countrywide survey of SSI surveillance quality. SETTING 147 hospitals or hospital units with surgical activities in Switzerland. METHODS Site visits were conducted with on-site structured interviews and review of a random sample of 15 patient records per hospital: 10 from the entire data set and 5 from a subset of patients with originally reported infection. Process and structure were rated in 9 domains with a weighted overall validation score, and sensitivity, specificity, positive predictive value, and negative predictive value were calculated for the identification of SSI. RESULTS Of 50 possible points, the median validation score was 35.5 (range, 16.25-48.5). Public hospitals (P<.001), hospitals in the Italian-speaking region of Switzerland (P=.021), and hospitals with longer participation in the surveillance (P=.018) had higher scores than others. Domains that contributed most to lower scores were quality of chart review and quality of data extraction. Of 49 infections, 15 (30.6%) had been overlooked in a random sample of 1,110 patient records, accounting for a sensitivity of 69.4% (95% confidence interval [CI], 54.6%-81.7%), a specificity of 99.9% (95% CI, 99.5%-100%), a positive predictive value of 97.1% (95% CI, 85.1%-99.9%), and a negative predictive value of 98.6% (95% CI, 97.7%-99.2%). CONCLUSIONS Irrespective of a well-defined surveillance methodology, there is a wide variation of SSI surveillance quality. The quality of chart review and the accuracy of data collection are the main areas for improvement. Infect Control Hosp Epidemiol 2017;38:1172-1181.

  7. A method for the evaluation of image quality according to the recognition effectiveness of objects in the optical remote sensing image using machine learning algorithm.

    PubMed

    Yuan, Tao; Zheng, Xinqi; Hu, Xuan; Zhou, Wei; Wang, Wei

    2014-01-01

    Objective and effective image quality assessment (IQA) is directly related to the application of optical remote sensing images (ORSI). In this study, a new IQA method of standardizing the target object recognition rate (ORR) is presented to reflect quality. First, several quality degradation treatments with high-resolution ORSIs are implemented to model the ORSIs obtained in different imaging conditions; then, a machine learning algorithm is adopted for recognition experiments on a chosen target object to obtain ORRs; finally, a comparison with commonly used IQA indicators was performed to reveal their applicability and limitations. The results showed that the ORR of the original ORSI was calculated to be up to 81.95%, whereas the ORR ratios of the quality-degraded images to the original images were 65.52%, 64.58%, 71.21%, and 73.11%. The results show that these data can more accurately reflect the advantages and disadvantages of different images in object identification and information extraction when compared with conventional digital image assessment indexes. By recognizing the difference in image quality from the application effect perspective, using a machine learning algorithm to extract regional gray scale features of typical objects in the image for analysis, and quantitatively assessing quality of ORSI according to the difference, this method provides a new approach for objective ORSI assessment.

  8. Unaware Processing of Tools in the Neural System for Object-Directed Action Representation.

    PubMed

    Tettamanti, Marco; Conca, Francesca; Falini, Andrea; Perani, Daniela

    2017-11-01

    The hypothesis that the brain constitutively encodes observed manipulable objects for the actions they afford is still debated. Yet, crucial evidence demonstrating that, even in the absence of perceptual awareness, the mere visual appearance of a manipulable object triggers a visuomotor coding in the action representation system including the premotor cortex, has hitherto not been provided. In this fMRI study, we instantiated reliable unaware visual perception conditions by means of continuous flash suppression, and we tested in 24 healthy human participants (13 females) whether the visuomotor object-directed action representation system that includes left-hemispheric premotor, parietal, and posterior temporal cortices is activated even under subliminal perceptual conditions. We found consistent activation in the target visuomotor cortices, both with and without perceptual awareness, specifically for pictures of manipulable versus non-manipulable objects. By means of a multivariate searchlight analysis, we also found that the brain activation patterns in this visuomotor network enabled the decoding of manipulable versus non-manipulable object picture processing, both with and without awareness. These findings demonstrate the intimate neural coupling between visual perception and motor representation that underlies manipulable object processing: manipulable object stimuli specifically engage the visuomotor object-directed action representation system, in a constitutive manner that is independent from perceptual awareness. This perceptuo-motor coupling endows the brain with an efficient mechanism for monitoring and planning reactions to external stimuli in the absence of awareness. SIGNIFICANCE STATEMENT Our brain constantly encodes the visual information that hits the retina, leading to a stimulus-specific activation of sensory and semantic representations, even for objects that we do not consciously perceive. Do these unconscious representations encompass the motor

  9. A conceptual persistent healthcare quality improvement process for software development management.

    PubMed

    Lin, Jen-Chiun; Su, Mei-Ju; Cheng, Po-Hsun; Weng, Yung-Chien; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    2007-01-01

    This paper illustrates a sustained conceptual service quality improvement process for the management of software development within a healthcare enterprise. Our proposed process is revised from Niland's healthcare quality information system (HQIS). This process includes functions to survey the satisfaction of system functions, describe the operation bylaws on-line, and provide on-demand training. To achieve these goals, we integrate five information systems in National Taiwan University Hospital, including healthcare information systems, health quality information system, requirement management system, executive information system, and digital learning system, to form a full Deming cycle. A preliminary user satisfaction survey showed that our outpatient information system scored an average of 71.31 in 2006.

  10. The use of UV-visible reflectance spectroscopy as an objective tool to evaluate pearl quality.

    PubMed

    Agatonovic-Kustrin, Snezana; Morton, David W

    2012-07-01

    Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl's quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry.

  11. Characterizing Objective Quality of Life and Normative Outcomes in Adults with Autism Spectrum Disorder: An Exploratory Latent Class Analysis

    ERIC Educational Resources Information Center

    Bishop-Fitzpatrick, Lauren; Hong, Jinkuk; Smith, Leann E.; Makuch, Renee A.; Greenberg, Jan S.; Mailick, Marsha R.

    2016-01-01

    This study aims to extend the definition of quality of life (QoL) for adults with autism spectrum disorder (ASD, n = 180, ages 23-60) by: (1) characterizing the heterogeneity of normative outcomes (employment, independent living, social engagement) and objective QoL (physical health, neighborhood quality, family contact, mental health issues); and…

  12. Conducting remote bioanalytical data monitoring and review based on scientific quality objectives.

    PubMed

    He, Ling

    2011-07-01

    For bioanalytical laboratories that follow GLP regulations and generate data for new drug filing, ensuring quality standards set by regulatory guidance is a fundamental expectation. Numerous guidelines and White Papers have been published by regulatory agencies, professional working groups and field experts in the past two decades, and have significantly improved the standards of good practices for bioanalysis. From a sponsor's perspective, continuous quality monitoring of the data generated by CRO laboratories, identifying adverse trends and taking corrective and preventative actions against issues encountered, are critical aspects of effective bioanalytical outsourcing management. This is especially important for clinical bioanalysis, where one validated assay is applied for analyzing a large number of samples of diverse demographics and disease states. This perspective article presents thoughts toward remote data monitoring and its merits for scientific quality oversight, and introduces a novel Bioanalytical Data Review software that was custom-developed and platform-neural, to conduct remote data monitoring on raw or processed LC-MS/MS data from CROs. Flexible, adaptive and user-customizable queries are applied for conducting project-, batch- and sample-level data review based on scientific quality performance factors commonly assessed for good bioanalytical practice.

  13. Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways

    PubMed Central

    Kersey, Alyssa J.; Clark, Tyia S.; Lussier, Courtney A.; Mahon, Bradford Z.; Cantlon, Jessica F.

    2016-01-01

    Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4–8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. PMID:26108614

  14. Object and technologies in the working process of an itinerant team in mental health.

    PubMed

    Eslabão, Adriane Domingues; Pinho, Leandro Barbosa de; Coimbra, Valéria Cristina Christello; Lima, Maria Alice Dias da Silva; Camatta, Marcio Wagner; Santos, Elitiele Ortiz Dos

    2017-01-01

    Objective To analyze the work object and the technologies in the working process of a Mental Health Itinerant Team in the attention to drug users. Methods Qualitative case study, carried out in a municipality in the South of Brazil. The theoretical framework was the Healthcare Labor Process. The data was collected through participant observation and semi-structured interviews with the professionals of an itinerant team in the year of 2015. For data analysis we used the Thematic Content Analysis. Results In the first empirical category - work object - the user is considered as a focus, bringing new challenges in the team's relationship with the network. In the second category - technologies of the work process - potentialities and contradictions of the team work tools are highlighted. Conclusions As an innovation in the mental health context, the itinerant team brings real possibilities to reinvent the care for the drug user as well as new institutional challenges.

  15. Analytic hierarchy process-based approach for selecting a Pareto-optimal solution of a multi-objective, multi-site supply-chain planning problem

    NASA Astrophysics Data System (ADS)

    Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi

    2017-07-01

    The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.

  16. Electrophysiological evidence for separation between human face and non-face object processing only in the right hemisphere.

    PubMed

    Niina, Megumi; Okamura, Jun-ya; Wang, Gang

    2015-10-01

    Scalp event-related potential (ERP) studies have demonstrated larger N170 amplitudes when subjects view faces compared to items from object categories. Extensive attempts have been made to clarify face selectivity and hemispheric dominance for face processing. The purpose of this study was to investigate hemispheric differences in N170s activated by human faces and non-face objects, as well as the extent of overlap of their sources. ERP was recorded from 20 subjects while they viewed human face and non-face images. N170s obtained during the presentation of human faces appeared earlier and with larger amplitude than for other category images. Further source analysis with a two-dipole model revealed that the locations of face and object processing largely overlapped in the left hemisphere. Conversely, the source for face processing in the right hemisphere located more anterior than the source for object processing. The results suggest that the neuronal circuits for face and object processing are largely shared in the left hemisphere, with more distinct circuits in the right hemisphere. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Dissociable intrinsic functional networks support noun-object and verb-action processing.

    PubMed

    Yang, Huichao; Lin, Qixiang; Han, Zaizhu; Li, Hongyu; Song, Luping; Chen, Lingjuan; He, Yong; Bi, Yanchao

    2017-12-01

    The processing mechanism of verbs-actions and nouns-objects is a central topic of language research, with robust evidence for behavioral dissociation. The neural basis for these two major word and/or conceptual classes, however, remains controversial. Two experiments were conducted to study this question from the network perspective. Experiment 1 found that nodes of the same class, obtained through task-evoked brain imaging meta-analyses, were more strongly connected with each other than nodes of different classes during resting-state, forming segregated network modules. Experiment 2 examined the behavioral relevance of these intrinsic networks using data from 88 brain-damaged patients, finding that across patients the relative strength of functional connectivity of the two networks significantly correlated with the noun-object vs. verb-action relative behavioral performances. In summary, we found that verbs-actions and nouns-objects are supported by separable intrinsic functional networks and that the integrity of such networks accounts for the relative noun-object- and verb-action-selective deficits. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. [Method for the quality assessment of data collection processes in epidemiological studies].

    PubMed

    Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L

    2017-10-01

    For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.

  19. Radiographic Film Processing Quality Assurance: A Self-Teaching Workbook. Quality Assurance Series.

    ERIC Educational Resources Information Center

    Goldman, Lee W.

    This workbook has been designed for use in conjunction with the manual, "Photographic Quality Assurance in Diagnostic Radiology, Nuclear Medicine and Radiation Therapy." Presented are several typical problems arising from the existence of variability and fluctuations in the automatic processing of radiographs, which unless corrected, can…

  20. Performance bounds for matched field processing in subsurface object detection applications

    NASA Astrophysics Data System (ADS)

    Sahin, Adnan; Miller, Eric L.

    1998-09-01

    In recent years there has been considerable interest in the use of ground penetrating radar (GPR) for the non-invasive detection and localization of buried objects. In a previous work, we have considered the use of high resolution array processing methods for solving these problems for measurement geometries in which an array of electromagnetic receivers observes the fields scattered by the subsurface targets in response to a plane wave illumination. Our approach uses the MUSIC algorithm in a matched field processing (MFP) scheme to determine both the range and the bearing of the objects. In this paper we derive the Cramer-Rao bounds (CRB) for this MUSIC-based approach analytically. Analysis of the theoretical CRB has shown that there exists an optimum inter-element spacing of array elements for which the CRB is minimum. Furthermore, the optimum inter-element spacing minimizing CRB is smaller than the conventional half wavelength criterion. The theoretical bounds are then verified for two estimators using Monte-Carlo simulations. The first estimator is the MUSIC-based MFP and the second one is the maximum likelihood based MFP. The two approaches differ in the cost functions they optimize. We observe that Monte-Carlo simulated error variances always lie above the values established by CRB. Finally, we evaluate the performance of our MUSIC-based algorithm in the presence of model mismatches. Since the detection algorithm strongly depends on the model used, we have tested the performance of the algorithm when the object radius used in the model is different from the true radius. This analysis reveals that the algorithm is still capable of localizing the objects with a bias depending on the degree of mismatch.

  1. Procedural justice and quality of life in compensation processes.

    PubMed

    Elbers, Nieke A; Akkermans, Arno J; Cuijpers, Pim; Bruinvels, David J

    2013-11-01

    There is considerable evidence that being involved in compensation processes has a negative impact on claimants' health. Previous studies suggested that this negative effect is caused by a stressful compensation process: claimants suffered from a lack of communication, a lack of information, and feelings of distrust. However, these rather qualitative findings have not been quantitatively investigated yet. This observational study aimed to fill this gap of knowledge, investigating the claimants' perceived fairness of the compensation process, the provided information, and the interaction with lawyers and insurance companies, in relation to the claimants' quality of life. Participants were individuals injured in traffic accidents, older than 18 years, who were involved in a compensation process in the Netherlands. They were recruited by three claims settlement offices. Outcome measures were procedural, interactional, and informational justice, and quality of life. Participants (n=176) perceived the interaction with lawyers to be fairer than the interaction with insurance companies (p<.001). The length of hospital stay was positively associated with procedural justice (β=.31, p<.001). Having trunk/back injury was negatively related to procedural justice (β=-.25, p=.001). Whiplash injury and length of time involved in the claim process were not associated with any of the justice scales. Finally, procedural justice was found to be positively correlated with quality of life (rs=.22, p=.004). The finding that the interaction with insurance companies was considered less fair than the interaction with lawyers may imply that insurers could improve their interaction with claimants, e.g. by communicating more directly. The result that claimants with mild injuries and with trunk/back injuries considered the compensation process to be less fair than those with respectively severe injuries and injuries to other body parts suggests that especially the former two require an

  2. Processing ser and estar to locate objects and events

    PubMed Central

    Dussias, Paola E.; Contemori, Carla; Román, Patricia

    2016-01-01

    In Spanish locative constructions, a different form of the copula is selected in relation to the semantic properties of the grammatical subject: sentences that locate objects require estar while those that locate events require ser (both translated in English as ‘to be’). In an ERP study, we examined whether second language (L2) speakers of Spanish are sensitive to the selectional restrictions that the different types of subjects impose on the choice of the two copulas. Twenty-four native speakers of Spanish and two groups of L2 Spanish speakers (24 beginners and 18 advanced speakers) were recruited to investigate the processing of ‘object/event + estar/ser’ permutations. Participants provided grammaticality judgments on correct (object + estar; event + ser) and incorrect (object + ser; event + estar) sentences while their brain activity was recorded. In line with previous studies (Leone-Fernández, Molinaro, Carreiras, & Barber, 2012; Sera, Gathje, & Pintado, 1999), the results of the grammaticality judgment for the native speakers showed that participants correctly accepted object + estar and event + ser constructions. In addition, while ‘object + ser’ constructions were considered grossly ungrammatical, ‘event + estar’ combinations were perceived as unacceptable to a lesser degree. For these same participants, ERP recording time-locked to the onset of the critical word ‘en’ showed a larger P600 for the ser predicates when the subject was an object than when it was an event (*La silla es en la cocina vs. La fiesta es en la cocina). This P600 effect is consistent with syntactic repair of the defining predicate when it does not fit with the adequate semantic properties of the subject. For estar predicates (La silla está en la cocina vs. *La fiesta está en la cocina), the findings showed a central-frontal negativity between 500–700 ms. Grammaticality judgment data for the L2 speakers of Spanish showed that beginners were significantly less

  3. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  4. Evaluating supplier quality performance using analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah

    2013-09-01

    This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.

  5. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1995-01-01

    Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  6. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  7. Refocusing-range and image-quality enhanced optical reconstruction of 3-D objects from integral images using a principal periodic δ-function array

    NASA Astrophysics Data System (ADS)

    Ai, Lingyu; Kim, Eun-Soo

    2018-03-01

    We propose a method for refocusing-range and image-quality enhanced optical reconstruction of three-dimensional (3-D) objects from integral images only by using a 3 × 3 periodic δ-function array (PDFA), which is called a principal PDFA (P-PDFA). By directly convolving the elemental image array (EIA) captured from 3-D objects with the P-PDFAs whose spatial periods correspond to each object's depth, a set of spatially-filtered EIAs (SF-EIAs) are extracted, and from which 3-D objects can be reconstructed to be refocused on their real depth. convolutional operations are performed directly on each of the minimum 3 × 3 EIs of the picked-up EIA, the capturing and refocused-depth ranges of 3-D objects can be greatly enhanced, as well as 3-D objects much improved in image quality can be reconstructed without any preprocessing operations. Through ray-optical analysis and optical experiments with actual 3-D objects, the feasibility of the proposed method has been confirmed.

  8. Small target detection using objectness and saliency

    NASA Astrophysics Data System (ADS)

    Zhang, Naiwen; Xiao, Yang; Fang, Zhiwen; Yang, Jian; Wang, Li; Li, Tao

    2017-10-01

    We are motived by the need for generic object detection algorithm which achieves high recall for small targets in complex scenes with acceptable computational efficiency. We propose a novel object detection algorithm, which has high localization quality with acceptable computational cost. Firstly, we obtain the objectness map as in BING[1] and use NMS to get the top N points. Then, k-means algorithm is used to cluster them into K classes according to their location. We set the center points of the K classes as seed points. For each seed point, an object potential region is extracted. Finally, a fast salient object detection algorithm[2] is applied to the object potential regions to highlight objectlike pixels, and a series of efficient post-processing operations are proposed to locate the targets. Our method runs at 5 FPS on 1000*1000 images, and significantly outperforms previous methods on small targets in cluttered background.

  9. Bio-objects and the media: the role of communication in bio-objectification processes.

    PubMed

    Maeseele, Pieter; Allgaier, Joachim; Martinelli, Lucia

    2013-06-01

    The representation of biological innovations in and through communication and media practices is vital for understanding the nature of "bio-objects" and the process we call "bio-objectification." This paper discusses two ideal-typical analytical approaches based on different underlying communication models, ie, the traditional (science- and media-centered) and media sociological (a multi-layered process involving various social actors in defining the meanings of scientific and technological developments) approach. In this analysis, the latter is not only found to be the most promising approach for understanding the circulation, (re)production, and (re)configuration of meanings of bio-objects, but also to interpret the relationship between media and science. On the basis of a few selected examples, this paper highlights how media function as a primary arena for the (re)production and (re)configuration of scientific and biomedical information with regards to bio-objects in the public sphere in general, and toward decision-makers, interest groups, and the public in specific.

  10. Development of a course review process.

    PubMed

    Persky, Adam M; Joyner, Pamela U; Cox, Wendy C

    2012-09-10

    To describe and assess a course review process designed to enhance course quality. A course review process led by the curriculum and assessment committees was designed for all required courses in the doctor of pharmacy (PharmD) program at a school of pharmacy. A rubric was used by the review team to address 5 areas: course layout and integration, learning outcomes, assessment, resources and materials, and learner interaction. One hundred percent of targeted courses, or 97% of all required courses, were reviewed from January to August 2010 (n=30). Approximately 3.5 recommendations per course were made, resulting in improvement in course evaluation items related to learning outcomes. Ninety-five percent of reviewers and 85% of course directors agreed that the process was objective and the course review process was important. The course review process was objective and effective in improving course quality. Future work will explore the effectiveness of an integrated, continual course review process in improving the quality of pharmacy education.

  11. Extracting Spatiotemporal Objects from Raster Data to Represent Physical Features and Analyze Related Processes

    NASA Astrophysics Data System (ADS)

    Zollweg, J. A.

    2017-10-01

    Numerous ground-based, airborne, and orbiting platforms provide remotely-sensed data of remarkable spatial resolution at short time intervals. However, this spatiotemporal data is most valuable if it can be processed into information, thereby creating meaning. We live in a world of objects: cars, buildings, farms, etc. On a stormy day, we don't see millions of cubes of atmosphere; we see a thunderstorm `object'. Temporally, we don't see the properties of those individual cubes changing, we see the thunderstorm as a whole evolving and moving. There is a need to represent the bulky, raw spatiotemporal data from remote sensors as a small number of relevant spatiotemporal objects, thereby matching the human brain's perception of the world. This presentation reveals an efficient algorithm and system to extract the objects/features from raster-formatted remotely-sensed data. The system makes use of the Python object-oriented programming language, SciPy/NumPy for matrix manipulation and scientific computation, and export/import to the GeoJSON standard geographic object data format. The example presented will show how thunderstorms can be identified and characterized in a spatiotemporal continuum using a Python program to process raster data from NOAA's High-Resolution Rapid Refresh v2 (HRRRv2) data stream.

  12. The Benefits of Sensorimotor Knowledge: Body-Object Interaction Facilitates Semantic Processing

    ERIC Educational Resources Information Center

    Siakaluk, Paul D.; Pexman, Penny M.; Sears, Christopher R.; Wilson, Kim; Locheed, Keri; Owen, William J.

    2008-01-01

    This article examined the effects of body-object interaction (BOI) on semantic processing. BOI measures perceptions of the ease with which a human body can physically interact with a word's referent. In Experiment 1, BOI effects were examined in 2 semantic categorization tasks (SCT) in which participants decided if words are easily imageable.…

  13. Fuzzy intelligent quality monitoring model for X-ray image processing.

    PubMed

    Khalatbari, Azadeh; Jenab, Kouroush

    2009-01-01

    Today's imaging diagnosis needs to adapt modern techniques of quality engineering to maintain and improve its accuracy and reliability in health care system. One of the main factors that influences diagnostic accuracy of plain film X-ray on detecting pathology is the level of film exposure. If the level of film exposure is not adequate, a normal body structure may be interpretated as pathology and vice versa. This not only influences the patient management but also has an impact on health care cost and patient's quality of life. Therefore, providing an accurate and high quality image is the first step toward an excellent patient management in any health care system. In this paper, we study these techniques and also present a fuzzy intelligent quality monitoring model, which can be used to keep variables from degrading the image quality. The variables derived from chemical activity, cleaning procedures, maintenance, and monitoring may not be sensed, measured, or calculated precisely due to uncertain situations. Therefore, the gamma-level fuzzy Bayesian model for quality monitoring of an image processing is proposed. In order to apply the Bayesian concept, the fuzzy quality characteristics are assumed as fuzzy random variables. Using the fuzzy quality characteristics, the newly developed model calculates the degradation risk for image processing. A numerical example is also presented to demonstrate the application of the model.

  14. Process, cost, and clinical quality: the initial oral contraceptive visit.

    PubMed

    McMullen, Michael J; Woolford, Samuel W; Moore, Charles L; Berger, Barry M

    2013-01-01

    To demonstrate how the analysis of clinical process, cost, and outcomes can identify healthcare improvements that reduce cost without sacrificing quality, using the example of the initial visit associated with oral contraceptive pill use. Cross-sectional study using data collected by HealthMETRICS between 1996 and 2009. Using data collected from 106 sites in 24 states, the unintended pregnancy (UIP) rate, effectiveness of patient education, and unit visit cost were calculated. Staff type providing education and placement of education were recorded. Two-way analysis of variance models were created and tested for significance to identify differences between groups. Sites using nonclinical staff to provide education outside the exam were associated with lower cost, higher education scores, and a UIP rate no different from that of sites using clinical staff. Sites also providing patient education during the physical examination were associated with higher cost, lower education scores, and a UIP rate no lower than that of sites providing education outside of the exam. Through analyzing process, cost, and quality, lower-cost processes that did not reduce clinical quality were identified. This methodology is applicable to other clinical services for identifying low-cost processes that do not result in lower clinical quality. By using nonclinical staff educators to provide education outside of the physical examination, sites could save an average of 32% of the total cost of the visit.

  15. A laboratory evaluation of four quality control devices for radiographic processing.

    PubMed

    Rushton, V E; Horner, K

    1994-08-01

    Quality assurance programmes for radiographic processing traditionally employ expensive sensitometric and densitometric techniques. However cheap and simple devices for monitoring radiographic processing are available. The aim of this study was to make a comparison of four such devices in terms of their ability to detect variations in radiographic density of clinical significance. Three of the devices are commercially available while the fourth is easily manufactured from waste materials. Ideal bitewing exposure times were selected for four different kilovoltage/film speed combinations. Phantom bitewing radiographs, exposed using these exposure times, were processed using a variety of times and developer temperatures to simulate variations in radiographic quality due to inadequate processing conditions. Test films, produced using the four monitoring devices, were exposed and processed under identical conditions. The phantom bitewings were judged to have 'acceptable' quality when the optical density of that part of the film not showing calcified structures was within +/- 0.5 of that of the film processed under optimal conditions. The efficacy of the monitoring devices in indicating the adequacy of processing was assessed by a comparison of their readings with those made from the phantom bitewings. None of the monitoring devices was ideal for all the kilovoltage/film speed combinations tested, but the homemade device proved to be the most generally effective. We conclude that guidelines to dentists on radiographic quality assurance should include reference to and details of this simple device.

  16. Process performance and product quality in an integrated continuous antibody production process.

    PubMed

    Karst, Daniel J; Steinebach, Fabian; Soos, Miroslav; Morbidelli, Massimo

    2017-02-01

    Continuous manufacturing is currently being seriously considered in the biopharmaceutical industry as the possible new paradigm for producing therapeutic proteins, due to production cost and product quality related benefits. In this study, a monoclonal antibody producing CHO cell line was cultured in perfusion mode and connected to a continuous affinity capture step. The reliable and stable integration of the two systems was enabled by suitable control loops, regulating the continuous volumetric flow and adapting the operating conditions of the capture process. For the latter, an at-line HPLC measurement of the harvest concentration subsequent to the bioreactor was combined with a mechanistic model of the capture chromatographic unit. Thereby, optimal buffer consumption and productivity throughout the process was realized while always maintaining a yield above the target value of 99%. Stable operation was achieved at three consecutive viable cell density set points (20, 60, and 40 × 10 6 cells/mL), together with consistent product quality in terms of aggregates, fragments, charge isoforms, and N-linked glycosylation. In addition, different values for these product quality attributes such as N-linked glycosylation, charge variants, and aggregate content were measured at the different steady states. As expected, the amount of released DNA and HCP was significantly reduced by the capture step for all considered upstream operating conditions. This study is exemplary for the potential of enhancing product quality control and modulation by integrated continuous manufacturing. Biotechnol. Bioeng. 2017;114: 298-307. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Biowaste home composting: experimental process monitoring and quality control.

    PubMed

    Tatàno, Fabio; Pagliaro, Giacomo; Di Giovanni, Paolo; Floriani, Enrico; Mangani, Filippo

    2015-04-01

    Because home composting is a prevention option in managing biowaste at local levels, the objective of the present study was to contribute to the knowledge of the process evolution and compost quality that can be expected and obtained, respectively, in this decentralized option. In this study, organized as the research portion of a provincial project on home composting in the territory of Pesaro-Urbino (Central Italy), four experimental composters were first initiated and temporally monitored. Second, two small sub-sets of selected provincial composters (directly operated by households involved in the project) underwent quality control on their compost products at two different temporal steps. The monitored experimental composters showed overall decreasing profiles versus composting time for moisture, organic carbon, and C/N, as well as overall increasing profiles for electrical conductivity and total nitrogen, which represented qualitative indications of progress in the process. Comparative evaluations of the monitored experimental composters also suggested some interactions in home composting, i.e., high C/N ratios limiting organic matter decomposition rates and final humification levels; high moisture contents restricting the internal temperature regime; nearly horizontal phosphorus and potassium evolutions contributing to limit the rates of increase in electrical conductivity; and prolonged biowaste additions contributing to limit the rate of decrease in moisture. The measures of parametric data variability in the two sub-sets of controlled provincial composters showed decreased variability in moisture, organic carbon, and C/N from the seventh to fifteenth month of home composting, as well as increased variability in electrical conductivity, total nitrogen, and humification rate, which could be considered compatible with the respective nature of decreasing and increasing parameters during composting. The modeled parametric kinetics in the monitored experimental

  18. Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways.

    PubMed

    Kersey, Alyssa J; Clark, Tyia S; Lussier, Courtney A; Mahon, Bradford Z; Cantlon, Jessica F

    2016-07-01

    Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4-8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. The Use of UV-Visible Reflectance Spectroscopy as an Objective Tool to Evaluate Pearl Quality

    PubMed Central

    Agatonovic-Kustrin, Snezana; Morton, David W.

    2012-01-01

    Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl’s quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry. PMID:22851919

  20. Use of osmotic dehydration to improve fruits and vegetables quality during processing.

    PubMed

    Maftoonazad, Neda

    2010-11-01

    Osmotic treatment describes a preparation step to further processing of foods involving simultaneous transient moisture loss and solids gain when immersing in osmotic solutions, resulting in partial drying and improving the overall quality of food products. The different aspects of the osmotic dehydration (OD) technology namely the solutes employed, solutions characteristics used, process variables influence, as well as, the quality characteristics of the osmodehydrated products will be discussed in this review. As the process is carried out at mild temperatures and the moisture is removed by a liquid diffusion process, phase change that would be present in the other drying processes will be avoided, resulting in high quality products and may also lead to substantial energy savings. To optimize this process, modeling of the mass transfer phenomenon can improve high product quality. Several techniques such as microwave heating, vacuum, high pressure, pulsed electric field, etc. may be employed during or after osmotic treatment to enhance performance of the osmotic dehydration. Moreover new technologies used in osmotic dehydration will be discussed. Patents on osmotic dehydration of fruits and vegetables are also discussed in this article.

  1. Object width modulates object-based attentional selection.

    PubMed

    Nah, Joseph C; Neppi-Modona, Marco; Strother, Lars; Behrmann, Marlene; Shomstein, Sarah

    2018-04-24

    Visual input typically includes a myriad of objects, some of which are selected for further processing. While these objects vary in shape and size, most evidence supporting object-based guidance of attention is drawn from paradigms employing two identical objects. Importantly, object size is a readily perceived stimulus dimension, and whether it modulates the distribution of attention remains an open question. Across four experiments, the size of the objects in the display was manipulated in a modified version of the two-rectangle paradigm. In Experiment 1, two identical parallel rectangles of two sizes (thin or thick) were presented. Experiments 2-4 employed identical trapezoids (each having a thin and thick end), inverted in orientation. In the experiments, one end of an object was cued and participants performed either a T/L discrimination or a simple target-detection task. Combined results show that, in addition to the standard object-based attentional advantage, there was a further attentional benefit for processing information contained in the thick versus thin end of objects. Additionally, eye-tracking measures demonstrated increased saccade precision towards thick object ends, suggesting that Fitts's Law may play a role in object-based attentional shifts. Taken together, these results suggest that object-based attentional selection is modulated by object width.

  2. Using Actigraphy and mHealth Systems for an Objective Analysis of Sleep Quality on Systemic Lupus Erythematosus Patients.

    PubMed

    Balderas-Díaz, Sara; Martínez, M Pilar; Guerrero-Contreras, Gabriel; Miró, Elena; Benghazi, Kawtar; Sánchez, Ana I; Garrido, José Luis; Prados, Germán

    2017-03-23

    Although sleep alterations can be an important factor contributing to the clinical state of Systemic Lupus Erythematosus, there are no studies to adequately assess sleep quality in this type of disease. The aim of this work is to analyse the sleep quality of Systemic Lupus Erythematous (SLE) patients based on more objective information provided by actigraphy and mobile systems. The idea is to carry out a comprehensive study by analysing how environmental conditions and factors can affect sleep quality. In traditional methods the information for assessing sleep quality is obtained through questionnaires. In this work, a novel method is proposed by combining these questionnaires that provide valuable but subjective information with actigraphy and a mobile system to collect more objective information about the patient and their environment. The method provides mechanisms to detect how sleep hygiene could be associated directly with the sleep quality of the subjects, in order to provide a custom intervention to SLE patients. Moreover, this alternative provides ease of use, and non-intrusive ICT (Information and Communication Technology) through a wristband and a mHealth system. The mHealth system has been developed for environmental conditions sensing. This consists of a mobile device with built-in sensors providing input data about the bedroom environment during sleep, and a set of services of the Environmental Monitoring System for properly managing the configuration, registration and fusion of those input data. In previous studies, this information has never been taken into account. However, the information could be relevant in the case of SLE patients. The sample is composed of 9 women with SLE and 11 matched controls with a mean age of 35.78 and 32.18, respectively. Demographic and clinical variables between SLE patients and healthy controls are compared using the Fisher exact test and the Mann-Whitney U test. Relationships between psychological variables

  3. Multi-Objective Optimization of Friction Stir Welding Process Parameters of AA6061-T6 and AA7075-T6 Using a Biogeography Based Optimization Algorithm

    PubMed Central

    Tamjidy, Mehran; Baharudin, B. T. Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz

    2017-01-01

    The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon’s entropy. PMID:28772893

  4. Multi-Objective Optimization of Friction Stir Welding Process Parameters of AA6061-T6 and AA7075-T6 Using a Biogeography Based Optimization Algorithm.

    PubMed

    Tamjidy, Mehran; Baharudin, B T Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz

    2017-05-15

    The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.

  5. Multi-objective optimization of laser-scribed micro grooves on AZO conductive thin film using Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Kuo, Chung-Feng Jeffrey; Quang Vu, Huy; Gunawan, Dewantoro; Lan, Wei-Luen

    2012-09-01

    Laser scribing process has been considered as an effective approach for surface texturization on thin film solar cell. In this study, a systematic method for optimizing multi-objective process parameters of fiber laser system was proposed to achieve excellent quality characteristics, such as the minimum scribing line width, the flattest trough bottom, and the least processing edge surface bumps for increasing incident light absorption of thin film solar cell. First, the Taguchi method (TM) obtained useful statistical information through the orthogonal array with relatively fewer experiments. However, TM is only appropriate to optimize single-objective problems and has to rely on engineering judgment for solving multi-objective problems that can cause uncertainty to some degree. The back-propagation neural network (BPNN) and data envelopment analysis (DEA) were utilized to estimate the incomplete data and derive the optimal process parameters of laser scribing system. In addition, analysis of variance (ANOVA) method was also applied to identify the significant factors which have the greatest effects on the quality of scribing process; in other words, by putting more emphasis on these controllable and profound factors, the quality characteristics of the scribed thin film could be effectively enhanced. The experiments were carried out on ZnO:Al (AZO) transparent conductive thin film with a thickness of 500 nm and the results proved that the proposed approach yields better anticipated improvements than that of the TM which is only superior in improving one quality while sacrificing the other qualities. The results of confirmation experiments have showed the reliability of the proposed method.

  6. A cultural side effect: learning to read interferes with identity processing of familiar objects

    PubMed Central

    Kolinsky, Régine; Fernandes, Tânia

    2014-01-01

    Based on the neuronal recycling hypothesis (Dehaene and Cohen, 2007), we examined whether reading acquisition has a cost for the recognition of non-linguistic visual materials. More specifically, we checked whether the ability to discriminate between mirror images, which develops through literacy acquisition, interferes with object identity judgments, and whether interference strength varies as a function of the nature of the non-linguistic material. To these aims we presented illiterate, late literate (who learned to read at adult age), and early literate adults with an orientation-independent, identity-based same-different comparison task in which they had to respond “same” to both physically identical and mirrored or plane-rotated images of pictures of familiar objects (Experiment 1) or of geometric shapes (Experiment 2). Interference from irrelevant orientation variations was stronger with plane rotations than with mirror images, and stronger with geometric shapes than with objects. Illiterates were the only participants almost immune to mirror variations, but only for familiar objects. Thus, the process of unlearning mirror-image generalization, necessary to acquire literacy in the Latin alphabet, has a cost for a basic function of the visual ventral object recognition stream, i.e., identification of familiar objects. This demonstrates that neural recycling is not just an adaptation to multi-use but a process of at least partial exaptation. PMID:25400605

  7. Instant noodles: processing, quality, and nutritional aspects.

    PubMed

    Gulia, Neelam; Dhaka, Vandana; Khatkar, B S

    2014-01-01

    Noodles are one of the staple foods consumed in many Asian countries. Instant noodles have become internationally recognized food, and worldwide consumption is on the rise. The properties of instant noodles like taste, nutrition, convenience, safety, longer shelf-life, and reasonable price have made them popular. Quality factors important for instant noodles are color, flavor, and texture, cooking quality, rehydration rates during final preparation, and the presence or absence of rancid taste after extended storage. Microstructure of dough and noodles has been studied to understand the influence of ingredients and processing variables on the noodle quality by employing scanning electron microscopy. Applications of newer techniques like confocal laser scanning microscopy and epifluorescence light microscopy employed to understand the microstructure changes in dough and noodles have also been discussed. Sincere efforts of researchers are underway to improve the formulation, extend the shelf life, and promote universal fortification of instant noodles. Accordingly, many researchers are exploring the potential of noodle fortification as an effective public health intervention and improve its nutritional properties. This review focuses on the functionality of ingredients, unit operations involved, quality criteria for evaluation, recent trends in fortification, and current knowledge in relation to instant noodles.

  8. Twofold processing for denoising ultrasound medical images.

    PubMed

    Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y

    2015-01-01

    Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India.

  9. [Approach to quality objectives in incidents of patients in peritoneal dialysis].

    PubMed

    Portolés, J; Ocaña, J; López-Sánchez, P; Gómez, M; Rivera, M T; Del Peso, G; Corchete, E; Bajo, M A; Rodríguez-Palomares, J R; Fernández-Perpen, A; López-Gómez, J M

    2010-01-01

    In 2007 the Scientific Quality-technical and Improvement of Quality in Peritoneal Dialysis was edited. It includes several quality indicators. As far as we know, only some groups of work had evaluated these indicators, with inconclusive results. To study the evolution and impact of guidelines in Peritoneal Dialysis. Prospective cohort study of each incident of patients in Peritoneal Dialysis, in a regional public health care system (2003-2006). We prospectively collected baseline clinical and analytical data, technical efficacy, cardiovascular risk, events and deaths, hospital admissions and also prescription data was collected every 6 months. Over a period of 3 years, 490 patients (53.58 years of age; 61.6% males.) Causes of ERC: glomerular 25.5%, diabetes 16%, vascular 12.4%, and interstitial 13.3%. 26.48% were on the list for transplant. Dialysis efficacy: Of the first available results, the residual renal function was 6.37 ml/min, achieving 67.6% of all the objectives K/DOQI. 38.6% remained within the range during the entire first year. Anaemia: 79.3% received erythropoietic stimulating agents and maintained an average Hb of 12.1 g/dl. The percentage of patients in the range (Hb: 11-13 g/dl) improved after a year (58.4% vs 56.3% keeping in the range during this time of 25.6%). Evolution: it has been estimated that per patient-year the risk of: 1) mortality is 0.06 IC 95% [0.04-0.08]; 2) admissions 0.65 [0.58-0.72]; 3) peritoneal infections 0.5 [0.44-0.56]. Diabetes Mellitus patients had a higher cardiovascular risk and prevalence of events. The degrees of control during the follow-up in many topics of peritoneal dialysis improve each year; however they are far from the recommended guidelines, especially if they are evaluated throughout the whole study.

  10. Video quality assesment using M-SVD

    NASA Astrophysics Data System (ADS)

    Tao, Peining; Eskicioglu, Ahmet M.

    2007-01-01

    Objective video quality measurement is a challenging problem in a variety of video processing application ranging from lossy compression to printing. An ideal video quality measure should be able to mimic the human observer. We present a new video quality measure, M-SVD, to evaluate distorted video sequences based on singular value decomposition. A computationally efficient approach is developed for full-reference (FR) video quality assessment. This measure is tested on the Video Quality Experts Group (VQEG) phase I FR-TV test data set. Our experiments show the graphical measure displays the amount of distortion as well as the distribution of error in all frames of the video sequence while the numerical measure has a good correlation with perceived video quality outperforms PSNR and other objective measures by a clear margin.

  11. Total Quality Management (TQM): Training Module on "Focus on Processes."

    ERIC Educational Resources Information Center

    Leigh, David

    This module for a 1-semester Total Quality Management (TQM) course for high school or community college students contains a brief overview of the definition of processes, a section on process flow diagrams, and a section on process management as well as a description of process variation. Examples are used throughout the module to make processes…

  12. Assessment of Rheumatoid Arthritis Quality Process Measures and Associated Costs.

    PubMed

    Brady, Brenna L; Tkacz, Joseph; Meyer, Roxanne; Bolge, Susan C; Ruetsch, Charles

    2017-02-01

    The objective was to examine the relationship between health care costs and quality in rheumatoid arthritis (RA). Administrative claims were used to calculate 8 process measures for the treatment of RA. Associated health care costs were calculated for members who achieved or did not achieve each of the measures. Medical, pharmacy, and laboratory claims for RA patients (International Classification of Diseases, Ninth Revision, Clinical Modification 714.x) were extracted from the Optum Clinformatics Datamart database for 2011. Individuals were predominately female and in their mid-fifties. Measure achievement ranged from 55.9% to 80.8%. The mean cost of care for members meeting the measure was $18,644; members who did not meet the measures had a mean cost of $14,973. Primary cost drivers were pharmacy and office expenses, accounting for 42.4% and 26.3% of total costs, respectively. Regression analyses revealed statistically significant associations between biologic usage, which was more prevalent in groups attaining measures, and total expenditure across all measures (Ps < 0.001). Pharmacy costs were similar between both groups. Individuals meeting the measures had a higher proportion of costs accounted for by office visits; those not meeting the measures had a higher proportion of costs from inpatient and outpatient visits. These findings suggest that increased quality may lead to lower inpatient and outpatient hospital costs. Yet, the overall cost of RA care is likely to remain high because of intensive pharmacotherapy regimens.

  13. Assessment of Rheumatoid Arthritis Quality Process Measures and Associated Costs

    PubMed Central

    Tkacz, Joseph; Meyer, Roxanne; Bolge, Susan C.; Ruetsch, Charles

    2017-01-01

    Abstract The objective was to examine the relationship between health care costs and quality in rheumatoid arthritis (RA). Administrative claims were used to calculate 8 process measures for the treatment of RA. Associated health care costs were calculated for members who achieved or did not achieve each of the measures. Medical, pharmacy, and laboratory claims for RA patients (International Classification of Diseases, Ninth Revision, Clinical Modification 714.x) were extracted from the Optum Clinformatics Datamart database for 2011. Individuals were predominately female and in their mid-fifties. Measure achievement ranged from 55.9% to 80.8%. The mean cost of care for members meeting the measure was $18,644; members who did not meet the measures had a mean cost of $14,973. Primary cost drivers were pharmacy and office expenses, accounting for 42.4% and 26.3% of total costs, respectively. Regression analyses revealed statistically significant associations between biologic usage, which was more prevalent in groups attaining measures, and total expenditure across all measures (Ps < 0.001). Pharmacy costs were similar between both groups. Individuals meeting the measures had a higher proportion of costs accounted for by office visits; those not meeting the measures had a higher proportion of costs from inpatient and outpatient visits. These findings suggest that increased quality may lead to lower inpatient and outpatient hospital costs. Yet, the overall cost of RA care is likely to remain high because of intensive pharmacotherapy regimens. PMID:27031517

  14. Object Processing in Visual Perception and Action in Children and Adults

    ERIC Educational Resources Information Center

    Schum, Nina; Franz, Volker H.; Jovanovic, Bianca; Schwarzer, Gudrun

    2012-01-01

    We investigated whether 6- and 7-year-olds and 9- and 10-year-olds, as well as adults, process object dimensions independent of or in interaction with one another in a perception and action task by adapting Ganel and Goodale's method for testing adults ("Nature", 2003, Vol. 426, pp. 664-667). In addition, we aimed to confirm Ganel and Goodale's…

  15. Object Management Group object transaction service based on an X/Open and International Organization for Standardization open systems interconnection transaction processing kernel

    NASA Astrophysics Data System (ADS)

    Liang, J.; Sédillot, S.; Traverson, B.

    1997-09-01

    This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.

  16. [Life quality parameters in prenosologic evaluation of health state in residents of protective measures area near objects of storage and destruction of chemical weapons].

    PubMed

    Filippov, V L; Nechaeva, E N

    2014-01-01

    The article presents results of life quality assessment and subjective evaluation data on health state, used for prenosologic evaluation of health state in residents of protective measures area near objects of storage and destruction of chemical weapons. Considering specific features of residence near potentially dangerous objects, the authors conducted qualitative evaluation of satisfaction with various life facets, with taking into account the objects specificity, established correlation between life quality and self-evaluation of health with factors influencing public health state.

  17. Ethnographic process evaluation of a quality improvement project to improve transitions of care for older people

    PubMed Central

    Sutton, Elizabeth; Dixon-Woods, Mary; Tarrant, Carolyn

    2016-01-01

    Objectives Quality improvement projects to address transitions of care across care boundaries are increasingly common but meet with mixed success for reasons that are poorly understood. We aimed to characterise challenges in a project to improve transitions for older people between hospital and care homes. Design Independent process evaluation, using ethnographic observations and interviews, of a quality improvement project. Setting and participants An English hospital and two residential care homes for older people. Data 32 hours of non-participant observations and 12 semistructured interviews with project members, hospital and care home staff. Results A hospital-based improvement team sought to reduce unplanned readmissions from residential care homes using interventions including a community-based geriatric team that could be accessed directly by care homes and a communication tool intended to facilitate transfer of information between homes and hospital. Only very modest (if any) impacts of these interventions on readmission rates could be detected. The process evaluation identified multiple challenges in implementing interventions and securing improvement. Many of these arose because of lack of consensus on the nature of the problem and the proper solutions: while the hospital team was keen to reduce readmissions and saw the problems as lying in poor communication and lack of community-based support for care homes, the care home staff had different priorities. Care home staff were unconvinced that the improvement interventions were aligned with their needs or addressed their concerns, resulting in compromised implementation. Conclusions Process evaluations have a valuable role in quality improvement. Our study suggests that a key task for quality improvement projects aimed at transitions of care is that of developing a shared view of the problem to be addressed. A more participatory approach could help to surface assumptions, interpretations and interests

  18. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  19. Hydrodynamic pressure processing: Impact on the quality attributes of fresh and further-processed meat products

    USDA-ARS?s Scientific Manuscript database

    This book chapter reviews hydrodynamic pressure processing (HDP) as an innovative, postharvest technology for enhancing the quality attributes of fresh and further-processed meat products. A variety of meat products have been tested for their response to the high pressure shockwaves of HDP. The st...

  20. Contributions of Low and High Spatial Frequency Processing to Impaired Object Recognition Circuitry in Schizophrenia

    PubMed Central

    Calderone, Daniel J.; Hoptman, Matthew J.; Martínez, Antígona; Nair-Collins, Sangeeta; Mauro, Cristina J.; Bar, Moshe; Javitt, Daniel C.; Butler, Pamela D.

    2013-01-01

    Patients with schizophrenia exhibit cognitive and sensory impairment, and object recognition deficits have been linked to sensory deficits. The “frame and fill” model of object recognition posits that low spatial frequency (LSF) information rapidly reaches the prefrontal cortex (PFC) and creates a general shape of an object that feeds back to the ventral temporal cortex to assist object recognition. Visual dysfunction findings in schizophrenia suggest a preferential loss of LSF information. This study used functional magnetic resonance imaging (fMRI) and resting state functional connectivity (RSFC) to investigate the contribution of visual deficits to impaired object “framing” circuitry in schizophrenia. Participants were shown object stimuli that were intact or contained only LSF or high spatial frequency (HSF) information. For controls, fMRI revealed preferential activation to LSF information in precuneus, superior temporal, and medial and dorsolateral PFC areas, whereas patients showed a preference for HSF information or no preference. RSFC revealed a lack of connectivity between early visual areas and PFC for patients. These results demonstrate impaired processing of LSF information during object recognition in schizophrenia, with patients instead displaying increased processing of HSF information. This is consistent with findings of a preference for local over global visual information in schizophrenia. PMID:22735157

  1. Implications of process characteristics on quality-related event reporting in community pharmacy.

    PubMed

    Boyle, Todd A; Scobie, Andrea C; MacKinnon, Neil J; Mahaffey, Thomas

    2012-01-01

    The lack of a single pharmacy regulator in Canada has led to a wide variety of processes for reporting and learning from medication errors and near misses, collectively known as quality-related events (QREs). These processes range from completely informal processes, through to primarily manual processes that rely on paper forms and incident reports stored in a binder, all the way to fully computerized processes such as anonymous online reporting to a national database. The objective of the study was to develop and test a model of the influence of various QRE reporting process characteristics on levels of QRE reporting process support and QRE reporting in Canadian community pharmacies. A questionnaire was administered to 427 pharmacy managers, pharmacists, and technicians in Nova Scotia, Canada, in 2010, with 210 questionnaires returned. Partial least squares was performed on a subgroup of the data set (N=121) to test and refine the model. Content analysis of the open-ended data provided additional support for model variables. The final model retained all proposed variables except for anonymous reporting. The model highlights that process ease and learning capability both greatly influence the overall support for the QRE process; with these 2 variables explaining 62% of the variance in QRE process support and QRE process support explaining 34% of the variance in overall levels of QRE reporting. The findings have implications for the creation and implementation of successful QRE reporting processes in community pharmacies. Implementing effective QRE reporting tools is paramount to ensuring that pharmacies report and learn from QREs. Dynamic QRE reporting tools that are modern, up to date, integrated into workflow, easy to use, and quick have been shown to be the most effective. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Dyadic Processes in Early Marriage: Attributions, Behavior, and Marital Quality

    ERIC Educational Resources Information Center

    Durtschi, Jared A.; Fincham, Frank D.; Cui, Ming; Lorenz, Frederick O.; Conger, Rand D.

    2011-01-01

    Marital processes in early marriage are important for understanding couples' future marital quality. Spouses' attributions about a partner's behavior have been linked to marital quality, yet the mechanisms underlying this association remain largely unknown. When we used couple data from the Family Transitions Project (N = 280 couples) across the…

  3. Lifetime suicide attempt history, quality of life, and objective functioning among HIV/AIDS patients with alcohol and illicit substance use disorders.

    PubMed

    Walter, Kimberly N; Petry, Nancy M

    2016-05-01

    This cross-sectional study evaluated lifetime prevalence of suicide attempts in 170 HIV/AIDS patients with substance use disorders and the impact of suicide attempt history on subjective indices of quality of life and objective indices of cognitive and physical functioning. All patients met the diagnostic criteria for past-year cocaine or opioid use disorders and 27% of patients also had co-occurring alcohol use disorders. Compared to their counterparts without a history of a suicide attempt, patients with a history of a suicide attempt (n = 60, 35.3%) had significantly poorer emotional and cognitive quality of life scores (ps < .05), but not physical, social, or functional/global quality-of-life scores. Lifetime suicide attempt status was unrelated to objective indices of cognitive functioning, but there was a non-significant trend (p = .07) toward lower viral loads in those with a lifetime suicide attempt relative to those without. The findings indicate that suicide attempt histories are prevalent among HIV/AIDS patients with substance use disorders and relate to poorer perceived emotional and cognitive quality of life, but not objective functioning. HIV/AIDS patients with substance use disorders should be screened for lifetime histories of suicide attempts and offered assistance to improve perceived emotional and cognitive functioning. © The Author(s) 2016.

  4. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    PubMed

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  5. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  6. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space.

    PubMed

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.

  7. Faculty' Technology Barriers Faced within the Framework of Quality Processes: SAU Sample

    ERIC Educational Resources Information Center

    Elmas, Muzaffer

    2012-01-01

    This research was carried out to determine technology barriers faced by the instructors within the framework of quality processes conducted at the University of Sakarya.Therefore, technology barriers encountered in the process of teaching while using web sites developed in order to manage quality operations from a single center were examined…

  8. Quality assurance and accreditation.

    PubMed

    1997-01-01

    In 1996, the Joint Commission International (JCI), which is a partnership between the Joint Commission on Accreditation of Healthcare Organizations and Quality Healthcare Resources, Inc., became one of the contractors of the Quality Assurance Project (QAP). JCI recognizes the link between accreditation and quality, and uses a collaborative approach to help a country develop national quality standards that will improve patient care, satisfy patient-centered objectives, and serve the interest of all affected parties. The implementation of good standards provides support for the good performance of professionals, introduces new ideas for improvement, enhances the quality of patient care, reduces costs, increases efficiency, strengthens public confidence, improves management, and enhances the involvement of the medical staff. Such good standards are objective and measurable; achievable with current resources; adaptable to different institutions and cultures; and demonstrate autonomy, flexibility, and creativity. The QAP offers the opportunity to approach accreditation through research efforts, training programs, and regulatory processes. QAP work in the area of accreditation has been targeted for Zambia, where the goal is to provide equal access to cost-effective, quality health care; Jordan, where a consensus process for the development of standards, guidelines, and policies has been initiated; and Ecuador, where JCI has been asked to help plan an approach to the evaluation and monitoring of the health care delivery system.

  9. Best Practices in Digital Object Development for Education: Promoting Excellence and Innovation in Instructional Quality and Assessment

    ERIC Educational Resources Information Center

    Reece, Amanda A.

    2016-01-01

    A program of development of online learning resources should provide content, resources, support and activities to promote excellence and innovation in instructional quality and assessment. This article provides details on five best practices in digital object development for teaching and learning. In addition, an evaluation of the learning object…

  10. Grey Relational Analysis Coupled with Principal Component Analysis for Optimization of Stereolithography Process to Enhance Part Quality

    NASA Astrophysics Data System (ADS)

    Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.

    2017-08-01

    The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.

  11. Quality Management and Enhancement Processes in UK Business Schools: A Review

    ERIC Educational Resources Information Center

    Hodgkinson, Myra; Kelly, Mike

    2007-01-01

    Purpose: The aim of this paper is to provide insights into the processes that can be and have been adopted by UK business schools as they attempt to meet the Quality Assurance Agency's concern with the standard of quality management and enhancement. Design/methodology/approach: A review of the literature provides interpretations of quality,…

  12. Influence of grid control and object detection on radiation exposure and image quality using mobile C-arms - first results.

    PubMed

    Gosch, D; Ratzmer, A; Berauer, P; Kahn, T

    2007-09-01

    The objective of this study was to examine the extent to which the image quality on mobile C-arms can be improved by an innovative exposure rate control system (grid control). In addition, the possible dose reduction in the pulsed fluoroscopy mode using 25 pulses/sec produced by automatic adjustment of the pulse rate through motion detection was to be determined. As opposed to conventional exposure rate control systems, which use a measuring circle in the center of the field of view, grid control is based on a fine mesh of square cells which are overlaid on the entire fluoroscopic image. The system uses only those cells for exposure control that are covered by the object to be visualized. This is intended to ensure optimally exposed images, regardless of the size, shape and position of the object to be visualized. The system also automatically detects any motion of the object. If a pulse rate of 25 pulses/sec is selected and no changes in the image are observed, the pulse rate used for pulsed fluoroscopy is gradually reduced. This may decrease the radiation exposure. The influence of grid control on image quality was examined using an anthropomorphic phantom. The dose reduction achieved with the help of object detection was determined by evaluating the examination data of 146 patients from 5 different countries. The image of the static phantom made with grid control was always optimally exposed, regardless of the position of the object to be visualized. The average dose reduction when using 25 pulses/sec resulting from object detection and automatic down-pulsing was 21 %, and the maximum dose reduction was 60 %. Grid control facilitates C-arm operation, since optimum image exposure can be obtained independently of object positioning. Object detection may lead to a reduction in radiation exposure for the patient and operating staff.

  13. Comparisons of Observed Process Quality in German and American Infant/Toddler Programs

    ERIC Educational Resources Information Center

    Tietze, Wolfgang; Cryer, Debby

    2004-01-01

    Observed process quality in infant/toddler classrooms was compared in Germany (n = 75) and the USA (n = 219). Process quality was assessed with the Infant/Toddler Environment Rating Scale(ITERS) and parent attitudes about ITERS content with the ITERS Parent Questionnaire (ITERSPQ). The ITERS had comparable reliabilities in the two countries and…

  14. Quality indicators and specifications for strategic and support processes in laboratory medicine.

    PubMed

    Ricós, Carmen; Biosca, Carme; Ibarz, Mercè; Minchinela, Joana; Llopis, Maantonia; Perich, Carmen; Alsina, Jesus; Alvarez, Virtudes; Doménech, Vicenta; Pastor, Rosa Ma; Sansalvador, Mireia; Isern, Gloria Trujillo; Navarro, Conrad Vilanova

    2008-01-01

    This work is the second part of a study regarding indicators and quality specifications for the non-analytical processes in laboratory medicine. Five primary care and five hospital laboratories agreed on the indicators for two strategic processes (quality planning and project development) and various support processes (client relationships, instrument and infrastructure maintenance, safety and risk prevention, purchases and storage, personnel training). In the majority of cases, the median values recorded over 1 year is considered to be the state-of-the-art in our setting and proposed as the quality specification for the indicators stated. Values have been stratified according to primary care and hospital laboratory for referred tests and group of personnel for training. In some cases, the specifications have been set equal to zero events, such as serious incidents in the infrastructure maintenance process and number of work accidents in the safety and risk prevention process. In light of this study, an effort is needed to optimize decisions regarding corrective actions and to move from a subjective individual criterion to systematic and comparative management. This preliminary study provides a comprehensive vision of a subject that could motivate further research and advances in the quality of laboratory services.

  15. [Quality indicators in the storage and dispensing process in a Hospital Pharmacy].

    PubMed

    Rabuñal-Álvarez, M T; Calvin-Lamas, M; Feal-Cortizas, B; Martínez-López, L M; Pedreira-Vázquez, I; Martín-Herranz, M I

    2014-01-01

    To establish indicators for the evaluation of the quality of the storage and dispensing processes related to semiautomatic vertical (SAVCS) and horizontal (SAHCS) carousel systems. Descriptive observational study conducted between January-December 2012. Definition of quality indicators, a target value is established and an obtained value is calculated for 2012. Five quality indicators in the process of storage and dispensing of drugs were defined and calculated: indicator 1, error filling unidose trolleys: target (<1.67%), obtained (1.03%); indicator 2, filling accuracy unidose trolleys by using an SAVCS: target (<15%), obtained (11.5%); indicator 3, reliability of drug inventory in the process of drug entries using an SAHCS: target (<15%), obtained (6.53%); indicator 4, reliability of drug inventory in the picking process of orders replacement stock of clinical units using an SAHCS: target (<10%), obtained (1.97%); indicator 5, accuracy of the picking process of drug orders using an SAHCS: target (<10%), obtained (10.41%). Establishing indicators has allowed the quality in terms of safety, precision and reliability of semiautomatic systems for storage and dispensing drugs to be assessed. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  16. The Aspects About of Objectively Appraisals of Modeling Gypsum Quality and Composites of Phonic-Absorbent and Orthopedic on Base of Gypsum

    NASA Astrophysics Data System (ADS)

    Pop, P. A.; Ungur, P. A.; Lazar, L.; Marcu, F.

    2009-11-01

    The EU Norms about of protection environment, outside and inside ambient, and human health demands has lead at obtain of new materials on the base of airborne material, with high thermo and phonic-absorbent properties, porous and lightweight. The α and β-modeling gypsum plaster quality and lightweight depend on many factors as: fabrication process, granulation, roast temperature, work temperature, environment, additives used, breakage, etc. Also, the objectively appraisal of modeling gypsum quality depends of proper tests methods selection, which are legislated in norms, standards and recommendations. In Romanian Standards SR EN 13279-1/2005 and SR EN 13279-2/2005, adaptable from EU Norms EN 13279-1/2004 and EN 13279-2/2004, the characteristics gypsum family tests are well specification, as: granule-metric analysis, determination of water/plaster ratio, setting time, mechanical characteristics, adhesions and water restrain. For plaster with special use (phonic-absorbent and orthopedic materials, etc.) these determinations are not concluding, being necessary more parameters finding, as: elastic constant, phonic-absorbent coefficient, porosity, working, etc., which is imposed the completion of norms and standards with new determinations.

  17. Leading quality through the development of a multi-year corporate quality plan: sharing The Ottawa Hospital experience.

    PubMed

    Hunter, Linda; Myles, Joanne; Worthington, James R; Lebrun, Monique

    2011-01-01

    This article discusses the background and process for developing a multi-year corporate quality plan. The Ottawa Hospital's goal is to be a top 10% performer in quality and patient safety in North America. In order to create long-term measurable and sustainable changes in the quality of patient care, The Ottawa Hospital embarked on the development of a three-year strategic corporate quality plan. This was accomplished by engaging the organization at all levels and defining quality frameworks, aligning with internal and external expectations, prioritizing strategic goals, articulating performance measurements and reporting to stakeholders while maintaining a transparent communication process. The plan was developed through an iterative process that engaged a broad base of health professionals, physicians, support staff, administration and senior management. A literature review of quality frameworks was undertaken, a Quality Plan Working Group was established, 25 key stakeholder interviews were conducted and 48 clinical and support staff consultations were held. The intent was to gather information on current quality initiatives and challenges encountered and to prioritize corporate goals and then create the quality plan. Goals were created and then prioritized through an affinity exercise. Action plans were developed for each goal and included objectives, tasks and activities, performance measures (structure, process and outcome), accountabilities and timelines. This collaborative methodology resulted in the development of a three-year quality plan. Six corporate goals were outlined by the tenets of the quality framework for The Ottawa Hospital: access to care, appropriate care (effective and efficient), safe care and satisfaction with care. Each of the six corporate goals identified objectives and supporting action plans with accountabilities outlining what would be accomplished in years one, two and three. The three-year quality plan was approved by senior

  18. In-line quality control of moving objects by means of spectral-domain OCT

    NASA Astrophysics Data System (ADS)

    Markl, Daniel; Hannesschläger, Günther; Buchsbaum, Andreas; Sacher, Stephan; Khinast, Johannes G.; Leitner, Michael

    2014-08-01

    In-line quality control of intermediate and final products is essential in various industries. This may imply determining the thickness of a foil or evaluating the homogeneity of coating applied to a pharmaceutical tablet. Such a qualitative and quantitative monitoring in a depth-resolved manner can be accomplished using optical coherence tomography (OCT). In-line quality control based on OCT requires additional consideration of motion effects for the system design as well as for data interpretation. This study focuses on transverse motion effects that can arise in spectral-domain (SD-) OCT systems. The impact of a transverse movement is analyzed for a constant relative speed difference up to 0.7 m/s between sample and sensor head. In particular, transverse motion is affecting OCT system properties such as the beam displacement (distance between adjacent A-scans) and transverse resolution. These properties were evaluated theoretically and experimentally for OCT images of a resolution target and pharmaceutical film-coated tablets. Both theoretical and experimental analyses highlight the shift of the transverse resolution limiting factor from the optics to the beam displacement above a relative speed difference between sensor head and sample of 0.42 m/s (for the presented SD-OCT setup). Speeds above 0.4 m/s are often demanded when monitoring industrial processes, such as a coating process when producing film-coated tablets. This emphasizes the importance of a fast data acquisition when using OCT as in-line quality control tool.

  19. Effects of processing conditions on mammographic image quality.

    PubMed

    Braeuning, M P; Cooper, H W; O'Brien, S; Burns, C B; Washburn, D B; Schell, M J; Pisano, E D

    1999-08-01

    Any given mammographic film will exhibit changes in sensitometric response and image resolution as processing variables are altered. Developer type, immersion time, and temperature have been shown to affect the contrast of the mammographic image and thus lesion visibility. The authors evaluated the effect of altering processing variables, including film type, developer type, and immersion time, on the visibility of masses, fibrils, and speaks in a standard mammographic phantom. Images of a phantom obtained with two screen types (Kodak Min-R and Fuji) and five film types (Kodak Min-R M, Min-R E, Min-R H; Fuji UM-MA HC, and DuPont Microvision-C) were processed with five different developer chemicals (Autex SE, DuPont HSD, Kodak RP, Picker 3-7-90, and White Mountain) at four different immersion times (24, 30, 36, and 46 seconds). Processor chemical activity was monitored with sensitometric strips, and developer temperatures were continuously measured. The film images were reviewed by two board-certified radiologists and two physicists with expertise in mammography quality control and were scored based on the visibility of calcifications, masses, and fibrils. Although the differences in the absolute scores were not large, the Kodak Min-R M and Fuji films exhibited the highest scores, and images developed in White Mountain and Autex chemicals exhibited the highest scores. For any film, several processing chemicals may be used to produce images of similar quality. Extended processing may no longer be necessary.

  20. Activation of response force by self-splitting objects: where are the limits of feedforward Gestalt processing?

    PubMed

    Schmidt, Filipp; Weber, Andreas; Schmidt, Thomas

    2014-08-21

    Most objects can be recognized easily even when they are partly occluded. This also holds when several overlapping objects share the same surface features (self-splitting objects) which is an illustration of the grouping principle of Good Gestalt. We employed outline and filled contour stimuli in a primed flanker task to test whether the processing of self-splitting objects is in accordance with a simple feedforward model. We obtained priming effects in response time and response force for both types of stimuli, even when increasing the number of occluders up to three. The results for outline contours were in full accordance with a feedforward account. This was not the case for the results for filled contours (i.e., for self-splitting objects), especially under conditions of strong occlusion. We conclude that the implementation of the Good Gestalt principle is fast but still based on recurrent processing. © 2014 ARVO.

  1. A unified framework of unsupervised subjective optimized bit allocation for multiple video object coding

    NASA Astrophysics Data System (ADS)

    Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi

    2005-10-01

    MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.

  2. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  3. Quality Time: Temporal Constraints to Continual Process Development in the Air Force

    DTIC Science & Technology

    2017-06-01

    quality is baked into the process or quality must be obtained through testing and correction of deficiencies. Furthermore, the Air Force concluded...that if quality is baked in it comes “for free” but if quality must be inspected or tested in it comes at a cost. As a manager or a leader, it is

  4. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  5. Expanded opportunities of THz passive camera for the detection of concealed objects

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2013-10-01

    Among the security problems, the detection of object implanted into either the human body or animal body is the urgent problem. At the present time the main tool for the detection of such object is X-raying only. However, X-ray is the ionized radiation and therefore can not be used often. Other way for the problem solving is passive THz imaging using. In our opinion, using of the passive THz camera may help to detect the object implanted into the human body under certain conditions. The physical reason of such possibility arises from temperature trace on the human skin as a result of the difference in temperature between object and parts of human body. Modern passive THz cameras have not enough resolution in temperature to see this difference. That is why, we use computer processing to enhance the passive THz camera resolution for this application. After computer processing of images captured by passive THz camera TS4, developed by ThruVision Systems Ltd., we may see the pronounced temperature trace on the human body skin from the water, which is drunk by person, or other food eaten by person. Nevertheless, there are many difficulties on the way of full soution of this problem. We illustrate also an improvement of quality of the image captured by comercially available passive THz cameras using computer processing. In some cases, one can fully supress a noise on the image without loss of its quality. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts.

  6. Integrating Microscopic Analysis into Existing Quality Assurance Processes

    NASA Astrophysics Data System (ADS)

    Frühberger, Peter; Stephan, Thomas; Beyerer, Jürgen

    When technical goods, like mainboards and other electronic components, are produced, quality assurance (QA) is very important. To achieve this goal, different optical microscopes can be used to analyze a variety of specimen to gain comprehensive information by combining the acquired sensor data. In many industrial processes, cameras are used to examine these technical goods. Those cameras can analyze complete boards at once and offer a high level of accuracy when used for completeness checks. When small defects, e.g. soldered points, need to be examined in detail, those wide area cameras are limited. Microscopes with large magnification need to be used to analyze those critical areas. But microscopes alone cannot fulfill this task within a limited time schedule, because microscopic analysis of complete motherboards of a certain size is time demanding. Microscopes are limited concerning their depth of field and depth of focus, which is why additional components like XY moving tables need to be used to examine the complete surface. Yet today's industrial production quality standards require a 100 % control of the soldered components within a given time schedule. This level of quality, while keeping inspection time low, can only be achieved when combining multiple inspection devices in an optimized manner. This paper presents results and methods of combining industrial cameras with microscopy instrumenting a classificatory based approach intending to keep already deployed QA processes in place but extending them with the purpose of increasing the quality level of the produced technical goods while maintaining high throughput.

  7. Improving program documentation quality through the application of continuous improvement processes.

    PubMed

    Lovlien, Cheryl A; Johansen, Martha; Timm, Sandra; Eversman, Shari; Gusa, Dorothy; Twedell, Diane

    2007-01-01

    Maintaining the integrity of record keeping and retrievable information related to the provision of continuing education credit creates challenges for a large organization. Accurate educational program documentation is vital to support the knowledge and professional development of nursing staff. Quality review and accurate documentation of programs for nursing staff development occurred at one institution through the use of continuous improvement principles. Integration of the new process into the current system maintains the process of providing quality record keeping.

  8. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  9. Are hospital process quality indicators influenced by socio-demographic health determinants.

    PubMed

    Buja, Alessandra; Canavese, Daniel; Furlan, Patrizia; Lago, Laura; Saia, Mario; Baldo, Vincenzo

    2015-10-01

    This population-level health service study aimed to address whether hospitals assure the same quality of care to people in equal need, i.e. to see if any associations exist between social determinants and adherence to four hospital process indicators clearly identified as being linked to better health outcomes for patients. This was a retrospective cohort study based on administrative data collected in the Veneto Region (northeast Italy). We included residents of the Veneto Region hospitalized for ST-segment elevation myocardial infarction (STEMI) or acute myocardial infarction (AMI), hip fracture, or cholecystitis, and women giving birth, who were discharged from any hospital operating under the Veneto Regional Health Service between January 2012 and December 2012. The following quality indicator rates were calculated: patients with STEMI-AMI treated with percutaneous coronary intervention, elderly patients with hip fractures who underwent surgery within 48 h of admission, laparoscopic cholecystectomies and women who underwent cesarean section. A multilevel, multivariable logistic regression analyses were conducted to test the association between age, gender, formal education or citizenship and the quality of hospital care processes. All the inpatient hospital care process quality indicators measured were associated with an undesirable number of disparities concerning the social determinants. Monitoring the evidence-based hospital health care process indicators reveals undesirable disparities. Administrative data sets are of considerable practical value in broad-based quality assessments and as a screening tool, also in the health disparities domain. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  10. Reentrant processing mediates object substitution masking: comment on Põder (2013).

    PubMed

    Di Lollo, Vincent

    2014-01-01

    Object-substitution masking (OSM) occurs when a target stimulus and a surrounding mask are displayed briefly together, and the display then continues with the mask alone. Target identification is accurate when the stimuli co-terminate but is progressively impaired as the duration of the trailing mask is increased. In reentrant accounts, OSM is said to arise from iterative exchanges between brain regions connected by two-way pathways. In an alternative account, OSM is explained on the basis of exclusively feed-forward processes, without recourse to reentry. Here I show that the feed-forward account runs afoul of the extant phenomenological, behavioral, brain-imaging, and electrophysiological evidence. Further, the feed-forward assumption that masking occurs when attention finds a degraded target is shown to be entirely ad hoc. In contrast, the evidence is uniformly consistent with a reentrant-processing account of OSM.

  11. The NCC project: A quality management perspective

    NASA Technical Reports Server (NTRS)

    Lee, Raymond H.

    1993-01-01

    The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.

  12. A Scale-up Approach for Film Coating Process Based on Surface Roughness as the Critical Quality Attribute.

    PubMed

    Yoshino, Hiroyuki; Hara, Yuko; Dohi, Masafumi; Yamashita, Kazunari; Hakomori, Tadashi; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru

    2018-04-01

    Scale-up approaches for film coating process have been established for each type of film coating equipment from thermodynamic and mechanical analyses for several decades. The objective of the present study was to establish a versatile scale-up approach for film coating process applicable to commercial production that is based on critical quality attribute (CQA) using the Quality by Design (QbD) approach and is independent of the equipment used. Experiments on a pilot scale using the Design of Experiment (DoE) approach were performed to find a suitable CQA from surface roughness, contact angle, color difference, and coating film properties by terahertz spectroscopy. Surface roughness was determined to be a suitable CQA from a quantitative appearance evaluation. When surface roughness was fixed as the CQA, the water content of the film-coated tablets was determined to be the critical material attribute (CMA), a parameter that does not depend on scale or equipment. Finally, to verify the scale-up approach determined from the pilot scale, experiments on a commercial scale were performed. The good correlation between the surface roughness (CQA) and the water content (CMA) identified at the pilot scale was also retained at the commercial scale, indicating that our proposed method should be useful as a scale-up approach for film coating process.

  13. Occipital Alpha Activity during Stimulus Processing Gates the Information Flow to Object-Selective Cortex

    PubMed Central

    Zumer, Johanna M.; Scheeringa, René; Schoffelen, Jan-Mathijs; Norris, David G.; Jensen, Ole

    2014-01-01

    Given the limited processing capabilities of the sensory system, it is essential that attended information is gated to downstream areas, whereas unattended information is blocked. While it has been proposed that alpha band (8–13 Hz) activity serves to route information to downstream regions by inhibiting neuronal processing in task-irrelevant regions, this hypothesis remains untested. Here we investigate how neuronal oscillations detected by electroencephalography in visual areas during working memory encoding serve to gate information reflected in the simultaneously recorded blood-oxygenation-level-dependent (BOLD) signals recorded by functional magnetic resonance imaging in downstream ventral regions. We used a paradigm in which 16 participants were presented with faces and landscapes in the right and left hemifields; one hemifield was attended and the other unattended. We observed that decreased alpha power contralateral to the attended object predicted the BOLD signal representing the attended object in ventral object-selective regions. Furthermore, increased alpha power ipsilateral to the attended object predicted a decrease in the BOLD signal representing the unattended object. We also found that the BOLD signal in the dorsal attention network inversely correlated with visual alpha power. This is the first demonstration, to our knowledge, that oscillations in the alpha band are implicated in the gating of information from the visual cortex to the ventral stream, as reflected in the representationally specific BOLD signal. This link of sensory alpha to downstream activity provides a neurophysiological substrate for the mechanism of selective attention during stimulus processing, which not only boosts the attended information but also suppresses distraction. Although previous studies have shown a relation between the BOLD signal from the dorsal attention network and the alpha band at rest, we demonstrate such a relation during a visuospatial task, indicating

  14. Effects of process parameters on the molding quality of the micro-needle array

    NASA Astrophysics Data System (ADS)

    Qiu, Z. J.; Ma, Z.; Gao, S.

    2016-07-01

    Micro-needle array, which is used in medical applications, is a kind of typical injection molded products with microstructures. Due to its tiny micro-features size and high aspect ratios, it is more likely to produce short shots defects, leading to poor molding quality. The injection molding process of the micro-needle array was studied in this paper to find the effects of the process parameters on the molding quality of the micro-needle array and to provide theoretical guidance for practical production of high-quality products. With the shrinkage ratio and warpage of micro needles as the evaluation indices of the molding quality, the orthogonal experiment was conducted and the analysis of variance was carried out. According to the results, the contribution rates were calculated to determine the influence of various process parameters on molding quality. The single parameter method was used to analyse the main process parameter. It was found that the contribution rate of the holding pressure on shrinkage ratio and warpage reached 83.55% and 94.71% respectively, far higher than that of the other parameters. The study revealed that the holding pressure is the main factor which affects the molding quality of micro-needle array so that it should be focused on in order to obtain plastic parts with high quality in the practical production.

  15. Rethinking infant knowledge: toward an adaptive process account of successes and failures in object permanence tasks.

    PubMed

    Munakata, Y; McClelland, J L; Johnson, M H; Siegler, R S

    1997-10-01

    Infants seem sensitive to hidden objects in habituation tasks at 3.5 months but fail to retrieve hidden objects until 8 months. The authors first consider principle-based accounts of these successes and failures, in which early successes imply knowledge of principles and failures are attributed to ancillary deficits. One account is that infants younger than 8 months have the object permanence principle but lack means-ends abilities. To test this, 7-month-olds were trained on means-ends behaviors and were tested on retrieval of visible and occluded toys. Means-ends demands were the same, yet infants made more toy-guided retrievals in the visible case. The authors offer an adaptive process account in which knowledge is graded and embedded in specific behavioral processes. Simulation models that learn gradually to represent occluded objects show how this approach can account for success and failure in object permanence tasks without assuming principles and ancillary deficits.

  16. Educational Process Quality in Preschools at the Individual Child Level: Findings from a German Study

    ERIC Educational Resources Information Center

    Smidt, Wilfried; Rossbach, Hans-Günther

    2016-01-01

    A large body of research has examined the quality of educational processes in preschools, but it has usually been studied at the group level. Thus, there is a lack of research on the quality of educational processes as experienced by individual children. Therefore, this study investigated the quality of educational processes in preschools at the…

  17. Influence of raw milk quality on processed dairy products: How do raw milk quality test results relate to product quality and yield?

    PubMed

    Murphy, Steven C; Martin, Nicole H; Barbano, David M; Wiedmann, Martin

    2016-12-01

    This article provides an overview of the influence of raw milk quality on the quality of processed dairy products and offers a perspective on the merits of investing in quality. Dairy farmers are frequently offered monetary premium incentives to provide high-quality milk to processors. These incentives are most often based on raw milk somatic cell and bacteria count levels well below the regulatory public health-based limits. Justification for these incentive payments can be based on improved processed product quality and manufacturing efficiencies that provide the processor with a return on their investment for high-quality raw milk. In some cases, this return on investment is difficult to measure. Raw milks with high levels of somatic cells and bacteria are associated with increased enzyme activity that can result in product defects. Use of raw milk with somatic cell counts >100,000cells/mL has been shown to reduce cheese yields, and higher levels, generally >400,000 cells/mL, have been associated with textural and flavor defects in cheese and other products. Although most research indicates that fairly high total bacteria counts (>1,000,000 cfu/mL) in raw milk are needed to cause defects in most processed dairy products, receiving high-quality milk from the farm allows some flexibility for handling raw milk, which can increase efficiencies and reduce the risk of raw milk reaching bacterial levels of concern. Monitoring total bacterial numbers in regard to raw milk quality is imperative, but determining levels of specific types of bacteria present has gained increasing importance. For example, spores of certain spore-forming bacteria present in raw milk at very low levels (e.g., <1/mL) can survive pasteurization and grow in milk and cheese products to levels that result in defects. With the exception of meeting product specifications often required for milk powders, testing for specific spore-forming groups is currently not used in quality incentive programs in

  18. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space

    PubMed Central

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202

  19. An optimal policy for a single-vendor and a single-buyer integrated system with setup cost reduction and process-quality improvement

    NASA Astrophysics Data System (ADS)

    Shu, Hui; Zhou, Xideng

    2014-05-01

    The single-vendor single-buyer integrated production inventory system has been an object of study for a long time, but little is known about the effect of investing in reducing setup cost reduction and process-quality improvement for an integrated inventory system in which the products are sold with free minimal repair warranty. The purpose of this article is to minimise the integrated cost by optimising simultaneously the number of shipments and the shipment quantity, the setup cost, and the process quality. An efficient algorithm procedure is proposed for determining the optimal decision variables. A numerical example is presented to illustrate the results of the proposed models graphically. Sensitivity analysis of the model with respect to key parameters of the system is carried out. The paper shows that the proposed integrated model can result in significant savings in the integrated cost.

  20. Top-down modulation of visual processing and knowledge after 250 ms supports object constancy of category decisions

    PubMed Central

    Schendan, Haline E.; Ganis, Giorgio

    2015-01-01

    People categorize objects more slowly when visual input is highly impoverished instead of optimal. While bottom-up models may explain a decision with optimal input, perceptual hypothesis testing (PHT) theories implicate top-down processes with impoverished input. Brain mechanisms and the time course of PHT are largely unknown. This event-related potential study used a neuroimaging paradigm that implicated prefrontal cortex in top-down modulation of occipitotemporal cortex. Subjects categorized more impoverished and less impoverished real and pseudo objects. PHT theories predict larger impoverishment effects for real than pseudo objects because top-down processes modulate knowledge only for real objects, but different PHT variants predict different timing. Consistent with parietal-prefrontal PHT variants, around 250 ms, the earliest impoverished real object interaction started on an N3 complex, which reflects interactive cortical activity for object cognition. N3 impoverishment effects localized to both prefrontal and occipitotemporal cortex for real objects only. The N3 also showed knowledge effects by 230 ms that localized to occipitotemporal cortex. Later effects reflected (a) word meaning in temporal cortex during the N400, (b) internal evaluation of prior decision and memory processes and secondary higher-order memory involving anterotemporal parts of a default mode network during posterior positivity (P600), and (c) response related activity in posterior cingulate during an anterior slow wave (SW) after 700 ms. Finally, response activity in supplementary motor area during a posterior SW after 900 ms showed impoverishment effects that correlated with RTs. Convergent evidence from studies of vision, memory, and mental imagery which reflects purely top-down inputs, indicates that the N3 reflects the critical top-down processes of PHT. A hybrid multiple-state interactive, PHT and decision theory best explains the visual constancy of object cognition. PMID:26441701

  1. Voice quality after endoscopic laser surgery and radiotherapy for early glottic cancer: objective measurements emphasizing the Voice Handicap Index

    PubMed Central

    Caminero Cueva, Maria Jesús; Señaris González, Blanca; Llorente Pendás, José Luis; Gorriz Gil, Carmen; López Llames, Aurora; Alonso Pantiga, Ramón; Suárez Nieto, Carlos

    2007-01-01

    We analyzed the functional outcome and self-evaluation of the voice of patients with T1 glottic carcinoma treated with endoscopic laser surgery and radiotherapy. We performed an objective voice evaluation, as well as a physical, emotional and functional well being assessment of 19 patients treated with laser surgery and 18 patients treated with radiotherapy. Voice quality is affected both by surgery and radiotherapy. Voice parameters only show differences in the maximum phonation time between both treatments. Results in the Voice Handicap Index show that radiotherapy has less effect on patient voice quality perception. There is a reduced impact on the patient’s perception of voice quality after radiotherapy, despite there being no significant differences in vocal quality between radiotherapy and laser cordectomy. PMID:17999074

  2. Quality - Inexpensive if a way of life.

    NASA Technical Reports Server (NTRS)

    Grau, D.

    1972-01-01

    NASA major projects require phased planning. The participation of persons charged with maintaining the proper quality during the last two of four phases has become accepted practice. Current objectives are concerned with the application of quality assurance techniques during the second phase. It is pointed out that quality must be emphasized during the entire engineering process, starting with the selection of the components.

  3. Assessing Program Learning Objectives to Improve Undergraduate Physics Education

    NASA Astrophysics Data System (ADS)

    Menke, Carrie

    2014-03-01

    Our physics undergraduate program has five program learning objectives (PLOs) focusing on (1) physical principles, (2) mathematical expertise, (3) experimental technique, (4) communication and teamwork, and (5) research proficiency. One PLO is assessed each year, with the results guiding modifications in our curriculum and future assessment practices; we have just completed our first cycle of assessing all PLOs. Our approach strives to maximize the ease and applicability of our assessment practices while maintaining faculty's flexibility in course design and delivery. Objectives are mapped onto our core curriculum with identified coursework collected as direct evidence. We've utilized mostly descriptive rubrics, applying them at the course and program levels as well as sharing them with the students. This has resulted in more efficient assessment that is also applicable to reaccreditation efforts, higher inter-rater reliability than with other rubric types, and higher quality capstone projects. We've also found that the varied quality of student writing can interfere with our assessment of other objectives. This poster outlines our processes, resources, and how we have used PLO assessment to strengthen our undergraduate program.

  4. Quality and safety attributes of afghan raisins before and after processing

    PubMed Central

    McCoy, Stacy; Chang, Jun Won; McNamara, Kevin T; Oliver, Haley F; Deering, Amanda J

    2015-01-01

    Raisins are an important export commodity for Afghanistan; however, Afghan packers are unable to export to markets seeking high-quality products due to limited knowledge regarding their quality and safety. To evaluate this, Afghan raisin samples from pre-, semi-, and postprocessed raisins were obtained from a raisin packer in Kabul, Afghanistan. The raisins were analyzed and compared to U.S. standards for processed raisins. The samples tested did not meet U.S. industry standards for embedded sand and pieces of stem, total soluble solids, and titratable acidity. The Afghan raisins did meet or exceed U.S. Grade A standard for the number of cap-stems, percent damaged, crystallization levels, moisture content, and color. Following processing, the number of total aerobic bacteria, yeasts, molds, and total coliforms were within the acceptable limits. Although quality issues are present in the Afghan raisins, the process used to clean the raisins is suitable to maintain food safety standards. PMID:25650241

  5. Communication Barriers in Quality Process: Sakarya University Sample

    ERIC Educational Resources Information Center

    Yalcin, Mehmet Ali

    2012-01-01

    Communication has an important role in life and especially in education. Nowadays, lots of people generally use technology for communication. When technology uses in education and other activities, there may be some communication barriers. And also, quality process has an important role in higher education institutes. If a higher education…

  6. Reproducibility of image quality for moving objects using respiratory-gated computed tomography: a study using a phantom model

    PubMed Central

    Fukumitsu, Nobuyoshi; Ishida, Masaya; Terunuma, Toshiyuki; Mizumoto, Masashi; Hashimoto, Takayuki; Moritake, Takashi; Okumura, Toshiyuki; Sakae, Takeji; Tsuboi, Koji; Sakurai, Hideyuki

    2012-01-01

    To investigate the reproducibility of computed tomography (CT) imaging quality in respiratory-gated radiation treatment planning is essential in radiotherapy of movable tumors. Seven series of regular and six series of irregular respiratory motions were performed using a thorax dynamic phantom. For the regular respiratory motions, the respiratory cycle was changed from 2.5 to 4 s and the amplitude was changed from 4 to 10 mm. For the irregular respiratory motions, a cycle of 2.5 to 4 or an amplitude of 4 to 10 mm was added to the base data (i.e. 3.5-s cycle, 6-mm amplitude) every three cycles. Images of the object were acquired six times using respiratory-gated data acquisition. The volume of the object was calculated and the reproducibility of the volume was decided based on the variety. The registered image of the object was added and the reproducibility of the shape was decided based on the degree of overlap of objects. The variety in the volumes and shapes differed significantly as the respiratory cycle changed according to regular respiratory motions. In irregular respiratory motion, shape reproducibility was further inferior, and the percentage of overlap among the six images was 35.26% in the 2.5- and 3.5-s cycle mixed group. Amplitude changes did not produce significant differences in the variety of the volumes and shapes. Respiratory cycle changes reduced the reproducibility of the image quality in respiratory-gated CT. PMID:22966173

  7. Inpatients' and outpatients' satisfaction: the mediating role of perceived quality of physical and social environment.

    PubMed

    Campos Andrade, Cláudia; Lima, Maria Luísa; Pereira, Cícero Roberto; Fornara, Ferdinando; Bonaiuto, Marino

    2013-05-01

    This study analyses the processes through which the physical environment of health care settings impacts on patients' well-being. Specifically, we investigate the mediating role of perceptions of the physical and social environments, and if this process is moderated by patients' status, that is, if the objective physical environment impacts inpatients' and outpatients' satisfaction by different social-psychological processes. Patients (N=206) evaluated the physical and social environments of the care unit where they were receiving treatment, and its objective physical conditions were independently evaluated by two architects. Results showed that the objective environmental quality affects satisfaction through perceptions of environmental quality, and that patients' status moderates this relationship. For inpatients, it is the perception of quality of the social environment that mediates the relationship between objective environmental quality and satisfaction, whereas for outpatients it is the perception of quality of the physical environment. This moderated mediation is discussed in terms of differences on patients' experiences of health care environments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Community College Management by Objectives: Process, Progress, Problems.

    ERIC Educational Resources Information Center

    Deegan, William L.; And Others

    The objectives of this book are: (1) to present a theoretical framework for management by objectives in community colleges, (2) to present information about alternative methods for conducting needs assessment and implementing management by objectives, (3) to present a framework for integrating academic and fiscal planning through management by…

  9. Defense Logistics Agency Can Improve Its Product Quality Deficiency Report Processing

    DTIC Science & Technology

    2015-07-01

    Contracts for M2 Machine Gun Spare Parts in Support of Operations in Southwest Asia,” January 11, 2010 Appendixes DODIG-2015-140 │ 29 Appendix B...personnel are adequately processing product quality deficiency reports and identifying the root cause for defective spare parts . This is the first...quality deficiency report program and prevents meaningful analysis of the primary causes of spare- part quality deficiencies. In addition, the

  10. Development and Implementation of a Quality Improvement Process for Echocardiographic Laboratory Accreditation.

    PubMed

    Gilliland, Yvonne E; Lavie, Carl J; Ahmad, Homaa; Bernal, Jose A; Cash, Michael E; Dinshaw, Homeyar; Milani, Richard V; Shah, Sangeeta; Bienvenu, Lisa; White, Christopher J

    2016-03-01

    We describe our process for quality improvement (QI) for a 3-year accreditation cycle in echocardiography by the Intersocietal Accreditation Commission (IAC) for a large group practice. Echocardiographic laboratory accreditation by the IAC was introduced in 1996, which is not required but could impact reimbursement. To ensure high-quality patient care and community recognition as a facility committed to providing high-quality echocardiographic services, we applied for IAC accreditation in 2010. Currently, there is little published data regarding the IAC process to meet echocardiography standards. We describe our approach for developing a multicampus QI process for echocardiographic laboratory accreditation during the 3-year cycle of accreditation by the IAC. We developed a quarterly review assessing (1) the variability of the interpretations, (2) the quality of the examinations, (3) a correlation of echocardiographic studies with other imaging modalities, (4) the timely completion of reports, (5) procedure volume, (6) maintenance of Continuing Medical Education credits by faculty, and (7) meeting Appropriate Use Criteria. We developed and implemented a multicampus process for QI during the 3-year accreditation cycle by the IAC for Echocardiography. We documented both the process and the achievement of those metrics by the Echocardiography Laboratories at the Ochsner Medical Institutions. We found the QI process using IAC standards to be a continuous educational experience for our Echocardiography Laboratory physicians and staff. We offer our process as an example and guide for other echocardiography laboratories who wish to apply for such accreditation or reaccreditation. © 2016, Wiley Periodicals, Inc.

  11. Minimally processed vegetable salads: microbial quality evaluation.

    PubMed

    Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa

    2007-05-01

    The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.

  12. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation

    PubMed Central

    Moberg, Kayla; Amin, Kemia N.; Wright, Melissa; Newkirk, Jordan J.; Ponder, Monica A.; Acuff, Gary R.; Dickson, James S.

    2017-01-01

    Abstract Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted‐steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L *) and more red (higher a*) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b *; lower L *) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. PMID:28407236

  13. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    PubMed

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  14. The association between effectiveness of the management processes and quality of health services from the perspective of the managers in the university hospitals of Ahvaz, Iran

    PubMed Central

    Faraji-Khiavi, F; Ghobadian, S; Moradi-Joo, E

    2015-01-01

    Background and Objective: Knowledge management is introduced as a key element of quality improvement in organizations. There was no such research in university hospitals of Ahvaz. This study aimed to determine the association between the effectiveness of the processes of knowledge management and the health services quality from the managers’ view in the educational hospitals of Ahvaz city. Materials and Methods: in this correlational and research, the research population consisted of 120 managers from hospitals in University of Medical Sciences Ahvaz. Due to the limited population, the census was run. Three questionnaires were used for data collection: Demographic characteristics, the effectiveness of knowledge management processes and the quality of medical services. To analyze the data, the Spearman association analysis, The Kruskal-Wallis, the Mann–Whitney U test, were used in SPSS. Results: estimation of average scoring of the effectiveness of knowledge management processes and its components were relatively appropriate. Quality of medical services was estimated as relatively appropriate. Relationship of quality of health services with the effectiveness of knowledge management processes showed a medium and positive correlation (p < 0.001). Managers with different genders showed significant differences in knowledge development and transfer (P = 0.003). Conclusion: a significant and positive association was observed between the effectiveness of knowledge management processes and health care quality. To improve the health care quality in university hospitals, managers should pay more attention to develop the cultures of innovation, encourage teamwork, and improve communication and creative thinking in the knowledge management context PMID:28316735

  15. Quality Indicators for the Total Testing Process.

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Aita, Ada

    2017-03-01

    ISO 15189:2012 requires the use of quality indicators (QIs) to monitor and evaluate all steps of the total testing process, but several difficulties dissuade laboratories from effective and continuous use of QIs in routine practice. An International Federation of Clinical Chemistry and Laboratory Medicine working group addressed this problem and implemented a project to develop a model of QIs to be used in clinical laboratories worldwide to monitor and evaluate all steps of the total testing process, and decrease error rates and improve patient services in laboratory testing. All laboratories are invited, at no cost, to enroll in the project and contribute to harmonized management at the international level. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. A primer on the cost of quality for improvement of laboratory and pathology specimen processes.

    PubMed

    Carlson, Richard O; Amirahmadi, Fazlollaah; Hernandez, James S

    2012-09-01

    In today's environment, many laboratories and pathology practices are challenged to maintain or increase their quality while simultaneously lowering their overall costs. The cost of improving specimen processes is related to quality, and we demonstrate that actual costs can be reduced by designing "quality at the source" into the processes. Various costs are hidden along the total testing process, and we suggest ways to identify opportunities to reduce cost by improving quality in laboratories and pathology practices through the use of Lean, Six Sigma, and industrial engineering.

  17. Differential Processing of Isolated Object and Multi-item Pop-Out Displays in LIP and PFC.

    PubMed

    Meyers, Ethan M; Liang, Andy; Katsuki, Fumi; Constantinidis, Christos

    2017-10-11

    Objects that are highly distinct from their surroundings appear to visually "pop-out." This effect is present for displays in which: (1) a single cue object is shown on a blank background, and (2) a single cue object is highly distinct from surrounding objects; it is generally assumed that these 2 display types are processed in the same way. To directly examine this, we applied a decoding analysis to neural activity recorded from the lateral intraparietal (LIP) area and the dorsolateral prefrontal cortex (dlPFC). Our analyses showed that for the single-object displays, cue location information appeared earlier in LIP than in dlPFC. However, for the display with distractors, location information was substantially delayed in both brain regions, and information first appeared in dlPFC. Additionally, we see that pattern of neural activity is similar for both types of displays and across different color transformations of the stimuli, indicating that location information is being coded in the same way regardless of display type. These results lead us to hypothesize that 2 different pathways are involved processing these 2 types of pop-out displays. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Development of an algorithm for improving quality and information processing capacity of MathSpeak synthetic speech renderings.

    PubMed

    Isaacson, M D; Srinivasan, S; Lloyd, L L

    2010-01-01

    MathSpeak is a set of rules for non speaking of mathematical expressions. These rules have been incorporated into a computerised module that translates printed mathematics into the non-ambiguous MathSpeak form for synthetic speech rendering. Differences between individual utterances produced with the translator module are difficult to discern because of insufficient pausing between utterances; hence, the purpose of this study was to develop an algorithm for improving the synthetic speech rendering of MathSpeak. To improve synthetic speech renderings, an algorithm for inserting pauses was developed based upon recordings of middle and high school math teachers speaking mathematic expressions. Efficacy testing of this algorithm was conducted with college students without disabilities and high school/college students with visual impairments. Parameters measured included reception accuracy, short-term memory retention, MathSpeak processing capacity and various rankings concerning the quality of synthetic speech renderings. All parameters measured showed statistically significant improvements when the algorithm was used. The algorithm improves the quality and information processing capacity of synthetic speech renderings of MathSpeak. This increases the capacity of individuals with print disabilities to perform mathematical activities and to successfully fulfill science, technology, engineering and mathematics academic and career objectives.

  19. A Rotatable Quality Control Phantom for Evaluating the Performance of Flat Panel Detectors in Imaging Moving Objects.

    PubMed

    Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki

    2016-02-01

    As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.

  20. The benefits of sensorimotor knowledge: body-object interaction facilitates semantic processing.

    PubMed

    Siakaluk, Paul D; Pexman, Penny M; Sears, Christopher R; Wilson, Kim; Locheed, Keri; Owen, William J

    2008-04-05

    This article examined the effects of body-object interaction (BOI) on semantic processing. BOI measures perceptions of the ease with which a human body can physically interact with a word's referent. In Experiment 1, BOI effects were examined in 2 semantic categorization tasks (SCT) in which participants decided if words are easily imageable. Responses were faster and more accurate for high BOI words (e.g., mask) than for low BOI words (e.g., ship). In Experiment 2, BOI effects were examined in a semantic lexical decision task (SLDT), which taps both semantic feedback and semantic processing. The BOI effect was larger in the SLDT than in the SCT, suggesting that BOI facilitates both semantic feedback and semantic processing. The findings are consistent with the embodied cognition perspective (e.g., Barsalou's, 1999, Perceptual Symbols Theory), which proposes that sensorimotor interactions with the environment are incorporated in semantic knowledge. 2008 Cognitive Science Society, Inc.

  1. Ultrasonic Real-Time Quality Monitoring Of Aluminum Spot Weld Process

    NASA Astrophysics Data System (ADS)

    Perez Regalado, Waldo Josue

    The real-time ultrasonic spot weld monitoring system, introduced by our research group, has been designed for the unsupervised quality characterization of the spot welding process. It comprises the ultrasonic transducer (probe) built into one of the welding electrodes and an electronics hardware unit which gathers information from the transducer, performs real-time weld quality characterization and communicates with the robot programmable logic controller (PLC). The system has been fully developed for the inspection of spot welds manufactured in steel alloys, and has been mainly applied in the automotive industry. In recent years, a variety of materials have been introduced to the automotive industry. These include high strength steels, magnesium alloys, and aluminum alloys. Aluminum alloys have been of particular interest due to their high strength-to-weight ratio. Resistance spot welding requirements for aluminum vary greatly from those of steel. Additionally, the oxide film formed on the aluminum surface increases the heat generation between the copper electrodes and the aluminum plates leading to accelerated electrode deterioration. Preliminary studies showed that the real-time quality inspection system was not able to monitor spot welds manufactured with aluminum. The extensive experimental research, finite element modelling of the aluminum welding process and finite difference modeling of the acoustic wave propagation through the aluminum spot welds presented in this dissertation, revealed that the thermodynamics and hence the acoustic wave propagation through an aluminum and a steel spot weld differ significantly. For this reason, the hardware requirements and the algorithms developed to determine the welds quality from the ultrasonic data used on steel, no longer apply on aluminum spot welds. After updating the system and designing the required algorithms, parameters such as liquid nugget penetration and nugget diameter were available in the ultrasonic data

  2. Effective Application of a Quality System in the Donation Process at Hospital Level.

    PubMed

    Trujnara, M; Czerwiński, J; Osadzińska, J

    2016-06-01

    This article describes the application of a quality system at the hospital level at the Multidisciplinary Hospital in Warsaw-Międzylesie in Poland. A quality system of hospital procedures (in accordance with the ISO system 9001:2008) regarding the donation process, from the identification of a possible donor to the retrieval of organs, was applied there in 2014. Seven independent documents about hospital procedures, were designed to cover the entire process of donation. The number of donors identified increased after the application of the quality system. The reason for this increase is, above all, the cooperation of the well-trained team of specialists who have been engaged in the process of donation for many years, but formal procedures certainly organize the process and make it easier. Copyright © 2016. Published by Elsevier Inc.

  3. Collinearity and Sample Coverage Issues in the Objective Measurement of Vocal Quality: The Case of Roughness and Breathiness

    ERIC Educational Resources Information Center

    Ferrer, Carlos A.; Haderlein, Tino; Maryn, Youri; de Bodt, Marc S.; Nöth, Elmar

    2018-01-01

    Purpose: The aim of the study was to address the reported inconsistencies in the relationship between objective acoustic measures and perceptual ratings of vocal quality. Method: This tutorial moves away from the more widely examined problems related to obtaining the perceptual ratings and the acoustic measures and centers in less scrutinized…

  4. How product trial changes quality perception of four new processed beef products.

    PubMed

    Saeed, Faiza; Grunert, Klaus G; Therkildsen, Margrethe

    2013-01-01

    The purpose of this paper is the quantitative analysis of the change in quality perception of four new processed beef products from pre to post trial phases. Based on the Total Food Quality Model, differences in pre and post-trial phases were measured using repeated measures technique for cue evaluation, quality evaluation and purchase motive fulfillment. For two of the tested products, trial resulted in a decline of the evaluation of cues, quality and purchase motive fulfillment compared to pre-trial expectations. For these products, positive expectations were created by giving information about ingredients and ways of processing, which were not confirmed during trial. For the other two products, evaluations on key sensory dimensions based on trial exceeded expectations, whereas the other evaluations remained unchanged. Several demographic factors influenced the pattern of results, notably age and gender, which may be due to underlying differences in previous experience. The study gives useful insights for testing of new processed meat products before market introduction. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  6. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  7. Restabilizing attachment to cultural objects. Aesthetics, emotions and biography.

    PubMed

    Benzecry, Claudio E

    2015-12-01

    The scholarship on aesthetics and materiality has studied how objects help shape identity, social action and subjectivity. Objects, as 'equipment[s] for living' (Luhmann 2000), become the 'obligatory passage points humans have to contend with in order to pursue their projects (Latour 1991). They provide patterns to which bodies can unconsciously latch onto, or help human agents work towards particular states of being (DeNora 2000, 2003). Objects are central in the long term process of taste construction, as any attachment to an object is made out of a delicate equilibrium of mediators, bodies, situations and techniques (Hennion and his collaborators (Hennion and Fouquet 2001; Hennion and Gomart 1999). In all of these accounts objects are the end result of long-term processes of stabilization, in which the actual material object (a musical piece, a sculpture, an art installation, a glass of wine, the oeuvre of Bach as we know it) is both a result and yet a key co-producer of its own generation. Whereas the literature has been generous and detailed in exploring the processes of assembling and sustaining object-centered attachments, it has not sufficiently engaged with what happens when the aesthetic elements of cultural artifacts that have produced emotional resonance are transformed: what do these artifacts morph into? What explains the transition (or not) of different cultural objects? And relatedly, what happens to the key aesthetic qualities that were so central to how the objects had been defined, and to those who have emotionally attached to them? To answer these questions, this article uses as exemplars two different cases of attachment, predicated on the distinctive features of a cultural object--the transcendence of opera and the authenticity of a soccer jersey--that have undergone transformations. © London School of Economics and Political Science 2015.

  8. Bio-objects and the media: the role of communication in bio-objectification processes

    PubMed Central

    Maeseele, Pieter; Allgaier, Joachim; Martinelli, Lucia

    2013-01-01

    The representation of biological innovations in and through communication and media practices is vital for understanding the nature of “bio-objects” and the process we call “bio-objectification.” This paper discusses two ideal-typical analytical approaches based on different underlying communication models, ie, the traditional (science- and media-centered) and media sociological (a multi-layered process involving various social actors in defining the meanings of scientific and technological developments) approach. In this analysis, the latter is not only found to be the most promising approach for understanding the circulation, (re)production, and (re)configuration of meanings of bio-objects, but also to interpret the relationship between media and science. On the basis of a few selected examples, this paper highlights how media function as a primary arena for the (re)production and (re)configuration of scientific and biomedical information with regards to bio-objects in the public sphere in general, and toward decision-makers, interest groups, and the public in specific. PMID:23771763

  9. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  10. Duration and quality of the peer review process: the author's perspective.

    PubMed

    Huisman, Janine; Smits, Jeroen

    2017-01-01

    To gain insight into the duration and quality of the scientific peer review process, we analyzed data from 3500 review experiences submitted by authors to the SciRev.sc website. Aspects studied are duration of the first review round, total review duration, immediate rejection time, the number, quality, and difficulty of referee reports, the time it takes authors to revise and resubmit their manuscript, and overall quality of the experience. We find clear differences in these aspects between scientific fields, with Medicine, Public health, and Natural sciences showing the shortest durations and Mathematics and Computer sciences, Social sciences, Economics and Business, and Humanities the longest. One-third of journals take more than 2 weeks for an immediate (desk) rejection and one sixth even more than 4 weeks. This suggests that besides the time reviewers take, inefficient editorial processes also play an important role. As might be expected, shorter peer review processes and those of accepted papers are rated more positively by authors. More surprising is that peer review processes in the fields linked to long processes are rated highest and those in the fields linked to short processes lowest. Hence authors' satisfaction is apparently influenced by their expectations regarding what is common in their field. Qualitative information provided by the authors indicates that editors can enhance author satisfaction by taking an independent position vis-à-vis reviewers and by communicating well with authors.

  11. Tacit Quality Leadership: Operationalized Quality Perceptions as a Source of Influence in the American Higher Education Accreditation Process

    ERIC Educational Resources Information Center

    Saurbier, Ann L.

    2013-01-01

    American post-secondary education faces unprecedented challenges in the dynamic 21st century environment. An appreciation of the higher education accreditation process, as a quality control mechanism, therefore may be seen as a significant priority. When American higher education is viewed systemically, the perceptions of quality held and…

  12. [Quality control in anesthesiology].

    PubMed

    Muñoz-Ramón, J M

    1995-03-01

    The process of quality control and auditing of anesthesiology allows us to evaluate care given by a service and solve problems that are detected. Quality control is a basic element of care giving and is only secondarily an area of academic research; it is therefore a meaningless effort if the information does not serve to improve departmental procedures. Quality assurance procedures assume certain infrastructural requirements and an initial period of implementation and adjustment. The main objectives of quality control are the reduction of morbidity and mortality due to anesthesia, assurance of the availability and proper management of resources and, finally, the well-being and safety of the patient.

  13. Objects of attention, objects of perception.

    PubMed

    Avrahami, J

    1999-11-01

    Four experiments were conducted, to explore the notion of objects in perception. Taking as a starting point the effects of display content on rapid attention transfer and manipulating curvature, closure, and processing time, a link between objects of attention and objects of perception is proposed. In Experiment 1, a number of parallel, equally spaced, straight lines facilitated attention transfer along the lines, relative to transfer across the lines. In Experiment 2, with curved, closed-contour shapes, no "same-object" facilitation was observed. However, when a longer time interval was provided, in Experiment 3, a same-object advantage started to emerge. In Experiment 4, using the same curved shapes but in a non-speeded distance estimation task, a strong effect of objects was observed. It is argued that attention transfer is facilitated by line tracing but that line tracing is encouraged by objects.

  14. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    PubMed

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  15. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times

    PubMed Central

    Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems. PMID:27907163

  16. Individual Differences in Study Processes and the Quality of Learning Outcomes.

    ERIC Educational Resources Information Center

    Biggs, John

    1979-01-01

    The relationship between students' study processes and the structural complexity of their learning is examined. Study processes are viewed in terms of three dimensions and are assessed by a questionnaire. Learning quality is expressed in levels of a taxonomy. A study that relates taxonomic levels and retention to study processes is reported.…

  17. Quality risk management in pharmaceutical development.

    PubMed

    Charoo, Naseem Ahmad; Ali, Areeg Anwer

    2013-07-01

    The objective of ICH Q8, Q9 and Q10 documents is application of systemic and science based approach to formulation development for building quality into product. There is always some uncertainty in new product development. Good risk management practice is essential for success of new product development in decreasing this uncertainty. In quality by design paradigm, the product performance properties relevant to the patient are predefined in target product profile (TPP). Together with prior knowledge and experience, TPP helps in identification of critical quality attributes (CQA's). Initial risk assessment which identifies risks to these CQA's provides impetus for product development. Product and process are designed to gain knowledge about these risks, devise strategies to eliminate or mitigate these risks and meet objectives set in TPP. By laying more emphasis on high risk events the protection level of patient is increased. The process being scientifically driven improves the transparency and reliability of the manufacturer. The focus on risk to the patient together with flexible development approach saves invaluable resources, increases confidence on quality and reduces compliance risk. The knowledge acquired in analysing risks to CQA's permits construction of meaningful design space. Within the boundaries of the design space, variation in critical material characteristics and process parameters must be managed in order to yield a product having the desired characteristics. Specifications based on product and process understanding are established such that product will meet the specifications if tested. In this way, the product is amenable to real time release, since specifications only confirm quality but they do not serve as a means of effective process control.

  18. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation.

    PubMed

    Duncan, Susan E; Moberg, Kayla; Amin, Kemia N; Wright, Melissa; Newkirk, Jordan J; Ponder, Monica A; Acuff, Gary R; Dickson, James S

    2017-05-01

    Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted-steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L * ) and more red (higher a * ) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b * ; lower L * ) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. © 2017 The Authors. Journal of Food Science published by Wiley Periodicals, Inc. on behalf of Institute of Food Technologists.

  19. Astronomical Instrumentation Systems Quality Management Planning: AISQMP

    NASA Astrophysics Data System (ADS)

    Goldbaum, Jesse

    2017-06-01

    The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.

  20. Objective Speech Quality Assessment Based on Payload Discrimination of Lost Packets for Cellular Phones in NGN Environment

    NASA Astrophysics Data System (ADS)

    Uemura, Satoshi; Fukumoto, Norihiro; Yamada, Hideaki; Nakamura, Hajime

    A feature of services provided in a Next Generation Network (NGN) is that the end-to-end quality is guaranteed. This is quite a challenging issue, given the considerable fluctuation in network conditions within a Fixed Mobile Convergence (FMC) network. Therefore, a novel approach, whereby a network node and a mobile terminal such as a cellular phone cooperate with each other to control service quality is essential. In order to achieve such cooperation, the mobile terminal needs to become more intelligent so it can estimate the service quality, including the user's perceptual quality, and notify the measurement result to the network node. Subsequently, the network node implements some kind of service control function, such as a resource and admission control function, based on the notification from the mobile terminal. In this paper, the role of the mobile terminal in such collaborative system is focused on. As a part of a QoS/QoE measurement system, we describe an objective speech quality assessment with payload discrimination of lost packets to measure the user's perceptual quality of VoIP. The proposed assessment is so simple that it can be implemented on a cellular phone. We therefore did this as part of the QoS/QoE measurement system. By using the implemented system, we can measure the user's perceptual quality of VoIP as well as the network QoS metrics, in terms of criteria such as packet loss rate, jitter and burstiness in real time.

  1. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  2. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  3. Optical derotator alignment using image-processing algorithm for tracking laser vibrometer measurements of rotating objects.

    PubMed

    Khalil, Hossam; Kim, Dongkyu; Jo, Youngjoon; Park, Kyihwan

    2017-06-01

    An optical component called a Dove prism is used to rotate the laser beam of a laser-scanning vibrometer (LSV). This is called a derotator and is used for measuring the vibration of rotating objects. The main advantage of a derotator is that it works independently from an LSV. However, this device requires very specific alignment, in which the axis of the Dove prism must coincide with the rotational axis of the object. If the derotator is misaligned with the rotating object, the results of the vibration measurement are imprecise, owing to the alteration of the laser beam on the surface of the rotating object. In this study, a method is proposed for aligning a derotator with a rotating object through an image-processing algorithm that obtains the trajectory of a landmark attached to the object. After the trajectory of the landmark is mathematically modeled, the amount of derotator misalignment with respect to the object is calculated. The accuracy of the proposed method for aligning the derotator with the rotating object is experimentally tested.

  4. Risk-based Process Development of Biosimilars as Part of the Quality by Design Paradigm.

    PubMed

    Zalai, Dénes; Dietzsch, Christian; Herwig, Christoph

    2013-01-01

    In the last few years, several quality by design (QbD) studies demonstrated the benefit of systematic approaches for biopharmaceutical development. However, only very few studies identified biosimilars as a special case of product development. The targeted quality profile of biosimilars is strictly defined by the originator's product characteristic. Moreover, the major source of prior knowledge is the experience with the originator product itself. Processing this information in biosimilar development has a major effect on risk management and process development strategies. The main objective of this contribution is to demonstrate how risk management can facilitate the implementation of QbD in early-stage product development with special emphasis on fitting the reported approaches to biosimilars. Risk assessments were highlighted as important tools to integrate prior knowledge in biosimilar development. The risk assessment process as suggested by the International Conference on Harmonization (ICH Q9) was reviewed and three elements were identified to play a key role in targeted risk assessment approaches: proper understanding of target linkage, risk assessment tool compliance, and criticality threshold value. Adjusting these steps to biosimilar applications helped to address some unique challenges of these products such as a strictly defined quality profile or a lack of process knowledge. This contribution demonstrates the need for tailored risk management approaches for the risk-based development of biosimilars and provides novel tools for the integration of additional knowledge available for these products. The pharmaceutical industry is facing challenges such as profit loss and price competition. Companies are forced to rationalize business models and to cut costs in development as well as manufacturing. These trends recently hinder the implementation of any concepts that do not offer certain financial benefit or promise a long return of investment. Quality by

  5. Using IT to improve quality at NewYork-Presybterian Hospital: a requirements-driven strategic planning process.

    PubMed

    Kuperman, Gilad J; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality.

  6. Using IT to Improve Quality at NewYork-Presybterian Hospital: A Requirements-Driven Strategic Planning Process

    PubMed Central

    Kuperman, Gilad J.; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D.; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality. PMID:17238381

  7. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  8. Prediction processes during multiple object tracking (MOT): involvement of dorsal and ventral premotor cortices

    PubMed Central

    Atmaca, Silke; Stadler, Waltraud; Keitel, Anne; Ott, Derek V M; Lepsien, Jöran; Prinz, Wolfgang

    2013-01-01

    Background The multiple object tracking (MOT) paradigm is a cognitive task that requires parallel tracking of several identical, moving objects following nongoal-directed, arbitrary motion trajectories. Aims The current study aimed to investigate the employment of prediction processes during MOT. As an indicator for the involvement of prediction processes, we targeted the human premotor cortex (PM). The PM has been repeatedly implicated to serve the internal modeling of future actions and action effects, as well as purely perceptual events, by means of predictive feedforward functions. Materials and methods Using functional magnetic resonance imaging (fMRI), BOLD activations recorded during MOT were contrasted with those recorded during the execution of a cognitive control task that used an identical stimulus display and demanded similar attentional load. A particular effort was made to identify and exclude previously found activation in the PM-adjacent frontal eye fields (FEF). Results We replicated prior results, revealing occipitotemporal, parietal, and frontal areas to be engaged in MOT. Discussion The activation in frontal areas is interpreted to originate from dorsal and ventral premotor cortices. The results are discussed in light of our assumption that MOT engages prediction processes. Conclusion We propose that our results provide first clues that MOT does not only involve visuospatial perception and attention processes, but prediction processes as well. PMID:24363971

  9. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  10. A cross-national study to objectively evaluate the quality of diverse simulation approaches for undergraduate nursing students.

    PubMed

    Kable, Ashley K; Levett-Jones, Tracy L; Arthur, Carol; Reid-Searl, Kerry; Humphreys, Melanie; Morris, Sara; Walsh, Pauline; Witton, Nicola J

    2018-01-01

    The aim of this paper is to report the results of a cross-national study that evaluated a range of simulation sessions using an observation schedule developed from evidence-based quality indicators. Observational data were collected from 17 simulation sessions conducted for undergraduate nursing students at three universities in Australia and the United Kingdom. The observation schedule contained 27 questions that rated simulation quality. Data were collected by direct observation and from video recordings of the simulation sessions. Results indicated that the highest quality scores were for provision of learning objectives prior to the simulation session (90%) and debriefing (72%). Student preparatiosn and orientation (67%) and perceived realism and fidelity (67%) were scored lower than other components of the simulation sessions. This observational study proved to be an effective strategy to identify areas of strength and those needing further development to improve simulation sessions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    PubMed

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. The Processing of Subject-Object Ambiguities in Native and Near-Native Mexican Spanish

    ERIC Educational Resources Information Center

    Jegerski, Jill

    2012-01-01

    This self-paced reading study first tested the prediction that the garden path effect previously observed during the processing of subject-object ambiguities in native English would not obtain in a null subject language like Spanish. The investigation then further explored whether the effect would be evident among near-native readers of Spanish…

  13. [Feedforward control strategy and its application in quality improvement of ethanol precipitation process of danhong injection].

    PubMed

    Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao

    2013-06-01

    In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.

  14. 3-D Interpolation in Object Perception: Evidence from an Objective Performance Paradigm

    ERIC Educational Resources Information Center

    Kellman, Philip J.; Garrigan, Patrick; Shipley, Thomas F.; Yin, Carol; Machado, Liana

    2005-01-01

    Object perception requires interpolation processes that connect visible regions despite spatial gaps. Some research has suggested that interpolation may be a 3-D process, but objective performance data and evidence about the conditions leading to interpolation are needed. The authors developed an objective performance paradigm for testing 3-D…

  15. A variation reduction allocation model for quality improvement to minimize investment and quality costs by considering suppliers’ learning curve

    NASA Astrophysics Data System (ADS)

    Rosyidi, C. N.; Jauhari, WA; Suhardi, B.; Hamada, K.

    2016-02-01

    Quality improvement must be performed in a company to maintain its product competitiveness in the market. The goal of such improvement is to increase the customer satisfaction and the profitability of the company. In current practice, a company needs several suppliers to provide the components in assembly process of a final product. Hence quality improvement of the final product must involve the suppliers. In this paper, an optimization model to allocate the variance reduction is developed. Variation reduction is an important term in quality improvement for both manufacturer and suppliers. To improve suppliers’ components quality, the manufacturer must invest an amount of their financial resources in learning process of the suppliers. The objective function of the model is to minimize the total cost consists of investment cost, and quality costs for both internal and external quality costs. The Learning curve will determine how the employee of the suppliers will respond to the learning processes in reducing the variance of the component.

  16. Fit for purpose quality management system for military forensic exploitation.

    PubMed

    Wilson, Lauren Elizabeth; Gahan, Michelle Elizabeth; Robertson, James; Lennard, Chris

    2018-03-01

    In a previous publication we described a systems approach to forensic science applied in the military domain. The forensic science 'system of systems' describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military systems, with quality management being an important supporting system. Quality management systems help to ensure that organisations achieve their objective and continually improve their capability. Components of forensic science quality management systems can include standardisation of processes, accreditation of facilities to national/international standards, and certification of personnel. A fit for purpose quality management system should be balanced to allow organisations to meet objectives, provide continuous improvement; mitigate risk; and impart a positive quality culture. Considerable attention over the last decades has been given to the need for forensic science quality management systems to meet criminal justice and law enforcement objectives. More recently, the need for the forensic quality management systems to meet forensic intelligence objectives has been considered. This paper, for the first time, discusses the need for a fit for purpose quality management system for military forensic exploitation. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  17. Multi-objective optimization of a continuous bio-dissimilation process of glycerol to 1, 3-propanediol.

    PubMed

    Xu, Gongxian; Liu, Ying; Gao, Qunwang

    2016-02-10

    This paper deals with multi-objective optimization of continuous bio-dissimilation process of glycerol to 1, 3-propanediol. In order to maximize the production rate of 1, 3-propanediol, maximize the conversion rate of glycerol to 1, 3-propanediol, maximize the conversion rate of glycerol, and minimize the concentration of by-product ethanol, we first propose six new multi-objective optimization models that can simultaneously optimize any two of the four objectives above. Then these multi-objective optimization problems are solved by using the weighted-sum and normal-boundary intersection methods respectively. Both the Pareto filter algorithm and removal criteria are used to remove those non-Pareto optimal points obtained by the normal-boundary intersection method. The results show that the normal-boundary intersection method can successfully obtain the approximate Pareto optimal sets of all the proposed multi-objective optimization problems, while the weighted-sum approach cannot achieve the overall Pareto optimal solutions of some multi-objective problems. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Statistical Process Control: Going to the Limit for Quality.

    ERIC Educational Resources Information Center

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  19. Intelligent Systems Approaches to Product Sound Quality Analysis

    NASA Astrophysics Data System (ADS)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  20. Distinct and overlapping fMRI activation networks for processing of novel identities and locations of objects.

    PubMed

    Pihlajamäki, Maija; Tanila, Heikki; Könönen, Mervi; Hänninen, Tuomo; Aronen, Hannu J; Soininen, Hilkka

    2005-10-01

    The ventral visual stream processes information about the identity of objects ('what'), whereas the dorsal stream processes the spatial locations of objects ('where'). There is a corresponding, although disputed, distinction for the ventrolateral and dorsolateral prefrontal areas. Furthermore, there seems to be a distinction between the anterior and posterior medial temporal lobe (MTL) structures in the processing of novel items and new spatial arrangements, respectively. Functional differentiation of the intermediary mid-line cortical and temporal neocortical structures that communicate with the occipitotemporal, occipitoparietal, prefrontal, and MTL structures, however, is unclear. Therefore, in the present functional magnetic resonance imaging (fMRI) study, we examined whether the distinction among the MTL structures extends to these closely connected cortical areas. The most striking difference in the fMRI responses during visual presentation of changes in either items or their locations was the bilateral activation of the temporal lobe and ventrolateral prefrontal cortical areas for novel object identification in contrast to wide parietal and dorsolateral prefrontal activation for the novel locations of objects. An anterior-posterior distinction of fMRI responses similar to the MTL was observed in the cingulate/retrosplenial, and superior and middle temporal cortices. In addition to the distinct areas of activation, certain frontal, parietal, and temporo-occipital areas responded to both object and spatial novelty, suggesting a common attentional network for both types of changes in the visual environment. These findings offer new insights to the functional roles and intrinsic specialization of the cingulate/retrosplenial, and lateral temporal cortical areas in visuospatial cognition.

  1. Processing and Quality Monitoring for the ATLAS Tile Hadronic Calorimeter Data

    NASA Astrophysics Data System (ADS)

    Burghgrave, Blake; ATLAS Collaboration

    2017-10-01

    An overview is presented of Data Processing and Data Quality (DQ) Monitoring for the ATLAS Tile Hadronic Calorimeter. Calibration runs are monitored from a data quality perspective and used as a cross-check for physics runs. Data quality in physics runs is monitored extensively and continuously. Any problems are reported and immediately investigated. The DQ efficiency achieved was 99.6% in 2012 and 100% in 2015, after the detector maintenance in 2013-2014. Changes to detector status or calibrations are entered into the conditions database (DB) during a brief calibration loop between the end of a run and the beginning of bulk processing of data collected in it. Bulk processed data are reviewed and certified for the ATLAS Good Run List if no problem is detected. Experts maintain the tools used by DQ shifters and the calibration teams during normal operation, and prepare new conditions for data reprocessing and Monte Carlo (MC) production campaigns. Conditions data are stored in 3 databases: Online DB, Offline DB for data and a special DB for Monte Carlo. Database updates can be performed through a custom-made web interface.

  2. Effect of pilot-scale aseptic processing on tomato soup quality parameters.

    PubMed

    Colle, Ines J P; Andrys, Anna; Grundelius, Andrea; Lemmens, Lien; Löfgren, Anders; Buggenhout, Sandy Van; Loey, Ann; Hendrickx, Marc Van

    2011-01-01

    Tomatoes are often processed into shelf-stable products. However, the different processing steps might have an impact on the product quality. In this study, a model tomato soup was prepared and the impact of pilot-scale aseptic processing, including heat treatment and high-pressure homogenization, on some selected quality parameters was evaluated. The vitamin C content, the lycopene isomer content, and the lycopene bioaccessibility were considered as health-promoting attributes. As a structural characteristic, the viscosity of the tomato soup was investigated. A tomato soup without oil as well as a tomato soup containing 5% olive oil were evaluated. Thermal processing had a negative effect on the vitamin C content, while lycopene degradation was limited. For both compounds, high-pressure homogenization caused additional losses. High-pressure homogenization also resulted in a higher viscosity that was accompanied by a decrease in lycopene bioaccessibility. The presence of lipids clearly enhanced the lycopene isomerization susceptibility and improved the bioaccessibility. The results obtained in this study are of relevance for product formulation and process design of tomato-based food products. © 2011 Institute of Food Technologists®

  3. Paraho environmental data. Part I. Process characterization. Par II. Air quality. Part III. Water quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heistand, R.N.; Atwood, R.A.; Richardson, K.L.

    1980-06-01

    From 1973 to 1978, Development Engineering, Inc. (DEI), a subsidiary of Paraho Development Corporation, demostrated the Paraho technology for surface oil shale retorting at Anvil Points, Colorado. A considerable amount of environmentally-related research was also conducted. This body of data represents the most comprehensive environmental data base relating to surface retorting that is currently available. In order to make this information available, the DOE Office of Environment has undertaken to compile, assemble, and publish this environmental data. The compilation has been prepared by DEI. This report includes the process characterization, air quality, and water quality categories.

  4. Modern Methods of Measuring and Modelling Architectural Objects in the Process of their Valorisation

    NASA Astrophysics Data System (ADS)

    Zagroba, Marek

    2017-10-01

    As well as being a cutting-edge technology, laser scanning is still developing rapidly. Laser scanners have an almost unlimited range of use in many disciplines of contemporary engineering, where precision and high quality of tasks performed are of the utmost importance. Among these disciplines, special attention is drawn to architecture and urban space studies that is the fields of science which shape the space and surroundings occupied by people, thus having a direct impact on people’s lives. It is more complicated to take measurements with a laser scanner than with traditional methods, where laser target markers or a measuring tape are used. A specific procedure must be followed when measurements are taken with a laser scanner, and the aim is to obtain three-dimensional data about a building situated in a given space. Accuracy, low time consumption, safety and non-invasiveness are the primary advantages of this technology used in the civil engineering practice, when handling both historic and modern architecture. Using a laser scanner is especially important when taking measurements of vast engineering constructions, where an application of traditional techniques would be much more difficult and would require higher time and labour inputs, for example because of some less easily accessible nooks and crannies or due to the geometrical complexity of individual components of a building structure. In this article, the author undertakes the problem of measuring and modelling architectural objects in the process of their valorisation, i.e. the enhancement of their functional, usable, spatial and aesthetic values. Above all, the laser scanning method, by generating results as a point cloud, enables the user to obtain a very detailed, three-dimensional computer image of measured objects, and to make series of analyses and expert investigations, e.g. of the technical condition (deformation of construction elements) as well as the spatial management of the surrounding

  5. High-quality compressive ghost imaging

    NASA Astrophysics Data System (ADS)

    Huang, Heyan; Zhou, Cheng; Tian, Tian; Liu, Dongqi; Song, Lijun

    2018-04-01

    We propose a high-quality compressive ghost imaging method based on projected Landweber regularization and guided filter, which effectively reduce the undersampling noise and improve the resolution. In our scheme, the original object is reconstructed by decomposing of regularization and denoising steps instead of solving a minimization problem in compressive reconstruction process. The simulation and experimental results show that our method can obtain high ghost imaging quality in terms of PSNR and visual observation.

  6. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  7. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach

    PubMed Central

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Context: Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. Aims: This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Materials and Methods: Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts’ deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Statistical Analysis Used: Non-parametric tests and AHP approach using Expert Choice software. Results: The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution “controlling and improving the process in handling users complaints” is of the utmost importance and authorities

  8. Road-safety education: spatial decentering and subjective or objective picture processing.

    PubMed

    Guercin, F

    2007-10-01

    The current study examined children's ability to analyse pictures of a risky situation, both in relation to the characteristics of the pictures and in relation to the centering/decentering process of cognitive development. Sixty children aged 6, 9 or 11 years were given an objective or subjective version of a story about a risky situation involving road crossing and were asked to reconstruct it by putting six pictures in chronological order. The type of picture series, objective or subjective, had a different effect on the children's understanding and performance, according to the age. The older children were better at ordering the pictures, but on the subjective version only. The picture-version effect on planning time decreased with age; only the younger children took more time to start touching the pictures. On one hand, it is concluded that for the youngest children, objective representations are essential to analysing pictures showing a risk, whereas the oldest children will profit more from a subjective view. On the other hand, subjective representations, which give a more realistic view, provide an excellent tool for testing children's abilities. Subjective representations can be used to detect potentially risky behaviour in virtual situations (static pictures, or multimedia tools), since it permits one to predict at-risk behaviour in the street and to assess the effectiveness of remedial measures.

  9. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  10. [Study of continuous quality improvement for clinical laboratory processes via the platform of Hospital Group].

    PubMed

    Song, Wenqi; Shen, Ying; Peng, Xiaoxia; Tian, Jian; Wang, Hui; Xu, Lili; Nie, Xiaolu; Ni, Xin

    2015-05-26

    The program of continuous quality improvement in clinical laboratory processes for complete blood count (CBC) was launched via the platform of Beijing Children's Hospital Group in order to improve the quality of pediatric clinical laboratories. Fifteen children's hospitals of Beijing Children's Hospital group were investigated using the method of Chinese adapted continuous quality improvement with PDCA (Plan-Do-Check-Action). The questionnaire survey and inter-laboratory comparison was conducted to find the existing problems, to analyze reasons, to set forth quality targets and to put them into practice. Then, targeted training was conducted to 15 children's hospitals and the second questionnaire survey, self examinations by the clinical laboratories was performed. At the same time, the Group's online internal quality control platform was established. Overall effects of the program were evaluated so that lay a foundation for the next stage of PDCA. Both quality of control system documents and CBC internal quality control scheme for all of clinical laboratories were improved through this program. In addition, standardization of performance verification was also improved, especially with the comparable verification rate of precision and internal laboratory results up to 100%. In terms of instrument calibration and mandatory diagnostic rates, only three out of the 15 hospitals (20%) failed to pass muster in 2014 from 46.67% (seven out of the 15 hospitals) in 2013. The abnormal data of intraday precision variance coefficients of the five CBC indicator parameters (WBC, RBC, Hb, Plt and Hct) of all the 15 laboratories accounted for 1.2% (2/165) in 2014, a marked decrease from 9.6% (14/145) in 2013. While the number of the hospitals using only one horizontal quality control object for daily quality control has dropped to three from five. The 15 hospitals organized a total of 263 times of training in 2014 from 101 times in 2013, up 160%. The quality improvement program for

  11. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through themore » DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.« less

  12. Research on Objectives for High-School Biology

    ERIC Educational Resources Information Center

    Korgan, John J., Jr.; Wilson, John T.

    1973-01-01

    Describes procedures to develop instructional objectives for high school biology. Two kinds of objectives are identified as pre-objectives and performance objectives. Models to classify these in branches of biology and to ensure quality control are provided. (PS)

  13. [The significance of meat quality in marketing].

    PubMed

    Kallweit, E

    1994-07-01

    Food quality in general and meat quality in particular are not only evaluated by means of objective quality traits but the entire production process is gaining more attention by the modern consumer. Due to this development quality programs were developed to define the majority of the processes in all production and marketing steps which are again linked by contracts. Not all of these items are quality relevant, but are concessions to ethic principles (animal welfare etc.). This is demonstrated by the example of Scharrel-pork production. The price differentiation at the pork market is still influenced predominantly by quantitative carcass traits. On the European market quality programs still are of minor significance. Premiums which are paid for high quality standards are more or less compensated by higher production costs and lower lean meat percentages, which must be expected in stress susceptible strains. The high efforts to establish quality programs, however, help to improve the quality level in general, and secure the market shares for local producers.

  14. EPA Region 7 and Four States Water Quality Standards Review Process Kaizen Event

    EPA Pesticide Factsheets

    The submittal, review and approval process of the EPA–State process for developing and revising Water Quality Standards (WQS) was the focus of a Lean business process improvement kaizen event in June 2007.

  15. Distinct brain activity in processing negative pictures of animals and objects --- the role of human contexts

    PubMed Central

    Cao, Zhijun; Zhao, Yanbing; Tan, Tengteng; Chen, Gang; Ning, Xueling; Zhan, Lexia; Yang, Jiongjiong

    2013-01-01

    Previous studies have shown that the amygdala is important in processing not only animate entities but also social information. It remains to be determined to what extent the factors of category and social context interact to modulate the activities of the amygdala and cortical regions. In this study, pictures depicting animals and inanimate objects in negative and neutral levels were presented. The contexts of the pictures differed in whether they included human/human parts. The factors of valence, arousal, familiarity and complexity of pictures were controlled across categories. The results showed that the amygdala activity was modulated by category and contextual information. Under the nonhuman context condition, the amygdala responded more to animals than objects for both negative and neutral pictures. In contrast, under the human context condition, the amygdala showed stronger activity for negative objects than animals. In addition to cortical regions related to object action, functional and effective connectivity analyses showed that the anterior prefrontal cortex interacted with the amygdala more for negative objects (vs. animals) in the human context condition, by a top-down modulation of the anterior prefrontal cortex to the amygdala. These results highlighted the effects of category and human contexts on modulating brain activity in emotional processing. PMID:24099847

  16. Correlating objective and subjective evaluation of texture appearance with applications to camera phone imaging

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.

    2009-01-01

    Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.

  17. Quality metric for spherical panoramic video

    NASA Astrophysics Data System (ADS)

    Zakharchenko, Vladyslav; Choi, Kwang Pyo; Park, Jeong Hoon

    2016-09-01

    Virtual reality (VR)/ augmented reality (AR) applications allow users to view artificial content of a surrounding space simulating presence effect with a help of special applications or devices. Synthetic contents production is well known process form computer graphics domain and pipeline has been already fixed in the industry. However emerging multimedia formats for immersive entertainment applications such as free-viewpoint television (FTV) or spherical panoramic video require different approaches in content management and quality assessment. The international standardization on FTV has been promoted by MPEG. This paper is dedicated to discussion of immersive media distribution format and quality estimation process. Accuracy and reliability of the proposed objective quality estimation method had been verified with spherical panoramic images demonstrating good correlation results with subjective quality estimation held by a group of experts.

  18. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  19. Astronomical Instrumentation Systems Quality Management Planning: AISQMP (Abstract)

    NASA Astrophysics Data System (ADS)

    Goldbaum, J.

    2017-12-01

    (Abstract only) The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.

  20. Utilization of Expert Knowledge in a Multi-Objective Hydrologic Model Automatic Calibration Process

    NASA Astrophysics Data System (ADS)

    Quebbeman, J.; Park, G. H.; Carney, S.; Day, G. N.; Micheletty, P. D.

    2016-12-01

    Spatially distributed continuous simulation hydrologic models have a large number of parameters for potential adjustment during the calibration process. Traditional manual calibration approaches of such a modeling system is extremely laborious, which has historically motivated the use of automatic calibration procedures. With a large selection of model parameters, achieving high degrees of objective space fitness - measured with typical metrics such as Nash-Sutcliffe, Kling-Gupta, RMSE, etc. - can easily be achieved using a range of evolutionary algorithms. A concern with this approach is the high degree of compensatory calibration, with many similarly performing solutions, and yet grossly varying parameter set solutions. To help alleviate this concern, and mimic manual calibration processes, expert knowledge is proposed for inclusion within the multi-objective functions, which evaluates the parameter decision space. As a result, Pareto solutions are identified with high degrees of fitness, but also create parameter sets that maintain and utilize available expert knowledge resulting in more realistic and consistent solutions. This process was tested using the joint SNOW-17 and Sacramento Soil Moisture Accounting method (SAC-SMA) within the Animas River basin in Colorado. Three different elevation zones, each with a range of parameters, resulted in over 35 model parameters simultaneously calibrated. As a result, high degrees of fitness were achieved, in addition to the development of more realistic and consistent parameter sets such as those typically achieved during manual calibration procedures.

  1. Auditory object salience: human cortical processing of non-biological action sounds and their acoustic signal attributes

    PubMed Central

    Lewis, James W.; Talkington, William J.; Tallaksen, Katherine C.; Frum, Chris A.

    2012-01-01

    Whether viewed or heard, an object in action can be segmented as a distinct salient event based on a number of different sensory cues. In the visual system, several low-level attributes of an image are processed along parallel hierarchies, involving intermediate stages wherein gross-level object form and/or motion features are extracted prior to stages that show greater specificity for different object categories (e.g., people, buildings, or tools). In the auditory system, though relying on a rather different set of low-level signal attributes, meaningful real-world acoustic events and “auditory objects” can also be readily distinguished from background scenes. However, the nature of the acoustic signal attributes or gross-level perceptual features that may be explicitly processed along intermediate cortical processing stages remain poorly understood. Examining mechanical and environmental action sounds, representing two distinct non-biological categories of action sources, we had participants assess the degree to which each sound was perceived as object-like versus scene-like. We re-analyzed data from two of our earlier functional magnetic resonance imaging (fMRI) task paradigms (Engel et al., 2009) and found that scene-like action sounds preferentially led to activation along several midline cortical structures, but with strong dependence on listening task demands. In contrast, bilateral foci along the superior temporal gyri (STG) showed parametrically increasing activation to action sounds rated as more “object-like,” independent of sound category or task demands. Moreover, these STG regions also showed parametric sensitivity to spectral structure variations (SSVs) of the action sounds—a quantitative measure of change in entropy of the acoustic signals over time—and the right STG additionally showed parametric sensitivity to measures of mean entropy and harmonic content of the environmental sounds. Analogous to the visual system, intermediate stages

  2. [Investigation on production process quality control of traditional Chinese medicine--Banlangen granule as an example].

    PubMed

    Tan, Manrong; Yan, Dan; Qiu, Lingling; Chen, Longhu; Yan, Yan; Jin, Cheng; Li, Hanbing; Xiao, Xiaohe

    2012-04-01

    For the quality management system of herbal medicines, intermediate and finished products it exists the " short board" effect of methodologies. Based on the concept of process control, new strategies and new methods of the production process quality control had been established with the consideration of the actual production of traditional Chinese medicine an the characteristics of Chinese medicine. Taking Banlangen granule as a practice example, which was effective and widespread application, character identification, determination of index components, chemical fingerprint and biometrics technology were sequentially used respectively to assess the quality of Banlangen herbal medicines, intermediate (water extraction and alcohol precipitation) and finished product. With the transfer rate of chemical information and biological potency as indicators, the effectiveness and transmission of the above different assessments and control methods had been researched. And ultimately, the process quality control methods of Banlangen granule, which were based on chemical composition analysis-biometric analysis, had been set up. It can not only validly solute the current status that there were many manufacturers varying quality of Banlangen granule, but also ensure and enhance its clinical efficacy. Furthermore it provided a foundation for the construction of the quality control of traditional Chinese medicine production process.

  3. Effect of high pressure-high temperature process on meat product quality

    NASA Astrophysics Data System (ADS)

    Duranton, Frédérique; Marée, Elvire; Simonin, Hélène; Chéret, Romuald; de Lamballerie, Marie

    2011-03-01

    High pressure/high temperature (HPHT) processing is an innovative way to sterilize food and has been proposed as an alternative to conventional retorting. By using elevated temperatures and adiabatic compression, it allows the inactivation of vegetative microorganisms and pathogen spores. Even though the microbial inactivation has been widely studied, the effect of such process on sensorial attributes of food products, especially meat products, remains rare. The aim of this study was to investigate the potential of using HPHT process (500 MPa/115 °C) instead of conventional retorting to stabilize Toulouse sausages while retaining high organoleptic quality. The measurements of texture, color, water-holding capacity and microbial stability were investigated. It was possible to manufacture stable products at 500 MPa/115 °C/30 min. However, in these conditions, no improvement of the quality was found compared with conventional retorting.

  4. Effects of the situational context and interactional process on the quality of family caregiving.

    PubMed

    Phillips, L R; Morrison, E; Steffl, B; Chae, Y M; Cromwell, S L; Russell, C K

    1995-06-01

    A staged theoretical model designed to explain the quality of elder caring by family members was tested. The model posits how the situational context, interactional process, and caregiving burden perceived by the caregiver affect the quality of elder caring. The purpose was to determine the amount of variance explained by the interactional process beyond that explained by the situational context and caregiving burden. Data were collected from 209 elder-caregiver dyads using interviews, observations, and caregiver self-reports. The strongest predictors of caregiving burden were the caregiver's stressful negative life events (situational context) and discrepancy between past and present image of elder (interactional process). The strongest predictors of quality of elder caring were the caregiver's perception of subjective burden and a monitoring role definition on the part of the caregiver (interactional process).

  5. The Taguchi Method Application to Improve the Quality of a Sustainable Process

    NASA Astrophysics Data System (ADS)

    Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.

    2018-06-01

    Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.

  6. Parametric identification of the process of preparing ceramic mixture as an object of control

    NASA Astrophysics Data System (ADS)

    Galitskov, Stanislav; Nazarov, Maxim; Galitskov, Konstantin

    2017-10-01

    Manufacture of ceramic materials and products largely depends on the preparation of clay raw materials. The main process here is the process of mixing, which in industrial production is mostly done in cross-compound clay mixers of continuous operation with steam humidification. The authors identified features of dynamics of this technological stage, which in itself is a non-linear control object with distributed parameters. When solving practical tasks for automation of a certain class of ceramic materials production it is important to make parametric identification of moving clay. In this paper the task is solved with the use of computational models, approximated to a particular section of a clay mixer along its length. The research introduces a methodology of computational experiments as applied to the designed computational model. Parametric identification of dynamic links was carried out according to transient characteristics. The experiments showed that the control object in question is to a great extent a non-stationary one. The obtained results are problematically oriented on synthesizing a multidimensional automatic control system for preparation of ceramic mixture with specified values of humidity and temperature exposed to the technological process of major disturbances.

  7. Clinical characteristics and objective living conditions in relation to quality of life among community-based individuals of schizophrenia in Hong Kong.

    PubMed

    Yeung, Frederick Ka Ching; Chan, Sunny Ho Wan

    2006-11-01

    Quality of life (QOL) has gained importance as an outcome measure for people with schizophrenia living in the community following deinstitutionalization. This study aims at exploring the effects of clinical characteristics and objective living conditions on QOL. In this study, 201 community-based individuals with schizophrenia were recruited from five different types of objective living conditions comprising long stay care home, halfway house, supported hostel/housing, living with family, and living alone. Clinical characteristics including cognitive abilities, symptom levels, and community/social functioning were assessed by the Allen Cognitive Level Screen, the Scales for the Assessment of Negative Symptoms and Positive Symptoms, and the Chinese version of the Multnomah Community Ability Scale respectively. The outcome measure of QOL was measured by the Chinese version of the WHO Quality of Life Measure. Analysis of covariance showed significant differences in community/social functioning, cognitive abilities, and negative symptoms; but not in QOL under different objective living conditions. Further simultaneous multiple regressions found out that community/social functioning was the robust significant predictor of QOL. Yet caution should be noted in making the conclusion with the objective living condition of long stay care home, as it provides a protective element for the perseverance of QOL.

  8. Dual-energy CT in patients with abdominal malignant lymphoma: impact of noise-optimised virtual monoenergetic imaging on objective and subjective image quality.

    PubMed

    Lenga, L; Czwikla, R; Wichmann, J L; Leithner, D; Albrecht, M H; D'Angelo, T; Arendt, C T; Booz, C; Hammerstingl, R; Vogl, T J; Martin, S S

    2018-06-05

    To investigate the impact of noise-optimised virtual monoenergetic imaging (VMI+) reconstructions on quantitative and qualitative image parameters in patients with malignant lymphoma at dual-energy computed tomography (DECT) examinations of the abdomen. Thirty-five consecutive patients (mean age, 53.8±18.6 years; range, 21-82 years) with histologically proven malignant lymphoma of the abdomen were included retrospectively. Images were post-processed with standard linear blending (M_0.6), traditional VMI, and VMI+ technique at energy levels ranging from 40 to 100 keV in 10 keV increments. Signal-to-noise (SNR) and contrast-to-noise ratios (CNR) were objectively measured in lymphoma lesions. Image quality, lesion delineation, and image noise were rated subjectively by three blinded observers using five-point Likert scales. Quantitative image quality parameters peaked at 40-keV VMI+ (SNR, 15.77±7.74; CNR, 18.27±8.04) with significant differences compared to standard linearly blended M_0.6 (SNR, 7.96±3.26; CNR, 13.55±3.47) and all traditional VMI series (p<0.001). Qualitative image quality assessment revealed significantly superior ratings for image quality at 60-keV VMI+ (median, 5) in comparison with all other image series (p<0.001). Assessment of lesion delineation showed the highest rating scores for 40-keV VMI+ series (median, 5), while lowest subjective image noise was found for 100-keV VMI+ reconstructions (median, 5). Low-keV VMI+ reconstructions led to improved image quality and lesion delineation of malignant lymphoma lesions compared to standard image reconstruction and traditional VMI at abdominal DECT examinations. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  9. Representational Similarity Analysis Reveals Commonalities and Differences in the Semantic Processing of Words and Objects

    PubMed Central

    Devereux, Barry J.; Clarke, Alex; Marouchos, Andreas; Tyler, Lorraine K.

    2013-01-01

    Understanding the meanings of words and objects requires the activation of underlying conceptual representations. Semantic representations are often assumed to be coded such that meaning is evoked regardless of the input modality. However, the extent to which meaning is coded in modality-independent or amodal systems remains controversial. We address this issue in a human fMRI study investigating the neural processing of concepts, presented separately as written words and pictures. Activation maps for each individual word and picture were used as input for searchlight-based multivoxel pattern analyses. Representational similarity analysis was used to identify regions correlating with low-level visual models of the words and objects and the semantic category structure common to both. Common semantic category effects for both modalities were found in a left-lateralized network, including left posterior middle temporal gyrus (LpMTG), left angular gyrus, and left intraparietal sulcus (LIPS), in addition to object- and word-specific semantic processing in ventral temporal cortex and more anterior MTG, respectively. To explore differences in representational content across regions and modalities, we developed novel data-driven analyses, based on k-means clustering of searchlight dissimilarity matrices and seeded correlation analysis. These revealed subtle differences in the representations in semantic-sensitive regions, with representations in LIPS being relatively invariant to stimulus modality and representations in LpMTG being uncorrelated across modality. These results suggest that, although both LpMTG and LIPS are involved in semantic processing, only the functional role of LIPS is the same regardless of the visual input, whereas the functional role of LpMTG differs for words and objects. PMID:24285896

  10. Representational similarity analysis reveals commonalities and differences in the semantic processing of words and objects.

    PubMed

    Devereux, Barry J; Clarke, Alex; Marouchos, Andreas; Tyler, Lorraine K

    2013-11-27

    Understanding the meanings of words and objects requires the activation of underlying conceptual representations. Semantic representations are often assumed to be coded such that meaning is evoked regardless of the input modality. However, the extent to which meaning is coded in modality-independent or amodal systems remains controversial. We address this issue in a human fMRI study investigating the neural processing of concepts, presented separately as written words and pictures. Activation maps for each individual word and picture were used as input for searchlight-based multivoxel pattern analyses. Representational similarity analysis was used to identify regions correlating with low-level visual models of the words and objects and the semantic category structure common to both. Common semantic category effects for both modalities were found in a left-lateralized network, including left posterior middle temporal gyrus (LpMTG), left angular gyrus, and left intraparietal sulcus (LIPS), in addition to object- and word-specific semantic processing in ventral temporal cortex and more anterior MTG, respectively. To explore differences in representational content across regions and modalities, we developed novel data-driven analyses, based on k-means clustering of searchlight dissimilarity matrices and seeded correlation analysis. These revealed subtle differences in the representations in semantic-sensitive regions, with representations in LIPS being relatively invariant to stimulus modality and representations in LpMTG being uncorrelated across modality. These results suggest that, although both LpMTG and LIPS are involved in semantic processing, only the functional role of LIPS is the same regardless of the visual input, whereas the functional role of LpMTG differs for words and objects.

  11. 3-d interpolation in object perception: evidence from an objective performance paradigm.

    PubMed

    Kellman, Philip J; Garrigan, Patrick; Shipley, Thomas F; Yin, Carol; Machado, Liana

    2005-06-01

    Object perception requires interpolation processes that connect visible regions despite spatial gaps. Some research has suggested that interpolation may be a 3-D process, but objective performance data and evidence about the conditions leading to interpolation are needed. The authors developed an objective performance paradigm for testing 3-D interpolation and tested a new theory of 3-D contour interpolation, termed 3-D relatability. The theory indicates for a given edge which orientations and positions of other edges in space may be connected to it by interpolation. Results of 5 experiments showed that processing of orientation relations in 3-D relatable displays was superior to processing in 3-D nonrelatable displays and that these effects depended on object formation. 3-D interpolation and 3-D relatabilty are discussed in terms of their implications for computational and neural models of object perception, which have typically been based on 2-D-orientation-sensitive units. ((c) 2005 APA, all rights reserved).

  12. Recovery from Object Substitution Masking Induced by Transient Suppression of Visual Motion Processing: A Repetitive Transcranial Magnetic Stimulation Study

    ERIC Educational Resources Information Center

    Hirose, Nobuyuki; Kihara, Ken; Mima, Tatsuya; Ueki, Yoshino; Fukuyama, Hidenao; Osaka, Naoyuki

    2007-01-01

    Object substitution masking is a form of visual backward masking in which a briefly presented target is rendered invisible by a lingering mask that is too sparse to produce lower image-level interference. Recent studies suggested the importance of an updating process in a higher object-level representation, which should rely on the processing of…

  13. Conceptual Coherence Affects Phonological Activation of Context Objects during Object Naming

    ERIC Educational Resources Information Center

    Oppermann, Frank; Jescheniak, Jorg D.; Schriefers, Herbert

    2008-01-01

    In 4 picture-word interference experiments, speakers named a target object that was presented with a context object. Using auditory distractors that were phonologically related or unrelated either to the target object or the context object, the authors assessed whether phonological processing was confined to the target object or not. Phonological…

  14. Objective and Subjective Socioeconomic Gradients Exist for Sleep Quality, Sleep Latency, Sleep Duration, Weekend Oversleep, and Daytime Sleepiness in Adults

    PubMed Central

    Jarrin, Denise Christina; McGrath, Jennifer J.; Silverstein, Janice E.; Drake, Christopher

    2017-01-01

    Socioeconomic gradients exist for multiple health outcomes. Lower objective socioeconomic position (SEP), whether measured by income, education, or occupation, is associated with inadequate sleep. Less is known about whether one’s perceived ranking of their social status, or subjective SEP, affects sleep. This study examined whether a subjective socioeconomic gradient exists for sleep while controlling for objective SEP. Participants (N = 177; age, M = 45.3 years, SD = 6.3 years) completed the Pittsburgh Sleep Quality Index, Epworth Sleepiness Scale, MacArthur Ladder, and other self-report measures to assess sleep and objective SEP. Subjective SEP trumped objective SEP as a better predictor of sleep duration, daytime sleepiness, and weekend oversleep. These findings highlight the need to expand our framework to better understand the mechanisms underlying socioeconomic gradients and sleep. PMID:23136841

  15. Hydrochemical processes regulating groundwater quality in the coastal plain of Al Musanaah, Sultanate of Oman

    NASA Astrophysics Data System (ADS)

    Askri, Brahim

    2015-06-01

    The Al Batinah coastal aquifer is the principal source of water in northwestern Oman. The rainfall in the Jabal Al Akhdar mountain region recharges the plain with freshwater that allowed agricultural and industrial activities to develop. The over-exploitation of this aquifer since the 1970s for municipal, agricultural and industrial purposes, excessive use of fertilizers in agriculture and leakage from septic tanks led to the deterioration of groundwater quality. The objective of this study was to investigate the hydrochemical processes regulating the groundwater quality in the southwestern section of Al Batinah. From available data collected during the spring of 2010 from 58 wells located in Al Musanaah wilayat, it was determined that the groundwater salinity increased in the direction from the south to the north following the regional flow direction. In addition to salinisation, the groundwater in the upstream and intermediate regions was contaminated with nitrate, while groundwater in the downstream region was affected by fluoride. Calculations of ionic ratios and seawater fraction indicated that seawater intrusion was not dominant in the study area. The primary factors controlling the groundwater chemistry in Al Musanaah appear to be halite dissolution, reverse ion exchange with clay material and anthropogenic pollutants.

  16. Multi-objective optimization model of CNC machining to minimize processing time and environmental impact

    NASA Astrophysics Data System (ADS)

    Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad

    2017-11-01

    Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.

  17. Examining the Relationship Between Nursing Informatics Competency and the Quality of Information Processing.

    PubMed

    Al-Hawamdih, Sajidah; Ahmad, Muayyad M

    2018-03-01

    The purpose of this study was to examine nursing informatics competency and the quality of information processing among nurses in Jordan. The study was conducted in a large hospital with 380 registered nurses. The hospital introduced the electronic health record in 2010. The measures used in this study were personal and job characteristics, self-efficacy, Self-Assessment Nursing Informatics Competencies, and Health Information System Monitoring Questionnaire. The convenience sample consisted of 99 nurses who used the electronic health record for at least 3 months. The analysis showed that nine predictors explained 22% of the variance in the quality of information processing, whereas the statistically significant predictors were nursing informatics competency, clinical specialty, and years of nursing experience. There is a need for policies that advocate for every nurse to be educated in nursing informatics and the quality of information processing.

  18. Principles and Practices for Quality Assurance and Quality Control

    USGS Publications Warehouse

    Jones, Berwyn E.

    1999-01-01

    Quality assurance and quality control are vital parts of highway runoff water-quality monitoring projects. To be effective, project quality assurance must address all aspects of the project, including project management responsibilities and resources, data quality objectives, sampling and analysis plans, data-collection protocols, data quality-control plans, data-assessment procedures and requirements, and project outputs. Quality control ensures that the data quality objectives are achieved as planned. The historical development and current state of the art of quality assurance and quality control concepts described in this report can be applied to evaluation of data from prior projects.

  19. Nixtamalized flour from quality protein maize (Zea mays L). optimization of alkaline processing.

    PubMed

    Milán-Carrillo, J; Gutiérrez-Dorado, R; Cuevas-Rodríguez, E O; Garzón-Tiznado, J A; Reyes-Moreno, C

    2004-01-01

    Quality of maize proteins is poor, they are deficient in the essential amino acids lysine and tryptophan. Recently, in Mexico were successfully developed nutritionally improved 26 new hybrids and cultivars called quality protein maize (QPM) which contain greater amounts of lysine and tryptophan. Alkaline cooking of maize with lime (nixtamalization) is the first step for producing several maize products (masa, tortillas, flours, snacks). Processors adjust nixtamalization variables based on experience. The objective of this work was to determine the best combination of nixtamalization process variables for producing nixtamalized maize flour (NMF) from QPM V-537 variety. Nixtamalization conditions were selected from factorial combinations of process variables: nixtamalization time (NT, 20-85 min), lime concentration (LC, 3.3-6.7 g Ca(OH)2/l, in distilled water), and steep time (ST, 8-16 hours). Nixtamalization temperature and ratio of grain to cooking medium were 85 degrees C and 1:3 (w/v), respectively. At the end of each cooking treatment the steeping started for the required time. Steeping was finished by draining the cooking liquor (nejayote). Nixtamal (alkaline-cooked maize kernels) was washed with running tap water. Wet nixtamal was dried (24 hours, 55 degrees C) and milled to pass through 80-US mesh screen to obtain NMF. Response surface methodology (RSM) was applied as optimization technique, over four response variables: In vitro protein digestibility (PD), total color difference (deltaE), water absorption index (WAI), and pH. Predictive models for response variables were developed as a function of process variables. Conventional graphical method was applied to obtain maximum PD, WAI and minimum deltaE, pH. Contour plots of each of the response variables were utilized applying superposition surface methodology, to obtain three contour plots for observation and selection of best combination of NT (31 min), LC (5.4 g Ca(OH)2/l), and ST (8.1 hours) for producing

  20. SHERPA: an image segmentation and outline feature extraction tool for diatoms and other objects.

    PubMed

    Kloster, Michael; Kauer, Gerhard; Beszteri, Bánk

    2014-06-25

    Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and/or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily

  1. Internal Quality Assurance Benchmarking. ENQA Workshop Report 20

    ERIC Educational Resources Information Center

    Blackstock, Douglas; Burquel, Nadine; Comet, Nuria; Kajaste, Matti; dos Santos, Sergio Machado; Marcos, Sandra; Moser, Marion; Ponds, Henri; Scheuthle, Harald; Sixto, Luis Carlos Velon

    2012-01-01

    The Internal Quality Assurance group of ENQA (IQA Group) has been organising a yearly seminar for its members since 2007. The main objective is to share experiences concerning the internal quality assurance of work processes in the participating agencies. The overarching theme of the 2011 seminar was how to use benchmarking as a tool for…

  2. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  3. Low-cost oblique illumination: an image quality assessment.

    PubMed

    Ruiz-Santaquiteria, Jesus; Espinosa-Aranda, Jose Luis; Deniz, Oscar; Sanchez, Carlos; Borrego-Ramos, Maria; Blanco, Saul; Cristobal, Gabriel; Bueno, Gloria

    2018-01-01

    We study the effectiveness of several low-cost oblique illumination filters to improve overall image quality, in comparison with standard bright field imaging. For this purpose, a dataset composed of 3360 diatom images belonging to 21 taxa was acquired. Subjective and objective image quality assessments were done. The subjective evaluation was performed by a group of diatom experts by psychophysical test where resolution, focus, and contrast were assessed. Moreover, some objective nonreference image quality metrics were applied to the same image dataset to complete the study, together with the calculation of several texture features to analyze the effect of these filters in terms of textural properties. Both image quality evaluation methods, subjective and objective, showed better results for images acquired using these illumination filters in comparison with the no filtered image. These promising results confirm that this kind of illumination filters can be a practical way to improve the image quality, thanks to the simple and low cost of the design and manufacturing process. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  4. Self-concept and quality of object relations as predictors of outcome in short- and long-term psychotherapy.

    PubMed

    Lindfors, Olavi; Knekt, Paul; Heinonen, Erkki; Virtala, Esa

    2014-01-01

    Quality of object relations and self-concept reflect clinically relevant aspects of personality functioning, but their prediction as suitability factors for psychotherapies of different lengths has not been compared. This study compared their prediction on psychiatric symptoms and work ability in short- and long-term psychotherapy. Altogether 326 patients, 20-46 years of age, with mood and/or anxiety disorder, were randomized to short-term (solution-focused or short-term psychodynamic) psychotherapy and long-term psychodynamic psychotherapy. The Quality of Object Relations Scale (QORS) and the Structural Analysis of Social Behavior (SASB) self-concept questionnaire were measured at baseline, and their prediction on outcome during the 3-year follow-up was assessed by the Symptom Check List Global Severity Index and the Anxiety Scale, the Beck Depression Inventory and by the Work Ability Index, Social Adjustment Scale work subscale and the Perceived Psychological Functioning scale. Negative self-concept strongly and self-controlling characteristics modestly predicted better 3-year outcomes in long-term therapy, after faster early gains in short-term therapy. Patients with a more positive or self-emancipating self-concept, or more mature object relations, experienced more extensive benefits after long-term psychotherapy. The importance of length vs. long-term therapy technique on the differences found is not known. Patients with mild to moderate personality pathology, indicated by poor self-concept, seem to benefit more from long-term than short-term psychotherapy, in reducing risk of depression. Long-term therapy may also be indicated for patients with relatively good psychological functioning. More research is needed on the relative importance of these characteristics in comparison with other patient-related factors. © 2013 Published by Elsevier B.V.

  5. DESCRIPTION OF ATMOSPHERIC TRANSPORT PROCESSES IN EULERIAN AIR QUALITY MODELS

    EPA Science Inventory

    Key differences among many types of air quality models are the way atmospheric advection and turbulent diffusion processes are treated. Gaussian models use analytical solutions of the advection-diffusion equations. Lagrangian models use a hypothetical air parcel concept effecti...

  6. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  7. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  8. Image Processing Strategies Based on a Visual Saliency Model for Object Recognition Under Simulated Prosthetic Vision.

    PubMed

    Wang, Jing; Li, Heng; Fu, Weizhen; Chen, Yao; Li, Liming; Lyu, Qing; Han, Tingting; Chai, Xinyu

    2016-01-01

    Retinal prostheses have the potential to restore partial vision. Object recognition in scenes of daily life is one of the essential tasks for implant wearers. Still limited by the low-resolution visual percepts provided by retinal prostheses, it is important to investigate and apply image processing methods to convey more useful visual information to the wearers. We proposed two image processing strategies based on Itti's visual saliency map, region of interest (ROI) extraction, and image segmentation. Itti's saliency model generated a saliency map from the original image, in which salient regions were grouped into ROI by the fuzzy c-means clustering. Then Grabcut generated a proto-object from the ROI labeled image which was recombined with background and enhanced in two ways--8-4 separated pixelization (8-4 SP) and background edge extraction (BEE). Results showed that both 8-4 SP and BEE had significantly higher recognition accuracy in comparison with direct pixelization (DP). Each saliency-based image processing strategy was subject to the performance of image segmentation. Under good and perfect segmentation conditions, BEE and 8-4 SP obtained noticeably higher recognition accuracy than DP, and under bad segmentation condition, only BEE boosted the performance. The application of saliency-based image processing strategies was verified to be beneficial to object recognition in daily scenes under simulated prosthetic vision. They are hoped to help the development of the image processing module for future retinal prostheses, and thus provide more benefit for the patients. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  9. The objective vocal quality, vocal risk factors, vocal complaints, and corporal pain in Dutch female students training to be speech-language pathologists during the 4 years of study.

    PubMed

    Van Lierde, Kristiane M; D'haeseleer, Evelien; Wuyts, Floris L; De Ley, Sophia; Geldof, Ruben; De Vuyst, Julie; Sofie, Claeys

    2010-09-01

    The purpose of the present cross-sectional study was to determine the objective vocal quality and the vocal characteristics (vocal risk factors, vocal and corporal complaints) in 197 female students in speech-language pathology during the 4 years of study. The objective vocal quality was measured by means of the Dysphonia Severity Index (DSI). Perceptual voice assessment, the Voice Handicap Index (VHI), questionnaires addressing vocal risks, and vocal and corporal complaints during and/or after voice usage were performed. Speech-language pathology (SLP) students have a borderline vocal quality corresponding to a DSI% of 68. The analysis of variance revealed no significant change of the objective vocal quality between the first bachelor year and the master year. No psychosocial handicapping effect of the voice was observed by means of the VHI total, though there was an effect at the functional VHI level in addition to some vocal complaints. Ninety-three percent of the student SLPs reported the presence of corporal pain during and/or after speaking. In particular, sore throat and headache were mentioned as the prevalent corporal pain symptoms. A longitudinal study of the objective vocal quality of the same subjects during their career as an SLP might provide new insights. 2010 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  10. Head and Eye Movements Affect Object Processing in 4-Month-Old Infants More than an Artificial Orientation Cue

    ERIC Educational Resources Information Center

    Wahl, Sebastian; Michel, Christine; Pauen, Sabina; Hoehl, Stefanie

    2013-01-01

    This study investigates the effects of attention-guiding stimuli on 4-month-old infants' object processing. In the human head condition, infants saw a person turning her head and eye gaze towards or away from objects. When presented with the objects again, infants showed increased attention in terms of longer looking time measured by eye…

  11. Improving Multi-Objective Management of Water Quality Tipping Points: Revisiting the Classical Shallow Lake Problem

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Reed, P. M.; Keller, K.

    2015-12-01

    Recent multi-objective extensions of the classical shallow lake problem are useful for exploring the conceptual and computational challenges that emerge when managing irreversible water quality tipping points. Building on this work, we explore a four objective version of the lake problem where a hypothetical town derives economic benefits from polluting a nearby lake, but at the risk of irreversibly tipping the lake into a permanently polluted state. The trophic state of the lake exhibits non-linear threshold dynamics; below some critical phosphorus (P) threshold it is healthy and oligotrophic, but above this threshold it is irreversibly eutrophic. The town must decide how much P to discharge each year, a decision complicated by uncertainty in the natural P inflow to the lake. The shallow lake problem provides a conceptually rich set of dynamics, low computational demands, and a high level of mathematical difficulty. These properties maximize its value for benchmarking the relative merits and limitations of emerging decision support frameworks, such as Direct Policy Search (DPS). Here, we explore the use of DPS as a formal means of developing robust environmental pollution control rules that effectively account for deeply uncertain system states and conflicting objectives. The DPS reformulation of the shallow lake problem shows promise in formalizing pollution control triggers and signposts, while dramatically reducing the computational complexity of the multi-objective pollution control problem. More broadly, the insights from the DPS variant of the shallow lake problem formulated in this study bridge emerging work related to socio-ecological systems management, tipping points, robust decision making, and robust control.

  12. Improving Learning Object Quality: Moodle HEODAR Implementation

    ERIC Educational Resources Information Center

    Munoz, Carlos; Garcia-Penalvo, Francisco J.; Morales, Erla Mariela; Conde, Miguel Angel; Seoane, Antonio M.

    2012-01-01

    Automation toward efficiency is the aim of most intelligent systems in an educational context in which results calculation automation that allows experts to spend most of their time on important tasks, not on retrieving, ordering, and interpreting information. In this paper, the authors provide a tool that easily evaluates Learning Objects quality…

  13. Effects of marketing group on the quality of fresh and cured hams sourced from a commercial processing facility

    USDA-ARS?s Scientific Manuscript database

    The objective was: 1) to characterize the effect of marketing 30 group on fresh and cured ham quality, and 2) to determine which fresh ham traits correlated to cured ham quality traits. Pigs raised in 8 barns representing two seasons (hot and cold) and two production focuses (lean and quality) were ...

  14. Recent developments in minimal processing: a tool to retain nutritional quality of food.

    PubMed

    Pasha, Imran; Saeed, Farhan; Sultan, M Tauseef; Khan, Moazzam Rafiq; Rohi, Madiha

    2014-01-01

    The modernization during the last century resulted in urbanization coupled with modifications in lifestyles and dietary habits. In the same era, industrial developments made it easier to meet the requirements for processed foods. However, consumers are now interested in minimally processed foods owing to increase in their awareness to have fruits and vegetables with superior quality, and natural integrity with fewer additives. The food products deteriorate as a consequence of physiological aging, biochemical changes, high respiration rat,e and high ethylene production. These factors contribute substantially to discoloration, loss of firmness, development of off-flavors, acidification, and microbial spoilage. Simultaneously, food processors are using emerging approaches to process perishable commodities, along with enhanced nutritional and sensorial quality. The present review article is an effort to utilize the modern approaches to minimize the processing and deterioration. The techniques discussed in this paper include chlorination, ozonation, irradiation, photosensitization, edible coating, natural preservative use, high-pressure processing, microwave heating, ohmic heating, and hurdle technology. The consequences of these techniques on shelf-life stability, microbial safety, preservation of organoleptic and nutritional quality, and residue avoidance are the limelight of the paper. Moreover, the discussion has been made on the feasibility and operability of these techniques in modern-day processing.

  15. Objects and events as determinants of parallel processing in dual tasks: evidence from the backward compatibility effect.

    PubMed

    Ellenbogen, Ravid; Meiran, Nachshon

    2011-02-01

    The backward-compatibility effect (BCE) is a major index of parallel processing in dual tasks and is related to the dependency of Task 1 performance on Task 2 response codes (Hommel, 1998). The results of four dual-task experiments showed that a BCE occurs when the stimuli of both tasks are included in the same visual object (Experiments 1 and 2) or belong to the same perceptual event (Experiments 3 and 4). Thus, the BCE may be modulated by factors that influence whether both task stimuli are included in the same perceptual event (objects, as studied in cognitive experiments, being special cases of events). As with objects, drawing attention to a (selected) event results in the processing of its irrelevant features and may interfere with task execution. (c) 2010 APA, all rights reserved.

  16. Effects of Processing Parameters on the Forming Quality of C-Shaped Thermosetting Composite Laminates in Hot Diaphragm Forming Process

    NASA Astrophysics Data System (ADS)

    Bian, X. X.; Gu, Y. Z.; Sun, J.; Li, M.; Liu, W. P.; Zhang, Z. G.

    2013-10-01

    In this study, the effects of processing temperature and vacuum applying rate on the forming quality of C-shaped carbon fiber reinforced epoxy resin matrix composite laminates during hot diaphragm forming process were investigated. C-shaped prepreg preforms were produced using a home-made hot diaphragm forming equipment. The thickness variations of the preforms and the manufacturing defects after diaphragm forming process, including fiber wrinkling and voids, were evaluated to understand the forming mechanism. Furthermore, both interlaminar slipping friction and compaction behavior of the prepreg stacks were experimentally analyzed for showing the importance of the processing parameters. In addition, autoclave processing was used to cure the C-shaped preforms to investigate the changes of the defects before and after cure process. The results show that the C-shaped prepreg preforms with good forming quality can be achieved through increasing processing temperature and reducing vacuum applying rate, which obviously promote prepreg interlaminar slipping process. The process temperature and forming rate in hot diaphragm forming process strongly influence prepreg interply frictional force, and the maximum interlaminar frictional force can be taken as a key parameter for processing parameter optimization. Autoclave process is effective in eliminating voids in the preforms and can alleviate fiber wrinkles to a certain extent.

  17. Using Group Projects to Teach Process Improvement in a Quality Class

    ERIC Educational Resources Information Center

    Neidigh, Robert O.

    2016-01-01

    This paper provides a description of a teaching approach that uses experiential learning to teach process improvement. The teaching approach uses student groups to perform and gather process data in a senior-level quality management class that focuses on Lean Six Sigma. A strategy to link the experiential learning in the group projects to the…

  18. Integration of a three-dimensional process-based hydrological model into the Object Modeling System

    USDA-ARS?s Scientific Manuscript database

    The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...

  19. Musical Sound Quality in Cochlear Implant Users: A Comparison in Bass Frequency Perception Between Fine Structure Processing and High-Definition Continuous Interleaved Sampling Strategies.

    PubMed

    Roy, Alexis T; Carver, Courtney; Jiradejvong, Patpong; Limb, Charles J

    2015-01-01

    Med-El cochlear implant (CI) patients are typically programmed with either the fine structure processing (FSP) or high-definition continuous interleaved sampling (HDCIS) strategy. FSP is the newer-generation strategy and aims to provide more direct encoding of fine structure information compared with HDCIS. Since fine structure information is extremely important in music listening, FSP may offer improvements in musical sound quality for CI users. Despite widespread clinical use of both strategies, few studies have assessed the possible benefits in music perception for the FSP strategy. The objective of this study is to measure the differences in musical sound quality discrimination between the FSP and HDCIS strategies. Musical sound quality discrimination was measured using a previously designed evaluation, called Cochlear Implant-MUltiple Stimulus with Hidden Reference and Anchor (CI-MUSHRA). In this evaluation, participants were required to detect sound quality differences between an unaltered real-world musical stimulus and versions of the stimulus in which various amount of bass (low) frequency information was removed via a high-pass filer. Eight CI users, currently using the FSP strategy, were enrolled in this study. In the first session, participants completed the CI-MUSHRA evaluation with their FSP strategy. Patients were then programmed with the clinical-default HDCIS strategy, which they used for 2 months to allow for acclimatization. After acclimatization, each participant returned for the second session, during which they were retested with HDCIS, and then switched back to their original FSP strategy and tested acutely. Sixteen normal-hearing (NH) controls completed a CI-MUSHRA evaluation for comparison, in which NH controls listened to music samples under normal acoustic conditions, without CI stimulation. Sensitivity to high-pass filtering more closely resembled that of NH controls when CI users were programmed with the clinical-default FSP strategy

  20. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.