Sample records for process-based quality pbq

  1. Analysis of benzoquinone decomposition in solution plasma process

    NASA Astrophysics Data System (ADS)

    Bratescu, M. A.; Saito, N.

    2016-01-01

    The decomposition of p-benzoquinone (p-BQ) in Solution Plasma Processing (SPP) was analyzed by Coherent Anti-Stokes Raman Spectroscopy (CARS) by monitoring the change of the anti-Stokes signal intensity of the vibrational transitions of the molecule, during and after SPP. Just in the beginning of the SPP treatment, the CARS signal intensities of the ring vibrational molecular transitions increased under the influence of the electric field of plasma. The results show that plasma influences the p-BQ molecules in two ways: (i) plasma produces a polarization and an orientation of the molecules in the local electric field of plasma and (ii) the gas phase plasma supplies, in the liquid phase, hydrogen and hydroxyl radicals, which reduce or oxidize the molecules, respectively, generating different carboxylic acids. The decomposition of p-BQ after SPP was confirmed by UV-visible absorption spectroscopy and liquid chromatography.

  2. The Japanese version of the Postpartum Bonding Questionnaire: Examination of the reliability, validity, and scale structure.

    PubMed

    Suetsugu, Yoshiko; Honjo, Shuji; Ikeda, Mari; Kamibeppu, Kiyoko

    2015-07-01

    The purpose of this study was to develop the Japanese version of the Postpartum Bonding Questionnaire (PBQ) to gather data on Japanese mothers for comparison with other cultures and to examine the scale structure of the PBQ among Japanese mothers. We administered the PBQ to a cross-section of 244 mothers 4 weeks after delivery and again 2 weeks later to 199 mothers as a retest to examine reliability. We used exploratory factor analysis to evaluate the factor structure of the PBQ. Correlations with the Mother-to-Infant Bonding Scale (MIBS), the Maternal Attachment Inventory (MAI), Edinburgh Postnatal Depression Scale (EPDS), and sociodemographic variables were calculated for validation. The 14-item version of the PBQ extracted by exploratory analysis consisted of four factors: 'impaired bonding', 'rejection and anger', 'anxiety about care', and 'lack of affection'. We found significant correlations of the total scores of the PBQ and the 14-item version of the PBQ positively with the MIBS and negatively with the MAI. Moderate significant correlations with total scores were also found with the EPDS. Total scores for primiparous and depressed mothers were higher than those for multiparous mothers and mothers without depression. The results of this study demonstrated the reliability and validity of the PBQ and the 14-item version of the PBQ in Japanese mothers 4 weeks after delivery. Copyright © 2015. Published by Elsevier Inc.

  3. Assessment of psychometric properties of the Postpartum Bonding Questionnaire (PBQ) in Spanish mothers.

    PubMed

    Garcia-Esteve, Lluïsa; Torres, Anna; Lasheras, Gracia; Palacios-Hernández, Bruma; Farré-Sender, Borja; Subirà, Susana; Valdés, Manuel; Brockington, Ian Fraser

    2016-04-01

    The Postpartum Bonding Questionnaire (PBQ) was developed to assess mother-infant bonding disturbances in the postpartum period. The aim of this study was to examine the psychometric properties of the Spanish version of the PBQ in a sample of Spanish postpartum women. Eight hundred forty mothers were recruited in the postpartum visit (4-6 weeks after delivery): 513 from a gynecology unit (forming the general population sample) and 327 mothers from a perinatal psychiatry program (forming the clinical sample). All women were assessed by means of the Edinburgh Postnatal Depression Scale (EPDS) and the PBQ. Neither the original four-factor structure nor alternative structures (Reck et al. 2006; Wittkowski et al. 2010) were replicated by the confirmatory factor analyses. An exploratory factor analysis showed a four-factor solution. The Schmid-Leiman transformation found a general factor that accounted for 61% of the variance of the PBQ. Bonding impairment showed higher associations with depressive symptomatology in both samples. The Spanish version of the PBQ showed adequate psychometric properties for use with clinical and general populations of Spanish postpartum women. The results suggest that the PBQ could be summarized by a general factor and confirm the utility of the use of the total score for detecting bonding impairment.

  4. Estimate of radiocaesium derived FNPP1 accident in the North Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Inomata, Yayoi; Aoyama, Michio; Tsubono, Takaki; Tsumune, Daisuke; Yamada, Masatoshi

    2017-04-01

    134Cs and 137Cs (radiocaesium) were released to the North Pacific Ocean by direct discharge and atmospheric deposition released from the TEPCO Fukushima Dai-ichi Nuclear Power Plant (FNPP1) accident in 2011. After the FNPP1 accident, measurements of 134Cs and 137Cs were conducted by many researches. However, those results are only snapshots in order to interpret the distribution and transport of the released radiocaesium on a basin scale. It is recognized that estimation of the total amount of released 134Cs and 137Cs is necessary to assess the radioecological impacts of their release on the environment. It was reported that the inventory of 134Cs or 137Cs on the North Pacific Ocean after the FNPP1 accident was 15.2-18.3 PBq based on the observations (Aoyama et al., 2016a), 15.3±1.6 PBq by OI analysis (Inomata et al., 2016), 16.1±1.64 PBq by global ocean model (Tsubono et al., 2016). These suggest that more than 75 % of the atmospheric-released radiocaesium (15.2-20.4 PBq; Aoyama et al., 2016a) were deposited on the North Pacific Ocean. The radiocaesium from the atmospheric fallout and direct discharge were expected to mixing as well as diluting near the coastal region and transported eastward across the North Pacific Ocean in the surface layer. Furthermore, radicaesium were rapidly mixed and penetrated into the subsurface water in the North Pacific Ocean in winter. It was revealed that these radiocaesium existed in the Subtropical Mode Water (STMW, Aoyama et al., 2016b; Kaeriyama et al., 2016) and Central Mode Water (CMW, Aoyama et al., 2016b), suggesting that mode water formation and subduction are efficient pathway for the transport of FNPP1 derived radiocaesium into the ocean interior within 1-year timescale. Kaeriyama et al. (2016) estimated the total amount of FNPP1 derived radiocaesium in the STMW was 4.2 ± 1.1 PBq in October-November 2012. However, there is no estimation of the amount of radiocaesium in the CMW. Therefore, it is impossible to discuss about the mass balance of radiocaesium injected into the North Pacific Ocean. In this study, we conducted the optimum interpolation (OI) analysis to estimate the inventory of radiocaesium in the ocean interior as well as surface sweater by using the measured activities. Furthermore, transport speed of radiocaesium in the surface layer in the North Pacific Ocean were also estimated. The data used in this study were derived from all of the available data reported by such as the Tokyo Electric Power Company (TEPCO), the Japanese Ministry of Education, Culture, Sports, Science and Technology (MEXT), and voluntary cargo ships. The data analysis period was until December 2015 after the FNPP1 accident. It was found that the radiocaesium across the North Pacific Ocean were reached to 180˚ E around 40˚ N latitude at July, 2012 by OI analysis. The transport speed was estimated to 8.5 cm s-1. These were reached to the coastal site of America continent and the activities were increased after the year of 2014. The transport speed across 70˚ W (40˚ N latitude) was decreased to 5.2 cm s-1. We estimated the inventory of radiocaesium in the surface seawater (depth; 0-100m) during the periods from August to December, 2012, based on the OI analysis. Amount of 134Cs inventory was estimated to 4.7 PBq with decay-corrected to 1 October 2012 (7.9 PBq at the time on 11 March 2011). (In the case of 137Cs, the inventory was estimated to 12.5 PBq with decay-corrected to 1 October 2012 and 13 PBq at the time on 11 March 2011 which includes pre-Fukushima 137Cs derived from atmospheric weapons test conducted in late 1950s and eraly 1960s). These correspond to 43-53% of the injected 134Cs in the North Pacific Ocean. It was reported that the 4.2±1.1 PBq of 134Cs were distributed in the STMW (Kaeriyama et al., 2016), which corresponds to 22-28% of the injected 134Cs in the North Pacific Ocean. Taking into account these estimation, FNPP1 derived radiocaesium existed in the CMW in the North Pacific Ocean would be about 3-6 PBq. (References) Aoyama, M., Hamajima, Y., Hult, M., Uematsu, M., Oka, E., . 2016. 134Cs and 137Cs in the North Pacific Ocean derived from the March 2011 TEPCO Fukushima Dai-Ichi Nuclear Power Plant accident, Japan. Part one: surface pathway and vertical distributions. J. Oceanogr. 72:53-65. Aoyama, M., Kajino, M., Tanaka, T. Y., Sekiyama, T.T., Tsumune, D., Tsubono, T., Hamajima, Y., Inomata, Y., Gamo, T., .2016. 134Cs and 137Cs in the North Pacific Ocean derived from the March 2011 TEPCO Fukushima Daiichi Nuclear Power Plant accident, Japan. Part two: estimation of 134Cs and 137Cs inventories in the North Pacific Ocean. J. Oceanogr. 72:53-65. Inomata, Y., Aoyama, M., Tsubono, T., Tsumune, D., Hirose, K. 2016. Spatial and temporal distributions of 134Cs and 137Cs derived from the TEPCOFukushima Daiichi nuclear power plant accident in the North Pacific Ocean by using optimal interpolation analysis. Environ. Sci. Process. Impacts 18:126-36. Tsubono, T., Misumi. K., Tsumune, D., Bryan, F.O., Hirose, K., Aoyama, M. 2016. Evaluation of radioactive cesium impact from atmospheric deposition and direct release fluxes into the North Pacific from the Fukushima Daiichi nuclear power plant. Deep-Sea Res. I. 115:10-21. Kaeriyama, H., Shimizu, Y., Setou, T., Kumamoto, Y., Okazaki, M., Ambe, D., Ono, T. 2016. Intrusion of Fukushima-derived radiocaesium into subsurface water due to formation of mode waters in the North Pacific. Sci. Report., 6:22010 | DOI: 10.1038/srep22010.

  5. Long term behavior of TEPCO FNPP1 derived radiocaesium in the North Pacific Ocean through the end of 2016: A review

    NASA Astrophysics Data System (ADS)

    Aoyama, Michio; Hamajima, Yasunori; Inomata, Yayoi; Kumamoto, Yuichiro; Oka, Eitarou; Tsubono, Takaki; Tsumune, Daisuke

    2017-04-01

    1, Two major source terms of radiocaesium to the Ocean There are two major sources of radionuclides to the environment derived by the TEPCO Fukushima Dai-ichi Nuclear Power Plant (FNPP1) accident in 2011. The largest and earliest source of artificial radionuclide was atmospheric release from three melt down cores of FNPP1, which led to atmospheric deposition on both land and in the ocean. Total amount of atmospheric release of 137Cs was estimated to be 15.2-20.4 PBq ( and same amount of 134Cs) (Aoyama et al., 2016). About 20 % of released radiocaesium fell on land and 80% of released radiocaesium fell on the ocean. Therefore 11.7-14.8 PBq of 137Cs was injected in the North Pacific as atmospheric deposition. Second largest source was the direct discharge of contaminated waters to the ocean since 26 March 2011 and peaked on 6 April 2011 (Tsumune et al., 2012). Total amount of directly released 137Cs was estimated to be 3.5 +- 0.7 PBq. A combined input to the North Pacific was therefore 15.2 - 18.3 PBq. 2, Three major pathways of FNPP1 derived radiocaesium in the North Pacific The fastest pathway of radiocaesium might be surface. FNPP1-derived radiocaesium injected at north of Kuroshio front by atmospheric deposition and direct discharge spread eastward in surface water by the North Pacific Current across the mid-latitude North Pacific (Aoyama et al., 2016). A model simulation by Tsubono et al.(2016) also shows good agreement with the observed radiocaesium activities in the North Pacific. The second pathway is formation of central mode water (CMW). A maximum of radiocaesium activity in June/July 2012 was observed at potential densities of 26.1-26.3 at 34 deg. N-39 deg. N along 165 deg. E, which correspond to 400 meters depth. The density is in a range of density of CMW and radiocaesium activity was higher than those in the surrounding waters, including STMW. In June-July 2015 and June 2016 at 36 deg. N-44 deg.N, 165 deg. E - 170 deg. E, we observe very week signal of FNPP1 radiocaesium, which means that subducted radiocaesium might have moved eastward from this region. The third pathway is formation of subtropical mode water (STMW). FNPP1-derived radiocaesium injected at south of Kuroshio front by atmospheric deposition transported to southward rapidly due to formation of STMW at potential densities of 25.1-25.3. In 2015 along 165 deg. E, FNPP1 radiocaesium corresponding STMW spread entire subtropical gyre and a part of them reached 2 deg. N and recirculated in the subtropical gyre and reached Japanese coast. 3, Mass balance of FNPP1 radiocaesium in the North Pacific 134Cs inventory was estimated to be 8 PBq in surface layer in summer 2012 (Inomata unpublished). Kaeriyama et al. (2016) estimated that 134Cs inventory in STWM in 2012 was about 4 PBq. We believe that FNPP1 derived 134Cs injected in the North Pacific was 15.2 - 18.3 PBq. Therefore 134Cs inventory can be estimated 3-6 PBq in CMW at this moment based on a mass balance of FNPP1 radiocaesium.

  6. Principal-Centric Reasoning in Constructive Authorization Logic

    DTIC Science & Technology

    2009-04-14

    formulas of DTL0 to formulas of CS4m as 19 follows. pPq = P pA ∧ Bq = pAq ∧ pBq pA ∨ Bq = pAq ∨ pBq pA ⊃ Bq = pAq ⊃ pBq p>q = > p⊥q = ⊥ pK says Aq...K(K ⊃ pAq ) The important part of the translation is the mapping of K says A to K(K ⊃ pAq ). The formula K on the left of the implication acts as a...guard” on pAq , and recovers the effect of the context associated with hypothetical judgments in DTL0: pAq can be obtained from K ⊃ pAq only if K is

  7. Past-behavioural versus situational questions in a postgraduate admissions multiple mini-interview: a reliability and acceptability comparison.

    PubMed

    Yoshimura, Hiroshi; Kitazono, Hidetaka; Fujitani, Shigeki; Machi, Junji; Saiki, Takuya; Suzuki, Yasuyuki; Ponnamperuma, Gominda

    2015-04-14

    The Multiple Mini-Interview (MMI) mostly uses 'Situational' Questions (SQs) as an interview format within a station, rather than 'Past-Behavioural' Questions (PBQs), which are most frequently adopted in traditional single-station personal interviews (SSPIs) for non-medical and medical selection. This study investigated reliability and acceptability of the postgraduate admissions MMI with PBQ and SQ interview formats within MMI stations. Twenty-six Japanese medical graduates, first completed the two-year national obligatory initial postgraduate clinical training programme and then applied to three specialty training programmes - internal medicine, general surgery, and emergency medicine - in a Japanese teaching hospital, where they underwent the Accreditation Council for Graduate Medical Education (ACGME)-competency-based MMI. This MMI contained five stations, with two examiners per station. In each station, a PBQ, and then an SQ were asked consecutively. PBQ and SQ interview formats were not separated into two different stations, or the order of questioning of PBQs and SQs in individual stations was not changed due to lack of space and experienced examiners. Reliability was analysed for the scores of these two MMI question types. Candidates and examiners were surveyed on this experience. The PBQ and SQ formats had generalisability coefficients of 0.822 and 0.821, respectively. With one examiner per station, seven stations could produce a reliability of more than 0.80 in both PBQ and SQ formats. More than 60% of both candidates and examiners felt positive about the overall candidates' ability. All participants liked the fairness of this MMI when compared with the previously experienced SSPI. SQs were perceived more favourable by candidates; in contrast, PBQs were perceived more relevant by examiners. Both PBQs and SQs are equally reliable and acceptable as station interview formats in the postgraduate admissions MMI. However, the use of the two formats within the same station, and with a fixed order, is not the best to maximise its utility as an admission test. Future studies are required to evaluate how best the SQs and PBQs should be combined as station interview formats to enhance reliability, feasibility, acceptability and predictive validity of the MMI.

  8. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    NASA Astrophysics Data System (ADS)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration and deposition observations over Europe. The results of the present inversion were confirmed using an independent Eulerian model, for which deposition patterns were also improved when using the estimated posterior releases. Although the independent model tends to underestimate deposition in countries that are not in the main direction of the plume, it reproduces country levels of deposition very efficiently. The results were also tested for robustness against different setups of the inversion through sensitivity runs. The source term data from this study are publicly available.

  9. p-Benzoquinone-induced aggregation and perturbation of structure and chaperone function of α-crystallin is a causative factor of cigarette smoke-related cataractogenesis.

    PubMed

    Chowdhury, Aritra; Choudhury, Aparajita; Chakraborty, Shruti; Ghosh, Arunava; Banerjee, Victor; Ganguly, Shinjini; Bhaduri, Gautam; Banerjee, Rajat; Das, Kalipada; Chatterjee, Indu B

    2018-02-01

    Cigarette smoking is a significant risk factor for cataract. However, the mechanism by which cigarette smoke (CS) causes cataract remains poorly understood. We had earlier shown that in CS-exposed guinea pig, p-benzoquinone (p-BQ) derived from CS in the lungs is carried by the circulatory system to distant organs and induces various smoke-related pathogeneses. Here, we observed that CS exposure caused accumulation of the p-BQ-protein adduct in the eye lens of guinea pigs. We also observed accumulation of the p-BQ-protein adduct in resected lens from human smokers with cataract. No such accumulation was observed in the lens of never smokers. p-BQ is a strong arylating agent that forms Michael adducts with serum albumin and haemoglobin resulting in alterations of structure and function. A major protein in the mammalian eye lens is αA-crystallin, which is a potent molecular chaperone. αA-crystallin plays a key role in maintaining the integrity and transparency of the lens. SDS-PAGE indicated that p-BQ induced aggregation of αA-crystallin. Various biophysical techniques including UV-vis spectroscopy, fluorescence spectroscopy, FT-IR, bis-ANS titration suggested a perturbation of structure and chaperone function of αA-crystallin upon p-BQ modification. Our results indicate that p-BQ is a causative agent involved in the modification of αA-crystallin and pathogenesis of CS-induced cataract. Our findings would educate public about the impacts of smoking on eye health and help to discourage them from smoking. The study might also help scientists to develop new drugs for the intervention of CS-induced cataract at an early stage. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. [Maternal depressive symptoms and anxiety and interference in the mother/child relationship based on a prenatal cohort: an approach with structural equations modeling].

    PubMed

    Morais, Adriana Oliveira Dias de Sousa; Simões, Vanda Maria Ferreira; Rodrigues, Lívia Dos Santos; Batista, Rosângela Fernandes Lucena; Lamy, Zeni Carvalho; Carvalho, Carolina Abreu de; Silva, Antônio Augusto Moura da; Ribeiro, Marizélia Rodrigues Costa

    2017-07-13

    This study aimed to investigate the association between maternal depressive symptoms and anxiety and interference in the mother/child relationship, using structural equations modeling. Data were used from a prospective cohort study initiated during the prenatal period with 1,140 mothers in São Luís, Maranhão State, Brazil. Data were collected during prenatal care and when the children reached two years of age. Interference in the mother/child relationship was measured with the Postpartum Bonding Questionnaire - PBQ (N = 1,140). In the initial theoretical model, socioeconomic status determined the maternal demographic, psychosocial, and social support factors, which determined the outcome, i.e., the mother/child relationship. Adjustments were performed by structural equations modeling, using Mplus 7.0. The final model showed good fit (RMSEA = 0.047; CFI = 0.984; TLI = 0.981). Depressive symptoms in pregnancy and the postpartum were associated with higher PBQ scores, indicating interference in the mother/child relationship. The greatest effect was from depressive symptoms in pregnancy. Other factors associated with higher PBQ scores were lower social support, unfavorable socioeconomic status, and living without a partner, by indirect association. Anxiety symptoms and maternal age were not associated with the mother/child relationship. The results suggest that identifying and treating depression in pregnancy and postpartum can improve mother/child bonding in childhood.

  11. Personality-related core beliefs in patients diagnosed with fibromyalgia plus depression: A comparison with depressed and healthy control groups.

    PubMed

    Taymur, Ibrahim; Ozdel, Kadir; Gundogdu, Ibrahim; Efe, Canan; Tulaci, Riza Gokcer; Kervancioglu, Aysegul

    2015-07-01

    Personality has an important role in understanding both fibromyalgia syndrome (FMS) and major depressive disorder (MDD). This study considers the question that specific personality features may characterize depressed FMS patients. To this end, 125 individuals were included in the study: 40 of them diagnosed with FMS+ MDD, 40 with MDD only and 45 healthy controls. Individual Beck Depression Inventory (BDI) and Personality Belief Questionnaire-Short Form (PBQ-SF) scores were compared between the three groups. The mean scores for each personality domain of the PBQ-SF were the highest in the MDD group and the lowest mean scores appeared in the control group. Dependent personality and obsessive-compulsive personality scores were higher in the MDD group (t = 2.510, P = 0.014 and t = 2.240, P = 0.028, respectively) in comparison with the FM+ MDD group. However, this difference disappeared when PBQ-SF scores were controlled for depression severity. Although some common personality features are evident in FMS patients, it seems that the differences identified are primarily related to depression symptom severity.

  12. Birth-related, psychosocial, and emotional correlates of positive maternal-infant bonding in a cohort of first-time mothers.

    PubMed

    Kinsey, Cara Bicking; Baptiste-Roberts, Kesha; Zhu, Junjia; Kjerulff, Kristen H

    2014-05-01

    to describe the development of a shortened 10-item version of the Postpartum Bonding Questionnaire (S-PBQ) and examine the relationship between birth-related, psychosocial, and emotional factors and maternal-infant bonding. cross-sectional interview study. women having their first baby in Pennsylvania, USA. we interviewed 3005 women in their third trimester and at one month post partum who were enroled in the First Baby Study. for the S-PBQ, we completed factor analysis and examined instrument properties. We examined the relationship between birth-related, psychosocial, and emotional factors and maternal-infant bonding using adjusted linear regression models. The S-PBQ demonstrated acceptable internal reliability (Cronbach׳s α=0.67). Analysis revealed a socio-economic bias such that women who were older, more educated, not living in poverty, and married reported lower bonding scores. Maternal-infant bonding was significantly negatively correlated with maternal stress, maternal pain, and post partum depression, and positively correlated with partner support with the infant, and social support. For researchers who wish to measure maternal-infant bonding but are in need of a relatively short scale, the 10 item S-PBQ may be a useful alternative to the original version. However, it is important that researchers measuring maternal-infant bonding also investigate socio-economic bias in their studies and adjust for this effect as needed. Our results also indicate that clinicians should be aware of life stressors that may impact the maternal-infant relationship, in order that intervention may be provided to improved health outcomes for mothers, infants, and families. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. AMS measurements of 14C and 129I in seawater around radioactive waste dump sites

    NASA Astrophysics Data System (ADS)

    Povinec, P. P.; Oregioni, B.; Jull, A. J. T.; Kieser, W. E.; Zhao, X.-L.

    2000-10-01

    According to a recent IAEA compilation of inventories of radioactive wastes dumped in the world ocean, a total of 85 PBq of radioactive wastes were dumped, in the Atlantic (45 PBq), the Pacific (1.4 PBq) and the Arctic (38 PBq) Oceans and their marginal seas between 1946 and 1993, mostly in the form of low-level wastes. 3H, and 14C formed an important part of the beta-activity of these dumped wastes. Because of its long half-life, 14C will be the main constituent in possible leakages from the wastes in the future. On the other hand, 14C and 129I are important radioactive tracers which have been artificially introduced into the oceans. Small amounts of 14C and 129I can be easily measured by accelerator mass spectrometry (AMS) on mg-size samples of carbon and iodine extracted from 500 ml seawater samples. The high analytical sensitivity enables one therefore to find even trace amounts of 14C and 129I which could be released from radioactive wastes, and to compare the measured levels with the global distribution of these radionuclides. The IAEAs Marine Environment Laboratory (IAEA-MEL) has been engaged in an assessment program related to radioactive waste dumping in the oceans since 1992 and has participated in several expeditions to the Atlantic, Arctic, Indian and Pacific Oceans to sample seawater, biota and sediment for radiological assessment studies. In the present paper, we report on methods of 14C and 129I measurements in seawater by AMS and present data on the NE Atlantic, the Arctic and the NW Pacific Ocean dumping sites. A small increase of 14C was observed at the NE Atlantic dumping site.

  14. BIRTH-RELATED, PSYCHOSOCIAL, AND EMOTIONAL CORRELATES OF POSITIVE MATERNAL-INFANT BONDING IN A COHORT OF FIRST-TIME MOTHERS

    PubMed Central

    Kinsey, Cara Bicking; Baptiste-Roberts, Kesha; Zhu, Junjia; Kjerulff, Kristen H.

    2014-01-01

    Objective to describe the development of a shortened 10-item version of the Postpartum Bonding Questionnaire (S-PBQ) and examine the relationship between birth-related, psychosocial, and emotional factors and maternal-infant bonding. Design cross-sectional interview study. Setting women having their first baby in Pennsylvania, USA. Participants We interviewed 3005 women in their third trimester and at 1 month postpartum who were enrolled in the First Baby Study. Measurements and Findings For the S-PBQ, we completed factor analysis and examined instrument properties. We examined the relationship between birth-related, psychosocial, and emotional factors and maternal-infant bonding using adjusted linear regression models. The S-PBQ demonstrated acceptable internal reliability (Cronbach’s α=0.67). Analysis revealed a socioeconomic bias such that women who were older, more educated, not living in poverty, and married reported lower bonding scores. Maternal-infant bonding was significantly negatively correlated with maternal stress, maternal pain, and postpartum depression, and positively correlated with partner support with the baby, and social support. Key Conclusions and Implications for Practice For researchers who wish to measure maternal-infant bonding but are in need of a relatively short scale, the 10 item S-PBQ may be a useful alternative to the original version. However, it is important that researchers measuring maternal-infant bonding also investigate socioeconomic bias in their studies and adjust for this effect as needed. Our results also indicate that clinicians should be aware of life stressors that may impact the maternal-infant relationship, in order that intervention may be provided to improved health outcomes for mothers, infants, and families. PMID:24650812

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chenna, A.; Singer, B.

    Benzene is a carcinogen in rodents and a cause of bone marrow toxicity and leukemia in humans. p-Benzoquinone (p-BQ) is one of the stable metabolites of benzene, as well as of a number of drugs and other chemicals. 2{prime}-Deoxycytidine (dC) and 2{prime}-deoxyadenosine (dA) were allowed to react with p-BQ in aqueous solution at pH 7.4 and 4.5. The yields were considerably higher at pH 4.5 than at pH 7.4 and 4.5. The yields were considerably higher at pH 4.5 than at pH 7.4, as indicated by HPLC analysis. The desired products were isolated by column chromatography on silica gel ormore » cellulose. Identification was done by FAB-MS, {sup 1}H NMR, and UV spectroscopy. The reaction of p-BQ with dC and dA at pH 4.5 produced the exocyclic compounds 3-hydroxy-1,N{sup 4}-benzetheno-2{prime}-deoxycytidine (p-BA-dC), and 9-hydroxy-1,N{sup 6}-benzetheno-2{prime}-deoxyadenosine (p-BQ-dA), respectively, in a large scale and high yield. These adducts have been previously made in a microgram scale as the 3{prime}-phosphate for {sup 32}P-postlabeling studies of their incidence in DNA. The p-BQ-dC and p-BQ-dA adducts have, in addition to the two hydroxyl groups of deoxyribose, one newly formed hydroxyl group a the C-3 or C-9 of the exocyclic base of each product respectively. Incorporation of these adducts into oligonucleotides as the phosphoramidite requires the protection of ll three hydroxyl groups in these compounds. The mass spectroscopic analysis of the DNA oligomers was confirmed by electrospray MS. These oligomers are now under investigation for their biochemical properties. 41 refs., 4 figs.« less

  16. Dopant titrating ion mobility spectrometry for trace exhaled nitric oxide detection.

    PubMed

    Peng, Liying; Hua, Lei; Li, Enyou; Wang, Weiguo; Zhou, Qinghua; Wang, Xin; Wang, Changsong; Li, Jinghua; Li, Haiyang

    2015-01-05

    Ion mobility spectrometry (IMS) is a promising non-invasive tool for the analysis of exhaled gas and exhaled nitric oxide (NO), a biomarker for diagnosis of respiratory diseases. However, the high moisture in exhaled gas always brings about extra overlapping ion peaks and results in poor identification ability. In this paper, p-benzoquinone (PBQ) was introduced into IMS to eliminate the interference of overlapping ion peaks and realize the selective identification of NO. The overlapping ions caused by moisture were titrated by PBQ and then converted to hydrated PBQ anions (C6H4[Formula: see text](H2O)n). The NO concentration could be determined by quantifying gas phase hydrated nitrite anions (N[Formula: see text](H2O)n), product ions of NO. Under optimized conditions, a limit of detection (LOD) of about 1.4 ppbv and a linear range of 10-200 ppbv were obtained for NO even in 100% relative humidity (RH) purified air. Furthermore, this established method was applied to measure hourly the exhaled NO of eight healthy volunteers, and real-time monitoring the exhaled NO of an esophageal carcinoma patient during radical surgery. These results revealed the potential of the current dopant titrating IMS method in the measurement of exhaled NO for medical disease diagnosis.

  17. Tracing Fukushima Radionuclides in the Northern Hemisphere -An Overview

    NASA Astrophysics Data System (ADS)

    Thakur, Punam; Ballard, Sally; Nelson, Roger

    2013-04-01

    A massive 9.0 earthquake and ensuing tsunami struck the northern coast of the Honshu-island, Japan on March 11, 2011 and severely damaged the electric system of the Fukushima- Daiichi Nuclear Power Plant (NPP). The structural damage to the plant disabled the reactor's cooling systems. Subsequent fires, a hydrogen explosion and possible partial core meltdowns released radioactive fission products into the atmosphere. The atmospheric release from the crippled Fukushima NPP started on March 12, 2011 with a maximum release phase from March 14 to 17. The radioactivity released was dominated by volatile fission products including isotopes of the noble gases xenon (Xe-133) and krypton (Kr-85); iodine (I-131,I-132); cesium (Cs-134,Cs-136,Cs-137); and tellurium (Te-132). The non-volatile radionuclides such as isotopes of strontium and plutonium are believed to have remained largely inside the reactor, although there is evidence of plutonium release into the environment. Global air monitoring across the northern hemisphere was increased following the first reports of atmospheric releases. According to the source term, declared by the Nuclear and Industrial Safety Agency (NISA) of Japan), approximately 160 PBq (1 PBq (Peta Becquerel = 10^15 Bq)) of I-131 and 15 PBq of Cs-137 (or 770 PBq "iodine-131 equivalent"), were released into the atmosphere. The 770 PBq figure is about 15% of the Chernobyl release of 5200 PBq of "iodine-131 equivalent". For the assessment of contamination after the accident and to track the transport time of the contaminated air mass released from the Fukushima NPP across the globe, several model calculations were performed by various research groups. All model calculations suggested long-range transport of radionuclides from the damaged Fukushima NPP towards the North American Continent to Europe and to Central Asia. As a result, an elevated level of Fukushima radionuclides were detected in air, rain, milk, and vegetation samples across the northern hemisphere. Although the releases from the Fukushima NPP were pronounced, due to significant dilution of the radioactivity in the atmosphere as it was transported across the globe, the concentrations of radionuclides measured outside Japan were extremely low. The activities of I-131, Cs-134, and Cs-137 in air were estimated to have diluted by a factor of 105 to 108 during trans-Pacific transport. This paper will present a compilation of the radionuclide concentrations measured across the northern hemisphere by various national and international monitoring networks. It will focus on the most prevalent cesium and iodine isotopes, but other secondary isotopes will be discussed. Spatial and Temporal patterns and differences will be contrasted. The effects from this global radionuclide dispersal are reported and discussed. The activity ratios of ^131I/^137Cs and ^134Cs/^137Cs measured at several locations are evaluated to gain an insight into the fuel burn-up, the inventory of radionuclides in the reactor and thus on the isotopic signature of the accident. It is important to note that all of the radiation levels detected across the northern hemisphere have been very low and are well below any level of public and environmental concern.

  18. Spectroscopic evidence for temperature dependent relative movement of light and heavy hole valence bands of PbQ (Q=Te,Se,S)

    NASA Astrophysics Data System (ADS)

    Chatterjee, Utpal; Zhao, Junjing; Kanatzidis, Mercouri; Malliakas, Christos

    We have conducted temperature dependent Angle Resolved Photoemission Spectroscopy (ARPES) studies of the electronic structures of PbTe, PbSe and PbS. Our ARPES measurements provide direct evidences for the light hole upper valence bands (UVBs) and the so-called heavy hole lower valence bands (LVBs), and an unusual temperature dependent relative movement between their band maxima leading to a monotonic decrease in the energy separation between LVBs and UVBs with increase in temperature. This enables convergence of these valence bands and consequently an effective increase in the valley degeneracy in PbQ at higher temperatures, which has long been believed to be the driving factor behind their extraordinary thermoelectric performance.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hang, Bo; Rodriguez, Ben; Yang, Yanu

    Benzene, a ubiquitous human carcinogen, forms DNA adducts through its metabolites such as p-benzoquinone (p-BQ) and hydroquinone (HQ). N(2)-(4-Hydroxyphenyl)-2'-deoxyguanosine (N(2)-4-HOPh-dG) is the principal adduct identified in vivo by (32)P-postlabeling in cells or animals treated with p-BQ or HQ. To study its effect on repair specificity and replication fidelity, we recently synthesized defined oligonucleotides containing a site-specific adduct using phosphoramidite chemistry. We here report the repair of this adduct by Escherichia coli UvrABC complex, which performs the initial damage recognition and incision steps in the nucleotide excision repair (NER) pathway. We first showed that the p-BQ-treated plasmid was efficiently cleaved bymore » the complex, indicating the formation of DNA lesions that are substrates for NER. Using a 40-mer substrate, we found that UvrABC incises the DNA strand containing N(2)-4-HOPh-dG in a dose- and time-dependent manner. The specificity of such repair was also compared with that of DNA glycosylases and damage-specific endonucleases of E. coli, both of which were found to have no detectable activity toward N(2)-4-HOPh-dG. To understand why this adduct is specifically recognized and processed by UvrABC, molecular modeling studies were performed. Analysis of molecular dynamics trajectories showed that stable G:C-like hydrogen bonding patterns of all three Watson-Crick hydrogen bonds are present within the N(2)-4-HOPh-G:C base pair, with the hydroxyphenyl ring at an almost planar position. In addition, N(2)-4-HOPh-dG has a tendency to form more stable stacking interactions than a normal G in B-type DNA. These conformational properties may be critical in differential recognition of this adduct by specific repair enzymes.« less

  20. [The radiological situation before and after Chernobyl disaster].

    PubMed

    Leoniak, Marcin; Zonenberg, Anna; Zarzycki, Wiesław

    2006-01-01

    The nuclear reactor accident, which occurred on 26 April 1986 at Chernobyl, has been one of the greatest ecological disasters in human history. In our study we discussed the most recent data on the accident, and the natural and synthetic sources of radiation. According to the recent data, the air at Chernobyl had been contaminated with about 5300 PBq radionuclide activity (excluding rare gases), including 1760 PBq (131)I and 85 PBq (137)Cs. The highest radiation received by the liquidators (0.8-16 Gy), lower doses were received by the population which was evacuated or inhabited the contaminated areas (in which the level of (137)Cs activity deposited in the earth was 37 kBq/m(2)). In the European countries the highest mean radiation dose per year for the whole body in the first year after the accident was in Bulgaria (760 microSv), Austria (670 microSv) and Greece (590 microSv), while the lowest radiation dose was observed in Portugal (1.8 microSv) and Spain (4.2 microSv). In Poland the mean effective equivalent dose resulting from Chernobyl accident was 932 microSv and is close to the limited dose permitted in Poland, equalling 1 mSv/year. The highest radiation dose to thyroid was received by inhabitants of the states previously known as Bielskopodlaskie, Nowosadeckie and the north-east region of Poland. Lowest dose was received by inhabitants of the areas previously known as Slupski and Rzeszowski.

  1. Spatial and temporal distributions of (134)Cs and (137)Cs derived from the TEPCO Fukushima Daiichi Nuclear Power Plant accident in the North Pacific Ocean by using optimal interpolation analysis.

    PubMed

    Inomata, Y; Aoyama, M; Tsubono, T; Tsumune, D; Hirose, K

    2016-01-01

    Optimal interpolation (OI) analysis was used to investigate the oceanic distributions of (134)Cs and (137)Cs released from the Tokyo Electric Power Company Fukushima Daiichi Nuclear Power Plant (FNPP1) accident. From the end of March to early April 2011, extremely high activities were observed in the coastal surface seawater near the FNPP1. The high activities spread to a region near 165°E in the western North Pacific Ocean, with a latitudinal center of 40°N. Atmospheric deposition also caused high activities in the region between 180° and 130°W in the North Pacific Ocean. The inventory of FNPP1-released (134)Cs in the North Pacific Ocean was estimated to be 15.3 ± 2.6 PBq. About half of this activity (8.4 ± 2.6 PBq) was found in the coastal region near the FNPP1. After 6 April 2011, when major direct releases ceased, the FNPP1-released (134)Cs in the coastal region decreased exponentially with an apparent half-time of about 4.2 ± 0.5 days and declined to about 2 ± 0.4 PBq by the middle of May 2011. Taking into account that the (134)Cs/(137)Cs activity ratio was about 1 just after release and was extremely uniform during the first month after the accident, the amount of (137)Cs released by the FNPP1 accident increased the North Pacific inventory of (137)Cs due to bomb testing during the 1950s and early 1960s by 20%.

  2. Consequences of the radiation accident at the Mayak production association in 1957 (the 'Kyshtym Accident').

    PubMed

    Akleyev, A V; Krestinina, L Yu; Degteva, M O; Tolstykh, E I

    2017-09-01

    This paper presents an overview of the nuclear accident that occurred at the Mayak Production Association (PA) in the Russian Federation on 29 September 1957, often referred to as 'Kyshtym Accident', when 20 MCi (740 PBq) of radionuclides were released by a chemical explosion in a radioactive waste storage tank. 2 MCi (74 PBq) spread beyond the Mayak PA site to form the East Urals Radioactive Trace (EURT). The paper describes the accident and gives brief characteristics of the efficacy of the implemented protective measures that made it possible to considerably reduce doses to the exposed population. The paper also provides retrospective dosimetry estimates for the members of the EURT Cohort (EURTC) which comprises approximately 21 400 people. During the first two years after the accident a decrease in the group average leukocyte (mainly due to neutrophils and lymphocytes) and thrombocyte count was observed in the population. At later dates an increased excess relative risk of solid cancer incidence and mortality was found in the EURTC.

  3. Estimation of the Cesium-137 Source Term from the Fukushima Daiichi Power Plant Using Air Concentration and Deposition Data

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2013-04-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativeness of the measurements, the instrumental errors, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, and specially in a situation of sparse observability, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. In Winiarek et al. (2012), we proposed to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We applied the method to the estimation of the Fukushima Daiichi cesium-137 and iodine-131 source terms using activity concentrations in the air. The results were compared to an L-curve estimation technique, and to Desroziers's scheme. Additionally to the estimations of released activities, we provided related uncertainties (12 PBq with a std. of 15 - 20 % for cesium-137 and 190 - 380 PBq with a std. of 5 - 10 % for iodine-131). We also enlightened that, because of the low number of available observations (few hundreds) and even if orders of magnitude were consistent, the reconstructed activities significantly depended on the method used to estimate the prior errors. In order to use more data, we propose to extend the methods to the use of several data types, such as activity concentrations in the air and fallout measurements. The idea is to simultaneously estimate the prior errors related to each dataset, in order to fully exploit the information content of each one. Using the activity concentration measurements, but also daily fallout data from prefectures and cumulated deposition data over a region lying approximately 150 km around the nuclear power plant, we can use a few thousands of data in our inverse modeling algorithm to reconstruct the Cesium-137 source term. To improve the parameterization of removal processes, rainfall fields have also been corrected using outputs from the mesoscale meteorological model WRF and ground station rainfall data. As expected, the different methods yield closer results as the number of data increases. Reference : Winiarek, V., M. Bocquet, O. Saunier, A. Mathieu (2012), Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant : Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant, J. Geophys. Res., 117, D05122, doi:10.1029/2011JD016932.

  4. Validation of a Multidimensional Assessment of Parenting Styles for Low-Income African-American Families with Preschool Children.

    ERIC Educational Resources Information Center

    Coolahan, Kathleen; McWayne, Christine; Fantuzzo, John; Grim, Suzanne

    2002-01-01

    Examined the construct and concurrent validity of the Parenting Behavior Questionnaire-Head Start (PBQ-HS) with low-income African-American families with preschoolers, and whether parenting styles differed by caregiver characteristics. Derived Active-Responsive, Active-Restrictive, and Passive-Permissive parenting dimensions; the last differed…

  5. Global radioxenon emission inventory based on nuclear power reactor reports.

    PubMed

    Kalinowski, Martin B; Tuma, Matthias P

    2009-01-01

    Atmospheric radioactivity is monitored for the verification of the Comprehensive Nuclear-Test-Ban Treaty, with xenon isotopes 131mXe, 133Xe, 133mXe and 135Xe serving as important indicators of nuclear explosions. The treaty-relevant interpretation of atmospheric concentrations of radioxenon is enhanced by quantifying radioxenon emissions released from civilian facilities. This paper presents the first global radioxenon emission inventory for nuclear power plants, based on North American and European emission reports for the years 1995-2005. Estimations were made for all power plant sites for which emission data were unavailable. According to this inventory, a total of 1.3PBq of radioxenon isotopes are released by nuclear power plants as continuous or pulsed emissions in a generic year.

  6. An overview of current knowledge concerning the health and environmental consequences of the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident.

    PubMed

    Aliyu, Abubakar Sadiq; Evangeliou, Nikolaos; Mousseau, Timothy Alexander; Wu, Junwen; Ramli, Ahmad Termizi

    2015-12-01

    Since 2011, the scientific community has worked to identify the exact transport and deposition patterns of radionuclides released from the accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP) in Japan. Nevertheless, there still remain many unknowns concerning the health and environmental impacts of these radionuclides. The present paper reviews the current understanding of the FDNPP accident with respect to interactions of the released radionuclides with the environment and impacts on human and non-human biota. Here, we scrutinize existing literature and combine and interpret observations and modeling assessments derived after Fukushima. Finally, we discuss the behavior and applications of radionuclides that might be used as tracers of environmental processes. This review focuses on (137)Cs and (131)I releases derived from Fukushima. Published estimates suggest total release amounts of 12-36.7PBq of (137)Cs and 150-160PBq of (131)I. Maximum estimated human mortality due to the Fukushima nuclear accident is 10,000 (due to all causes) and the maximum estimates for lifetime cancer mortality and morbidity are 1500 and 1800, respectively. Studies of plants and animals in the forests of Fukushima have recorded a range of physiological, developmental, morphological, and behavioral consequences of exposure to radioactivity. Some of the effects observed in the exposed populations include the following: hematological aberrations in Fukushima monkeys; genetic, developmental and morphological aberrations in a butterfly; declines in abundances of birds, butterflies and cicadas; aberrant growth forms in trees; and morphological abnormalities in aphids. These findings are discussed from the perspective of conservation biology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. CeLAND: search for a 4th light neutrino state with a 3 PBq 144Ce- 144Pr electron antineutrino generator in KamLAND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gando, A; Gando, Y; Hayashida, S

    The reactor neutrino and gallium anomalies can be tested with a 3-4 PBq (75-100 kCi scale) 144Ce- 144Pr antineutrino beta-source deployed at the center or next to a large low-background liquid scintillator detector. The antineutrino generator will be produced by the Russian reprocessing plant PA Mayak as early as 2014, transported to Japan, and deployed in the Kamioka Liquid Scintillator Anti-Neutrino Detector (KamLAND) as early as 2015. KamLAND's 13 m diameter target volume provides a suitable environment to measure the energy and position dependence of the detected neutrino flux. A characteristic oscillation pattern would be visible for a baseline of about 10 m or less, providing a very clean signal of neutrino disappearance into a yet-unknown, sterile neutrino state. This will provide a comprehensive test of the electron dissaperance neutrino anomalies and could lead to the discovery of a 4th neutrino state for Δmmore » $$2\\atop{new}$$ ≳ 0.1 eV 2 and sin 2(2θ new) > 0.05.« less

  8. An experimental and theoretical investigation into the electronically excited states of para-benzoquinone

    NASA Astrophysics Data System (ADS)

    Jones, D. B.; Limão-Vieira, P.; Mendes, M.; Jones, N. C.; Hoffmann, S. V.; da Costa, R. F.; Varella, M. T. do N.; Bettega, M. H. F.; Blanco, F.; García, G.; Ingólfsson, O.; Lima, M. A. P.; Brunger, M. J.

    2017-05-01

    We report on a combination of experimental and theoretical investigations into the structure of electronically excited para-benzoquinone (pBQ). Here synchrotron photoabsorption measurements are reported over the 4.0-10.8 eV range. The higher resolution obtained reveals previously unresolved pBQ spectral features. Time-dependent density functional theory calculations are used to interpret the spectrum and resolve discrepancies relating to the interpretation of the Rydberg progressions. Electron-impact energy loss experiments are also reported. These are combined with elastic electron scattering cross section calculations performed within the framework of the independent atom model-screening corrected additivity rule plus interference (IAM-SCAR + I) method to derive differential cross sections for electronic excitation of key spectral bands. A generalized oscillator strength analysis is also performed, with the obtained results demonstrating that a cohesive and reliable quantum chemical structure and cross section framework has been established. Within this context, we also discuss some issues associated with the development of a minimal orbital basis for the single configuration interaction strategy to be used for our high-level low-energy electron scattering calculations that will be carried out as a subsequent step in this joint experimental and theoretical investigation.

  9. Vinpocetine Reduces Carrageenan-Induced Inflammatory Hyperalgesia in Mice by Inhibiting Oxidative Stress, Cytokine Production and NF-κB Activation in the Paw and Spinal Cord

    PubMed Central

    Ruiz-Miyazawa, Kenji W.; Zarpelon, Ana C.; Pinho-Ribeiro, Felipe A.; Pavão-de-Souza, Gabriela F.; Casagrande, Rubia; Verri, Waldiceu A.

    2015-01-01

    Vinpocetine is a safe nootropic agent used for neurological and cerebrovascular diseases. The anti-inflammatory activity of vinpocetine has been shown in cell based assays and animal models, leading to suggestions as to its utility in analgesia. However, the mechanisms regarding its efficacy in inflammatory pain treatment are still not completely understood. Herein, the analgesic effect of vinpocetine and its anti-inflammatory and antioxidant mechanisms were addressed in murine inflammatory pain models. Firstly, we investigated the protective effects of vinpocetine in overt pain-like behavior induced by acetic acid, phenyl-p-benzoquinone (PBQ) and formalin. The intraplantar injection of carrageenan was then used to induce inflammatory hyperalgesia. Mechanical and thermal hyperalgesia were evaluated using the electronic von Frey and the hot plate tests, respectively, with neutrophil recruitment to the paw assessed by a myeloperoxidase activity assay. A number of factors were assessed, both peripherally and in the spinal cord, including: antioxidant capacity, reduced glutathione (GSH) levels, superoxide anion, tumor necrosis factor alpha (TNF-α) and interleukin 1 beta (IL-1β) levels, as well as nuclear factor kappa B (NF-κB) activation. Vinpocetine inhibited the overt pain-like behavior induced by acetic acid, PBQ and formalin (at both phases), as well as the carrageenan-induced mechanical and thermal hyperalgesia and associated neutrophil recruitment. Both peripherally and in the spinal cord, vinpocetine also inhibited: antioxidant capacity and GSH depletion; increased superoxide anion; IL-1β and TNF-α levels; and NF-κB activation. As such, vinpocetine significantly reduces inflammatory pain by targeting oxidative stress, cytokine production and NF-κB activation at both peripheral and spinal cord levels. PMID:25822523

  10. Vinpocetine reduces carrageenan-induced inflammatory hyperalgesia in mice by inhibiting oxidative stress, cytokine production and NF-κB activation in the paw and spinal cord.

    PubMed

    Ruiz-Miyazawa, Kenji W; Zarpelon, Ana C; Pinho-Ribeiro, Felipe A; Pavão-de-Souza, Gabriela F; Casagrande, Rubia; Verri, Waldiceu A

    2015-01-01

    Vinpocetine is a safe nootropic agent used for neurological and cerebrovascular diseases. The anti-inflammatory activity of vinpocetine has been shown in cell based assays and animal models, leading to suggestions as to its utility in analgesia. However, the mechanisms regarding its efficacy in inflammatory pain treatment are still not completely understood. Herein, the analgesic effect of vinpocetine and its anti-inflammatory and antioxidant mechanisms were addressed in murine inflammatory pain models. Firstly, we investigated the protective effects of vinpocetine in overt pain-like behavior induced by acetic acid, phenyl-p-benzoquinone (PBQ) and formalin. The intraplantar injection of carrageenan was then used to induce inflammatory hyperalgesia. Mechanical and thermal hyperalgesia were evaluated using the electronic von Frey and the hot plate tests, respectively, with neutrophil recruitment to the paw assessed by a myeloperoxidase activity assay. A number of factors were assessed, both peripherally and in the spinal cord, including: antioxidant capacity, reduced glutathione (GSH) levels, superoxide anion, tumor necrosis factor alpha (TNF-α) and interleukin 1 beta (IL-1β) levels, as well as nuclear factor kappa B (NF-κB) activation. Vinpocetine inhibited the overt pain-like behavior induced by acetic acid, PBQ and formalin (at both phases), as well as the carrageenan-induced mechanical and thermal hyperalgesia and associated neutrophil recruitment. Both peripherally and in the spinal cord, vinpocetine also inhibited: antioxidant capacity and GSH depletion; increased superoxide anion; IL-1β and TNF-α levels; and NF-κB activation. As such, vinpocetine significantly reduces inflammatory pain by targeting oxidative stress, cytokine production and NF-κB activation at both peripheral and spinal cord levels.

  11. Postpartum bonding: the role of perinatal depression, anxiety and maternal-fetal bonding during pregnancy.

    PubMed

    Dubber, S; Reck, C; Müller, M; Gawlik, S

    2015-04-01

    Adverse effects of perinatal depression on the mother-child interaction are well documented; however, the influence of maternal-fetal bonding during pregnancy on postpartum bonding has not been clearly identified. The subject of this study was to investigate prospectively the influence of maternal-fetal bonding and perinatal symptoms of anxiety and depression on postpartum mother-infant bonding. Data from 80 women were analyzed for associations of symptoms of depression and anxiety as well as maternal bonding during pregnancy to maternal bonding in the postpartum period using the Edinburgh Postnatal Depression Scale (EPDS), the State-Trait Anxiety Inventory (STAI), the Pregnancy Related Anxiety Questionnaire (PRAQ-R), the Maternal-Fetal Attachment Scale (MFAS) and the Postpartum Bonding Questionnaire (PBQ-16). Maternal education, MFAS, PRAQ-R, EPDS and STAI-T significantly correlated with the PBQ-16. In the final regression model, MFAS and EPDS postpartum remained significant predictors of postpartum bonding and explained 20.8 % of the variance. The results support the hypothesized negative relationship between maternal-fetal bonding and postpartum maternal bonding impairment as well as the role of postpartum depressive symptoms. Early identification of bonding impairment during pregnancy and postpartum depression in mothers plays an important role for the prevention of potential bonding impairment in the early postpartum period.

  12. Comparison of the Chernobyl and Fukushima nuclear accidents: a review of the environmental impacts.

    PubMed

    Steinhauser, Georg; Brandl, Alexander; Johnson, Thomas E

    2014-02-01

    The environmental impacts of the nuclear accidents of Chernobyl and Fukushima are compared. In almost every respect, the consequences of the Chernobyl accident clearly exceeded those of the Fukushima accident. In both accidents, most of the radioactivity released was due to volatile radionuclides (noble gases, iodine, cesium, tellurium). However, the amount of refractory elements (including actinides) emitted in the course of the Chernobyl accident was approximately four orders of magnitude higher than during the Fukushima accident. For Chernobyl, a total release of 5,300 PBq (excluding noble gases) has been established as the most cited source term. For Fukushima, we estimated a total source term of 520 (340-800) PBq. In the course of the Fukushima accident, the majority of the radionuclides (more than 80%) was transported offshore and deposited in the Pacific Ocean. Monitoring campaigns after both accidents reveal that the environmental impact of the Chernobyl accident was much greater than of the Fukushima accident. Both the highly contaminated areas and the evacuated areas are smaller around Fukushima and the projected health effects in Japan are significantly lower than after the Chernobyl accident. This is mainly due to the fact that food safety campaigns and evacuations worked quickly and efficiently after the Fukushima accident. In contrast to Chernobyl, no fatalities due to acute radiation effects occurred in Fukushima. © 2013.

  13. Wide Area Recovery and Resiliency Program (WARRP) Decon-13 Subject Matter Expert Meeting

    DTIC Science & Technology

    2012-08-14

    Japan, Chernobyl , Goiania Waste Screening Workshop August 14, 2012 Edward A. Tupin Center for Radiological Emergency Response Radiation Protection...Total release -10% - 20% of releases from Chernobyl (37 PBq = 1,000,000 Curies) L~:lCl.~== ~ Wide Ar£>a Contamination ~ MEXT data as of S£>pt£>mber...and longer-tenn interim storage - disposal likely will take more time 2 1 On April 26, 1986, Unit 4 of the Chernobyl Nuclear Power Plant suffered

  14. Joint US/Russian Studies of Population Exposures Resulting from Nuclear Production Activities in the Southern Urals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, Bruce A.

    2014-01-01

    Beginning in 1948, the Soviet Union initiated a program for production of nuclear materials for a weapons program. The first facility for production of plutonium was constructed in the central portion of the country east of the southern Ural Mountains, about halfway between the major industrial cities of Ekaterinburg and Chelyabinsk. The facility now known as the Mayak Production Association and its associated town, now known as Ozersk, were built to irradiate uranium in reactors, separate the resulting plutonium in reprocessing plants, and prepare plutonium metal. The rush to production, coupled with inexperience in handling radioactive materials, lead to largemore » radiation exposures, not only to the workers in the facilities, but also to the surrounding public. Fuel processing started with no controls on releases, and fuel dissolution and accidents in reactors resulted in release of about 37 PBq (1015 Bq) of 131I between 1948 and 1967. Designed disposals of low- and intermediate-level liquid radioactive wastes, and accidental releases via cooling water from tank farms of high-level liquid radioactive wastes, into the small Techa River caused significant contamination and exposures to residents of numerous small riverside villages downstream of the site. Discovery of the magnitude of the aquatic contamination in late 1951 caused revisions to the waste handling regimes, but not before over 200 PBq of radionuclides (with large contributions of 90Sr and 137Cs) were released. Liquid wastes were diverted to tiny Lake Karachay (which today holds over 4 EBq); cooling water was stopped in the tank farms. In 1957, one of the tanks in the tank farm overheated and exploded; over 70 PBq, disproportionately 90Sr, was blown over a large area to the northeast of the site; a large area was contaminated and many villages evacuated. This area today is known as the East Urals Radioactive Trace (EURT). Each of these releases was significant; together they have created a group of cohorts unrivaled in the world for their chronic, low-dose-rate radiation exposure. The 26,000 workers at Mayak were highly exposed to external gamma and inhaled plutonium. A cohort of individuals raised as children in Ozersk is under evaluation for their exposures to radioiodine. The Techa River Cohort consists of over 30,000 people who were born before the start of exposure in 1949 and lived along the Techa River. The Techa River Offspring Cohort consists of about 21,000 persons born to one or more exposed parents of this group - many of whom also lived along the contaminated river. The EURT Cohort consists of about 18,000 people who were evacuated from the EURT soon after the 1957 explosion and another 8000 who remained. These groups together are the focus of dose reconstruction and epidemiological studies funded by the US, Russia, and the European Union to address the question “Are doses delivered at low dose rates as effective in producing health effects as the same doses delivered at high dose rates?”« less

  15. Effect of miscarriage history on maternal-infant bonding during the first year postpartum in the first baby study: a longitudinal cohort study.

    PubMed

    Bicking Kinsey, Cara; Baptiste-Roberts, Kesha; Zhu, Junjia; Kjerulff, Kristen H

    2014-07-15

    Miscarriage, the unexpected loss of pregnancy before 20 weeks gestation, may have a negative effect on a mother's perception of herself as a capable woman and on her emotional health when she is pregnant again subsequent to the miscarriage. As such, a mother with a history of miscarriage may be at greater risk for difficulties navigating the process of becoming a mother and achieving positive maternal-infant bonding with an infant born subsequent to the loss. The aim of this study was to examine the effect of miscarriage history on maternal-infant bonding after the birth of a healthy infant to test the hypothesis that women with a history of miscarriage have decreased maternal-infant bonding compared to women without a history of miscarriage. We completed secondary analysis of the First Baby Study, a longitudinal cohort study, to examine the effect of a history of miscarriage on maternal-infant bonding at 1 month, 6 months, and 12 months after women experienced the birth of their first live-born baby. In a sample of 2798 women living in Pennsylvania, USA, we tested our hypothesis using linear regression analysis of Shortened Postpartum Bonding Questionnaire (S-PBQ) scores, followed by longitudinal analysis using a generalized estimating equations model with repeated measures. We found that women with a history of miscarriage had similar S-PBQ scores as women without a history of miscarriage at each of the three postpartum time points. Likewise, longitudinal analysis revealed no difference in the pattern of maternal-infant bonding scores between women with and without a history of miscarriage. Women in the First Baby Study with a history of miscarriage did not differ from women without a history of miscarriage in their reported level of bonding with their subsequently born infants. It is important for clinicians to recognize that even though some women may experience impaired bonding related to a history of miscarriage, the majority of women form a healthy bond with their infant despite this history.

  16. Effect of miscarriage history on maternal-infant bonding during the first year postpartum in the First Baby Study: a longitudinal cohort study

    PubMed Central

    2014-01-01

    Background Miscarriage, the unexpected loss of pregnancy before 20 weeks gestation, may have a negative effect on a mother’s perception of herself as a capable woman and on her emotional health when she is pregnant again subsequent to the miscarriage. As such, a mother with a history of miscarriage may be at greater risk for difficulties navigating the process of becoming a mother and achieving positive maternal-infant bonding with an infant born subsequent to the loss. The aim of this study was to examine the effect of miscarriage history on maternal-infant bonding after the birth of a healthy infant to test the hypothesis that women with a history of miscarriage have decreased maternal-infant bonding compared to women without a history of miscarriage. Methods We completed secondary analysis of the First Baby Study, a longitudinal cohort study, to examine the effect of a history of miscarriage on maternal-infant bonding at 1 month, 6 months, and 12 months after women experienced the birth of their first live-born baby. In a sample of 2798 women living in Pennsylvania, USA, we tested our hypothesis using linear regression analysis of Shortened Postpartum Bonding Questionnaire (S-PBQ) scores, followed by longitudinal analysis using a generalized estimating equations model with repeated measures. Results We found that women with a history of miscarriage had similar S-PBQ scores as women without a history of miscarriage at each of the three postpartum time points. Likewise, longitudinal analysis revealed no difference in the pattern of maternal-infant bonding scores between women with and without a history of miscarriage. Conclusions Women in the First Baby Study with a history of miscarriage did not differ from women without a history of miscarriage in their reported level of bonding with their subsequently born infants. It is important for clinicians to recognize that even though some women may experience impaired bonding related to a history of miscarriage, the majority of women form a healthy bond with their infant despite this history. PMID:25028056

  17. The search for sterile neutrinos with SOX-Borexino

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Altenmüller, K., E-mail: konrad.altenmueller@ph.tum.de; Agostini, M.; Appel, S.

    2016-12-15

    The aim of the SOX-Borexino project is to verify or falsify the existence of eV-scale sterile neutrinos. The existence of sterile neutrinos is suspected because of several anomalies, which were observed in previous experiments. A ~3.7 PBq electron antineutrino source made of {sup 144}Ce will be installed below the Borexino detector at LNGS, Italy, to search for short-baseline oscillations of active-to-sterile neutrinos within the detector volume. Source delivery and beginning of data acquisition is planned for end of 2016, preliminary results are expected already in 2017.

  18. The search for sterile neutrinos with SOX-Borexino

    NASA Astrophysics Data System (ADS)

    Altenmüller, K.; Agostini, M.; Appel, S.; Bellini, G.; Benziger, J.; Berton, N.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Caminata, A.; Cavalcante, P.; Chepurnov, A.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; di Noto, L.; Drachnev, I.; Durero, M.; Empl, A.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Göger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, Th.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonqures, N.; Kaiser, M.; Kobychev, V.; Korablev, D.; Korga, G.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Maneschg, W.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pagani, L.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Roncin, R.; Romani, A.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Toropova, M.; Unzhakov, E.; Veyssière, C.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2016-12-01

    The aim of the SOX-Borexino project is to verify or falsify the existence of eV-scale sterile neutrinos. The existence of sterile neutrinos is suspected because of several anomalies, which were observed in previous experiments. A 3.7 PBq electron antineutrino source made of 144Ce will be installed below the Borexino detector at LNGS, Italy, to search for short-baseline oscillations of active-to-sterile neutrinos within the detector volume. Source delivery and beginning of data acquisition is planned for end of 2016, preliminary results are expected already in 2017.

  19. The Early Mother-to-Child Bond and Its Unique Prospective Contribution to Child Behavior Evaluated by Mothers and Teachers.

    PubMed

    Fuchs, Anna; Möhler, Eva; Reck, Corinna; Resch, Franz; Kaess, Michael

    Maternal bonding has been described as the quality of the affective tie from a mother to her infant. This early bond's mental components and its longitudinal impact on child outcome have been markedly understudied. Although most researchers assume impaired maternal bonding to have a negative impact on child development, there is a lack of prospective studies evaluating this hypothesis. Since maternal mental health problems may negatively affect both bonding quality and child development, it is still to be determined whether there is a unique contribution of bonding quality to child behavior problems over and above maternal psychopathology. We examined a community sample of 101 mother-child dyads at the child's age of 2 weeks (t1) and 6 weeks (t2), 4 months (t3), 14 months (t4), and 5.5 years (t5). Maternal bonding and psychopathology were assessed at time points t1-t4 using the Postpartum Bonding Questionnaire (PBQ-16) and the Symptom Checklist Revised (SCL 90-R). Child behavior problems were rated in a multi-informant design by mothers and teachers at t5 using the Strengths and Difficulties Questionnaire (SDQ). In the case of maternal judgment of child behavior problems, bonding at 14 months (t4) proved to be a significant predictor (β = 0.30; p = 0.011). Teacher-rated child behavior problems were significantly predicted by maternal bonding at 2 weeks (t1; β = 0.48; p = 0.025). Our results indicate a prospective influence of the early mother-infant bond on child development and underline the unique contribution of bonding quality to child behavior problems over and above the impact of maternal psychopathology in a community sample. © 2016 S. Karger AG, Basel.

  20. Maximal design basis accident of fusion neutron source DEMO-TIN

    NASA Astrophysics Data System (ADS)

    Kolbasov, B. N.

    2015-12-01

    When analyzing the safety of nuclear (including fusion) facilities, the maximal design basis accident at which the largest release of activity is expected must certainly be considered. Such an accident is usually the failure of cooling systems of the most thermally stressed components of a reactor (for a fusion facility, it is the divertor or the first wall). The analysis of safety of the ITER reactor and fusion power facilities (including hybrid fission-fusion facilities) shows that the initial event of such a design basis accident is a large-scale break of a pipe in the cooling system of divertor or the first wall outside the vacuum vessel of the facility. The greatest concern is caused by the possibility of hydrogen formation and the inrush of air into the vacuum chamber (VC) with the formation of a detonating mixture and a subsequent detonation explosion. To prevent such an explosion, the emergency forced termination of the fusion reaction, the mounting of shutoff valves in the cooling systems of the divertor and the first wall or blanket for reducing to a minimum the amount of water and air rushing into the VC, the injection of nitrogen or inert gas into the VC for decreasing the hydrogen and oxygen concentration, and other measures are recommended. Owing to a continuous feed-out of the molten-salt fuel mixture from the DEMO-TIN blanket with the removal period of 10 days, the radioactivity release at the accident will mainly be determined by tritium (up to 360 PBq). The activity of fission products in the facility will be up to 50 PBq.

  1. Nitroxyl inhibits overt pain-like behavior in mice: role of cGMP/PKG/ATP-sensitive potassium channel signaling pathway

    PubMed Central

    Staurengo-Ferrari, Larissa; Zarpelon, Ana C.; Longhi-Balbinot, Daniela T.; Marchesi, Mario; Cunha, Thiago M.; Alves-Filho, José C.; Cunha, Fernando Q.; Ferreira, Sergio H.; Casagrande, Rubia; Miranda, Katrina M.; Verri, Waldiceu A.

    2014-01-01

    Background Several lines of evidence have indicated that nitric oxide (NO) plays complex and diverse roles in modulation of pain/analgesia. However, the roles of charged and uncharged congeners of NO are less well understood. In the present study, the antinociceptive effect of the nitroxyl (HNO) donor, Angeli’s salt (Na2N2O3; AS) was investigated in models of overt pain-like behavior. Moreover, whether the antinociceptive effect of nitroxyl was dependent on the activation of cGMP (cyclic guanosine monophosphate)/PKG (protein kinase G)/ATP-sensitive potassium channels was addressed. Methods The antinociceptive effect of AS was evaluated on phenyl-p-benzoquinone (PBQ)- and acetic acid-induced writhings and via the formalin test. In addition, pharmacological treatments targeting guanylate cyclase (ODQ), PKG (KT5923) and ATP-sensitive potassium channel (glybenclamide) were used. Results PBQ and acetic acid induced significant writhing responses over 20 min. The nociceptive response in these models were significantly reduced in a dose-dependent manner by subcutaneous pre-treatment with AS. Furthermore, AS also inhibited both phases of the formalin test. Subsequently, the inhibitory effect of AS in writhing and flinching responses were prevented by ODQ, KT5823 and glybenclamide, although these inhibitors alone did not alter the writhing score. Furthermore, pretreatment with L-cysteine, an HNO scavenger, confirmed that the antinociceptive effect of AS depends on HNO. Conclusion The present study demonstrates the efficacy of a nitroxyl donor and its analgesic mechanisms in overt pain-like behavior by activating the cGMP/PKG/ATP-sensitive potassium channel (K+) signaling pathway. PMID:24948073

  2. New cytotoxic diterpenylnaphthohydroquinone derivatives obtained from a natural diterpenoid.

    PubMed

    Miguel Del Corral, José M; Castro, M Angeles; Lucena Rodri Guez, M; Chamorro, Pablo; Cuevas, Carmen; San Feliciano, Arturo

    2007-09-01

    Diterpenylquinone/hydroquinone derivatives were prepared through Diels-Alder cycloaddition between natural myrcecommunic acid or its methyl ester and p-benzoquinone (p-BQ), using BF(3).Et(2)O as catalyst or under microwave (Mw) irradiation. Acetyl, methyl and benzyl derivatives of several diterpenylnaphthohydroquinone were prepared from cycloadducts following two basic synthetic strategies, either protection before aromatisation or viceversa. Some of them were further functionalised at the B-ring of the decaline core. Most of the new compounds were evaluated and some of them resulted cytotoxic against several tumour cell lines with IC(50) values under the microM level.

  3. 137Cs and 134Cs activity in the North Pacific Ocean water from 1945 to 2020 by eddy-resolving ROMS

    NASA Astrophysics Data System (ADS)

    Tsubono, Takaki; Misumi, Kazuo; Tsumune, Daisuke; Aoyama, Michio; Hirose, Katsumi

    2017-04-01

    We conducted the simulation of 137Cs activity in the North Pacific Ocean (NPO) water from 1945 to 2020, before and after the Fukushima Dai-ichi Nuclear Power Plant (1F NPP) accident. Using the Regional Ocean Model System (ROMS) with high resolution (1/12°-1/4° in horizontal, 45 levels in vertical), of which domain was the NPO, we preliminarily estimated a factor multiplying the total 134Cs fluxes, which have been estimated for the atmospheric deposition and the direct discharge from the accident. The direct comparison of the observed and calculated 134Cs showed that the total 134Cs flux was 1.6 times greater than the previous estimates. We re-calculated the 134Cs activityies in the NPO water using the flux multiplied by 1.6 and confirmed the improvement of the simulation by the multiplied flux, which suggested that 134Cs and 137Cs inventories in the NPO increase by about 16PBq, respectively, due to the accident. For the hindcast and forecast of the 137Cs activityies in the NPO water, we calculated the 137Cs activityy in the NPO water from 1945 to 2020 by using the global fallout flux due to atmospheric nuclear weapons' tests and the Chernobyl accident and the estimated fluxes of the 1F NPP accident. For the calculation, five ensemble calculations of 137Cs activity were conducted by moving the start period of the input flux for one year. The 137Cs activity in the surface water showed that the plume due to the 1F NPP accident with relatively higher activity than 5 Bq m-3, which was lower than that in 1985, was transported to the western area of 135°W in 2015. The peak year of the 137Cs activity can be estimated from the hindcast and forecast. The 137Cs activity in the surface water north of 30°N shows that the 137Cs peak in 2011 occurs up to 180°, but the peak from 2012 to 2017 is distributed from near 180° to 90°W. The total inventory of 137Cs in the NPO increased up to 77 PBq in 2011 and gradually decreased to 61PBq in 2018 by transport outside of the domain, which is almost the same as that in Dec. 2010. The whole amount of 137Cs in the subsurface layer ( 200-600m depth ) is larger than that in the surface layer ( 0-200m depth) since the 1F NPP accident except 2011.

  4. Exploring differences in pain beliefs within and between a large nonclinical (workplace) population and a clinical (chronic low back pain) population using the pain beliefs questionnaire.

    PubMed

    Baird, Andrew J; Haslam, Roger A

    2013-12-01

    Beliefs, cognitions, and behaviors relating to pain can be associated with a range of negative outcomes. In patients, certain beliefs are associated with increased levels of pain and related disability. There are few data, however, showing the extent to which beliefs of patients differ from those of the general population. This study explored pain beliefs in a large nonclinical population and a chronic low back pain (CLBP) sample using the Pain Beliefs Questionnaire (PBQ) to identify differences in scores and factor structures between and within the samples. This was a cross-sectional study. The samples comprised patients attending a rehabilitation program and respondents to a workplace survey. Pain beliefs were assessed using the PBQ, which incorporates 2 scales: organic and psychological. Exploratory factor analysis was used to explore variations in factor structure within and between samples. The relationship between the 2 scales also was examined. Patients reported higher organic scores and lower psychological scores than the nonclinical sample. Within the nonclinical sample, those who reported frequent pain scored higher on the organic scale than those who did not. Factor analysis showed variations in relation to the presence of pain. The relationship between scales was stronger in those not reporting frequent pain. This was a cross-sectional study; therefore, no causal inferences can be made. Patients experiencing CLBP adopt a more biomedical perspective on pain than nonpatients. The presence of pain is also associated with increased biomedical thinking in a nonclinical sample. However, the impact is not only on the strength of beliefs, but also on the relationship between elements of belief and the underlying belief structure.

  5. Impact of the Fukushima accident on tritium, radiocarbon and radiocesium levels in seawater of the western North Pacific Ocean: A comparison with pre-Fukushima situation.

    PubMed

    Povinec, P P; Liong Wee Kwong, L; Kaizer, J; Molnár, M; Nies, H; Palcsu, L; Papp, L; Pham, M K; Jean-Baptiste, P

    2017-01-01

    Tritium, radiocarbon and radiocesium concentrations in water column samples in coastal waters offshore Fukushima and in the western North Pacific Ocean collected in 2011-2012 during the Ka'imikai-o-Kanaloa (KoK) cruise are compared with other published results. The highest levels in surface seawater were observed for 134 Cs and 137 Cs in seawater samples collected offshore Fukushima (up to 1.1 Bq L -1 ), which represent an increase by about three orders of magnitude when compared with the pre-Fukushima concentration. Tritium levels were much lower (up to 0.15 Bq L -1 ), representing an increase by about a factor of 6. The impact on the radiocarbon distribution was measurable, but the observed levels were only by about 9% above the global fallout background. The 137 Cs (and similarly 134 Cs) inventory in the water column of the investigated western North Pacific region was (2.7 ± 0.4) PBq, while for 3 H it was only (0.3 ± 0.2) PBq. Direct releases of highly contaminated water from the damaged Fukushima NPP, as well as dry and wet depositions of these radionuclides over the western North Pacific considerably changed their distribution patterns in seawater. Presently we can distinguish Fukushima labeled waters from global fallout background thanks to short-lived 134 Cs. However, in the long-term perspective when 134 Cs will decay, new distribution patterns of 3 H, 14 C and 137 Cs in the Pacific Ocean should be established for future oceanographic and climate change studies in the Pacific Ocean. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Anticholinesterase, antioxidant, analgesic and anti-inflammatory activity assessment of Xeranthemum annuum L. and isolation of two cyanogenic compounds.

    PubMed

    Orhan, Ilkay Erdogan; Gulyurdu, Fulya; Kupeli Akkol, Esra; Senol, Fatma Sezer; Arabaci Anul, Serap; Tatli, Iffet Irem

    2016-11-01

    Xeranthemum annuum L. (Asteraceae) (XA) is an ornamental and medicinal species with limited bioactivity and phytochemical data. Identification of anticholinesterase, antioxidant, anti-inflammatory and analgesic effects of the flower and root-stem (R-S) extracts of XA. Anticholinesterase (at 100 μg mL -1 ) and antioxidant (at 1000 μg mL -1 ) effects of various extracts were evaluated via microtiter assays, while anti-inflammatory and analgesic effects of the R-S extracts were tested using carrageenan-induced hind paw oedema (100 and 200 mg kg -1 ) and p-benzoquinone (PBQ) writhing models (200 mg kg -1 ) in male Swiss albino mice. The R-S ethanol extract of XA was subjected to isolation studies using conventional chromatographic methods. Most of the extracts showed inhibition over 85% against butyrylcholinesterase and no inhibition towards acetylcholinesterase. The flower chloroform and the R-S ethyl acetate extracts were most effective (97.85 ± 0.94% and 96.89 ± 1.09%, respectively). The R-S ethanol extract displayed a remarkable scavenging activity against DPPH (77.33 ± 1.99%) and in FRAP assay, while the hexane extract of the R-S parts possessed the highest metal-chelating capacity (72.79 ± 0.33%). The chloroform extract of the R-S caused a significant analgesic effect (24.4%) in PBQ writhing model. No anti-inflammatory effect was observed. Isolation of zierin and zierin xyloside, which were inactive in anticholinesterase assays, was achieved from the R-S ethanol extract. This is the first report of anticholinesterase, antioxidant, analgesic and anti-inflammatory activities and isolation of zierin and zierin xyloside from XA. Therefore, XA seems to contain antioxidant and BChE-inhibiting compounds.

  7. Maximal design basis accident of fusion neutron source DEMO-TIN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolbasov, B. N., E-mail: Kolbasov-BN@nrcki.ru

    2015-12-15

    When analyzing the safety of nuclear (including fusion) facilities, the maximal design basis accident at which the largest release of activity is expected must certainly be considered. Such an accident is usually the failure of cooling systems of the most thermally stressed components of a reactor (for a fusion facility, it is the divertor or the first wall). The analysis of safety of the ITER reactor and fusion power facilities (including hybrid fission–fusion facilities) shows that the initial event of such a design basis accident is a large-scale break of a pipe in the cooling system of divertor or themore » first wall outside the vacuum vessel of the facility. The greatest concern is caused by the possibility of hydrogen formation and the inrush of air into the vacuum chamber (VC) with the formation of a detonating mixture and a subsequent detonation explosion. To prevent such an explosion, the emergency forced termination of the fusion reaction, the mounting of shutoff valves in the cooling systems of the divertor and the first wall or blanket for reducing to a minimum the amount of water and air rushing into the VC, the injection of nitrogen or inert gas into the VC for decreasing the hydrogen and oxygen concentration, and other measures are recommended. Owing to a continuous feed-out of the molten-salt fuel mixture from the DEMO-TIN blanket with the removal period of 10 days, the radioactivity release at the accident will mainly be determined by tritium (up to 360 PBq). The activity of fission products in the facility will be up to 50 PBq.« less

  8. Spectroscopic evidence for temperature-dependent convergence of light- and heavy-hole valence bands of PbQ (Q = Te, Se, S)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J.; Malliakas, C. D.; Wijayaratne, K.

    2017-01-01

    We have conducted a temperature- dependent angle-resolved photoemission spectroscopy (ARPES) study of the electronic structures of PbTe, PbSe and PbS. Our ARPES data provide direct evidence for the light-hole upper valence bands (UVBs) and hitherto undetected heavy-hole lower valence bands (LVBs) in these materials. An unusual temperature-dependent relative movement between these bands leads to a monotonic decrease in the energy separation between their maxima with increasing temperature, which is known as band convergence and has long been believed to be the driving factor behind extraordinary thermoelectric performances of these compounds at elevated temperatures.

  9. Spectroscopic evidence for temperature-dependent convergence of light- and heavy-hole valence bands of PbQ (Q = Te, Se, S)

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Malliakas, C. D.; Wijayaratne, K.; Karlapati, V.; Appathurai, N.; Chung, D. Y.; Rosenkranz, S.; Kanatzidis, M. G.; Chatterjee, U.

    2017-01-01

    We have conducted a temperature-dependent angle-resolved photoemission spectroscopy (ARPES) study of the electronic structures of PbTe, PbSe and PbS. Our ARPES data provide direct evidence for the light-hole upper valence bands (UVBs) and hitherto undetected heavy-hole lower valence bands (LVBs) in these materials. An unusual temperature-dependent relative movement between these bands leads to a monotonic decrease in the energy separation between their maxima with increasing temperature, which is known as band convergence and has long been believed to be the driving factor behind extraordinary thermoelectric performances of these compounds at elevated temperatures.

  10. Global Xenon-133 Emission Inventory Caused by Medical Isotope Production and Derived from the Worldwide Technetium-99m Demand

    NASA Astrophysics Data System (ADS)

    Kalinowski, Martin B.; Grosch, Martina; Hebel, Simon

    2014-03-01

    Emissions from medical isotope production are the most important source of background for atmospheric radioxenon measurements, which are an essential part of nuclear explosion monitoring. This article presents a new approach for estimating the global annual radioxenon emission inventory caused by medical isotope production using the amount of Tc-99m applications in hospitals as the basis. Tc-99m is the most commonly used isotope in radiology and dominates the medical isotope production. This paper presents the first estimate of the global production of Tc-99m. Depending on the production and transport scenario, global xenon emissions of 11-45 PBq/year can be derived from the global isotope demand. The lower end of this estimate is in good agreement with other estimations which are making use of reported releases and realistic process simulations. This proves the validity of the complementary assessment method proposed in this paper. It may be of relevance for future emission scenarios and for estimating the contribution to the global source term from countries and operators that do not make sufficient radioxenon release information available. It depends on sound data on medical treatments with radio-pharmaceuticals and on technical information on the production process of the supplier. This might help in understanding the apparent underestimation of the global emission inventory that has been found by atmospheric transport modelling.

  11. Xenon-133 and caesium-137 releases into the atmosphere from the Fukushima Dai-ichi nuclear power plant: determination of the source term, atmospheric dispersion, and deposition

    NASA Astrophysics Data System (ADS)

    Stohl, A.; Seibert, P.; Wotawa, G.; Arnold, D.; Burkhart, J. F.; Eckhardt, S.; Tapia, C.; Vargas, A.; Yasunari, T. J.

    2012-04-01

    This presentation will show the results of a paper currently under review in ACPD and some additional new results, including more data and with an independent box modeling approach to support some of the findings of the ACPD paper. On 11 March 2011, an earthquake occurred about 130 km off the Pacific coast of Japan's main island Honshu, followed by a large tsunami. The resulting loss of electric power at the Fukushima Dai-ichi nuclear power plant (FD-NPP) developed into a disaster causing massive release of radioactivity into the atmosphere. In this study, we determine the emissions of two isotopes, the noble gas xenon-133 (133Xe) and the aerosol-bound caesium-137 (137Cs), which have very different release characteristics as well as behavior in the atmosphere. To determine radionuclide emissions as a function of height and time until 20 April, we made a first guess of release rates based on fuel inventories and documented accident events at the site. This first guess was subsequently improved by inverse modeling, which combined the first guess with the results of an atmospheric transport model, FLEXPART, and measurement data from several dozen stations in Japan, North America and other regions. We used both atmospheric activity concentration measurements as well as, for 137Cs, measurements of bulk deposition. Regarding 133Xe, we find a total release of 16.7 (uncertainty range 13.4-20.0) EBq, which is the largest radioactive noble gas release in history not associated with nuclear bomb testing. There is strong evidence that the first strong 133Xe release started early, before active venting was performed. The entire noble gas inventory of reactor units 1-3 was set free into the atmosphere between 11 and 15 March 2011. For 137Cs, the inversion results give a total emission of 35.8 (23.3-50.1) PBq, or about 42% of the estimated Chernobyl emission. Our results indicate that 137Cs emissions peaked on 14-15 March but were generally high from 12 until 19 March, when they suddenly dropped by orders of magnitude exactly when spraying of water on the spent-fuel pool of unit 4 started. This indicates that emissions were not only coming from the damaged reactor cores, but also from the spent-fuel pool of unit 4 and confirms that the spraying was an effective countermeasure. We also explore the main dispersion and deposition patterns of the radioactive cloud, both regionally for Japan as well as for the entire Northern Hemisphere. While at first sight it seemed fortunate that westerly winds prevailed most of the time during the accident, a different picture emerges from our detailed analysis. Exactly during and following the period of the strongest 137Cs emissions on 14 and 15 March as well as after another period with strong emissions on 19 March, the radioactive plume was advected over Eastern Honshu Island, where precipitation deposited a large fraction of 137Cs on land surfaces. The plume was also dispersed quickly over the entire Northern Hemisphere, first reaching North America on 15 March and Europe on 22 March. In general, simulated and observed concentrations of 133Xe and 137Cs both at Japanese as well as at remote sites were in good quantitative agreement with each other. Altogether, we estimate that 6.4 PBq of 137Cs, or 19% of the total fallout until 20 April, were deposited over Japanese land areas, while most of the rest fell over the North Pacific Ocean. Only 0.7 PBq, or 2% of the total fallout were deposited on land areas other than Japan.

  12. Worldwide deposition of strontium-90 through 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monetti, M.A.

    1996-03-01

    Strontium-90 results from the Environmental Measurements Laboratory`s (EML) Global Fallout Program (GFP) are presented for the years 1987 through 1990. Quarterly {sup 90}Sr deposition results for the 66 sampling locations of EML`s GFP were generally low, indicating that there was no significant release of fission products into the atmosphere during this period. The global {sup 90}Sr deposition during these 4 years was lower than it has been for any similar period since this program began in 1958. Since there was no major atmospheric source of {sup 90}Sr during this period, the global cumulative deposit of {sup 90}Sr continued to decreasemore » by radioactive decay to a 27 year low of 311.4 Pbq.« less

  13. Chemotaxis and degradation of organophosphate compound by a novel moderately thermo-halo tolerant Pseudomonas sp. strain BUR11: evidence for possible existence of two pathways for degradation.

    PubMed

    Pailan, Santanu; Saha, Pradipta

    2015-01-01

    An organophosphate (OP) degrading chemotactic bacterial strain BUR11 isolated from an agricultural field was identified as a member of Pseudomonas genus on the basis of its 16S rRNA gene sequence. The strain could utilize parathion, chlorpyrifos and their major hydrolytic intermediates as sole source of carbon for its growth and exhibited positive chemotactic response towards most of them. Optimum concentration of parathion for its growth was recorded to be 200 ppm and 62% of which was degraded within 96 h at 37 °C. Growth studies indicated the strain to be moderately thermo-halo tolerant in nature. Investigation based on identification of intermediates of parathion degradation by thin layer chromatography (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC) and liquid chromatography mass spectrometry (LC-MS/MS) provided evidence for possible existence of two pathways. The first pathway proceeds via 4-nitrophenol (4-NP) while the second proceeds through formation of 4-aminoparathion (4-APar), 4-aminophenol (4-AP) and parabenzoquinone (PBQ). This is the first report of chemotaxis towards organophosphate compound by a thermo-halo tolerant bacterium.

  14. Chemotaxis and degradation of organophosphate compound by a novel moderately thermo-halo tolerant Pseudomonas sp. strain BUR11: evidence for possible existence of two pathways for degradation

    PubMed Central

    Pailan, Santanu

    2015-01-01

    An organophosphate (OP) degrading chemotactic bacterial strain BUR11 isolated from an agricultural field was identified as a member of Pseudomonas genus on the basis of its 16S rRNA gene sequence. The strain could utilize parathion, chlorpyrifos and their major hydrolytic intermediates as sole source of carbon for its growth and exhibited positive chemotactic response towards most of them. Optimum concentration of parathion for its growth was recorded to be 200 ppm and 62% of which was degraded within 96 h at 37 °C. Growth studies indicated the strain to be moderately thermo-halo tolerant in nature. Investigation based on identification of intermediates of parathion degradation by thin layer chromatography (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC) and liquid chromatography mass spectrometry (LC-MS/MS) provided evidence for possible existence of two pathways. The first pathway proceeds via 4-nitrophenol (4-NP) while the second proceeds through formation of 4-aminoparathion (4-APar), 4-aminophenol (4-AP) and parabenzoquinone (PBQ). This is the first report of chemotaxis towards organophosphate compound by a thermo-halo tolerant bacterium. PMID:26587344

  15. PSII as an in vivo molecular catalyst for the production of energy rich hydroquinones - A new approach in renewable energy.

    PubMed

    Das, Sai; Maiti, Soumen K

    2018-03-01

    One of the pertinent issues in the field of energy science today is the quest for an abundant source of hydrogen or hydrogen equivalents. In this study, phenyl-p-benzoquinone (pPBQ) has been used to generate a molecular store of hydrogen equivalents (phenyl-p-hydroquinone; pPBQH 2 ) from thein vivo splitting of water by photosystem II of the marine cyanobacterium Synechococcus elongatus BDU 70542. Using this technique, 10.8 μmol of pPBQH 2 per mg chlorophyll a can be extracted per minute, an efficiency that is orders of magnitude higher when compared to the techniques present in the current literature. Moreover, the photo-reduction process was stable when tested over longer periods of time. Addition of phenyl-p-benzoquinone on an intermittent basis resulted in the precipitation of phenyl-p-hydroquinone, obviating the need for costly downstream processing units for product recovery. Phenyl-p-hydroquinone so obtained is a molecular store of free energy preserved through the light driven photolysis of water and can be used as a cheap and a renewable source of hydrogen equivalents by employing transition metal catalysts or fuel cells with the concomitant regeneration of phenyl-p-benzoquinone. The cyclic nature of this technique makes it an ideal candidate to be utilized in mankind's transition from fossil fuels to solar fuels. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Cloud diagnosis impact on deposition modelling applied to the Fukushima accident

    NASA Astrophysics Data System (ADS)

    Quérel, Arnaud; Quélo, Denis; Roustan, Yelva; Mathieu, Anne

    2017-04-01

    The accident at the Fukushima Daiichi Nuclear Power Plant in Japan in March 2011 resulted in the release of several hundred PBq of activity into the environment. Most of the radioactivity was released in a time period of about 40 days. Radioactivity was dispersed in the atmosphere and the ocean and subsequently traces of radionuclides were detected all over Japan. At the Fukushima airport for instance, a deposit as large as 36 kBq/m2 of Cs-137 was measured resulting of an atmospheric deposition of the plume. Both dry and wet deposition were probably involved since a raining event occurred on the 15th of March when the plume was passing nearby. The accident scenario have given rise to a number of scientific investigations. Atmospheric deposition, for example, was studied by utilizing atmospheric transport models. In atmospheric transport models, some parameters, such as cloud diagnosis, are derived from meteorological data. This cloud diagnosis is a key issue for wet deposition modelling since it allows to distinguish between two processes: in-cloud scavenging which corresponds to the collection of radioactive particles into the cloud and below-cloud scavenging consequent to the removal of radioactive material due to the falling drops. Several parametrizations of cloud diagnosis exist in the literature, using different input data: relative humidity, liquid water content, also. All these diagnosis return a large range of cloud base heights and cloud top heights. In this study, computed cloud diagnostics are compared to the observations at the Fukushima airport. Atmospheric dispersion simulations at Japan scale are then performed utilizing the most reliable ones. Impact on results are discussed.

  17. Iodine Prophylaxis in the Case of Nuclear Accident.

    PubMed

    Zbigniew, Szybinski

    2017-01-01

    On 26th April, 1986 the greatest accident of nuclear plant in Czernobyl occured and isotopes with high percentage of release were erupted: 33-Xe, 131-I, 132-Te, 134-Cs and 137-Cs. The radioactivity of the isotopes was very high - for instance: 33-Xe 6500 PBq, 131-I 1760 PBq. Rest of the 15 isotopes represented similar radioactivity with shorter percentage of release. The most exposed group of people were 237 liquidators, and 11600 people living around had to be evacuated when the limit dose for a person (5mSv) was crossed. Ionizing radiation on the molecular level produces high energy radicals, water radiolysis and ionization of the atoms leading to damage of the enzymes activity centers and receptors, cell membranes DNA, intracellular lysosomes, and especially important for ATP synthesis - mitochondria. These destructions lead to tissue and organs damage. The aim of this article is the presentation of the protective property of iodine application in the case of nuclear accident. In Poland at that time, effective iodine prophylaxis did not exist. In the face of such exposition, a special Governement Commission was appointed. When permissioned maximal dose for children and adolescents 50mSv appeared in some areas of the country, the Commission made a decision about obligatory administration of a single pharmacological dose of potassium iodide for all children and adolescents till age 16. No relevant recent patents were available for this WHO report. In this way, the dose of 131-I to thyroid for inhabitants in high, moderated, and low contaminated regions were reduced by about 45%. However, from 1987 to 1997 increase in prevalence of the differentiated thyroid cancer in adults in Polish population especially in women over 40 years old in the southern part of Poland was observed. Actually in European countries work 185 nuclear power plants and to 2045 another 100 is planned. In 1999, WHO issued recommendations on iodine prophylaxis in the case of nuclear accident. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  19. Investigation of Main Radiation Source above Shield Plug of Unit 3 at Fukushima Daiichi Nuclear Power Station

    NASA Astrophysics Data System (ADS)

    Hiratama, Hideo; Kondo, Kenjiro; Suzuki, Seishiro; Tanimura, Yoshihiko; Iwanaga, Kohei; Nagata, Hiroshi

    2017-09-01

    Pulse height distributions were measured using a CdZnTe detector inside a lead collimator to investigate main source producing high dose rates above the shield plugs of Unit 3 at Fukushima Daiichi Nuclear Power Station. It was confirmed that low energy photons are dominant. Concentrations of Cs-137 under 60 cm concrete of the shield plug were estimated to be between 8.1E+9 and 5.7E+10 Bq/cm2 from the measured peak count rate of 0.662 MeV photons. If Cs-137 was distributed on the surfaces of the gaps with radius 6m and with the averaged concentration of 5 points, 2.6E+10 Bq/cm2, total amount of Cs-137 is estimated to be 30 PBq.

  20. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  1. Xenon-133 and caesium-137 releases into the atmosphere from the Fukushima Dai-ichi nuclear power plant: determination of the source term, atmospheric dispersion, and deposition

    NASA Astrophysics Data System (ADS)

    Stohl, A.; Seibert, P.; Wotawa, G.; Arnold, D.; Burkhart, J. F.; Eckhardt, S.; Tapia, C.; Vargas, A.; Yasunari, T. J.

    2012-03-01

    On 11 March 2011, an earthquake occurred about 130 km off the Pacific coast of Japan's main island Honshu, followed by a large tsunami. The resulting loss of electric power at the Fukushima Dai-ichi nuclear power plant developed into a disaster causing massive release of radioactivity into the atmosphere. In this study, we determine the emissions into the atmosphere of two isotopes, the noble gas xenon-133 (133Xe) and the aerosol-bound caesium-137 (137Cs), which have very different release characteristics as well as behavior in the atmosphere. To determine radionuclide emissions as a function of height and time until 20 April, we made a first guess of release rates based on fuel inventories and documented accident events at the site. This first guess was subsequently improved by inverse modeling, which combined it with the results of an atmospheric transport model, FLEXPART, and measurement data from several dozen stations in Japan, North America and other regions. We used both atmospheric activity concentration measurements as well as, for 137Cs, measurements of bulk deposition. Regarding 133Xe, we find a total release of 15.3 (uncertainty range 12.2-18.3) EBq, which is more than twice as high as the total release from Chernobyl and likely the largest radioactive noble gas release in history. The entire noble gas inventory of reactor units 1-3 was set free into the atmosphere between 11 and 15 March 2011. In fact, our release estimate is higher than the entire estimated 133Xe inventory of the Fukushima Dai-ichi nuclear power plant, which we explain with the decay of iodine-133 (half-life of 20.8 h) into 133Xe. There is strong evidence that the 133Xe release started before the first active venting was made, possibly indicating structural damage to reactor components and/or leaks due to overpressure which would have allowed early release of noble gases. For 137Cs, the inversion results give a total emission of 36.6 (20.1-53.1) PBq, or about 43% of the estimated Chernobyl emission. Our results indicate that 137Cs emissions peaked on 14-15 March but were generally high from 12 until 19 March, when they suddenly dropped by orders of magnitude at the time when spraying of water on the spent-fuel pool of unit 4 started. This indicates that emissions may not have originated only from the damaged reactor cores, but also from the spent-fuel pool of unit 4. This would also confirm that the spraying was an effective countermeasure. We explore the main dispersion and deposition patterns of the radioactive cloud, both regionally for Japan as well as for the entire Northern Hemisphere. While at first sight it seemed fortunate that westerly winds prevailed most of the time during the accident, a different picture emerges from our detailed analysis. Exactly during and following the period of the strongest 137Cs emissions on 14 and 15 March as well as after another period with strong emissions on 19 March, the radioactive plume was advected over Eastern Honshu Island, where precipitation deposited a large fraction of 137Cs on land surfaces. Radioactive clouds reached North America on 15 March and Europe on 22 March. By middle of April, 133Xe was fairly uniformly distributed in the middle latitudes of the entire Northern Hemisphere and was for the first time also measured in the Southern Hemisphere (Darwin station, Australia). In general, simulated and observed concentrations of 133Xe and 137Cs both at Japanese as well as at remote sites were in good quantitative agreement. Altogether, we estimate that 6.4 PBq of 137Cs, or 18% of the total fallout until 20 April, were deposited over Japanese land areas, while most of the rest fell over the North Pacific Ocean. Only 0.7 PBq, or 1.9% of the total fallout were deposited on land areas other than Japan.

  2. Quality management of manufacturing process based on manufacturing execution system

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Jiang, Yang; Jiang, Weizhuo

    2017-04-01

    Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.

  3. Analysis of Bearing Errors from Acoustic Arrays

    DTIC Science & Technology

    2002-04-25

    BPp F ; ?v<Bq+/ mQ= -0_3pP* 6 . 0 zDg:2 l-¡ppA?-0/0/3p ;<= pA?_3]�?-0_3/ H | v<Bv,-0p?AF3u3+_ > F3 � > -0ppC,-¡pA?_3],`aBDpx...O bQZ : | JZ �TG | ZAUXw G w n ZiK�Z | OPb | v<B��{-@Bq�pq4 F3 *’-0]’-@F9],p_3]’C��hF3+Q��],C,-0],u9p`aF9]�?_3-0]<BDC�-0]�?v,-0p+Bq* F3 +?X_8...B?v<F9pAB F3 �P?v,B�_ ; ?v, F3 +px_3],C pv<F ; /0C�]< F3 ? = B`aF9],pA?A+ ;

  4. Disposal of disused sealed radiation sources in Boreholes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vicente, R.

    2007-07-01

    This paper gives a description of the concept of a geological repository for disposal of disused sealed radiation sources (DSRS) under development in the Institute of Energy and Nuclear Research (IPEN), in Brazil. DSRS represent a significant fraction of total activity of radioactive wastes to be managed. Most DSRS are collected and temporarily stored at IPEN. As of 2006, the total collected activity is 800 TBq in 7,508 industrial gauge or radiotherapy sources, 7.2 TBq in about 72,000 Americium-241 sources detached from lightning rods, and about 0,5 GBq in 20,857 sources from smoke detectors. The estimated inventory of sealed sourcesmore » in the country is 2.7 hundred thousand sources with 26 PBq. The proposed repository is designed to receive the total inventory of sealed sources. A description of the pre-disposal facilities at IPEN is also presented. (authors)« less

  5. Quality transitivity and traceability system of herbal medicine products based on quality markers.

    PubMed

    Liu, Changxiao; Guo, De-An; Liu, Liang

    2018-05-15

    Due to a variety of factors to affect the herb quality, the existing quality management model is unable to evaluate the process control. The development of the concept of "quality marker" (Q-marker) lays basis for establishing an independent process quality control system for herbal products. To ensure the highest degree of safety, effectiveness and quality process control of herbal products, it is aimed to establish a quality transitivity and traceability system of quality and process control from raw materials to finished herbal products. Based on the key issues and challenges of quality assessment, the current status of quality and process controls from raw materials to herbal medicinal products listed in Pharmacopoeia were analyzed and the research models including discovery and identification of Q-markers, analysis and quality management of risk evaluation were designed. Authors introduced a few new technologies and methodologies, such as DNA barcoding, chromatographic technologies, fingerprint analysis, chemical markers, bio-responses, risk management and solution for quality process control. The quality and process control models for herbal medicinal products were proposed and the transitivity and traceability system from raw materials to the finished products was constructed to improve the herbal quality from the entire supply and production chain. The transitivity and traceability system has been established based on quality markers, especially on how to control the production process under Good Engineering Practices, as well as to implement the risk management for quality and process control in herbal medicine production. Copyright © 2018 Elsevier GmbH. All rights reserved.

  6. Hindcast and Forecast of 137Cs Activities in the North Pacific Ocean Waters from 1945 to 2020 by Eddy-resolving ROMS

    NASA Astrophysics Data System (ADS)

    Tsubono, T.; Misumi, K.; Tsumune, D.; Aoyama, M.; Hirose, K.

    2015-12-01

    We conducted a hindcast and forecast of 137Cs activities in the North Pacific waters from 1945 to 2020, before and after the Fukushima Dai-ichi Nuclear Power Plant (F1NPP) accident. We used the Regional Ocean Model System (ROMS) with high resolution (1/12º-1/4º in horizontal, 45 levels in vertical), of which domain was the North Pacific Ocean. The model was driven by the exactly repeating "Normal Year" forcing Coordinated Ocean Reference Experiment (CORE) forcing dataset (Large and Yeager, 2008) using bulk formulae and the model-predicted sea surface temperature and the 50 years averaged SODA data as boundary conditions. The reconstructed global fallout due to atmospheric nuclear weapons' tests and Chernobyl accident was employed for atmospheric flux of 137Cs from 1945 to 2011. After the accident, the atmospheric deposition and direct release of 137Cs from F1NPP were also employed for input condition. Five ensemble calculations of 137Cs activities in seawater were conducted under different initial conditions, but had identical forcing. The net input of 16 PBq of 137Cs from F1NPP, which was employed in this study, corresponded to 26% of the total amount (61 PBq) of 137Cs that was estimated in the North Pacific before the F1NPP accident in 2011. Before the accident in 2011, the 137Cs on surface ranged from 0.75 to 1.7 Bq m-3. The direct comparison between simulated and observed 134Cs activities in the surface layer represented that the root-mean-square error and correlation coefficient were 5.6 Bq m-3 and 0.86, respectively, suggesting the model result were consistent with the observations. The main body of high 137Cs activity water from F1NPP was transported to south of the Subarctic Front around 42°N via the Oyashio Coastal Current, the Oyashio intrusion, and the Kuroshio bifurcation and then to the western North Pacific. This model simulation suggested that the 137Cs activities in surface waters at P26 (P04) would increase to 4.1 Bq m-3 (4.3 Bq m-3 ) in 2015 (2016) and then decrease to 1.3 Bq m-3 (1.8 Bq m-3 ) in last 2020.

  7. Anti-p-benzoquinone antibody level as a prospective biomarker to identify smokers at risk for COPD

    PubMed Central

    Banerjee, Santanu; Bhattacharyya, Parthasarathi; Mitra, Subhra; Kundu, Somenath; Panda, Samiran; Chatterjee, Indu B

    2017-01-01

    Background and objective Identification of smokers having predisposition to COPD is important for early intervention to reduce the huge global burden of the disease. Using a guinea pig model, we have shown that p-benzoquinone (p-BQ) derived from cigarette smoke (CS) in the lung is a causative factor for CS-induced emphysema. p-BQ is also derived from CS in smokers and it elicits the production of anti-p-BQ antibody in humans. We therefore hypothesized that anti-p-BQ antibody might have a protective role against COPD and could be used as a predictive biomarker for COPD in smokers. The objective of this study was to compare the serum anti-p-BQ antibody level between smokers with and without COPD for the evaluation of the hypothesis. Methods Serum anti-p-BQ antibody concentrations of current male smokers with (n=227) or without (n=308) COPD were measured by an indirect enzyme-linked immunoabsorbent assay (ELISA) developed in our laboratory. COPD was diagnosed by spirometry according to Global Initiative for Chronic Obstructive Lung Disease (GOLD) guidelines. Results and discussion A significant difference was observed in the serum anti-p-BQ antibody level between smokers with and without COPD (Mann–Whitney U-test =4,632.5, P=0.000). Receiver operating characteristic (ROC) curve analysis indicated that the ELISA had significant precision (area under the curve [AUC] =0.934, 95% confidence interval [CI]: 0.913–0.935) for identifying smokers with COPD from their low antibody level. The antibody cutoff value of 29.4 mg/dL was constructed from the ROC coordinates to estimate the risk for COPD in smokers. While 90.3% of smokers with COPD had a low antibody value (≤29.4 mg/dL), the majority (86.4%) of smokers without COPD had a high antibody value (≤29.4 mg/dL); 13.6% of current smokers without COPD having an antibody level below this cutoff value (odds ratio [OR] =59.3, 95% CI: 34.15–101.99) were considered to be at risk for COPD. Conclusion and future directions Our results indicate that serum anti-p-BQ antibody level may be used as a biomarker to identify asymptomatic smokers at risk for COPD for early intervention of the disease. PMID:28684907

  8. Application of process mining to assess the data quality of routinely collected time-based performance data sourced from electronic health records by validating process conformance.

    PubMed

    Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris

    2016-12-01

    Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.

  9. A Quality Assurance Initiative for Commercial-Scale Production in High-Throughput Cryopreservation of Blue Catfish Sperm

    PubMed Central

    Hu, E; Liao, T. W.; Tiersch, T. R.

    2013-01-01

    Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: 1) the main production quality characteristics; 2) the process features for quality assurance; 3) the internal quality characteristics and their specification designs; 4) the quality control and process capability evaluation methods, and 5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. PMID:23872356

  10. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  11. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  12. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  13. Quality of nursing documentation: Paper-based health records versus electronic-based health records.

    PubMed

    Akhu-Zaheya, Laila; Al-Maaitah, Rowaida; Bany Hani, Salam

    2018-02-01

    To assess and compare the quality of paper-based and electronic-based health records. The comparison examined three criteria: content, documentation process and structure. Nursing documentation is a significant indicator of the quality of patient care delivery. It can be either paper-based or organised within the system known as the electronic health records. Nursing documentation must be completed at the highest standards, to ensure the safety and quality of healthcare services. However, the evidence is not clear on which one of the two forms of documentation (paper-based versus electronic health records is more qualified. A retrospective, descriptive, comparative design was used to address the study's purposes. A convenient number of patients' records, from two public hospitals, were audited using the Cat-ch-Ing audit instrument. The sample size consisted of 434 records for both paper-based health records and electronic health records from medical and surgical wards. Electronic health records were better than paper-based health records in terms of process and structure. In terms of quantity and quality content, paper-based records were better than electronic health records. The study affirmed the poor quality of nursing documentation and lack of nurses' knowledge and skills in the nursing process and its application in both paper-based and electronic-based systems. Both forms of documentation revealed drawbacks in terms of content, process and structure. This study provided important information, which can guide policymakers and administrators in identifying effective strategies aimed at enhancing the quality of nursing documentation. Policies and actions to ensure quality nursing documentation at the national level should focus on improving nursing knowledge, competencies, practice in nursing process, enhancing the work environment and nursing workload, as well as strengthening the capacity building of nursing practice to improve the quality of nursing care and patients' outcomes. © 2017 John Wiley & Sons Ltd.

  14. [Process orientation as a tool of strategic approaches to corporate governance and integrated management systems].

    PubMed

    Sens, Brigitte

    2010-01-01

    The concept of general process orientation as an instrument of organisation development is the core principle of quality management philosophy, i.e. the learning organisation. Accordingly, prestigious quality awards and certification systems focus on process configuration and continual improvement. In German health care organisations, particularly in hospitals, this general process orientation has not been widely implemented yet - despite enormous change dynamics and the requirements of both quality and economic efficiency of health care processes. But based on a consistent process architecture that considers key processes as well as management and support processes, the strategy of excellent health service provision including quality, safety and transparency can be realised in daily operative work. The core elements of quality (e.g., evidence-based medicine), patient safety and risk management, environmental management, health and safety at work can be embedded in daily health care processes as an integrated management system (the "all in one system" principle). Sustainable advantages and benefits for patients, staff, and the organisation will result: stable, high-quality, efficient, and indicator-based health care processes. Hospitals with their broad variety of complex health care procedures should now exploit the full potential of total process orientation. Copyright © 2010. Published by Elsevier GmbH.

  15. A risk-based auditing process for pharmaceutical manufacturers.

    PubMed

    Vargo, Susan; Dana, Bob; Rangavajhula, Vijaya; Rönninger, Stephan

    2014-01-01

    The purpose of this article is to share ideas on developing a risk-based model for the scheduling of audits (both internal and external). Audits are a key element of a manufacturer's quality system and provide an independent means of evaluating the manufacturer's or the supplier/vendor's compliance status. Suggestions for risk-based scheduling approaches are discussed in the article. Pharmaceutical manufacturers are required to establish and implement a quality system. The quality system is an organizational structure defining responsibilities, procedures, processes, and resources that the manufacturer has established to ensure quality throughout the manufacturing process. Audits are a component of the manufacturer's quality system and provide a systematic and an independent means of evaluating the manufacturer's overall quality system and compliance status. Audits are performed at defined intervals for a specified duration. The intention of the audit process is to focus on key areas within the quality system and may not cover all relevant areas during each audit. In this article, the authors provide suggestions for risk-based scheduling approaches to aid pharmaceutical manufacturers in identifying the key focus areas for an audit.

  16. Mother-infant bonding impairment across the first six months postpartum: The primacy of psychopathology in women with childhood abuse and neglect histories

    PubMed Central

    Muzik, Maria; Bocknek, Erika London; Broderick, Amanda; Richardson, Patricia; Rosenblum, Katherine L.; Thelen, Kelsie; Seng, Julia S.

    2014-01-01

    Purpose Our goal was to examine the trajectory of bonding impairment across the first 6 months postpartum in the contexts of maternal risk, including maternal history of childhood abuse and neglect and postpartum psychopathology, and to test the association between self-reported bonding impairment and observed positive parenting behaviors. Method In a sample of women with childhood abuse and neglect (CA) histories (CA+, n=97) and a healthy control comparison group (CA-, n=53), participants completed questionnaires related to bonding with their infant at 6 weeks, 4 months, and 6 months postpartum and postpartum psychopathology at 6 months postpartum. In addition, during a 6 months postpartum home visit, mothers and infants participated in a dyadic play interaction subsequently coded for positive parenting behaviors by blinded coders. Results We found that all women independent of risk status increased in bonding to their infant over the first 6 months postpartum; however, women with postpartum psychopathology (depression and PTSD) showed consistently greater bonding impairment scores at all times points. Moreover, we found that at the 6 months assessment bonding impairment and observed parenting behaviors were significantly associated. Conclusion These results highlight the adverse effects of maternal postpartum depression and PTSD on mother-infant bonding in early postpartum in women with child abuse and neglect histories. These findings also shed light on the critical need for early detection and effective treatment of postpartum mental illness in order to prevent problematic parenting and the development of disturbed mother-infant relationships. Results support the use of the Parenting Bonding Questionnaire (PBQ) as a tool to assess parenting quality by its demonstrated association with observed parenting behaviors. PMID:23064898

  17. The Personality Inventory for DSM-5 Short Form (PID-5-SF): psychometric properties and association with big five traits and pathological beliefs in a Norwegian population.

    PubMed

    Thimm, Jens C; Jordan, Stian; Bach, Bo

    2016-12-07

    With the publication of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), an alternative model for personality disorders based on personality dysfunction and pathological personality traits was introduced. The Personality Inventory for DSM-5 (PID-5) is a 220-item self-report inventory designed to assess the personality traits of this model. Recently, a short 100-item version of the PID-5 (PID-5-SF) has been developed. The aim of this study was to investigate the score reliability and structure of the Norwegian PID-5-SF. Further, criterion validity with the five factor model of personality (FFM) and pathological personality beliefs was examined. A derivation sample of university students (N = 503) completed the PID-5, the Big Five Inventory (BFI), and the Personality Beliefs Questionnaire - Short Form (PBQ-SF), whereas a replication sample of 127 students completed the PID-5-SF along with the aforementioned measures. The short PID-5 showed overall good score reliability and structural validity. The associations with FFM traits and pathological personality beliefs were conceptually coherent and similar for the two forms of the PID-5. The results suggest that the Norwegian PID-5 short form is a reliable and efficient measure of the trait criterion of the alternative model for personality disorders in DSM-5.

  18. Quality Assessment of TPB-Based Questionnaires: A Systematic Review

    PubMed Central

    Oluka, Obiageli Crystal; Nie, Shaofa; Sun, Yi

    2014-01-01

    Objective This review is aimed at assessing the quality of questionnaires and their development process based on the theory of planned behavior (TPB) change model. Methods A systematic literature search for studies with the primary aim of TPB-based questionnaire development was conducted in relevant databases between 2002 and 2012 using selected search terms. Ten of 1,034 screened abstracts met the inclusion criteria and were assessed for methodological quality using two different appraisal tools: one for the overall methodological quality of each study and the other developed for the appraisal of the questionnaire content and development process. Both appraisal tools consisted of items regarding the likelihood of bias in each study and were eventually combined to give the overall quality score for each included study. Results 8 of the 10 included studies showed low risk of bias in the overall quality assessment of each study, while 9 of the studies were of high quality based on the quality appraisal of questionnaire content and development process. Conclusion Quality appraisal of the questionnaires in the 10 reviewed studies was successfully conducted, highlighting the top problem areas (including: sample size estimation; inclusion of direct and indirect measures; and inclusion of questions on demographics) in the development of TPB-based questionnaires and the need for researchers to provide a more detailed account of their development process. PMID:24722323

  19. [The role of playful interactions in the development of the early mother-child relationship--factors of risk and protection].

    PubMed

    Eigner, Bernadett

    2015-01-01

    The early mother-child relationship is taking shape and evolving during the series of their everyday interactions. The main aim of the research that focus on the risks at the beginning, and the future mother infant interactions are factors that have influence on the quality of the early mother-child relationship, and the exploration of the jeopardy and vulnerability of the early relationship disorders. I examined fifty mothers who have their first child. I researched the motherly, child- and interactional factors in the days right after the birth and then when the kids were one month old, and again at the age of four and a half month. I assessed the parental stress by the longer version of the PSI (Parenting Stress Index), and the mother anxiety with the help of the STAI-Y (State and feature anxiety value Index), and the features of the depression were tested by the EPDS scale, the Edinburgh Post-natal Depression Scale. PBQ, the Postpartum Bonding Questionnare--reveals the quality of the motherly emotions and behaviour focusing on the kid. The observation of interactions when the child was four and a half month old happened while a 'face-to-face' free play, and the analyses of that were assessed by an own code system. We found correlations between input risk factors and features of motherly interactional styles. The indexes of the after birth depression (depression right after the birth), and the anxiety also showed correlation to the indexes of the attachment of the mother to her child and the parental stress. The correlations among the playfulness, the risk kotodefactors and the quality of the interactions are obvious, that we found. The interactional style of the mother and the interactional strategies of the baby showed correlated patterns too as we examined those. We found that pre-history of pregnancy and perinatal events have predictive value on the relationship of the four and a half month old baby and his/her mother. These can add important facts to the prevention and early-prevention of the early mother-child interactions, and other factors in cognition of the potential strengths and weaknesses in order to prevent the future negative outputs (attachment, cognitive and social-emotional development, emotional and behavioural disorders).

  20. Southward spreading of the Fukushima-derived radiocesium across the Kuroshio Extension in the North Pacific

    PubMed Central

    Kumamoto, Yuichiro; Aoyama, Michio; Hamajima, Yasunori; Aono, Tatsuo; Kouketsu, Shinya; Murata, Akihiko; Kawano, Takeshi

    2014-01-01

    The accident of the Fukushima Dai-ichi nuclear power plant in March 2011 released a large amount of radiocesium into the North Pacific Ocean. Vertical distributions of Fukushima-derived radiocesium were measured at stations along the 149°E meridian in the western North Pacific during the winter of 2012. In the subtropical region, to the south of the Kuroshio Extension, we found a subsurface radiocesium maximum at a depth of about 300 m. It is concluded that atmospheric-deposited radiocesium south of the Kuroshio Extension just after the accident had been transported not only eastward along with surface currents but also southward due to formation/subduction of subtropical mode waters within about 10 months after the accident. The total amount of decay-corrected 134Cs in the mode water was an estimated about 6 PBq corresponding to 10–60% of the total inventory of Fukushima-derived 134Cs in the North Pacific Ocean. PMID:24589762

  1. Can Institutions Have Quality Programming without Utilizing a Systematic Outcomes-Based Assessment Process?

    ERIC Educational Resources Information Center

    Weiner, Lauren; Bresciani, Marilee J.

    2011-01-01

    The researchers explored whether implementation of a systematic outcomes-based assessment process is necessary for demonstrating quality in service learning programs at a two-year and a four-year institution. The findings revealed that Western Community College and the University of the Coast maintained quality service-learning programs, which met…

  2. [Feedforward control strategy and its application in quality improvement of ethanol precipitation process of danhong injection].

    PubMed

    Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao

    2013-06-01

    In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.

  3. An official American thoracic society workshop report: developing performance measures from clinical practice guidelines.

    PubMed

    Kahn, Jeremy M; Gould, Michael K; Krishnan, Jerry A; Wilson, Kevin C; Au, David H; Cooke, Colin R; Douglas, Ivor S; Feemster, Laura C; Mularski, Richard A; Slatore, Christopher G; Wiener, Renda Soylemez

    2014-05-01

    Many health care performance measures are either not based on high-quality clinical evidence or not tightly linked to patient-centered outcomes, limiting their usefulness in quality improvement. In this report we summarize the proceedings of an American Thoracic Society workshop convened to address this problem by reviewing current approaches to performance measure development and creating a framework for developing high-quality performance measures by basing them directly on recommendations from well-constructed clinical practice guidelines. Workshop participants concluded that ideally performance measures addressing care processes should be linked to clinical practice guidelines that explicitly rate the quality of evidence and the strength of recommendations, such as the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) process. Under this framework, process-based performance measures would only be developed from strong recommendations based on high- or moderate-quality evidence. This approach would help ensure that clinical processes specified in performance measures are both of clear benefit to patients and supported by strong evidence. Although this approach may result in fewer performance measures, it would substantially increase the likelihood that quality-improvement programs based on these measures actually improve patient care.

  4. [Establishment of industry promotion technology system in Chinese medicine secondary exploitation based on "component structure theory"].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Jia, Xiao-Bin

    2014-10-01

    The purpose of the secondary exploitation of Chinese medicine is to improve the quality of Chinese medicine products, enhance core competitiveness, for better use in clinical practice, and more effectively solve the patient suffering. Herbs, extraction, separation, refreshing, preparation and quality control are all involved in the industry promotion of Chinese medicine secondary exploitation of industrial production. The Chinese medicine quality improvement and industry promotion could be realized with the whole process of process optimization, quality control, overall processes improvement. Based on the "component structure theory", "multi-dimensional structure & process dynamic quality control system" and systematic and holistic character of Chinese medicine, impacts of whole process were discussed. Technology systems of Chinese medicine industry promotion was built to provide theoretical basis for improving the quality and efficacy of the secondary development of traditional Chinese medicine products.

  5. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  6. A laser-based vision system for weld quality inspection.

    PubMed

    Huang, Wei; Kovacevic, Radovan

    2011-01-01

    Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved.

  7. A Laser-Based Vision System for Weld Quality Inspection

    PubMed Central

    Huang, Wei; Kovacevic, Radovan

    2011-01-01

    Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved. PMID:22344308

  8. Cesium, iodine and tritium in NW Pacific waters - a comparison of the Fukushima impact with global fallout

    NASA Astrophysics Data System (ADS)

    Povinec, P. P.; Aoyama, M.; Biddulph, D.; Breier, R.; Buesseler, K.; Chang, C. C.; Golser, R.; Hou, X. L.; Ješkovský, M.; Jull, A. J. T.; Kaizer, J.; Nakano, M.; Nies, H.; Palcsu, L.; Papp, L.; Pham, M. K.; Steier, P.; Zhang, L. Y.

    2013-08-01

    Radionuclide impact of the Fukushima Dai-ichi nuclear power plant accident on the distribution of radionuclides in seawater of the NW Pacific Ocean is compared with global fallout from atmospheric tests of nuclear weapons. Surface and water column samples collected during the Ka'imikai-o-Kanaloa (KOK) international expedition carried out in June 2011 were analyzed for 134Cs, 137Cs, 129I and 3H. The 137Cs, 129I and 3H levels in surface seawater offshore Fukushima varied between 0.002-3.5 Bq L-1, 0.01-0.8 μBq L-1, and 0.05-0.15 Bq L-1, respectively. At the sampling site about 40 km from the coast, where all three radionuclides were analyzed, the Fukushima impact on the levels of these three radionuclides represents an increase above the global fallout background by factors of about 1000, 50 and 3, respectively. The water column data indicate that the transport of Fukushima-derived radionuclides downward to the depth of 300 m has already occurred. The observed 137Cs levels in surface waters and in the water column are compared with predictions obtained from the ocean general circulation model, which indicates that the Kuroshio Current acts as a southern boundary for the transport of the radionuclides, which have been transported from the Fukushima coast eastward in the NW Pacific Ocean. The 137Cs inventory in the water column is estimated to be about 2.2 PBq, what can be regarded as a lower limit of the direct liquid discharges into the sea as the seawater sampling was carried out only in the area from 34 to 37° N, and from 142 to 147° E. About 4.6 GBq of 129I was deposited in the NW Pacific Ocean, and 2.4-7 GBq of 129I was directly discharged as liquid wastes into the sea offshore Fukushima. The total amount of 3H released and deposited over the NW Pacific Ocean was estimated to be 0.1-0.5 PBq. These estimations depend, however, on the evaluation of the total 137Cs activities released as liquid wastes directly into the sea, which should improve when more data are available. Due to a suitable residence time in the ocean, Fukushima-derived radionuclides will provide useful tracers for isotope oceanography studies on the transport of water masses during the next decades in the NW Pacific Ocean.

  9. Using adapted quality-improvement approaches to strengthen community-based health systems and improve care in high HIV-burden sub-Saharan African countries.

    PubMed

    Horwood, Christiane M; Youngleson, Michele S; Moses, Edward; Stern, Amy F; Barker, Pierre M

    2015-07-01

    Achieving long-term retention in HIV care is an important challenge for HIV management and achieving elimination of mother-to-child transmission. Sustainable, affordable strategies are required to achieve this, including strengthening of community-based interventions. Deployment of community-based health workers (CHWs) can improve health outcomes but there is a need to identify systems to support and maintain high-quality performance. Quality-improvement strategies have been successfully implemented to improve quality and coverage of healthcare in facilities and could provide a framework to support community-based interventions. Four community-based quality-improvement projects from South Africa, Malawi and Mozambique are described. Community-based improvement teams linked to the facility-based health system participated in learning networks (modified Breakthrough Series), and used quality-improvement methods to improve process performance. Teams were guided by trained quality mentors who used local data to help nurses and CHWs identify gaps in service provision and test solutions. Learning network participants gathered at intervals to share progress and identify successful strategies for improvement. CHWs demonstrated understanding of quality-improvement concepts, tools and methods, and implemented quality-improvement projects successfully. Challenges of using quality-improvement approaches in community settings included adapting processes, particularly data reporting, to the education level and first language of community members. Quality-improvement techniques can be implemented by CHWs to improve outcomes in community settings but these approaches require adaptation and additional mentoring support to be successful. More research is required to establish the effectiveness of this approach on processes and outcomes of care.

  10. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  11. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  12. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M; Palta, J; Dunscombe, P

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less

  13. Software for pre-processing Illumina next-generation sequencing short read sequences

    PubMed Central

    2014-01-01

    Background When compared to Sanger sequencing technology, next-generation sequencing (NGS) technologies are hindered by shorter sequence read length, higher base-call error rate, non-uniform coverage, and platform-specific sequencing artifacts. These characteristics lower the quality of their downstream analyses, e.g. de novo and reference-based assembly, by introducing sequencing artifacts and errors that may contribute to incorrect interpretation of data. Although many tools have been developed for quality control and pre-processing of NGS data, none of them provide flexible and comprehensive trimming options in conjunction with parallel processing to expedite pre-processing of large NGS datasets. Methods We developed ngsShoRT (next-generation sequencing Short Reads Trimmer), a flexible and comprehensive open-source software package written in Perl that provides a set of algorithms commonly used for pre-processing NGS short read sequences. We compared the features and performance of ngsShoRT with existing tools: CutAdapt, NGS QC Toolkit and Trimmomatic. We also compared the effects of using pre-processed short read sequences generated by different algorithms on de novo and reference-based assembly for three different genomes: Caenorhabditis elegans, Saccharomyces cerevisiae S288c, and Escherichia coli O157 H7. Results Several combinations of ngsShoRT algorithms were tested on publicly available Illumina GA II, HiSeq 2000, and MiSeq eukaryotic and bacteria genomic short read sequences with the focus on removing sequencing artifacts and low-quality reads and/or bases. Our results show that across three organisms and three sequencing platforms, trimming improved the mean quality scores of trimmed sequences. Using trimmed sequences for de novo and reference-based assembly improved assembly quality as well as assembler performance. In general, ngsShoRT outperformed comparable trimming tools in terms of trimming speed and improvement of de novo and reference-based assembly as measured by assembly contiguity and correctness. Conclusions Trimming of short read sequences can improve the quality of de novo and reference-based assembly and assembler performance. The parallel processing capability of ngsShoRT reduces trimming time and improves the memory efficiency when dealing with large datasets. We recommend combining sequencing artifacts removal, and quality score based read filtering and base trimming as the most consistent method for improving sequence quality and downstream assemblies. ngsShoRT source code, user guide and tutorial are available at http://research.bioinformatics.udel.edu/genomics/ngsShoRT/. ngsShoRT can be incorporated as a pre-processing step in genome and transcriptome assembly projects. PMID:24955109

  14. The choices, choosing model of quality of life: description and rationale.

    PubMed

    Gurland, Barry J; Gurland, Roni V

    2009-01-01

    This introductory paper offers a critical review of current models and measures of quality of life, and describes a choices and choosing (c-c) process as a new model of quality of life. Criteria are proposed for judging the relative merits of models of quality of life with preference being given to explicit mechanisms, linkages to a science base, a means of identifying deficits amenable to rational restorative interventions, and with embedded values of the whole person. A conjectured model, based on the processes of gaining access to choices and choosing among them, matches the proposed criteria. The c-c process is an evolved adaptive mechanism dedicated to the pursuit of quality of life, driven by specific biological and psychological systems, and influenced by social and environmental forces. This model strengthens the science base for the field of quality of life, unifies approaches to concept and measurement, and guides the evaluation of impairments of quality of life. Corresponding interventions can be aimed at relieving restrictions or distortions of the c-c process; thus helping people to preserve and improve their quality of life. RELATED WORK: Companion papers detail relevant aspects of the science base, present methods of identifying deficits and distortions of the c-c model so as to open opportunities for rational restorative interventions, and explore empirical analyses of the relationship between health imposed restrictions of c-c and conventional indicators of diminished quality of life. [corrected] (c) 2008 John Wiley & Sons, Ltd.

  15. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  16. Quality assurance of multiport image-guided minimally invasive surgery at the lateral skull base.

    PubMed

    Nau-Hermes, Maria; Schmitt, Robert; Becker, Meike; El-Hakimi, Wissam; Hansen, Stefan; Klenzner, Thomas; Schipper, Jörg

    2014-01-01

    For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG), which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes.

  17. Quality Assurance of Multiport Image-Guided Minimally Invasive Surgery at the Lateral Skull Base

    PubMed Central

    Nau-Hermes, Maria; Schmitt, Robert; Becker, Meike; El-Hakimi, Wissam; Hansen, Stefan; Klenzner, Thomas; Schipper, Jörg

    2014-01-01

    For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG), which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes. PMID:25105146

  18. Students Matter: Quality Measurements in Online Courses

    ERIC Educational Resources Information Center

    Unal, Zafer; Unal, Aslihan

    2016-01-01

    Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…

  19. Providing leadership to a decentralized total quality process.

    PubMed

    Diederich, J J; Eisenberg, M

    1993-01-01

    Integrating total quality management into the culture of an organization and the daily work of employees requires a decentralized leadership structure that encourages all employees to become involved. This article, based upon the experience of the University of Michigan Hospitals Professional Services Divisional Lead Team, outlines a process for decentralizing the total quality management process.

  20. British Thoracic Society quality standards for home oxygen use in adults

    PubMed Central

    Suntharalingam, Jay; Wilkinson, Tom; Annandale, Joseph; Davey, Claire; Fielding, Rhea; Freeman, Daryl; Gibbons, Michael; Hardinge, Maxine; Hippolyte, Sabrine; Knowles, Vikki; Lee, Cassandra; MacNee, William; Pollington, Jacqueline; Vora, Vandana; Watts, Trefor; Wijesinghe, Meme

    2017-01-01

    Introduction The purpose of the quality standards document is to provide healthcare professionals, commissioners, service providers and patients with a guide to standards of care that should be met for home oxygen provision in the UK, together with measurable markers of good practice. Quality statements are based on the British Thoracic Society (BTS) Guideline for Home Oxygen Use in Adults. Methods Development of BTS Quality Standards follows the BTS process of quality standard production based on the National Institute for Health and Care Excellence process manual for the development of quality standards. Results 10 quality statements have been developed, each describing a key marker of high-quality, cost-effective care for home oxygen use, and each statement is supported by quality measures that aim to improve the structure, process and outcomes of healthcare. Discussion BTS Quality Standards for home oxygen use in adults form a key part of the range of supporting materials that the society produces to assist in the dissemination and implementation of a guideline’s recommendations. PMID:29018527

  1. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  2. Analysis of quality raw data of second generation sequencers with Quality Assessment Software.

    PubMed

    Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur

    2011-04-18

    Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  3. Clinical nursing leaders' perceptions of nutrition quality indicators in Swedish stroke wards: a national survey.

    PubMed

    Persenius, Mona; Hall-Lord, Marie-Louise; Wilde-Larsson, Bodil; Carlsson, Eva

    2015-09-01

    To describe nursing leaders' perceptions of nutrition quality in Swedish stroke wards. A high risk of undernutrition places great demand on nutritional care in stroke wards. Evidence-based guidelines exist, but healthcare professionals have reported low interest in nutritional care. The Donabedian framework of structure, process and outcome is recommended to monitor and improve nutrition quality. Using a descriptive cross-sectional design, a web-based questionnaire regarding nutritional care quality was delivered to eligible participants. Most clinical nursing leaders reported structure indicators, e.g. access to dieticians. Among process indicators, regular assessment of patients' swallowing was most frequently reported in comprehensive stroke wards compared with other stroke wards. Use of outcomes to monitor nutrition quality was not routine. Wards using standard care plans showed significantly better results. Using the structure, process and outcome framework to examine nutrition quality, quality-improvement needs became visible. To provide high-quality nutrition, all three structure, process and outcome components must be addressed. The use of care pathways, standard care plans, the Senior Alert registry, as well as systematic use of outcome measures could improve nutrition quality. To assist clinical nursing leaders in managing all aspects of quality, structure, process and outcome can be a valuable framework. © 2013 John Wiley & Sons Ltd.

  4. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  5. ISO 9001 in a neonatal intensive care unit (NICU).

    PubMed

    Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel

    2011-01-01

    The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.

  6. User-Oriented Quality for OER: Understanding Teachers' Views on Re-Use, Quality, and Trust

    ERIC Educational Resources Information Center

    Clements, K. I.; Pawlowski, J. M.

    2012-01-01

    We analysed how teachers as users of open educational resources (OER) repositories act in the re-use process and how they perceive quality. Based on a quantitative empirical study, we also surveyed which quality requirements users have and how they would contribute to the quality process. Trust in resources, organizations, and technologies seem to…

  7. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  8. Guidelines for Risk-Based Changeover of Biopharma Multi-Product Facilities.

    PubMed

    Lynch, Rob; Barabani, David; Bellorado, Kathy; Canisius, Peter; Heathcote, Doug; Johnson, Alan; Wyman, Ned; Parry, Derek Willison

    2018-01-01

    In multi-product biopharma facilities, the protection from product contamination due to the manufacture of multiple products simultaneously is paramount to assure product quality. To that end, the use of traditional changeover methods (elastomer change-out, full sampling, etc.) have been widely used within the industry and have been accepted by regulatory agencies. However, with the endorsement of Quality Risk Management (1), the use of risk-based approaches may be applied to assess and continuously improve established changeover processes. All processes, including changeover, can be improved with investment (money/resources), parallel activities, equipment design improvements, and standardization. However, processes can also be improved by eliminating waste. For product changeover, waste is any activity not needed for the new process or that does not provide added assurance of the quality of the subsequent product. The application of a risk-based approach to changeover aligns with the principles of Quality Risk Management. Through the use of risk assessments, the appropriate changeover controls can be identified and controlled to assure product quality is maintained. Likewise, the use of risk assessments and risk-based approaches may be used to improve operational efficiency, reduce waste, and permit concurrent manufacturing of products. © PDA, Inc. 2018.

  9. [Discussion on research and development of new traditional Chinese medicine preparation process based on idea of QbD].

    PubMed

    Feng, Yi; Hong, Yan-Long; Xian, Jie-Chen; Du, Ruo-Fei; Zhao, Li-Jie; Shen, Lan

    2014-09-01

    Traditional processes are mostly adopted in traditional Chinese medicine (TCM) preparation production and the quality of products is mostly controlled by terminal. Potential problems of the production in the process are unpredictable and is relied on experience in most cases. Therefore, it is hard to find the key points affecting the preparation process and quality control. A pattern of research and development of traditional Chinese medicine preparation process based on the idea of Quality by Design (QbD) was proposed after introducing the latest research achievement. Basic theories of micromeritics and rheology were used to characterize the physical property of TCM raw material. TCM preparation process was designed in a more scientific and rational way by studying the correlation among enhancing physical property of raw material, preparation process and product quality of preparation. So factors affecting the quality of TCM production would be found out and problems that might occur in the pilot process could be predicted. It would be a foundation for the R&D and production of TCM preparation as well as support for the "process control" of TCMIs gradually realized in the future.

  10. TU-AB-BRD-04: Development of Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  11. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  12. SB certification for mixture-based specification for flexible base.

    DOT National Transportation Integrated Search

    2012-10-01

    Presentation topics: : Establish List of Qualified Producers; : Producers Responsible for Process Control/Quality Control; : Reduce TxDOT Sampling and Testing; : Expedite Aggregate Base Acceptance; : Share Responsibility (Producer/TxDOT) for Quality ...

  13. Usalpharma: A Cloud-Based Architecture to Support Quality Assurance Training Processes in Health Area Using Virtual Worlds

    PubMed Central

    García-Peñalvo, Francisco J.; Pérez-Blanco, Jonás Samuel; Martín-Suárez, Ana

    2014-01-01

    This paper discusses how cloud-based architectures can extend and enhance the functionality of the training environments based on virtual worlds and how, from this cloud perspective, we can provide support to analysis of training processes in the area of health, specifically in the field of training processes in quality assurance for pharmaceutical laboratories, presenting a tool for data retrieval and analysis that allows facing the knowledge discovery in the happenings inside the virtual worlds. PMID:24778593

  14. Designing quality of care--contributions from parents: Parents' experiences of care processes in paediatric care and their contribution to improvements of the care process in collaboration with healthcare professionals.

    PubMed

    Gustavsson, Susanne; Gremyr, Ida; Kenne Sarenmalm, Elisabeth

    2016-03-01

    The aim of this article was to explore whether current quality dimensions for health care services are sufficient to capture how parents perceive and contribute to quality of health care. New quality improvement initiatives that actively involve patients must be examined with a critical view on established quality dimensions to ensure that these measures support patient involvement. This paper used a qualitative and descriptive design. This paper is based on interviews with parents participating in two experience-based co-design projects in a Swedish hospital that included qualitative content analysis of data from 12 parent interviews in paediatric care. Health care professionals often overemphasize their own significance for value creation in care processes and underappreciate parents' ability to influence and contribute to better quality. However, quality is not based solely on how professionals accomplish their task, but is co-created by health care professionals and parents. Consequently, assessment of quality outcomes also must include parents' ability and context. This paper questions current models of quality dimensions in health care, and suggests additional sub-dimensions, such as family quality and involvement quality. This paper underscores the importance of involving parents in health care improvements with health care professionals to capture as many dimensions of quality as possible. © 2015 John Wiley & Sons Ltd.

  15. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  16. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  17. [Investigation on production process quality control of traditional Chinese medicine--Banlangen granule as an example].

    PubMed

    Tan, Manrong; Yan, Dan; Qiu, Lingling; Chen, Longhu; Yan, Yan; Jin, Cheng; Li, Hanbing; Xiao, Xiaohe

    2012-04-01

    For the quality management system of herbal medicines, intermediate and finished products it exists the " short board" effect of methodologies. Based on the concept of process control, new strategies and new methods of the production process quality control had been established with the consideration of the actual production of traditional Chinese medicine an the characteristics of Chinese medicine. Taking Banlangen granule as a practice example, which was effective and widespread application, character identification, determination of index components, chemical fingerprint and biometrics technology were sequentially used respectively to assess the quality of Banlangen herbal medicines, intermediate (water extraction and alcohol precipitation) and finished product. With the transfer rate of chemical information and biological potency as indicators, the effectiveness and transmission of the above different assessments and control methods had been researched. And ultimately, the process quality control methods of Banlangen granule, which were based on chemical composition analysis-biometric analysis, had been set up. It can not only validly solute the current status that there were many manufacturers varying quality of Banlangen granule, but also ensure and enhance its clinical efficacy. Furthermore it provided a foundation for the construction of the quality control of traditional Chinese medicine production process.

  18. [Quality process control system of Chinese medicine preparation based on "holistic view"].

    PubMed

    Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming

    2018-01-01

    "High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.

  19. Radiostrontium in the western North Pacific: characteristics, behavior, and the Fukushima impact.

    PubMed

    Povinec, Pavel P; Hirose, Katsumi; Aoyama, Michio

    2012-09-18

    The impact of the Fukushima-derived radiostrontium ((90)Sr and (89)Sr) on the western North Pacific Ocean has not been well established, although (90)Sr concentrations recorded in surface seawater offshore of the damaged Fukushima Dai-ichi nuclear power plant were in some areas comparable to or even higher than (as those in December 2011 with 400 kBq m(-3)(90)Sr) the (137)Cs levels. The total amount of (90)Sr released to the marine environment in the form of highly radioactive wastewater could reach about 1 PBq. Long-term series (1960-2010) of (90)Sr concentration measurements in subtropical surface waters of the western North Pacific indicated that its concentration has been decreasing gradually with a half-life of 14 y. The pre-Fukushima (90)Sr levels in surface waters, including coastal waters near Fukushima, were estimated to be 1 Bq m(-3). To better assess the impact of about 4-5 orders of magnitude increased radiostrontium levels on the marine environment, more detail measurements in seawater and biota of the western North Pacific are required.

  20. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  1. Effects of Reflection Category and Reflection Quality on Learning Outcomes during Web-Based Portfolio Assessment Process: A Case Study of High School Students in Computer Application Course

    ERIC Educational Resources Information Center

    Chou, Pao-Nan; Chang, Chi-Cheng

    2011-01-01

    This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…

  2. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  3. Evaluation of ultrasound based sterilization approaches in terms of shelf life and quality parameters of fruit and vegetable juices.

    PubMed

    Khandpur, Paramjeet; Gogate, Parag R

    2016-03-01

    The present work evaluates the performance of ultrasound based sterilization approaches for processing of different fruit and vegetable juices in terms of microbial growth and changes in the quality parameters during the storage. Comparison with the conventional thermal processing has also been presented. A novel approach based on combination of ultrasound with ultraviolet irradiation and crude extract of essential oil from orange peels has been used for the first time. Identification of the microbial growth (total bacteria and yeast content) in the juices during the subsequent storage and assessing the safety for human consumption along with the changes in the quality parameters (Brix, titratable acidity, pH, ORP, salt, conductivity, TSS and TDS) has been investigated in details. The optimized ultrasound parameters for juice sterilization were established as ultrasound power of 100 W and treatment time of 15 min for the constant frequency operation (20 kHz). It has been established that more than 5 log reduction was achieved using the novel combined approaches based on ultrasound. The treated juices using different approaches based on ultrasound also showed lower microbial growth and improved quality characteristics as compared to the thermally processed juice. Scale up studies were also performed using spinach juice as the test sample with processing at 5 L volume for the first time. The ultrasound treated juice satisfied the microbiological and physiochemical safety limits in refrigerated storage conditions for 20 days for the large scale processing. Overall the present work conclusively established the usefulness of combined treatment approaches based on ultrasound for maintaining the microbiological safety of beverages with enhanced shelf life and excellent quality parameters as compared to the untreated and thermally processed juices. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Pay for performance in thoracic surgery.

    PubMed

    Varela, Gonzalo

    2007-08-01

    In the context of improving the quality of the medical practice, PFP programs have been developed to reward best medical practice. Early studies showed little gain in quality after implementing PFP family practice programs and some unintended consequences, like excluding high-risk patients from medical services when good outcomes were linked to payment. To date, no PFP programs have been implemented in surgical practice, but it is expected that value-based purchasing philosophy will be extended to surgical specialties in the near future. Quality initiatives in surgery can be based on outcome or process measures. Outcomes-focused quality approaches rely on accurate information obtained from multiinstitutional clinical databases for calculation of risk-adjusted models. Primary outcomes such surgical mortality are uncommon in modern thoracic surgery and outcome measures should rely on more prevalent intermediate outcomes such as specific postoperative morbidities or emergency readmission. Process-based quality approaches need to be based on scientific evidence linking process to outcomes. It is our responsibility to develop practice guidelines or international practice consensus to facilitate the parameters to be evaluated in the near future.

  5. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  6. [Discussion on Quality Evaluation Method of Medical Device During Life-Cycle in Operation Based on the Analytic Hierarchy Process].

    PubMed

    Zheng, Caixian; Zheng, Kun; Shen, Yunming; Wu, Yunyun

    2016-01-01

    The content related to the quality during life-cycle in operation of medical device includes daily use, repair volume, preventive maintenance, quality control and adverse event monitoring. In view of this, the article aims at discussion on the quality evaluation method of medical devices during their life cycle in operation based on the Analytic Hierarchy Process (AHP). The presented method is proved to be effective by evaluating patient monitors as example. The method presented in can promote and guide the device quality control work, and it can provide valuable inputs to decisions about purchase of new device.

  7. Application of a tablet film coating model to define a process-imposed transition boundary for robust film coating.

    PubMed

    van den Ban, Sander; Pitt, Kendal G; Whiteman, Marshall

    2018-02-01

    A scientific understanding of interaction of product, film coat, film coating process, and equipment is important to enable design and operation of industrial scale pharmaceutical film coating processes that are robust and provide the level of control required to consistently deliver quality film coated product. Thermodynamic film coating conditions provided in the tablet film coating process impact film coat formation and subsequent product quality. A thermodynamic film coating model was used to evaluate film coating process performance over a wide range of film coating equipment from pilot to industrial scale (2.5-400 kg). An approximate process-imposed transition boundary, from operating in a dry to a wet environment, was derived, for relative humidity and exhaust temperature, and used to understand the impact of the film coating process on product formulation and process control requirements. This approximate transition boundary may aid in an enhanced understanding of risk to product quality, application of modern Quality by Design (QbD) based product development, technology transfer and scale-up, and support the science-based justification of critical process parameters (CPPs).

  8. Physical and sensory quality of Java Arabica green coffee beans

    NASA Astrophysics Data System (ADS)

    Sunarharum, W. B.; Yuwono, S. S.; Pangestu, N. B. S. W.; Nadhiroh, H.

    2018-03-01

    Demand on high quality coffee for consumption is continually increasing not only in the consuming countries (importers) but also in the producing countries (exporters). Coffee quality could be affected by several factors from farm to cup including the post-harvest processing methods. This research aimed to investigate the influence of different post-harvest processing methods on physical and sensory quality of Java Arabica green coffee beans. The two factors being evaluated were three different post-harvest processing methods to produce green coffee beans (natural/dry, semi-washed and fully-washed processing) under sun drying. Physical quality evaluation was based on The Indonesian National Standard (SNI 01-2907-2008) while sensory quality was evaluated by five expert judges. The result shows that less defects observed in wet processed coffee as compared to the dry processing. The mechanical drying was also proven to yield a higher quality green coffee beans and minimise losses.

  9. Evidence based post graduate training. A systematic review of reviews based on the WFME quality framework

    PubMed Central

    2011-01-01

    Background A framework for high quality in post graduate training has been defined by the World Federation of Medical Education (WFME). The objective of this paper is to perform a systematic review of reviews to find current evidence regarding aspects of quality of post graduate training and to organise the results following the 9 areas of the WFME framework. Methods The systematic literature review was conducted in 2009 in Medline Ovid, EMBASE, ERIC and RDRB databases from 1995 onward. The reviews were selected by two independent researchers and a quality appraisal was based on the SIGN tool. Results 31 reviews met inclusion criteria. The majority of the reviews provided information about the training process (WFME area 2), the assessment of trainees (WFME area 3) and the trainees (WFME area 4). One review covered the area 8 'governance and administration'. No review was found in relation to the mission and outcomes, the evaluation of the training process and the continuous renewal (respectively areas 1, 7 and 9 of the WFME framework). Conclusions The majority of the reviews provided information about the training process, the assessment of trainees and the trainees. Indicators used for quality assessment purposes of post graduate training should be based on this evidence but further research is needed for some areas in particular to assess the quality of the training process. PMID:21977898

  10. Linkages between Total Quality Management and the Outcomes-Based Approach in an Education Environment

    ERIC Educational Resources Information Center

    de Jager, H. J.; Nieuwenhuis, F. J.

    2005-01-01

    South Africa has embarked on a process of education renewal by adopting outcomes-based education (OBE). This paper focuses on the linkages between total quality management (TQM) and the outcomes-based approach in an education context. Quality assurance in academic programmes in higher education in South Africa is, in some instances, based on the…

  11. Using activity-based learning approach to enhance the quality of instruction in civil engineering in Indonesian universities

    NASA Astrophysics Data System (ADS)

    Priyono, Wena, Made; Rahardjo, Boedi

    2017-09-01

    Experts and practitioners agree that the quality of higher education in Indonesia needs to be improved significantly and continuously. The low quality of university graduates is caused by many factors, one of which is the poor quality of learning. Today's instruction process tends to place great emphasis only on delivering knowledge. To avoid the pitfalls of such instruction, e.g. passive learning, thus Civil Engineering students should be given more opportunities to interact with others and actively participate in the learning process. Based on a number of theoretical and empirical studies, one appropriate strategy to overcome the aforementioned problem is by developing and implementing activity-based learning approach.

  12. TU-AB-BRD-01: Process Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palta, J.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  13. Development of the Quality of Australian Nursing Documentation in Aged Care (QANDAC) instrument to assess paper-based and electronic resident records.

    PubMed

    Wang, Ning; Björvell, Catrin; Hailey, David; Yu, Ping

    2014-12-01

    To develop an Australian nursing documentation in aged care (Quality of Australian Nursing Documentation in Aged Care (QANDAC)) instrument to measure the quality of paper-based and electronic resident records. The instrument was based on the nursing process model and on three attributes of documentation quality identified in a systematic review. The development process involved five phases following approaches to designing criterion-referenced measures. The face and content validities and the inter-rater reliability of the instrument were estimated using a focus group approach and consensus model. The instrument contains 34 questions in three sections: completion of nursing history and assessment, description of care process and meeting the requirements of data entry. Estimates of the validity and inter-rater reliability of the instrument gave satisfactory results. The QANDAC instrument may be a useful audit tool for quality improvement and research in aged care documentation. © 2013 ACOTA.

  14. Research on construction quality and improvement of assembly construction

    NASA Astrophysics Data System (ADS)

    Cheng, Fei

    2017-11-01

    Based on the acceleration of the urbanization process and the improvement of the quality of life of our residents, the demand for building construction has been increasing. In this context, the construction industry in order to promote the construction efficiency, quality improvement, to meet the needs of the development of the times to strengthen the new technology, the use of new technologies. At present, China’s engineering construction units in the process of carrying out the project to strengthen the use of assembly-type construction technology, which thus achieved for the traditional construction work low-level, high time-consuming issues, and promote the steady improvement of production efficiency. Based on this, this paper focuses on the analysis of the connotation of the assembly structure and analyzes the quality problems in the construction process of the construction projects and puts forward the improvement measures to promote the improvement of the building quality and the construction of the building Construction speed. Based on this, this paper analyzes the structural system and design of prefabricated building.

  15. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. High-efficiency cell concepts on low-cost silicon sheets

    NASA Technical Reports Server (NTRS)

    Bell, R. O.; Ravi, K. V.

    1985-01-01

    The limitations on sheet growth material in terms of the defect structure and minority carrier lifetime are discussed. The effect of various defects on performance are estimated. Given these limitations designs for a sheet growth cell that will make the best of the material characteristics are proposed. Achievement of optimum synergy between base material quality and device processing variables is proposed. A strong coupling exists between material quality and the variables during crystal growth, and device processing variables. Two objectives are outlined: (1) optimization of the coupling for maximum performance at minimal cost; and (2) decoupling of materials from processing by improvement in base material quality to make it less sensitive to processing variables.

  17. DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM

    EPA Science Inventory

    The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...

  18. Evaluating treatment process redesign by applying the EFQM Excellence Model.

    PubMed

    Nabitz, Udo; Schramade, Mark; Schippers, Gerard

    2006-10-01

    To evaluate a treatment process redesign programme implementing evidence-based treatment as part of a total quality management in a Dutch addiction treatment centre. Quality management was monitored over a period of more than 10 years in an addiction treatment centre with 550 professionals. Changes are evaluated, comparing the scores on the nine criteria of the European Foundation for Quality Management (EFQM) Excellence Model before and after a major redesign of treatment processes and ISO certification. In the course of 10 years, most intake, care, and cure processes were reorganized, the support processes were restructured and ISO certified, 29 evidence-based treatment protocols were developed and implemented, and patient follow-up measuring was established to make clinical outcomes transparent. Comparing the situation before and after the changes shows that the client satisfaction scores are stable, that the evaluation by personnel and society is inconsistent, and that clinical, production, and financial outcomes are positive. The overall EFQM assessment by external assessors in 2004 shows much higher scores on the nine criteria than the assessment in 1994. Evidence-based treatment can successfully be implemented in addiction treatment centres through treatment process redesign as part of a total quality management strategy, but not all results are positive.

  19. A systematic review of Human Factors and Ergonomics (HFE)-based healthcare system redesign for quality of care and patient safety

    PubMed Central

    Xie, Anping; Carayon, Pascale

    2014-01-01

    Healthcare systems need to be redesigned to provide care that is safe, effective and efficient, and meets the multiple needs of patients. This systematic review examines how Human Factors and Ergonomics (HFE) is applied to redesign healthcare work systems and processes and improve quality and safety of care. We identified twelve projects representing 23 studies and addressing different physical, cognitive and organizational HFE issues in a variety of healthcare systems and care settings. Some evidence exists for the effectiveness of HFE-based healthcare system redesign in improving process and outcome measures of quality and safety of care. We assessed risk of bias in 16 studies reporting the impact of HFE-based healthcare system redesign and found varying quality across studies. Future research should further assess the impact of HFE on quality and safety of care, and clearly define the mechanisms by which HFE-based system redesign can improve quality and safety of care. Practitioner Summary Existing evidence shows that HFE-based healthcare system redesign has the potential to improve quality of care and patient safety. Healthcare organizations need to recognize the importance of HFE-based healthcare system redesign to quality of care and patient safety, and invest resources to integrate HFE in healthcare improvement activities. PMID:25323570

  20. A systematic review of human factors and ergonomics (HFE)-based healthcare system redesign for quality of care and patient safety.

    PubMed

    Xie, Anping; Carayon, Pascale

    2015-01-01

    Healthcare systems need to be redesigned to provide care that is safe, effective and efficient, and meets the multiple needs of patients. This systematic review examines how human factors and ergonomics (HFE) is applied to redesign healthcare work systems and processes and improve quality and safety of care. We identified 12 projects representing 23 studies and addressing different physical, cognitive and organisational HFE issues in a variety of healthcare systems and care settings. Some evidence exists for the effectiveness of HFE-based healthcare system redesign in improving process and outcome measures of quality and safety of care. We assessed risk of bias in 16 studies reporting the impact of HFE-based healthcare system redesign and found varying quality across studies. Future research should further assess the impact of HFE on quality and safety of care, and clearly define the mechanisms by which HFE-based system redesign can improve quality and safety of care.

  1. Assessing the structure of non-routine decision processes in Airline Operations Control.

    PubMed

    Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans

    2016-03-01

    Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.

  2. TU-AB-BRD-00: Task Group 100

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  3. TU-AB-BRD-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  4. TU-AB-BRD-02: Failure Modes and Effects Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  5. Parametric Optimization Of Gas Metal Arc Welding Process By Using Grey Based Taguchi Method On Aisi 409 Ferritic Stainless Steel

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabendu; Kumar, Pradip; Nandi, Goutam

    2016-10-01

    Welding input process parameters play a very significant role in determining the quality of the welded joint. Only by properly controlling every element of the process can product quality be controlled. For better quality of MIG welding of Ferritic stainless steel AISI 409, precise control of process parameters, parametric optimization of the process parameters, prediction and control of the desired responses (quality indices) etc., continued and elaborate experiments, analysis and modeling are needed. A data of knowledge - base may thus be generated which may be utilized by the practicing engineers and technicians to produce good quality weld more precisely, reliably and predictively. In the present work, X-ray radiographic test has been conducted in order to detect surface and sub-surface defects of weld specimens made of Ferritic stainless steel. The quality of the weld has been evaluated in terms of yield strength, ultimate tensile strength and percentage of elongation of the welded specimens. The observed data have been interpreted, discussed and analyzed by considering ultimate tensile strength ,yield strength and percentage elongation combined with use of Grey-Taguchi methodology.

  6. Quality inspection guided laser processing of irregular shape objects by stereo vision measurement: application in badminton shuttle manufacturing

    NASA Astrophysics Data System (ADS)

    Qi, Li; Wang, Shun; Zhang, Yixin; Sun, Yingying; Zhang, Xuping

    2015-11-01

    The quality inspection process is usually carried out after first processing of the raw materials such as cutting and milling. This is because the parts of the materials to be used are unidentified until they have been trimmed. If the quality of the material is assessed before the laser process, then the energy and efforts wasted on defected materials can be saved. We proposed a new production scheme that can achieve quantitative quality inspection prior to primitive laser cutting by means of three-dimensional (3-D) vision measurement. First, the 3-D model of the object is reconstructed by the stereo cameras, from which the spatial cutting path is derived. Second, collaborating with another rear camera, the 3-D cutting path is reprojected to both the frontal and rear views of the object and thus generates the regions-of-interest (ROIs) for surface defect analysis. An accurate visual guided laser process and reprojection-based ROI segmentation are enabled by a global-optimization-based trinocular calibration method. The prototype system was built and tested with the processing of raw duck feathers for high-quality badminton shuttle manufacture. Incorporating with a two-dimensional wavelet-decomposition-based defect analysis algorithm, both the geometrical and appearance features of the raw feathers are quantified before they are cut into small patches, which result in fully automatic feather cutting and sorting.

  7. How product trial changes quality perception of four new processed beef products.

    PubMed

    Saeed, Faiza; Grunert, Klaus G; Therkildsen, Margrethe

    2013-01-01

    The purpose of this paper is the quantitative analysis of the change in quality perception of four new processed beef products from pre to post trial phases. Based on the Total Food Quality Model, differences in pre and post-trial phases were measured using repeated measures technique for cue evaluation, quality evaluation and purchase motive fulfillment. For two of the tested products, trial resulted in a decline of the evaluation of cues, quality and purchase motive fulfillment compared to pre-trial expectations. For these products, positive expectations were created by giving information about ingredients and ways of processing, which were not confirmed during trial. For the other two products, evaluations on key sensory dimensions based on trial exceeded expectations, whereas the other evaluations remained unchanged. Several demographic factors influenced the pattern of results, notably age and gender, which may be due to underlying differences in previous experience. The study gives useful insights for testing of new processed meat products before market introduction. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Fabrication of high quality aspheric microlens array by dose-modulated lithography and surface thermal reflow

    NASA Astrophysics Data System (ADS)

    Huang, Shengzhou; Li, Mujun; Shen, Lianguan; Qiu, Jinfeng; Zhou, Youquan

    2018-03-01

    A novel fabrication method for high quality aspheric microlens array (MLA) was developed by combining the dose-modulated DMD-based lithography and surface thermal reflow process. In this method, the complex shape of aspheric microlens is pre-modeled via dose modulation in a digital micromirror device (DMD) based maskless projection lithography. And the dose modulation mainly depends on the distribution of exposure dose of photoresist. Then the pre-shaped aspheric microlens is polished by a following non-contact thermal reflow (NCTR) process. Different from the normal process, the reflow process here is investigated to improve the surface quality while keeping the pre-modeled shape unchanged, and thus will avoid the difficulties in generating the aspheric surface during reflow. Fabrication of a designed aspheric MLA with this method was demonstrated in experiments. Results showed that the obtained aspheric MLA was good in both shape accuracy and surface quality. The presented method may be a promising approach in rapidly fabricating high quality aspheric microlens with complex surface.

  9. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  10. Q-marker based strategy for CMC research of Chinese medicine: A case study of Panax Notoginseng saponins.

    PubMed

    Zhong, Yi; Zhu, Jieqiang; Yang, Zhenzhong; Shao, Qing; Fan, Xiaohui; Cheng, Yiyu

    2018-01-31

    To ensure pharmaceutical quality, chemistry, manufacturing and control (CMC) research is essential. However, due to the inherent complexity of Chinese medicine (CM), CMC study of CM remains a great challenge for academia, industry, and regulatory agencies. Recently, quality-marker (Q-marker) was proposed to establish quality standards or quality analysis approaches of Chinese medicine, which sheds a light on Chinese medicine's CMC study. Here manufacture processes of Panax Notoginseng Saponins (PNS) is taken as a case study and the present work is to establish a Q-marker based research strategy for CMC of Chinese medicine. The Q-markers of Panax Notoginseng Saponins (PNS) is selected and established by integrating chemical profile with pharmacological activities. Then, the key processes of PNS manufacturing are identified by material flow analysis. Furthermore, modeling algorithms are employed to explore the relationship between Q-markers and critical process parameters (CPPs) of the key processes. At last, CPPs of the key processes are optimized in order to improving the process efficiency. Among the 97 identified compounds, Notoginsenoside R 1 , ginsenoside Rg 1 , Re, Rb 1 and Rd are selected as the Q-markers of PNS. Our analysis on PNS manufacturing show the extraction process and column chromatography process are the key processes. With the CPPs of each process as the inputs and Q-markers' contents as the outputs, two process prediction models are built separately for the extraction process and column chromatography process of Panax notoginseng, which both possess good prediction ability. Based on the efficiency models of extraction process and column chromatography process we constructed, the optimal CPPs of both processes are calculated. Our results show that the Q-markers derived from CMC research strategy can be applied to analyze the manufacturing processes of Chinese medicine to assure product's quality and promote key processes' efficiency simultaneously. Copyright © 2018 Elsevier GmbH. All rights reserved.

  11. Quality Measurements in Radiology: A Systematic Review of the Literature and Survey of Radiology Benefit Management Groups.

    PubMed

    Narayan, Anand; Cinelli, Christina; Carrino, John A; Nagy, Paul; Coresh, Josef; Riese, Victoria G; Durand, Daniel J

    2015-11-01

    As the US health care system transitions toward value-based reimbursement, there is an increasing need for metrics to quantify health care quality. Within radiology, many quality metrics are in use, and still more have been proposed, but there have been limited attempts to systematically inventory these measures and classify them using a standard framework. The purpose of this study was to develop an exhaustive inventory of public and private sector imaging quality metrics classified according to the classic Donabedian framework (structure, process, and outcome). A systematic review was performed in which eligibility criteria included published articles (from 2000 onward) from multiple databases. Studies were double-read, with discrepancies resolved by consensus. For the radiology benefit management group (RBM) survey, the six known companies nationally were surveyed. Outcome measures were organized on the basis of standard categories (structure, process, and outcome) and reported using Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. The search strategy yielded 1,816 citations; review yielded 110 reports (29 included for final analysis). Three of six RBMs (50%) responded to the survey; the websites of the other RBMs were searched for additional metrics. Seventy-five unique metrics were reported: 35 structure (46%), 20 outcome (27%), and 20 process (27%) metrics. For RBMs, 35 metrics were reported: 27 structure (77%), 4 process (11%), and 4 outcome (11%) metrics. The most commonly cited structure, process, and outcome metrics included ACR accreditation (37%), ACR Appropriateness Criteria (85%), and peer review (95%), respectively. Imaging quality metrics are more likely to be structural (46%) than process (27%) or outcome (27%) based (P < .05). As national value-based reimbursement programs increasingly emphasize outcome-based metrics, radiologists must keep pace by developing the data infrastructure required to collect outcome-based quality metrics. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. Benchmarking: A Process for Improvement.

    ERIC Educational Resources Information Center

    Peischl, Thomas M.

    One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…

  13. Drop-on-Demand System for Manufacturing of Melt-based Solid Oral Dosage: Effect of Critical Process Parameters on Product Quality.

    PubMed

    Içten, Elçin; Giridhar, Arun; Nagy, Zoltan K; Reklaitis, Gintaras V

    2016-04-01

    The features of a drop-on-demand-based system developed for the manufacture of melt-based pharmaceuticals have been previously reported. In this paper, a supervisory control system, which is designed to ensure reproducible production of high quality of melt-based solid oral dosages, is presented. This control system enables the production of individual dosage forms with the desired critical quality attributes: amount of active ingredient and drug morphology by monitoring and controlling critical process parameters, such as drop size and product and process temperatures. The effects of these process parameters on the final product quality are investigated, and the properties of the produced dosage forms characterized using various techniques, such as Raman spectroscopy, optical microscopy, and dissolution testing. A crystallization temperature control strategy, including controlled temperature cycles, is presented to tailor the crystallization behavior of drug deposits and to achieve consistent drug morphology. This control strategy can be used to achieve the desired bioavailability of the drug by mitigating variations in the dissolution profiles. The supervisor control strategy enables the application of the drop-on-demand system to the production of individualized dosage required for personalized drug regimens.

  14. Self-assessment procedure using fuzzy sets

    NASA Astrophysics Data System (ADS)

    Mimi, Fotini

    2000-10-01

    Self-Assessment processes, initiated by a company itself and carried out by its own people, are considered to be the starting point for a regular strategic or operative planning process to ensure a continuous quality improvement. Their importance has increased by the growing relevance and acceptance of international quality awards such as the Malcolm Baldrige National Quality Award, the European Quality Award and the Deming Prize. Especially award winners use the instrument of a systematic and regular Self-Assessment and not only because they have to verify their quality and business results for at least three years. The Total Quality Model of the European Foundation for Quality Management (EFQM), used for the European Quality Award, is the basis for Self-Assessment in Europe. This paper presents a self-assessment supporting method based on a methodology of fuzzy control systems providing an effective means of converting the linguistic approximation into an automatic control strategy. In particular, the elements of the Quality Model mentioned above are interpreted as linguistic variables. The LR-type of a fuzzy interval is used for their representation. The input data has a qualitative character based on empirical investigation and expert knowledge and therefore the base- variables are ordinal scaled. The aggregation process takes place on the basis of a hierarchical structure. Finally, in order to render the use of the method more practical a software system on PC basis is developed and implemented.

  15. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  16. Total quality management - It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.

  17. The quality of instruments to assess the process of shared decision making: A systematic review.

    PubMed

    Gärtner, Fania R; Bomhof-Roordink, Hanna; Smith, Ian P; Scholl, Isabelle; Stiggelbout, Anne M; Pieterse, Arwen H

    2018-01-01

    To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument's content and characteristics such as the perspective that they assess. We recommend refinement and validation of existing instruments, and the use of COSMIN-guidelines to help guarantee high-quality evaluations.

  18. EDUCATING MANAGERS ABOUT QUALITY THROUGH CUSTOMER-SUPPLIER UNDERSTANDING

    EPA Science Inventory

    The successful implementation of a Quality System depends largely on the commitment to Quality by managers and their participation in the quality management process. oday, an accepted definition of quality is largely based on the concept of customer and supplier partnerships in a...

  19. E-Services quality assessment framework for collaborative networks

    NASA Astrophysics Data System (ADS)

    Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian

    2015-08-01

    In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.

  20. Quality Assurance Processes in Finnish Universities: Direct and Indirect Outcomes and Organisational Conditions

    ERIC Educational Resources Information Center

    Haapakorpi, Arja

    2011-01-01

    In Finland, quality assurance related to the Bologna process has been adapted to existing systems of higher education at the national level and a form of implementation is also recognised at the level of the higher education institution. In universities, varied outcomes of quality assurance are based on interaction of organisational structures,…

  1. Next-to-leading logarithmic QCD contribution of the electromagnetic dipole operator to B¯→Xsγγ with a massive strange quark

    NASA Astrophysics Data System (ADS)

    Asatrian, H. M.; Greub, C.

    2014-05-01

    We calculate the O(αs) corrections to the double differential decay width dΓ77/(ds1ds2) for the process B¯→Xsγγ, originating from diagrams involving the electromagnetic dipole operator O7. The kinematical variables s1 and s2 are defined as si=(pb-qi)2/mb2, where pb, q1, q2 are the momenta of the b quark and two photons. We introduce a nonzero mass ms for the strange quark to regulate configurations where the gluon or one of the photons become collinear with the strange quark and retain terms which are logarithmic in ms, while discarding terms which go to zero in the limit ms→0. When combining virtual and bremsstrahlung corrections, the infrared and collinear singularities induced by soft and/or collinear gluons drop out. By our cuts the photons do not become soft, but one of them can become collinear with the strange quark. This implies that in the final result a single logarithm of ms survives. In principle, the configurations with collinear photon emission could be treated using fragmentation functions. In a related work we find that similar results can be obtained when simply interpreting ms appearing in the final result as a constituent mass. We do so in the present paper and vary ms between 400 and 600 MeV in the numerics. This work extends a previous paper by us, where only the leading power terms with respect to the (normalized) hadronic mass s3=(pb-q1-q2)2/mb2 were taken into account in the underlying triple differential decay width dΓ77/(ds1ds2ds3).

  2. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  3. A Portable Computer System for Auditing Quality of Ambulatory Care

    PubMed Central

    McCoy, J. Michael; Dunn, Earl V.; Borgiel, Alexander E.

    1987-01-01

    Prior efforts to effectively and efficiently audit quality of ambulatory care based on comprehensive process criteria have been limited largely by the complexity and cost of data abstraction and management. Over the years, several demonstration projects have generated large sets of process criteria and mapping systems for evaluating quality of care, but these paper-based approaches have been impractical to implement on a routine basis. Recognizing that portable microcomputers could solve many of the technical problems in abstracting data from medical records, we built upon previously described criteria and developed a microcomputer-based abstracting system that facilitates reliable and cost-effective data abstraction.

  4. Ontology Based Quality Evaluation for Spatial Data

    NASA Astrophysics Data System (ADS)

    Yılmaz, C.; Cömert, Ç.

    2015-08-01

    Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.

  5. Development of LLNA:DAE: a new local lymph node assay that includes the elicitation phase, discriminates borderline-positive chemicals, and is useful for cross-sensitization testing.

    PubMed

    Yamashita, Kunihiko; Shinoda, Shinsuke; Hagiwara, Saori; Itagaki, Hiroshi

    2014-02-01

    We developed a new local lymph node assay (LLNA) that includes the elicitation phase termed LLNA:DAE for discrimination of borderline-positive chemicals as classified by the LLNA modified by Daicel based on ATP content (LLNA:DA) and for cross-sensitization testing. Although the LLNA:DA method could help identify skin sensitizers, some skin irritants classified as non-sensitizers by the LLNA were classified as borderline positive. In addition, the evaluation for the cross-sensitization potential between chemicals was impossible. In the LLNA:DAE procedure, test group of mice received four applications of chemicals on the dorsum of the right ear for induction and one application on the dorsum of the left ear for elicitation. Control group of mice received one chemical application on the dorsum of the left ear. We evaluated the sensitizing potential by comparing the weights of the lymph nodes from the left ears between the two groups. The results of using the LLNA:DAE method to examine 24 chemicals, which contained borderline-positive chemicals, were consistent with those from the LLNA method, except for nickel chloride (NiCl2). Two chemical pairs, 2,4-dinitrochlorobenzene (DNCB) with 2,4-dinitrofluorobenzene (DNFB) and hydroquinone (HQ) with p-benzoquinone (p-BQ), showed clear cross-sensitization with each other, while another chemical pair, DNFB with hexylcinnamic aldehyde (HCA) did not. Taken together, our results suggest that the LLNA:DAE method is useful for discriminating borderline-positive chemicals and for determining chemical cross-sensitization.

  6. Measurement of Health Care Quality in Atopic Dermatitis - Development and Application of a Set of Quality Indicators.

    PubMed

    Steinke, S; Beikert, F C; Langenbruch, A; Fölster-Holst, R; Ring, J; Schmitt, J; Werfel, T; Hintzen, S; Franzke, N; Augustin, M

    2018-05-15

    Quality indicators are essential tools for the assessment of health care, in particular for guideline-based procedures. 1) Development of a set of indicators for the evaluation of process and outcomes quality in atopic dermatitis (AD) care. 2) Application of the indicators to a cross-sectional study and creation of a global process quality index. An expert committee consisting of 10 members of the German guideline group on atopic dermatitis condensed potential quality indicators to a final set of 5 outcomes quality and 12 process quality indicators using a Delphi panel. The outcomes quality and 7 resp. 8 process quality indicators were retrospectively applied to a nationwide study on 1,678 patients with atopic dermatitis (AtopicHealth). Each individual process quality indicator score was then summed up to a global index (ranges from 0 (no quality achieved) to 100 (full quality achieved)) displaying the quality of health care. In total, the global process quality index revealed a median value of 62.5 and did not or only slightly correlate to outcome indicators as the median SCORAD (SCORing Atopic Dermatitis; rp =0.08), Dermatology Life Quality Index (DLQI; rp = 0.256), and Patient Benefit Index (PBI; rp = -0.151). Process quality of AD care is moderate to good. The health care process quality index does not substantially correlate to the health status of AD patients measured by 5 different outcomes quality indicators. Further research should include the investigation of reliability, responsiveness, and feasibility of the proposed quality indicators for AD. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Influencing physician prescribing.

    PubMed

    Segal, R; Wang, F

    1999-10-01

    The drug use process suffers from problems related to quality and cost that have not responded well to administrative or educational interventions. In many cases, attempts to improve the quality of physician prescribing have been clumsy, often based on intuition. This article begins by describing the drug use process and the role of prescribing in that process. In the following section, we describe what is known about how physicians make drug choice decisions. The paper concludes with suggestions, based on evidence, about the design of strategies for influencing prescribing.

  8. The relationship between physical workload and quality within line-based assembly.

    PubMed

    Ivarsson, Anna; Eek, Frida

    2016-07-01

    Reducing costs and improvement of product quality are considered important to ensure productivity within a company. Quality deviations during production processes and ergonomics have previously shown to be associated. This study explored the relationship between physical workload and real (found during production processes) and potential (need of extra time and assistance to complete tasks) quality deviations in a line-based assembly plant. The physical workload on and the work rotation between 52 workstations were assessed. As the outcome, real and potential quality deviations were studied during 10 weeks. Results show that workstations with higher physical workload had significantly more real deviations compared to lower workload stations. Static work posture had significantly more potential deviations. Rotation between high and low workload was related to fewer quality deviations compared to rotation between only high workload stations. In conclusion, physical ergonomics seems to be related to real and potential quality deviation within line-based assembly. Practitioner Summary: To ensure good productivity in manufacturing industries, it is important to reduce costs and improve product quality. This study shows that high physical workload is associated with quality deviations and need of extra time and assistance to complete tasks within line-based assembly, which can be financially expensive for a company.

  9. Outcome-Based School-to-Work Transition Planning for Students with Severe Disabilities.

    ERIC Educational Resources Information Center

    Steere, Daniel E.; And Others

    1990-01-01

    A transition planning process that focuses on quality-of-life outcomes is presented. The process, which views employment not as an outcome but as a vehicle for the attainment of quality of life, involves six steps: orientation, personal profile development, identification of employment outcomes, measurement system, compatibility process, and…

  10. Strengthening the regulatory system through the implementation and use of a quality management system.

    PubMed

    Eisner, Reinhold; Patel, Rakeshkumar

    2017-04-20

    Quality management systems (QMS), based on ISO 9001 requirements, are applicable to government service organizations such as Health Canada's Biologics and Genetic Therapies Directorate (BGTD). This communication presents the process that the BGTD followed since the early 2000s to implement a quality management system and describes how the regulatory system was improved as a result of this project. BGTD undertook the implementation of a quality management system based on ISO 9001 and containing aspects of ISO 17025 with the goal of strengthening the regulatory system through improvements in the people, processes, and services of the organization. We discuss the strategy used by BGTD to implement the QMS and the benefits that were realized from the various stages of implementation. The eight quality principals upon which the QMS standards of the ISO 9000 series are based were used by senior management as a framework to guide QMS implementation.

  11. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  12. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  13. Research on manufacturing service behavior modeling based on block chain theory

    NASA Astrophysics Data System (ADS)

    Zhao, Gang; Zhang, Guangli; Liu, Ming; Yu, Shuqin; Liu, Yali; Zhang, Xu

    2018-04-01

    According to the attribute characteristics of processing craft, the manufacturing service behavior is divided into service attribute, basic attribute, process attribute, resource attribute. The attribute information model of manufacturing service is established. The manufacturing service behavior information is successfully divided into public and private domain. Additionally, the block chain technology is introduced, and the information model of manufacturing service based on block chain principle is established, which solves the problem of sharing and secreting information of processing behavior, and ensures that data is not tampered with. Based on the key pairing verification relationship, the selective publishing mechanism for manufacturing information is established, achieving the traceability of product data, guarantying the quality of processing quality.

  14. [Potential for the survey of quality indicators based on a national emergency department registry : A systematic literature search].

    PubMed

    Hörster, A C; Kulla, M; Brammen, D; Lefering, R

    2018-06-01

    Emergency department processes are often key for successful treatment. Therefore, collection of quality indicators is demanded. A basis for the collection is systematic, electronic documentation. The development of paper-based documentation into an electronic and interoperable national emergency registry is-besides the establishment of quality management for emergency departments-a target of the AKTIN project. The objective of this research is identification of internationally applied quality indicators. For the investigation of the current status of quality management in emergency departments based on quality indicators, a systematic literature search of the database PubMed, the Cochrane Library and the internet was performed. Of the 170 internationally applied quality indicators, 25 with at least two references are identified. A total of 10 quality indicators are ascertainable by the data set. An enlargement of the data set will enable the collection of seven further quality indicators. The implementation of data of care behind the emergency processes will provide eight additional quality indicators. This work was able to show that the potential of a national emergency registry for the establishment of quality indicators corresponds with the international systems taken into consideration and could provide a comparable collection of quality indicators.

  15. A vision-based weld quality evaluation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, R.J.; Cook, G.E.; Strauss, A.M.

    1996-12-31

    Inspection of the appearance of weld beads is an integral part of the overall welding process. Lack of satisfactory appearance in itself may be sufficient grounds for part rejection or the lack of satisfactory appearance may be used as an indirect indicator of more substantive problems such as poor fusion or subsurface cracks. In all cases the inspection process tends to be both time and labor intensive. The present research uses a video system and appropriate image capture and processing to determine the quality of the weld based upon surface appearance. This relative quality rating was compared to similar ratingsmore » performed by human inspectors and was found to give very good correlation. The system was implemented for the Gas Tungsten Arc Welding (GTAW) and Gas Metal Arc Welding (GMAW) processes.« less

  16. Speaking the right language: the scientific method as a framework for a continuous quality improvement program within academic medical research compliance units.

    PubMed

    Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S

    2008-10-01

    The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.

  17. [A strategy of constructing the technological system for quality control of Chinese medicine based on process control and management].

    PubMed

    Cheng, Yi-Yu; Qian, Zhong-Zhi; Zhang, Bo-Li

    2017-01-01

    The current situation, bottleneck problems and severe challenges in quality control technology of Chinese Medicine (CM) are briefly described. It is presented to change the phenomenon related to the post-test as the main means and contempt for process control in drug regulation, reverse the situation of neglecting the development of process control and management technology for pharmaceutical manufacture and reconstruct the technological system for quality control of CM products. The regulation and technology system based on process control and management for controlling CM quality should be established to solve weighty realistic problems of CM industry from the root causes, including backwardness of quality control technology, weakness of quality risk control measures, poor reputation of product quality and so on. By this way, the obstacles from poor controllability of CM product quality could be broken. Concentrating on those difficult problems and weak links in the technical field of CM quality control, it is proposed to build CMC (Chemistry, Manufacturing and Controls) regulation for CM products with Chinese characteristics and promote the regulation international recognition as soon as possible. The CMC technical framework, which is clinical efficacy-oriented, manufacturing manner-centered and process control-focused, was designed. To address the clinical characteristics of traditional Chinese medicine (TCM) and the production feature of CM manufacture, it is suggested to establish quality control engineering for CM manufacturing by integrating pharmaceutical analysis, TCM chemistry, TCM pharmacology, pharmaceutical engineering, control engineering, management engineering and other disciplines. Further, a theoretical model of quality control engineering for CM manufacturing and the methodology of digital pharmaceutical engineering are proposed. A technology pathway for promoting CM standard and realizing the strategic goal of CM internationalization is elaborated. Copyright© by the Chinese Pharmaceutical Association.

  18. Postacute rehabilitation quality of care: toward a shared conceptual framework.

    PubMed

    Jesus, Tiago Silva; Hoenig, Helen

    2015-05-01

    There is substantial interest in mechanisms for measuring, reporting, and improving the quality of health care, including postacute care (PAC) and rehabilitation. Unfortunately, current activities generally are either too narrow or too poorly specified to reflect PAC rehabilitation quality of care. In part, this is caused by a lack of a shared conceptual understanding of what construes quality of care in PAC rehabilitation. This article presents the PAC-rehab quality framework: an evidence-based conceptual framework articulating elements specifically pertaining to PAC rehabilitation quality of care. The widely recognized Donabedian structure, process, and outcomes (SPO) model furnished the underlying structure for the PAC-rehab quality framework, and the International Classification of Functioning, Disability and Health (ICF) framed the functional outcomes. A comprehensive literature review provided the evidence base to specify elements within the SPO model and ICF-derived framework. A set of macrolevel-outcomes (functional performance, quality of life of patient and caregivers, consumers' experience, place of discharge, health care utilization) were defined for PAC rehabilitation and then related to their (1) immediate and intermediate outcomes, (2) underpinning care processes, (3) supportive team functioning and improvement processes, and (4) underlying care structures. The role of environmental factors and centrality of patients in the framework are explicated as well. Finally, we discuss why outcomes may best measure and reflect the quality of PAC rehabilitation. The PAC-rehab quality framework provides a conceptually sound, evidence-based framework appropriate for quality of care activities across the PAC rehabilitation continuum. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. Laser-based rework in electronics production

    NASA Astrophysics Data System (ADS)

    Albert, Florian; Mys, Ihor; Schmidt, Michael

    2007-02-01

    Despite the electronic manufacturing is well-established mass production process for a long time, the problem of reworking, i.a. reject and replace of defect components, still exists. The rework operations (soldering, replacement and desoldering) are performed in most cases manually. However, this practice is characterized by an inconsistent quality of the reworked solder joints and a high degree of physiological stress for the employees. In this paper, we propose a novel full-automated laser based soldering and rework process. Our developed soldering system is a pick-and-place unit with an integrated galvanometer scanner, a fiber coupled diode laser for quasi-simultaneous soldering and a pyrometer-based process control. The developed system provides soldering and reworking processes taking into account a kind of defect, a type of electronic component and quality requirements from the IPC- 610 norm. The paper spends a great deal of efforts to analyze quality of laser reworked solder joints. The quality depends mainly on the type and thickness of intermetallic phases between solder, pads and leads; the wetting angles between pad, solder and lead; and finally, the joint microstructure with its mechanical properties. The influence of the rework soldering on these three factors is discussed and compared to conventional laser soldering results. In order to optimize the quality of reworked joints, the different strategies of energy input are applied.

  20. Long-term forest paired catchment studies: What do they tell us that landscape-level monitoring does not?

    Treesearch

    Dan Neary

    2016-01-01

    Forested catchments throughout the world are known for producing high quality water for human use. In the 20th Century, experimental forest catchment studies played a key role in studying the processes contributing to high water quality. The hydrologic processes investigated on these paired catchments have provided the science base for examining water quality...

  1. [Study on "multi-dimensional structure and process dynamics quality control system" of Danshen infusion solution based on component structure theory].

    PubMed

    Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Wang, Gui-You; Zhao, Zi-Yu; Jia, Xiao-Bin

    2013-11-01

    As traditional Chinese medicine (TCM) preparation products feature complex compounds and multiple preparation processes, the implementation of quality control in line with the characteristics of TCM preparation products provides a firm guarantee for the clinical efficacy and safety of TCM preparation products. Danshen infusion solution is a preparation commonly used in clinic, but its quality control is restricted to indexes of finished products, which can not guarantee its inherent quality. Our study group has proposed "multi-dimensional structure and process dynamics quality control system" on the basis of "component structure theory", for the purpose of controlling the quality of Danshen infusion solution at multiple levels and in multiple links from the efficacy-related material basis, the safety-related material basis, the characteristics of dosage form to the preparation process. This article, we bring forth new ideas and models to the quality control of TCM preparation products.

  2. The choices, choosing model of quality of life: linkages to a science base.

    PubMed

    Gurland, Barry J; Gurland, Roni V

    2009-01-01

    A previous paper began with a critical review of current models and measures of quality of life and then proposed criteria for judging the relative merits of alternative models: preference was given to finding a model with explicit mechanisms, linkages to a science base, a means of identifying deficits amenable to rational restorative interventions, and with embedded values of the whole person. A conjectured model, based on the processes of accessing choices and choosing among them, matched the proposed criteria. The choices and choosing (c-c) process is an evolved adaptive mechanism dedicated to the pursuit of quality of life, driven by specific biological and psychological systems, and influenced also by social and environmental forces. In this paper the c-c model is examined for its potential to strengthen the science base for the field of quality of life and thus to unify many approaches to concept and measurement. A third paper in this set will lay out a guide to applying the c-c model in evaluating impairments of quality of life and will tie this evaluation to corresponding interventions aimed at relieving restrictions or distortions of the c-c process; thus helping people to preserve and improve their quality of life. The fourth paper will demonstrate empirical analyses of the relationship between health imposed restrictions of options for living and conventional indicators of diminished quality of life. (c) 2008 John Wiley & Sons, Ltd.

  3. When high achievers and low achievers work in the same group: the roles of group heterogeneity and processes in project-based learning.

    PubMed

    Cheng, Rebecca Wing-yi; Lam, Shui-fong; Chan, Joanne Chung-yan

    2008-06-01

    There has been an ongoing debate about the inconsistent effects of heterogeneous ability grouping on students in small group work such as project-based learning. The present research investigated the roles of group heterogeneity and processes in project-based learning. At the student level, we examined the interaction effect between students' within-group achievement and group processes on their self- and collective efficacy. At the group level, we examined how group heterogeneity was associated with the average self- and collective efficacy reported by the groups. The participants were 1,921 Hong Kong secondary students in 367 project-based learning groups. Student achievement was determined by school examination marks. Group processes, self-efficacy and collective efficacy were measured by a student-report questionnaire. Hierarchical linear modelling was used to analyse the nested data. When individual students in each group were taken as the unit of analysis, results indicated an interaction effect of group processes and students' within-group achievement on the discrepancy between collective- and self-efficacy. When compared with low achievers, high achievers reported lower collective efficacy than self-efficacy when group processes were of low quality. However, both low and high achievers reported higher collective efficacy than self-efficacy when group processes were of high quality. With 367 groups taken as the unit of analysis, the results showed that group heterogeneity, group gender composition and group size were not related to the discrepancy between collective- and self-efficacy reported by the students. Group heterogeneity was not a determinant factor in students' learning efficacy. Instead, the quality of group processes played a pivotal role because both high and low achievers were able to benefit when group processes were of high quality.

  4. Lower- Versus Higher-Income Populations In The Alternative Quality Contract: Improved Quality And Similar Spending

    PubMed Central

    Song, Zirui; Rose, Sherri; Chernew, Michael E.; Safran, Dana Gelb

    2018-01-01

    As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. PMID:28069849

  5. A real time quality control application for animal production by image processing.

    PubMed

    Sungur, Cemil; Özkan, Halil

    2015-11-01

    Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.

  6. Development of risk-based air quality management strategies under impacts of climate change.

    PubMed

    Liao, Kuo-Jen; Amar, Praveen; Tagaris, Efthimios; Russell, Armistead G

    2012-05-01

    Climate change is forecast to adversely affect air quality through perturbations in meteorological conditions, photochemical reactions, and precursor emissions. To protect the environment and human health from air pollution, there is an increasing recognition of the necessity of developing effective air quality management strategies under the impacts of climate change. This paper presents a framework for developing risk-based air quality management strategies that can help policy makers improve their decision-making processes in response to current and future climate change about 30-50 years from now. Development of air quality management strategies under the impacts of climate change is fundamentally a risk assessment and risk management process involving four steps: (1) assessment of the impacts of climate change and associated uncertainties; (2) determination of air quality targets; (3) selections of potential air quality management options; and (4) identification of preferred air quality management strategies that minimize control costs, maximize benefits, or limit the adverse effects of climate change on air quality when considering the scarcity of resources. The main challenge relates to the level of uncertainties associated with climate change forecasts and advancements in future control measures, since they will significantly affect the risk assessment results and development of effective air quality management plans. The concept presented in this paper can help decision makers make appropriate responses to climate change, since it provides an integrated approach for climate risk assessment and management when developing air quality management strategies. Development of climate-responsive air quality management strategies is fundamentally a risk assessment and risk management process. The risk assessment process includes quantification of climate change impacts on air quality and associated uncertainties. Risk management for air quality under the impacts of climate change includes determination of air quality targets, selections of potential management options, and identification of effective air quality management strategies through decision-making models. The risk-based decision-making framework can also be applied to develop climate-responsive management strategies for the other environmental dimensions and assess costs and benefits of future environmental management policies.

  7. SAGES quality initiative: an introduction.

    PubMed

    Lidor, Anne; Telem, Dana; Bower, Curtis; Sinha, Prashant; Orlando, Rocco; Romanelli, John

    2017-08-01

    The Medicare program has transitioned to paying healthcare providers based on the quality of care delivered, not on the quantity. In May 2015, SAGES held its first ever Quality Summit. The goal of this meeting was to provide us with the information necessary to put together a strategic plan for our Society over the next 3-5 years, and to participate actively on a national level to help develop valid measures of quality of surgery. The transition to value-based medicine requires that providers are now measured and reimbursed based on the quality of services they provide rather than the quantity of patients in their care. As of 2014, quality measures must cover 3 of the 6 available National Quality domains. Physician quality reporting system measures are created via a vigorous process which is initiated by the proposal of the quality measure and subsequent validation. Commercial, non-profit, and governmental agencies have now been engaged in the measurement of hospital performance through structural measures, process measures, and increasingly with outcomes measures. This more recent focus on outcomes measures have been linked to hospital payments through the Value-Based Purchasing program. Outcomes measures of quality drive CMS' new program, MACRA, using two formats: Merit-based incentive programs and alternative payment models. But, the quality of information now available is highly variable and difficult for the average consumer to use. Quality metrics serve to guide efforts to improve performance and for consumer education. Professional organizations such as SAGES play a central role in defining the agenda for improving quality, outcomes, and safety. The mission of SAGES is to improve the quality of patient care through education, research, innovation, and leadership, principally in gastrointestinal and endoscopic surgery.

  8. Shifting the focus to practice quality improvement in radiation oncology.

    PubMed

    Crozier, Cheryl; Erickson-Wittmann, Beth; Movsas, Benjamin; Owen, Jean; Khalid, Najma; Wilson, J Frank

    2011-09-01

    To demonstrate how the American College of Radiology, Quality Research in Radiation Oncology (QRRO) process survey database can serve as an evidence base for assessing quality of care in radiation oncology. QRRO has drawn a stratified random sample of radiation oncology facilities in the USA and invited those facilities to participate in a Process Survey. Information from a prior QRRO Facilities Survey has been used along with data collected under the current National Process Survey to calculate national averages and make statistically valid inferences for national process measures for selected cancers in which radiation therapy plays a major role. These measures affect outcomes important to patients and providers and measure quality of care. QRRO's survey data provides national benchmark data for numerous quality indicators. The Process Survey is "fully qualified" as a Practice Quality Improvement project by the American Board of Radiology under its Maintenance of Certification requirements for radiation oncology and radiation physics. © 2011 National Association for Healthcare Quality.

  9. Taguchi experimental design to determine the taste quality characteristic of candied carrot

    NASA Astrophysics Data System (ADS)

    Ekawati, Y.; Hapsari, A. A.

    2018-03-01

    Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.

  10. Quality Saving Mechanisms of Mitochondria during Aging in a Fully Time-Dependent Computational Biophysical Model

    PubMed Central

    Mellem, Daniel; Fischer, Frank; Jaspers, Sören; Wenck, Horst; Rübhausen, Michael

    2016-01-01

    Mitochondria are essential for the energy production of eukaryotic cells. During aging mitochondria run through various processes which change their quality in terms of activity, health and metabolic supply. In recent years, many of these processes such as fission and fusion of mitochondria, mitophagy, mitochondrial biogenesis and energy consumption have been subject of research. Based on numerous experimental insights, it was possible to qualify mitochondrial behaviour in computational simulations. Here, we present a new biophysical model based on the approach of Figge et al. in 2012. We introduce exponential decay and growth laws for each mitochondrial process to derive its time-dependent probability during the aging of cells. All mitochondrial processes of the original model are mathematically and biophysically redefined and additional processes are implemented: Mitochondrial fission and fusion is separated into a metabolic outer-membrane part and a protein-related inner-membrane part, a quality-dependent threshold for mitophagy and mitochondrial biogenesis is introduced and processes for activity-dependent internal oxidative stress as well as mitochondrial repair mechanisms are newly included. Our findings reveal a decrease of mitochondrial quality and a fragmentation of the mitochondrial network during aging. Additionally, the model discloses a quality increasing mechanism due to the interplay of the mitophagy and biogenesis cycle and the fission and fusion cycle of mitochondria. It is revealed that decreased mitochondrial repair can be a quality saving process in aged cells. Furthermore, the model finds strategies to sustain the quality of the mitochondrial network in cells with high production rates of reactive oxygen species due to large energy demands. Hence, the model adds new insights to biophysical mechanisms of mitochondrial aging and provides novel understandings of the interdependency of mitochondrial processes. PMID:26771181

  11. Effect of drying process assisted by high-pressure impregnation on protein quality and digestibility in red abalone (Haliotis rufescens).

    PubMed

    Cepero-Betancourt, Yamira; Oliva-Moresco, Patricio; Pasten-Contreras, Alexis; Tabilo-Munizaga, Gipsy; Pérez-Won, Mario; Moreno-Osorio, Luis; Lemus-Mondaca, Roberto

    2017-10-01

    Abalone (Haliotis spp.) is an exotic seafood product recognized as a protein source of high biological value. Traditional methods used to preserve foods such as drying technology can affect their nutritional quality (protein quality and digestibility). A 28-day rat feeding study was conducted to evaluate the effects of the drying process assisted by high-pressure impregnation (HPI) (350, 450, and 500 MPa × 5 min) on chemical proximate and amino acid compositions and nutritional parameters, such as protein efficiency ratio (PER), true digestibility (TD), net protein ratio, and protein digestibility corrected amino acid score (PDCAAS) of dried abalone. The HPI-assisted drying process ensured excellent protein quality based on PER values, regardless of the pressure level. At 350 and 500 MPa, the HPI-assisted drying process had no negative effect on TD and PDCAAS then, based on nutritional parameters analysed, we recommend HPI-assisted drying process at 350 MPa × 5 min as the best process condition to dry abalone. Variations in nutritional parameters compared to casein protein were observed; nevertheless, the high protein quality and digestibility of HPI-assisted dried abalones were maintained to satisfy the metabolic demands of human beings.

  12. A new hyperspectral imaging based device for quality control in plastic recycling

    NASA Astrophysics Data System (ADS)

    Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.

    2013-05-01

    The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.

  13. [Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].

    PubMed

    Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin

    2017-07-01

    In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.

  14. In-situ quality monitoring during laser brazing

    NASA Astrophysics Data System (ADS)

    Ungers, Michael; Fecker, Daniel; Frank, Sascha; Donst, Dmitri; Märgner, Volker; Abels, Peter; Kaierle, Stefan

    Laser brazing of zinc coated steel is a widely established manufacturing process in the automotive sector, where high quality requirements must be fulfilled. The strength, impermeablitiy and surface appearance of the joint are particularly important for judging its quality. The development of an on-line quality control system is highly desired by the industry. This paper presents recent works on the development of such a system, which consists of two cameras operating in different spectral ranges. For the evaluation of the system, seam imperfections are created artificially during experiments. Finally image processing algorithms for monitoring process parameters based the captured images are presented.

  15. Are hospital process quality indicators influenced by socio-demographic health determinants.

    PubMed

    Buja, Alessandra; Canavese, Daniel; Furlan, Patrizia; Lago, Laura; Saia, Mario; Baldo, Vincenzo

    2015-10-01

    This population-level health service study aimed to address whether hospitals assure the same quality of care to people in equal need, i.e. to see if any associations exist between social determinants and adherence to four hospital process indicators clearly identified as being linked to better health outcomes for patients. This was a retrospective cohort study based on administrative data collected in the Veneto Region (northeast Italy). We included residents of the Veneto Region hospitalized for ST-segment elevation myocardial infarction (STEMI) or acute myocardial infarction (AMI), hip fracture, or cholecystitis, and women giving birth, who were discharged from any hospital operating under the Veneto Regional Health Service between January 2012 and December 2012. The following quality indicator rates were calculated: patients with STEMI-AMI treated with percutaneous coronary intervention, elderly patients with hip fractures who underwent surgery within 48 h of admission, laparoscopic cholecystectomies and women who underwent cesarean section. A multilevel, multivariable logistic regression analyses were conducted to test the association between age, gender, formal education or citizenship and the quality of hospital care processes. All the inpatient hospital care process quality indicators measured were associated with an undesirable number of disparities concerning the social determinants. Monitoring the evidence-based hospital health care process indicators reveals undesirable disparities. Administrative data sets are of considerable practical value in broad-based quality assessments and as a screening tool, also in the health disparities domain. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  16. Weaknesses in Applying a Process Approach in Industry Enterprises

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Mĺkva, Miroslava; Fidlerová, Helena

    2012-12-01

    The paper deals with a process approach as one of the main principles of the quality management. Quality management systems based on process approach currently represents one of a proofed ways how to manage an organization. The volume of sales, costs and profit levels are influenced by quality of processes and efficient process flow. As results of the research project showed, there are some weaknesses in applying of the process approach in the industrial routine and it has been often only a formal change of the functional management to process management in many organizations in Slovakia. For efficient process management it is essential that companies take attention to the way how to organize their processes and seek for their continuous improvement.

  17. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  18. Study and Development of Mobile Tracingterminal Based on Gprs for Agriculturalproducts Quality Tracking

    NASA Astrophysics Data System (ADS)

    Liu, Shihong; Meng, Hong; Zheng, Huoguo; Wu, Jiangshou

    Traceability system has become an important means for food safety management. Global food industry and many countries have paid increasing attention to the construction of food traceability system, but rarely referred to tracing terminal. According to the technical requirements of cereal and oil products quality safety tracing process, we design and develop a mobile tracing terminal based on GPRS for agricultural products quality tracking to facilitate quality supervisors and consumers to track and trace the quality of related agricultural products anytime ,anywhere.

  19. Thermal Catalytic Oxidation of Airborne Contaminants by a Reactor Using Ultra-Short Channel Length, Monolithic Catalyst Substrates

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Tomes, K. M.; Tatara, J. D.

    2005-01-01

    Contaminated air, whether in a crewed spacecraft cabin or terrestrial work and living spaces, is a pervasive problem affecting human health, performance, and well being. The need for highly effective, economical air quality processes spans a wide range of terrestrial and space flight applications. Typically, air quality control processes rely on absorption-based processes. Most industrial packed-bed adsorption processes use activated carbon. Once saturated, the carbon is either dumped or regenerated. In either case, the dumped carbon and concentrated waste streams constitute a hazardous waste that must be handled safely while minimizing environmental impact. Thermal catalytic oxidation processes designed to address waste handling issues are moving to the forefront of cleaner air quality control and process gas decontamination processes. Careful consideration in designing the catalyst substrate and reactor can lead to more complete contaminant destruction and poisoning resistance. Maintenance improvements leading to reduced waste handling and process downtime can also be realized. Performance of a prototype thermal catalytic reaction based on ultra-short waste channel, monolith catalyst substrate design, under a variety of process flow and contaminant loading conditions, is discussed.

  20. Evaluation of Thermostabilities of Enzymes, Mediators and Immobilizing Membranes for Enzyme Sensors

    NASA Astrophysics Data System (ADS)

    Yamada, Yohei; Ohnishi, Yuki; Hayashi, Tetsuya; Isobe, Yoshifumi; Yabutani, Tomoki

    The stability of the constituents of electrochemical measurement, electron mediators, enzymes and enzyme-immobilizing membranes was evaluated under high temperature (maximum 75°C) by electrochemical analysis, UV-Vis spectrometry (UV-Vis) and UV circular dichroism (CD). As a result of stability evaluation of mediators at 75°C, electrochemical activity of 2,2'-azinobis(3-ethylbenzothiazoline-6-sulfonic acid ammonium salt (ABTS), potassium ferricyanide (K3[Fe(CN)6]) and ferrocenemethanol (FcOH) were not changed, but 2,6-dichloroindophenol (DCIP), p-benzoquinone (p-BQ), vitaminK3 (VK3) were greatly decreased. The stability of diaphorase from Bacillus stearothermophilus (DI) were compared between in-solution and in several types of membranes, Agarose H, Poly-L-lysine (PLL) and poly-ion-complex (PIC) by electrochemical analysis. In solution, activity and secondary structure of DI were changed at 65°C or higher. This tendency of activity was not much different in Agarose H but in PLL, the activity was almost kept until 70°C. It was suggested that DI was fixed on the electrodes in high concentration and the elimination of DI seldom arise in PLL from the magnitude of the current response and the results of prolonged stability evaluation.

  1. Total Quality Management of Information System for Quality Assessment of Pesantren Using Fuzzy-SERVQUAL

    NASA Astrophysics Data System (ADS)

    Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal

    2018-02-01

    This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.

  2. The Use of Multidimensional Image-Based Analysis to Accurately Monitor Cell Growth in 3D Bioreactor Culture

    PubMed Central

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809

  3. The use of multidimensional image-based analysis to accurately monitor cell growth in 3D bioreactor culture.

    PubMed

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.

  4. 76 FR 21894 - Proposed Statement of Antitrust Enforcement Policy Regarding Accountable Care Organizations...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-19

    ... structure that includes clinical and administrative processes; (3) processes to promote evidence-based medicine and patient engagement; (4) reporting on quality and cost measures; and (5) coordinated care for... costs and ensure quality.\\16\\ Federal Trade Commission staff advisory opinions discuss evidence...

  5. Kpejigaou: an indigenous, high-protein, low-fat, cowpea-based griddled food proposed for coastal West Africa.

    PubMed

    Amonsou, Eric Oscar; Sakyi-Dawson, Esther; Saalia, Firibu Kwesi; Houssou, Paul

    2008-12-01

    Griddled cowpea paste foods have high nutritional potential because they are low in fat but high in protein. A good understanding of process and product characteristics of kpejigaou is necessary to improve its quality and enhance acceptability. To describe the product, evaluate critical variables in traditional processing, and determine consumer quality criteria and preferences for kpejigaou. A survey of kpejigaou processing was carried out among processors and regular consumers of kpejigaou. Kpejigaou is flat and circular in shape, with uniform thickness and porous structure. The production process of kpejigaou was found to be simple and rapid, but the quality of the finished product varied among processors and among batches. Critical processing variables affecting quality were dehulling of the cowpeas, type of griddling equipment, and griddling temperature. Texture (sponginess) is the most important quality index that determines the preference and acceptability of kpejigaou by consumers. Traditionally processed kpejigaou does not meet current standards for high-quality foods. This study provides the basis for efforts to standardize the kpejigaou process to ensure consistent product quality and enhance the acceptability of kpejigaou among consumers. Kpejigaou has a potential for success if marketed as a low-fat, nutritious fast food.

  6. Implementation of a Quality Improvement Process Aimed to Deliver Higher-Value Physical Therapy for Patients With Low Back Pain: Case Report.

    PubMed

    Karlen, Emily; McCathie, Becky

    2015-12-01

    The current state of health care demands higher-value care. Due to many barriers, clinicians routinely do not implement evidence-based care even though it is known to improve quality and reduce cost of care. The purpose of this case report is to describe a theory-based, multitactic implementation of a quality improvement process aimed to deliver higher-value physical therapy for patients with low back pain. Patients were treated from January 2010 through December 2014 in 1 of 32 outpatient physical therapy clinics within an academic health care system. Data were examined from 47,755 patients (mean age=50.3 years) entering outpatient physical therapy for management of nonspecific low back pain, with or without radicular pain. Development and implementation tactics were constructed from adult learning and change management theory to enhance adherence to best practice care among 130 physical therapists. A quality improvement team implemented 4 tactics: establish care delivery expectations, facilitate peer-led clinical and operational teams, foster a learning environment focused on meeting a population's needs, and continuously collect and analyze outcomes data. Physical therapy utilization and change in functional disability were measured to assess relative cost and quality of care. Secondarily, charge data assessed change in physical therapists' application of evidence-based care. Implementation of a quality improvement process was measured by year-over-year improved clinical outcomes, decreased utilization, and increased adherence to evidence-based physical therapy, which was associated with higher-value care. When adult learning and change management theory are combined in quality improvement efforts, common barriers to implementing evidence-based care can be overcome, creating an environment supportive of delivering higher-value physical therapy for patients with low back pain. © 2015 American Physical Therapy Association.

  7. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.

    PubMed

    Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam

    2018-04-05

    Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Quality Matters™: An Educational Input in an Ongoing Design-Based Research Project

    ERIC Educational Resources Information Center

    Adair, Deborah; Shattuck, Kay

    2015-01-01

    Quality Matters (QM) has been transforming established best practices and online education-based research into an applicable, scalable course level improvement process for the last decade. In this article, the authors describe QM as an ongoing design-based research project and an educational input for improving online education.

  10. Quality Assurance in American and British Higher Education: A Comparison.

    ERIC Educational Resources Information Center

    Stanley, Elizabeth C.; Patrick, William J.

    1998-01-01

    Compares quality improvement and accountability processes in the United States and United Kingdom. For the United Kingdom, looks at quality audits, institutional assessment, standards-based quality assurance, and research assessment; in the United States, looks at regional and specialized accreditation, performance indicator systems, academic…

  11. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  12. The quality of instruments to assess the process of shared decision making: A systematic review

    PubMed Central

    Bomhof-Roordink, Hanna; Smith, Ian P.; Scholl, Isabelle; Stiggelbout, Anne M.; Pieterse, Arwen H.

    2018-01-01

    Objective To inventory instruments assessing the process of shared decision making and appraise their measurement quality, taking into account the methodological quality of their validation studies. Methods In a systematic review we searched seven databases (PubMed, Embase, Emcare, Cochrane, PsycINFO, Web of Science, Academic Search Premier) for studies investigating instruments measuring the process of shared decision making. Per identified instrument, we assessed the level of evidence separately for 10 measurement properties following a three-step procedure: 1) appraisal of the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist, 2) appraisal of the psychometric quality of the measurement property using three possible quality scores, 3) best-evidence synthesis based on the number of studies, their methodological and psychometrical quality, and the direction and consistency of the results. The study protocol was registered at PROSPERO: CRD42015023397. Results We included 51 articles describing the development and/or evaluation of 40 shared decision-making process instruments: 16 patient questionnaires, 4 provider questionnaires, 18 coding schemes and 2 instruments measuring multiple perspectives. There is an overall lack of evidence for their measurement quality, either because validation is missing or methods are poor. The best-evidence synthesis indicated positive results for a major part of instruments for content validity (50%) and structural validity (53%) if these were evaluated, but negative results for a major part of instruments when inter-rater reliability (47%) and hypotheses testing (59%) were evaluated. Conclusions Due to the lack of evidence on measurement quality, the choice for the most appropriate instrument can best be based on the instrument’s content and characteristics such as the perspective that they assess. We recommend refinement and validation of existing instruments, and the use of COSMIN-guidelines to help guarantee high-quality evaluations. PMID:29447193

  13. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  14. Does the process map influence the outcome of quality improvement work? A comparison of a sequential flow diagram and a hierarchical task analysis diagram.

    PubMed

    Colligan, Lacey; Anderson, Janet E; Potts, Henry W W; Berman, Jonathan

    2010-01-07

    Many quality and safety improvement methods in healthcare rely on a complete and accurate map of the process. Process mapping in healthcare is often achieved using a sequential flow diagram, but there is little guidance available in the literature about the most effective type of process map to use. Moreover there is evidence that the organisation of information in an external representation affects reasoning and decision making. This exploratory study examined whether the type of process map - sequential or hierarchical - affects healthcare practitioners' judgments. A sequential and a hierarchical process map of a community-based anti coagulation clinic were produced based on data obtained from interviews, talk-throughs, attendance at a training session and examination of protocols and policies. Clinic practitioners were asked to specify the parts of the process that they judged to contain quality and safety concerns. The process maps were then shown to them in counter-balanced order and they were asked to circle on the diagrams the parts of the process where they had the greatest quality and safety concerns. A structured interview was then conducted, in which they were asked about various aspects of the diagrams. Quality and safety concerns cited by practitioners differed depending on whether they were or were not looking at a process map, and whether they were looking at a sequential diagram or a hierarchical diagram. More concerns were identified using the hierarchical diagram compared with the sequential diagram and more concerns were identified in relation to clinical work than administrative work. Participants' preference for the sequential or hierarchical diagram depended on the context in which they would be using it. The difficulties of determining the boundaries for the analysis and the granularity required were highlighted. The results indicated that the layout of a process map does influence perceptions of quality and safety problems in a process. In quality improvement work it is important to carefully consider the type of process map to be used and to consider using more than one map to ensure that different aspects of the process are captured.

  15. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  16. Classification and Quality Evaluation of Tobacco Leaves Based on Image Processing and Fuzzy Comprehensive Evaluation

    PubMed Central

    Zhang, Fan; Zhang, Xinhong

    2011-01-01

    Most of classification, quality evaluation or grading of the flue-cured tobacco leaves are manually operated, which relies on the judgmental experience of experts, and inevitably limited by personal, physical and environmental factors. The classification and the quality evaluation are therefore subjective and experientially based. In this paper, an automatic classification method of tobacco leaves based on the digital image processing and the fuzzy sets theory is presented. A grading system based on image processing techniques was developed for automatically inspecting and grading flue-cured tobacco leaves. This system uses machine vision for the extraction and analysis of color, size, shape and surface texture. Fuzzy comprehensive evaluation provides a high level of confidence in decision making based on the fuzzy logic. The neural network is used to estimate and forecast the membership function of the features of tobacco leaves in the fuzzy sets. The experimental results of the two-level fuzzy comprehensive evaluation (FCE) show that the accuracy rate of classification is about 94% for the trained tobacco leaves, and the accuracy rate of the non-trained tobacco leaves is about 72%. We believe that the fuzzy comprehensive evaluation is a viable way for the automatic classification and quality evaluation of the tobacco leaves. PMID:22163744

  17. Needs Assessment for the Use of NASA Remote Sensing Data in the Development and Implementation of Estuarine and Coastal Water Quality Standards

    NASA Technical Reports Server (NTRS)

    Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake

    2010-01-01

    The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.

  18. Continuous counter-current chromatography for capture and polishing steps in biopharmaceutical production.

    PubMed

    Steinebach, Fabian; Müller-Späth, Thomas; Morbidelli, Massimo

    2016-09-01

    The economic advantages of continuous processing of biopharmaceuticals, which include smaller equipment and faster, efficient processes, have increased interest in this technology over the past decade. Continuous processes can also improve quality assurance and enable greater controllability, consistent with the quality initiatives of the FDA. Here, we discuss different continuous multi-column chromatography processes. Differences in the capture and polishing steps result in two different types of continuous processes that employ counter-current column movement. Continuous-capture processes are associated with increased productivity per cycle and decreased buffer consumption, whereas the typical purity-yield trade-off of classical batch chromatography can be surmounted by continuous processes for polishing applications. In the context of continuous manufacturing, different but complementary chromatographic columns or devices are typically combined to improve overall process performance and avoid unnecessary product storage. In the following, these various processes, their performances compared with batch processing and resulting product quality are discussed based on a review of the literature. Based on various examples of applications, primarily monoclonal antibody production processes, conclusions are drawn about the future of these continuous-manufacturing technologies. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Development of an evidence-based review with recommendations using an online iterative process.

    PubMed

    Rudmik, Luke; Smith, Timothy L

    2011-01-01

    The practice of modern medicine is governed by evidence-based principles. Due to the plethora of medical literature, clinicians often rely on systematic reviews and clinical guidelines to summarize the evidence and provide best practices. Implementation of an evidence-based clinical approach can minimize variation in health care delivery and optimize the quality of patient care. This article reports a method for developing an "Evidence-based Review with Recommendations" using an online iterative process. The manuscript describes the following steps involved in this process: Clinical topic selection, Evidence-hased review assignment, Literature review and initial manuscript preparation, Iterative review process with author selection, and Manuscript finalization. The goal of this article is to improve efficiency and increase the production of evidence-based reviews while maintaining the high quality and transparency associated with the rigorous methodology utilized for clinical guideline development. With the rise of evidence-based medicine, most medical and surgical specialties have an abundance of clinical topics which would benefit from a formal evidence-based review. Although clinical guideline development is an important methodology, the associated challenges limit development to only the absolute highest priority clinical topics. As outlined in this article, the online iterative approach to the development of an Evidence-based Review with Recommendations may improve productivity without compromising the quality associated with formal guideline development methodology. Copyright © 2011 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.

  20. Quality nursing care: a qualitative enquiry.

    PubMed

    Hogston, R

    1995-01-01

    In spite of the wealth of literature on quality nursing care, a disparity exists in defining quality. The purpose of this study was an attempt to seek out practising nurses' perceptions of quality nursing care and to present a definition of quality as described by nurses. Eighteen nurses from a large hospital in the south of England were interviewed. Qualitative analysis based on a modified grounded theory approach revealed three categories described as 'structure', 'process' and 'outcome'. This supports previous work on evaluating quality care but postulates that structure, process and outcome could also be used as a mechanism for defining quality. The categories are defined by using the words of the informants in order to explain the essential attributes of quality nursing care. The findings demonstrate how more informants cited quality in terms of process and outcome than structure. It is speculated that the significance of this rests with the fact that nurses have direct control over process and outcome whereas the political and economic climate in which nurses work is beyond their control and decisions over structure lie with their managers.

  1. Lower- Versus Higher-Income Populations In The Alternative Quality Contract: Improved Quality And Similar Spending.

    PubMed

    Song, Zirui; Rose, Sherri; Chernew, Michael E; Safran, Dana Gelb

    2017-01-01

    As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. Project HOPE—The People-to-People Health Foundation, Inc.

  2. Polyethersulfone-based ultrafiltration hollow fibre membrane for drinking water treatment systems

    NASA Astrophysics Data System (ADS)

    Chew, Chun Ming; Ng, K. M. David; Ooi, H. H. Richard

    2017-12-01

    Conventional media/sand filtration has been the mainstream water treatment process for most municipal water treatment plants in Malaysia. Filtrate qualities of conventional media/sand filtration are very much dependent on the coagulation-flocculation process prior to filtration and might be as high as 5 NTU. However, the demands for better quality of drinking water through public piped-water supply systems are growing. Polymeric ultrafiltration (UF) hollow fibre membrane made from modified polyethersulfone (PES) material is highly hydrophilic with high tensile strength and produces excellent quality filtrate of below 0.3 NTU in turbidity. This advanced membrane filtration material is also chemical resistance which allows a typical lifespan of 5 years. Comparisons between the conventional media/sand filtration and PES-based UF systems are carried out in this paper. UF has been considered as the emerging technology in municipal drinking water treatment plants due to its consistency in producing high quality filtrates even without the coagulation-flocculation process. The decreasing cost of PES-based membrane due to mass production and competitive pricing by manufacturers has made the UF technology affordable for industrial-scale water treatment plants.

  3. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  4. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  5. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  6. Working toward quality in obstetric anesthesia: a business approach.

    PubMed

    Lynde, Grant C

    2017-06-01

    Physicians are increasingly required to demonstrate that they provide quality care. How does one define quality? A significant body of literature in industries outside of health care provides guidance on how to define appropriate metrics, create teams to troubleshoot problem areas, and sustain those improvements. The modern quality movement in the United States began in response to revolutionary gains in both quality and productivity in Japanese manufacturing in the 1980's. Applying these lessons to the healthcare setting has been slow. Hospitals are only now introducing tools such as failure mode and effect analysis, Lean and Six Sigma into their quality divisions and are seeing significant cost reductions and outcomes improvements. The review will discuss the process for creating an effective quality program for an obstetric anesthesia division. Sustainable improvements in delivered care need to be based on an evaluation of service line needs, defining appropriate metrics, understanding current process flows, changing and measuring those processes, and developing mechanisms to ensure the new processes are maintained.

  7. [Application of quality by design in granulation process for Ginkgo leaf tablet (Ⅲ): process control strategy based on design space].

    PubMed

    Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  8. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  9. The use of indicators to improve the quality of intensive care: theoretical aspects and experiences from the Dutch intensive care registry.

    PubMed

    van der Voort, P H J; van der Veer, S N; de Vos, M L G

    2012-10-01

    In the concept of total quality management that was originally developed in industry, the use of quality indicators is essential. The implementation of quality indicators in the intensive care unit to improve the quality of care is a complex process. This process can be described in seven subsequent steps of an indicator-based quality improvement (IBQI) cycle. With this IBQI cycle, a continuous quality improvement can be achieved with the use of indicator data in a benchmark setting. After the development of evidence-based indicators, a sense of urgency has to be created, registration should start, raw data must be analysed, feedback must be given, and interpretation and conclusions must be made, followed by a quality improvement plan. The last step is the implementation of changes that needs a sense of urgency, and this completes the IBQI cycle. Barriers and facilitators are found in each step. They should be identified and addressed in a multifaceted quality improvement strategy. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.

  10. Assessing the quality of radiographic processing in general dental practice.

    PubMed

    Thornley, P H; Stewardson, D A; Rout, P G J; Burke, F J T

    2006-05-13

    To determine if a commercial device (Vischeck) for monitoring film processing quality was a practical option in general dental practice, and to assess processing quality among a group of GDPs in the West Midlands with this device. Clinical evaluation. General dental practice, UK, 2004. Ten GDP volunteers from a practice based research group processed Vischeck strips (a) when chemicals were changed, (b) one week later, and (c) immediately before the next change of chemicals. These were compared with strips processed under ideal conditions. Additionally, a series of duplicate radiographs were produced and processed together with Vischeck strips in progressively more dilute developer solutions to compare the change in radiograph quality assessed clinically with that derived from the Vischeck. The Vischeck strips suggested that at the time chosen for change of processing chemicals, eight dentists had been processing films well beyond the point indicated for replacement. Solutions were changed after a wide range of time periods and number of films processed. The calibration of the Vischeck strip correlated closely to a clinical assessment of acceptable film quality. Vischeck strips are a useful aid to monitoring processing quality in automatic developers in general dental practice. Most of this group of GDPs were using chemicals beyond the point at which diagnostic yield would be affected.

  11. Assessing Faculty Experiences with and Perceptions of an Internal Quality Assurance Process for Undergraduate Distributed Learning Courses: A Pilot Study

    ERIC Educational Resources Information Center

    Rucker, Ryan; Edwards, Karen; Frass, Lydia R.

    2015-01-01

    To ensure that online courses match traditional classes' quality, some institutions are implementing internal standards for online course design and quality review. The University of South Carolina created the Distributed Learning Quality Review program, based on "Quality Matters'" standards. It was designed to be faculty-guided, as…

  12. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  13. A tutorial for developing a topical cream formulation based on the Quality by Design approach.

    PubMed

    Simões, Ana; Veiga, Francisco; Vitorino, Carla; Figueiras, Ana

    2018-06-20

    The pharmaceutical industry has entered in a new era, as there is a growing interest in increasing the quality standards of dosage forms, through the implementation of more structured development and manufacturing approaches. For many decades, the manufacturing of drug products was controlled by a regulatory framework to guarantee the quality of the final product through a fixed process and exhaustive testing. Limitations related to the Quality by Test (QbT) system have been widely acknowledged. The emergence of Quality by Design (QbD) as a systematic and risk-based approach introduced a new quality concept based on a good understanding of how raw materials and process parameters influence the final quality profile. Although the QbD system has been recognized as a revolutionary approach to product development and manufacturing, its full implementation in the pharmaceutical field is still limited. This is particularly evident in the case of semisolid complex formulation development. The present review aims at establishing a practical QbD framework to describe all stages comprised in the pharmaceutical development of a conventional cream in a comprehensible manner. Copyright © 2018. Published by Elsevier Inc.

  14. Applying Quality Management Process-Improvement Principles to Learning in Reading Courses: An Improved Learning and Retention Method.

    ERIC Educational Resources Information Center

    Hahn, William G.; Bart, Barbara D.

    2003-01-01

    Business students were taught a total quality management-based outlining process for course readings and a tally method to measure learning efficiency. Comparison of 233 who used the process and 99 who did not showed that the group means of users' test scores were 12.4 points higher than those of nonusers. (Contains 25 references.) (SK)

  15. A New Tool for Quality: The Internal Audit.

    PubMed

    Haycock, Camille; Schandl, Annette

    As health care systems aspire to improve the quality and value for the consumers they serve, quality outcomes must be at the forefront of this value equation. As organizations implement evidence-based practices, electronic records to standardize processes, and quality improvement initiatives, many tactics are deployed to accelerate improvement and care outcomes. This article describes how one organization utilized a formal clinical audit process to identify gaps and/or barriers that may be contributing to underperforming measures and outcomes. This partnership between quality and audit can be a powerful tool and produce insights that can be scaled across a large health care system.

  16. Research to Assembly Scheme for Satellite Deck Based on Robot Flexibility Control Principle

    NASA Astrophysics Data System (ADS)

    Guo, Tao; Hu, Ruiqin; Xiao, Zhengyi; Zhao, Jingjing; Fang, Zhikai

    2018-03-01

    Deck assembly is critical quality control point in final satellite assembly process, and cable extrusion and structure collision problems in assembly process will affect development quality and progress of satellite directly. Aimed at problems existing in deck assembly process, assembly project scheme for satellite deck based on robot flexibility control principle is proposed in this paper. Scheme is introduced firstly; secondly, key technologies on end force perception and flexible docking control in the scheme are studied; then, implementation process of assembly scheme for satellite deck is described in detail; finally, actual application case of assembly scheme is given. Result shows that compared with traditional assembly scheme, assembly scheme for satellite deck based on robot flexibility control principle has obvious advantages in work efficiency, reliability and universality aspects etc.

  17. 40 CFR 125.3 - Technology-based treatment requirements in permits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... techniques; (v) Process changes; and (vi) Non-water quality environmental impact (including energy...-water quality environmental impact (including energy requirements). (3) For BAT requirements: (i) The... achieving such effluent reduction; and (vi) Non-water quality environmental impact (including energy...

  18. 40 CFR 125.3 - Technology-based treatment requirements in permits.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... techniques; (v) Process changes; and (vi) Non-water quality environmental impact (including energy...-water quality environmental impact (including energy requirements). (3) For BAT requirements: (i) The... achieving such effluent reduction; and (vi) Non-water quality environmental impact (including energy...

  19. 40 CFR 125.3 - Technology-based treatment requirements in permits.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... techniques; (v) Process changes; and (vi) Non-water quality environmental impact (including energy...-water quality environmental impact (including energy requirements). (3) For BAT requirements: (i) The... achieving such effluent reduction; and (vi) Non-water quality environmental impact (including energy...

  20. 40 CFR 125.3 - Technology-based treatment requirements in permits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... techniques; (v) Process changes; and (vi) Non-water quality environmental impact (including energy...-water quality environmental impact (including energy requirements). (3) For BAT requirements: (i) The... achieving such effluent reduction; and (vi) Non-water quality environmental impact (including energy...

  1. Influence of raw milk quality on processed dairy products: How do raw milk quality test results relate to product quality and yield?

    PubMed

    Murphy, Steven C; Martin, Nicole H; Barbano, David M; Wiedmann, Martin

    2016-12-01

    This article provides an overview of the influence of raw milk quality on the quality of processed dairy products and offers a perspective on the merits of investing in quality. Dairy farmers are frequently offered monetary premium incentives to provide high-quality milk to processors. These incentives are most often based on raw milk somatic cell and bacteria count levels well below the regulatory public health-based limits. Justification for these incentive payments can be based on improved processed product quality and manufacturing efficiencies that provide the processor with a return on their investment for high-quality raw milk. In some cases, this return on investment is difficult to measure. Raw milks with high levels of somatic cells and bacteria are associated with increased enzyme activity that can result in product defects. Use of raw milk with somatic cell counts >100,000cells/mL has been shown to reduce cheese yields, and higher levels, generally >400,000 cells/mL, have been associated with textural and flavor defects in cheese and other products. Although most research indicates that fairly high total bacteria counts (>1,000,000 cfu/mL) in raw milk are needed to cause defects in most processed dairy products, receiving high-quality milk from the farm allows some flexibility for handling raw milk, which can increase efficiencies and reduce the risk of raw milk reaching bacterial levels of concern. Monitoring total bacterial numbers in regard to raw milk quality is imperative, but determining levels of specific types of bacteria present has gained increasing importance. For example, spores of certain spore-forming bacteria present in raw milk at very low levels (e.g., <1/mL) can survive pasteurization and grow in milk and cheese products to levels that result in defects. With the exception of meeting product specifications often required for milk powders, testing for specific spore-forming groups is currently not used in quality incentive programs in the United States but is used in other countries (e.g., the Netherlands). Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Transition from in-hospital ventilation to home ventilation: process description and quality indicators

    PubMed Central

    Kastrup, Marc; Tittmann, Benjamin; Sawatzki, Tanja; Gersch, Martin; Vogt, Charlotte; Rosenthal, Max; Rosseau, Simone; Spies, Claudia

    2017-01-01

    The current demographic development of our society results in an increasing number of elderly patients with chronic diseases being treated in the intensive care unit. A possible long-term consequence of such a treatment is that patients remain dependent on certain invasive organ support systems, such as long-term ventilator dependency. The main goal of this project is to define the transition process between in-hospital and out of hospital (ambulatory) ventilator support. A further goal is to identify evidence-based quality indicators to help define and describe this process. This project describes an ideal sequence of processes (process chain), based on the current evidence from the literature. Besides the process chain, key data and quality indicators were described in detail. Due to the limited project timeline, these indicators were not extensively tested in the clinical environment. The results of this project may serve as a solid basis for proof of feasibility and proof of concept investigations, optimize the transition process of ventilator-dependent patients from a clinical to an ambulatory setting, as well as reduce the rate of emergency re-admissions. PMID:29308061

  3. A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado

    USGS Publications Warehouse

    Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.

    2007-01-01

    Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.

  4. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  5. Addressing and Presenting Quality of Satellite Data via Web-Based Services

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.

    2011-01-01

    With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.

  6. [Research advances in secondary development of Chinese patent medicines based on quality by design concept].

    PubMed

    Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin

    2017-03-01

    Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.

  7. Development and Validation of an Index to Measure the Quality of Facility-Based Labor and Delivery Care Processes in Sub-Saharan Africa

    PubMed Central

    Tripathi, Vandana; Stanton, Cynthia; Strobino, Donna; Bartlett, Linda

    2015-01-01

    Background High quality care is crucial in ensuring that women and newborns receive interventions that may prevent and treat birth-related complications. As facility deliveries increase in developing countries, there are concerns about service quality. Observation is the gold standard for clinical quality assessment, but existing observation-based measures of obstetric quality of care are lengthy and difficult to administer. There is a lack of consensus on quality indicators for routine intrapartum and immediate postpartum care, including essential newborn care. This study identified key dimensions of the quality of the process of intrapartum and immediate postpartum care (QoPIIPC) in facility deliveries and developed a quality assessment measure representing these dimensions. Methods and Findings Global maternal and neonatal care experts identified key dimensions of QoPIIPC through a modified Delphi process. Experts also rated indicators of these dimensions from a comprehensive delivery observation checklist used in quality surveys in sub-Saharan African countries. Potential QoPIIPC indices were developed from combinations of highly-rated indicators. Face, content, and criterion validation of these indices was conducted using data from observations of 1,145 deliveries in Kenya, Madagascar, and Tanzania (including Zanzibar). A best-performing index was selected, composed of 20 indicators of intrapartum/immediate postpartum care, including essential newborn care. This index represented most dimensions of QoPIIPC and effectively discriminated between poorly and well-performed deliveries. Conclusions As facility deliveries increase and the global community pays greater attention to the role of care quality in achieving further maternal and newborn mortality reduction, the QoPIIPC index may be a valuable measure. This index complements and addresses gaps in currently used quality assessment tools. Further evaluation of index usability and reliability is needed. The availability of a streamlined, comprehensive, and validated index may enable ongoing and efficient observation-based assessment of care quality during labor and delivery in sub-Saharan Africa, facilitating targeted quality improvement. PMID:26107655

  8. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  9. Forging a novel provider and payer partnership in Wisconsin to compensate pharmacists for quality-driven pharmacy and medication therapy management services.

    PubMed

    Trapskin, Kari; Johnson, Curtis; Cory, Patrick; Sorum, Sarah; Decker, Chris

    2009-01-01

    To describe the Wisconsin Pharmacy Quality Collaborative (WPQC), a quality-based network of pharmacies and payers with the common goals of improving medication use and safety, reducing health care costs for payers and patients, and increasing professional recognition and compensation for pharmacist-provided services. Wisconsin between 2006 and 2009. Community (independent, chain, and health-system) pharmacies and private and public health care payers/purchasers with support from the McKesson Corporation. This initiative aligns incentives for pharmacies and payers through implementation of 12 quality-based pharmacy requirements as conditions of pharmacy participation in a practice-advancement pilot. Payers compensate network pharmacies that meet the quality-based requirements for two levels of pharmacy professional services (level 1, intervention-based services; level 2, comprehensive medication review and assessment services). The pilot project is designed to measure the following outcomes: medication-use quality improvements, frequency and types of services provided, drug therapy problems, patient safety, cost savings, identification of factors that facilitate pharmacist participation, and patient satisfaction. The Pharmacy Society of Wisconsin created the WPQC network, which consists of 53 pharmacies, 106 trained pharmacists, 45 student pharmacists, 6 pharmacy technicians, and 2 initial payers. A quality assurance process is followed approximately quarterly to audit the 12 network quality requirements. An evaluation of this collaboration is being conducted. This program demonstrates that collaboration among payers and pharmacists is possible and can result in the development of an incentive-aligned program that stresses quality patient care, standardized services, and professional service compensation for pharmacists. This combination of a quality-based credentialing process with a professional services reimbursement schedule is unique and has the promise to enhance the ambulatory pharmacy practice model.

  10. British Thoracic Society quality standards for the investigation and management of pulmonary nodules.

    PubMed

    Baldwin, David; Callister, Matthew; Akram, Ahsan; Cane, Paul; Draffan, Jeanette; Franks, Kevin; Gleeson, Fergus; Graham, Richard; Malhotra, Puneet; Pearson, Philip; Subesinghe, Manil; Waller, David; Woolhouse, Ian

    2018-01-01

    The purpose of the quality standards document is to provide healthcare professionals, commissioners, service providers and patients with a guide to standards of care that should be met for the investigation and management of pulmonary nodules in the UK, together with measurable markers of good practice. Development of British Thoracic Society (BTS) Quality Standards follows the BTS process of quality standard production based on the National Institute for Health and Care Excellence process manual for the development of quality standards. 7 quality statements have been developed, each describing a key marker of high-quality, cost-effective care for the investigation and management of pulmonary nodules, and each statement is supported by quality measures that aim to improve the structure, process and outcomes of healthcare. BTS Quality Standards for the investigation and management of pulmonary nodules form a key part of the range of supporting materials that the Society produces to assist in the dissemination and implementation of guideline recommendations.

  11. British Thoracic Society quality standards for the investigation and management of pulmonary nodules

    PubMed Central

    Baldwin, David; Callister, Matthew; Akram, Ahsan; Cane, Paul; Draffan, Jeanette; Franks, Kevin; Gleeson, Fergus; Graham, Richard; Malhotra, Puneet; Pearson, Philip; Subesinghe, Manil; Waller, David; Woolhouse, Ian

    2018-01-01

    Introduction The purpose of the quality standards document is to provide healthcare professionals, commissioners, service providers and patients with a guide to standards of care that should be met for the investigation and management of pulmonary nodules in the UK, together with measurable markers of good practice. Methods Development of British Thoracic Society (BTS) Quality Standards follows the BTS process of quality standard production based on the National Institute for Health and Care Excellence process manual for the development of quality standards. Results 7 quality statements have been developed, each describing a key marker of high-quality, cost-effective care for the investigation and management of pulmonary nodules, and each statement is supported by quality measures that aim to improve the structure, process and outcomes of healthcare. Discussion BTS Quality Standards for the investigation and management of pulmonary nodules form a key part of the range of supporting materials that the Society produces to assist in the dissemination and implementation of guideline recommendations. PMID:29682290

  12. On-line welding quality inspection system for steel pipe based on machine vision

    NASA Astrophysics Data System (ADS)

    Yang, Yang

    2017-05-01

    In recent years, high frequency welding has been widely used in production because of its advantages of simplicity, reliability and high quality. In the production process, how to effectively control the weld penetration welding, ensure full penetration, weld uniform, so as to ensure the welding quality is to solve the problem of the present stage, it is an important research field in the field of welding technology. In this paper, based on the study of some methods of welding inspection, a set of on-line welding quality inspection system based on machine vision is designed.

  13. Analysis of batch-related influences on injection molding processes viewed in the context of electro plating quality demands

    NASA Astrophysics Data System (ADS)

    Siepmann, Jens P.; Wortberg, Johannes; Heinzler, Felix A.

    2016-03-01

    The injection molding process is mandatorily influenced by the viscosity of the material. By varying the material batch the viscosity of the polymer changes. For the process and part quality the initial conditions of the material in addition to the processing parameters define the process and product quality. A high percentage of technical polymers processed in injection molding is refined in a follow-up production step, for example electro plating. Processing optimized for electro plating often requires avoiding high shear stresses by using low injection speed and pressure conditions. Therefore differences in the material charges' viscosity occur especially in the quality related low shear rate area. These differences and quality related influences can be investigated by high detail rheological analysis and process simulation based on adapted material describing models. Differences in viscosity between batches can be detected by measurements with high-pressure-capillary-rheometers or oscillatory rheometers for low shear rates. A combination of both measurement techniques is possible by the Cox-Merz-Relation. The detected differences in the rheological behavior of both charges are summarized in two material behavior describing model approaches and added to the simulation. In this paper the results of processing-simulations with standard filling parameters are presented with two ABS charges. Part quality defining quantities such as temperature, pressure and shear stress are investigated and the influence of charge variations is pointed out with respect to electro plating quality demands. Furthermore, the results of simulations with a new quality related process control are presented and compared to the standard processing.

  14. Experimental Study in Taguchi Method on Surface Quality Predication of HSM

    NASA Astrophysics Data System (ADS)

    Ji, Yan; Li, Yueen

    2018-05-01

    Based on the study of ball milling mechanism and machining surface formation mechanism, the formation of high speed ball-end milling surface is a time-varying and cumulative Thermos-mechanical coupling process. The nature of this problem is that the uneven stress field and temperature field affect the machined surface Process, the performance of the processing parameters in the processing interaction in the elastic-plastic materials produced by the elastic recovery and plastic deformation. The surface quality of machining surface is characterized by multivariable nonlinear system. It is still an indispensable and effective method to study the surface quality of high speed ball milling by experiments.

  15. Real-time parameter optimization based on neural network for smart injection molding

    NASA Astrophysics Data System (ADS)

    Lee, H.; Liau, Y.; Ryu, K.

    2018-03-01

    The manufacturing industry has been facing several challenges, including sustainability, performance and quality of production. Manufacturers attempt to enhance the competitiveness of companies by implementing CPS (Cyber-Physical Systems) through the convergence of IoT(Internet of Things) and ICT(Information & Communication Technology) in the manufacturing process level. Injection molding process has a short cycle time and high productivity. This features have been making it suitable for mass production. In addition, this process is used to produce precise parts in various industry fields such as automobiles, optics and medical devices. Injection molding process has a mixture of discrete and continuous variables. In order to optimized the quality, variables that is generated in the injection molding process must be considered. Furthermore, Optimal parameter setting is time-consuming work to predict the optimum quality of the product. Since the process parameter cannot be easily corrected during the process execution. In this research, we propose a neural network based real-time process parameter optimization methodology that sets optimal process parameters by using mold data, molding machine data, and response data. This paper is expected to have academic contribution as a novel study of parameter optimization during production compare with pre - production parameter optimization in typical studies.

  16. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  17. Is non-linear frequency compression amplification beneficial to adults and children with hearing loss? A systematic review.

    PubMed

    Akinseye, Gladys Atinuke; Dickinson, Ann-Marie; Munro, Kevin J

    2018-04-01

    To conduct a systematic review of the benefits of non-linear frequency compression (NLFC) in adults and children. Ten databases were searched for studies comparing the effects of NLFC and conventional processing (CP) for the period January 2008 to September 2017. Twelve articles were included in this review: four adults and school-aged only, one pre-school only and three with both adults and school-aged children. A two-stage process was implemented to grade the evidence. The individual studies were graded based on their study type (from 1 = highest quality of evidence to 5 = the lowest quality) and then sub-graded based on their quality ("a" for "good quality" or "b" for "lesser quality"). All studies were awarded 4a, except the single pre-school study, which was awarded 2a. The overall evidence for each population was graded based on the quality, quantity and consistency of the studies. The body of evidence was rated as very low for both adults and school-aged children, but high for pre-school children. The low number (and quality) of studies means that evidence supporting the benefit from NLFC is inconclusive. Further high-quality RCTs are required to provide a conclusive answer to this question.

  18. First Processing Steps and the Quality of Wild and Farmed Fish

    PubMed Central

    Borderías, Antonio J; Sánchez-Alonso, Isabel

    2011-01-01

    First processing steps of fish are species-dependent and have common practices for wild and for farmed fish. Fish farming does, however, have certain advantages over traditional fisheries in that the processor can influence postmortem biochemistry and various quality parameters. This review summarizes information about the primary processing of fish based on the influence of catching, slaughtering, bleeding, gutting, washing, and filleting. Recommendations are given for the correct primary processing of fish. PMID:21535702

  19. Meteorological Processes Affecting Air Quality – Research and Model Development Needs

    EPA Science Inventory

    Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...

  20. An assessment model for quality management

    NASA Astrophysics Data System (ADS)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  1. [Quality assessment in anesthesia].

    PubMed

    Kupperwasser, B

    1996-01-01

    Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.

  2. A Preliminary Investigation of the Empirical Validity of Study Quality Appraisal

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Dupuis, Danielle N.; Jitendra, Asha K.

    2017-01-01

    When classifying the evidence base of practices, special education scholars typically appraise study quality to identify and exclude from consideration in their reviews unacceptable-quality studies that are likely biased and might bias review findings if included. However, study quality appraisals used in the process of identifying evidence-based…

  3. CNV-based genome wide association study reveals additional variants contributing to meat quality in swine

    USDA-ARS?s Scientific Manuscript database

    Pork quality is important both to the meat processing industry and consumers’ purchasing attitudes. Copy number variation (CNV) is a burgeoning kind of variant that may influence meat quality. Herein, a genome-wide association study (GWAS) was performed between CNVs and meat quality traits in swine....

  4. Performance measurement: integrating quality management and activity-based cost management.

    PubMed

    McKeon, T

    1996-04-01

    The development of an activity-based management system provides a framework for developing performance measures integral to quality and cost management. Performance measures that cross operational boundaries and embrace core processes provide a mechanism to evaluate operational results related to strategic intention and internal and external customers. The author discusses this measurement process that allows managers to evaluate where they are and where they want to be, and to set a course of action that closes the gap between the two.

  5. Dimensional Precision Research of Wax Molding Rapid Prototyping based on Droplet Injection

    NASA Astrophysics Data System (ADS)

    Mingji, Huang; Geng, Wu; yan, Shan

    2017-11-01

    The traditional casting process is complex, the mold is essential products, mold quality directly affect the quality of the product. With the method of rapid prototyping 3D printing to produce mold prototype. The utility wax model has the advantages of high speed, low cost and complex structure. Using the orthogonal experiment as the main method, analysis each factors of size precision. The purpose is to obtain the optimal process parameters, to improve the dimensional accuracy of production based on droplet injection molding.

  6. Fuel quality processing study, volume 1

    NASA Astrophysics Data System (ADS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-04-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  7. Fuel quality processing study, volume 1

    NASA Technical Reports Server (NTRS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-01-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  8. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  9. The clinical nurse specialist as resuscitation process manager.

    PubMed

    Schneiderhahn, Mary Elizabeth; Fish, Anne Folta

    2014-01-01

    The purpose of this article was to describe the history and leadership dimensions of the role of resuscitation process manager and provide specific examples of how this role is implemented at a Midwest medical center. In 1992, a medical center in the Midwest needed a nurse to manage resuscitation care. This role designation meant that this nurse became central to all quality improvement efforts in resuscitation care. The role expanded as clinical resuscitation guidelines were updated and as the medical center grew. The role became known as the critical care clinical nurse specialist as resuscitation process manager. This clinical care nurse specialist was called a manager, but she had no direct line authority, so she accomplished her objectives by forming a multitude of collaborative networks. Based on a framework by Finkelman, the manager role incorporated specific leadership abilities in quality improvement: (1) coordination of medical center-wide resuscitation, (2) use of interprofessional teams, (3) integration of evidence into practice, and (4) staff coaching to develop leadership. The manager coordinates resuscitation care with the goals of prevention of arrests if possible, efficient and effective implementation of resuscitation protocols, high quality of patient and family support during and after the resuscitation event, and creation or revision of resuscitation policies for in-hospital and for ambulatory care areas. The manager designs a comprehensive set of meaningful and measurable process and outcome indicators with input from interprofessional teams. The manager engages staff in learning, reflecting on care given, and using the evidence base for resuscitation care. Finally, the manager role is a balance between leading quality improvement efforts and coaching staff to implement and sustain these quality improvement initiatives. Revisions to clinical guidelines for resuscitation care since the 1990s have resulted in medical centers developing improved resuscitation processes that require management. The manager enhances collaborative quality improvement efforts that are in line with Institute of Medicine recommendations. The role of resuscitation process manager may be of interest to medical centers striving for excellence in evidence-based resuscitation care.

  10. Cultural adaptation process for international dissemination of the strengthening families program.

    PubMed

    Kumpfer, Karol L; Pinyuchon, Methinin; Teixeira de Melo, Ana; Whiteside, Henry O

    2008-06-01

    The Strengthening Families Program (SFP) is an evidence-based family skills training intervention developed and found efficacious for substance abuse prevention by U.S researchers in the 1980s. In the 1990s, a cultural adaptation process was developed to transport SFP for effectiveness trials with diverse populations (African, Hispanic, Asian, Pacific Islander, and Native American). Since 2003, SFP has been culturally adapted for use in 17 countries. This article reviews the SFP theory and research and a recommended cultural adaptation process. Challenges in international dissemination of evidence-based programs (EBPs) are discussed based on the results of U.N. and U.S. governmental initiatives to transport EBP family interventions to developing countries. The technology transfer and quality assurance system are described, including the language translation and cultural adaptation process for materials development, staff training, and on-site and online Web-based supervision and technical assistance and evaluation services to assure quality implementation and process evaluation feedback for improvements.

  11. Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art

    PubMed Central

    Fissore, Davide

    2017-01-01

    Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123

  12. Sleep Quality Estimation based on Chaos Analysis for Heart Rate Variability

    NASA Astrophysics Data System (ADS)

    Fukuda, Toshio; Wakuda, Yuki; Hasegawa, Yasuhisa; Arai, Fumihito; Kawaguchi, Mitsuo; Noda, Akiko

    In this paper, we propose an algorithm to estimate sleep quality based on a heart rate variability using chaos analysis. Polysomnography(PSG) is a conventional and reliable system to diagnose sleep disorder and to evaluate its severity and therapeatic effect, by estimating sleep quality based on multiple channels. However, a recording process requires a lot of time and a controlled environment for measurement and then an analyzing process of PSG data is hard work because the huge sensed data should be manually evaluated. On the other hand, it is focused that some people make a mistake or cause an accident due to lost of regular sleep and of homeostasis these days. Therefore a simple home system for checking own sleep is required and then the estimation algorithm for the system should be developed. Therefore we propose an algorithm to estimate sleep quality based only on a heart rate variability which can be measured by a simple sensor such as a pressure sensor and an infrared sensor in an uncontrolled environment, by experimentally finding the relationship between chaos indices and sleep quality. The system including the estimation algorithm can inform patterns and quality of own daily sleep to a user, and then the user can previously arranges his life schedule, pays more attention based on sleep results and consult with a doctor.

  13. Exploring global recognition of quality midwifery education: Vision or fiction?

    PubMed

    Luyben, Ans; Barger, Mary; Avery, Melissa; Bharj, Kuldip Kaur; O'Connell, Rhona; Fleming, Valerie; Thompson, Joyce; Sherratt, Della

    2017-06-01

    Midwifery education is the foundation for preparing competent midwives to provide a high standard of safe, evidence-based care for women and their newborns. Global competencies and standards for midwifery education have been defined as benchmarks for establishing quality midwifery education and practice worldwide. However, wide variations in type and nature of midwifery education programs exist. To explore and discuss the opportunities and challenges of a global quality assurance process as a strategy to promote quality midwifery education. Accreditation and recognition as two examples of quality assurance processes in education are discussed. A global recognition process, with its opportunities and challenges, is explored from the perspective of four illustrative case studies from Ireland, Kosovo, Latin America and Bangladesh. The discussion highlights that the establishment of a global recognition process may assist in promoting quality of midwifery education programs world-wide, but cannot take the place of formal national accreditation. In addition, a recognition process will not be feasible for many institutions without additional resources, such as financial support or competent evaluators. In order to achieve quality midwifery education through a global recognition process the authors present 5 Essential Challenges for Quality Midwifery Education. Quality midwifery education is vital for establishing a competent workforce, and improving maternal and newborn health. Defining a global recognition process could be instrumental in moving toward this goal, but dealing with the identified challenges will be essential. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  14. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  15. Development of a validation process for parameters utilized in optimizing construction quality management of pavements.

    DOT National Transportation Integrated Search

    2006-01-01

    The implementation of an effective performance-based construction quality management requires a tool for determining impacts of construction quality on the life-cycle performance of pavements. This report presents an update on the efforts in the deve...

  16. [Quality assurance and total quality management in residential home care].

    PubMed

    Nübling, R; Schrempp, C; Kress, G; Löschmann, C; Neubart, R; Kuhlmey, A

    2004-02-01

    Quality, quality assurance, and quality management have been important topics in residential care homes for several years. However, only as a result of reform processes in the German legislation (long-term care insurance, care quality assurance) is a systematic discussion taking place. Furthermore, initiatives and holistic model projects, which deal with the assessment and improvement of service quality, were developed in the field of care for the elderly. The present article gives a critical overview of essential developments. Different comprehensive approaches such as the implementation of quality management systems, nationwide expert-based initiatives, and developments towards professionalizing care are discussed. Empirically based approaches, especially those emphasizing the assessment of outcome quality, are focused on in this work. Overall, the authors conclude that in the past few years comprehensive efforts have been made to improve the quality of care. However, the current situation still requires much work to establish a nationwide launch and implementation of evidence-based quality assurance and quality management.

  17. Bioreactor expansion of human mesenchymal stem cells according to GMP requirements.

    PubMed

    Elseberg, Christiane L; Salzig, Denise; Czermak, Peter

    2015-01-01

    In cell therapy, the use of autologous and allogenic human mesenchymal stem cells is rising. Accordingly, the supply of cells for clinical applications in highest quality is required. As hMSCs are considered as an advanced therapy medicinal products (ATMP), they underlie the requirements of GMP and PAT according to the authorities (FDA and EMA). The production process of these cells must therefore be documented according to GMP, which is usually performed via a GMP protocol based on standard operating procedures. This chapter provides an example of such a GMP protocol for hMSC, here a genetically modified allogenic cell line, based on a production process in a microcarrier-based stirred tank reactor including process monitoring according to PAT and final product quality assurance.

  18. Process perspective on image quality evaluation

    NASA Astrophysics Data System (ADS)

    Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

    2008-01-01

    The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

  19. Manufacturing Bms/Iso System Review

    NASA Technical Reports Server (NTRS)

    Gomez, Yazmin

    2004-01-01

    The Quality Management System (QMS) is one that recognizes the need to continuously change and improve an organization s products and services as determined by system feedback, and corresponding management decisions. The purpose of a Quality Management System is to minimize quality variability of an organization's products and services. The optimal Quality Management System balances the need for an organization to maintain flexibility in the products and services it provides with the need for providing the appropriate level of discipline and control over the processes used to provide them. The goal of a Quality Management System is to ensure the quality of the products and services while consistently (through minimizing quality variability) meeting or exceeding customer expectations. The GRC Business Management System (BMS) is the foundation of the Center's ISO 9001:2000 registered quality system. ISO 9001 is a quality system model developed by the International Organization for Standardization. BMS supports and promote the Glenn Research Center Quality Policy and wants to ensure the customer satisfaction while also meeting quality standards. My assignment during this summer is to examine the manufacturing processes used to develop research hardware, which in most cases are one of a kind hardware, made with non conventional equipment and materials. During this process of observation I will make a determination, based on my observations of the hardware development processes the best way to meet customer requirements and at the same time achieve the GRC quality standards. The purpose of my task is to review the manufacturing processes identifying opportunities in which to optimize the efficiency of the processes and establish a plan for implementation and continuous improvement.

  20. [Method for the quality assessment of data collection processes in epidemiological studies].

    PubMed

    Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L

    2017-10-01

    For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.

  1. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  2. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  3. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  4. Improving Quality and Efficiency of Postpartum Hospital Education

    PubMed Central

    Buchko, Barbara L.; Gutshall, Connie H.; Jordan, Elizabeth T.

    2012-01-01

    The purpose of this study was to investigate the implementation of an evidence-based, streamlined, education process (comprehensive education booklet, individualized education plan, and integration of education into the clinical pathway) and nurse education to improve the quality and efficiency of postpartum education during hospitalization. A one-group pretest–posttest design was used to measure the quality of discharge teaching for new mothers and efficiency of the education process for registered nurses before and after implementation of an intervention. Results indicated that a comprehensive educational booklet and enhanced documentation can improve efficiency in the patient education process for nurses. PMID:23997552

  5. Approaching Error-Free Customer Satisfaction through Process Change and Feedback Systems

    ERIC Educational Resources Information Center

    Berglund, Kristin M.; Ludwig, Timothy D.

    2009-01-01

    Employee-based errors result in quality defects that can often impact customer satisfaction. This study examined the effects of a process change and feedback system intervention on error rates of 3 teams of retail furniture distribution warehouse workers. Archival records of error codes were analyzed and aggregated as the measure of quality. The…

  6. Decision-making in honeybee swarms based on quality and distance information of candidate nest sites.

    PubMed

    Laomettachit, Teeraphan; Termsaithong, Teerasit; Sae-Tang, Anuwat; Duangphakdee, Orawan

    2015-01-07

    In the nest-site selection process of honeybee swarms, an individual bee performs a waggle dance to communicate information about direction, quality, and distance of a discovered site to other bees at the swarm. Initially, different groups of bees dance to represent different potential sites, but eventually the swarm usually reaches an agreement for only one site. Here, we model the nest-site selection process in honeybee swarms of Apis mellifera and show how the swarms make adaptive decisions based on a trade-off between the quality and distance to candidate nest sites. We use bifurcation analysis and stochastic simulations to reveal that the swarm's site distance preference is moderate>near>far when the swarms choose between low quality sites. However, the distance preference becomes near>moderate>far when the swarms choose between high quality sites. Our simulations also indicate that swarms with large population size prefer nearer sites and, in addition, are more adaptive at making decisions based on available information compared to swarms with smaller population size. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Investigating the CO 2 laser cutting parameters of MDF wood composite material

    NASA Astrophysics Data System (ADS)

    Eltawahni, H. A.; Olabi, A. G.; Benyounis, K. Y.

    2011-04-01

    Laser cutting of medium density fibreboard (MDF) is a complicated process and the selection of the process parameters combinations is essential to get the highest quality cut section. This paper presents a means for selecting the process parameters for laser cutting of MDF based on the design of experiments (DOE) approach. A CO 2 laser was used to cut three thicknesses, 4, 6 and 9 mm, of MDF panels. The process factors investigated are: laser power, cutting speed, air pressure and focal point position. In this work, cutting quality was evaluated by measuring the upper kerf width, the lower kerf width, the ratio between the upper kerf width to the lower kerf width, the cut section roughness and the operating cost. The effect of each factor on the quality measures was determined. The optimal cutting combinations were presented in favours of high quality process output and in favours of low cutting cost.

  8. Improvement of the System of Training of Specialists by University for Coal Mining Enterprises

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, Vadim; Seredkina, Irina

    2017-11-01

    In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.

  9. Adaptive scallop height tool path generation for robot-based incremental sheet metal forming

    NASA Astrophysics Data System (ADS)

    Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd

    2016-10-01

    Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.

  10. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  11. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  12. A person-centered integrated care quality framework, based on a qualitative study of patients' evaluation of care in light of chronic care ideals.

    PubMed

    Berntsen, Gro; Høyem, Audhild; Lettrem, Idar; Ruland, Cornelia; Rumpsfeld, Markus; Gammon, Deede

    2018-06-20

    Person-Centered Integrated Care (PC-IC) is believed to improve outcomes and experience for persons with multiple long-term and complex conditions. No broad consensus exists regarding how to capture the patient-experienced quality of PC-IC. Most PC-IC evaluation tools focus on care events or care in general. Building on others' and our previous work, we outlined a 4-stage goal-oriented PC-IC process ideal: 1) Personalized goal setting 2) Care planning aligned with goals 3) Care delivery according to plan, and 4) Evaluation of goal attainment. We aimed to explore, apply, refine and operationalize this quality of care framework. This paper is a qualitative evaluative review of the individual Patient Pathways (iPP) experiences of 19 strategically chosen persons with multimorbidity in light of ideals for chronic care. The iPP includes all care events, addressing the persons collected health issues, organized by time. We constructed iPPs based on the electronic health record (from general practice, nursing services, and hospital) with patient follow-up interviews. The application of the framework and its refinement were parallel processes. Both were based on analysis of salient themes in the empirical material in light of the PC-IC process ideal and progressively more informed applications of themes and questions. The informants consistently reviewed care quality by how care supported/ threatened their long-term goals. Personal goals were either implicit or identified by "What matters to you?" Informants expected care to address their long-term goals and placed responsibility for care quality and delivery at the system level. The PC-IC process framework exposed system failure in identifying long-term goals, provision of shared long-term multimorbidity care plans, monitoring of care delivery and goal evaluation. The PC-IC framework includes descriptions of ideal care, key questions and literature references for each stage of the PC-IC process. This first version of a PC-IC process framework needs further validation in other settings. Gaps in care that are invisible with event-based quality of care frameworks become apparent when evaluated by a long-term goal-driven PC-IC process framework. The framework appears meaningful to persons with multimorbidity.

  13. On the Road to Quality: Turning Stumbling Blocks into Stepping Stones.

    ERIC Educational Resources Information Center

    Bonstingl, John Jay

    1996-01-01

    W. Edwards Deming's quality philosophy can help organizations develop collaborative, community-building leadership practices. This article outlines five personal practices of quality based on personal leadership, partnerships, a systems focus, a process orientation, and constant dedication to continuous improvement. Stumbling blocks can be…

  14. Process quality planning of quality function deployment for carrot syrup

    NASA Astrophysics Data System (ADS)

    Ekawati, Yurida; Noya, Sunday; Widjaja, Filemon

    2017-06-01

    Carrot products are rarely available in the market. Based on previous research that had been done using QFD to generate product design of carrots products, the research to produce the process quality planning had been carried out. The carrot product studied was carrot syrup. The research resulted in a process planning matrix for carrot syrup. The matrix gives information about critical process plan and the priority of the critical process plan. The critical process plan on the production process of carrot syrup consists of carrots sorting, carrots peeling, carrots washing, blanching process, carrots cutting, the making of pureed carrots, filtering carrot juice, the addition of sugar in carrot juice, the addition of food additives in carrot juice, syrup boiling, syrup filtering, syrup filling into the bottle, the bottle closure and cooling. The information will help the design of the production process of carrot syrup.

  15. Canadian Association of Gastroenterology consensus guidelines on safety and quality indicators in endoscopy

    PubMed Central

    Armstrong, David; Barkun, Alan; Bridges, Ron; Carter, Rose; de Gara, Chris; Dubé, Catherine; Enns, Robert; Hollingworth, Roger; MacIntosh, Donald; Borgaonkar, Mark; Forget, Sylviane; Leontiadis, Grigorios; Meddings, Jonathan; Cotton, Peter; Kuipers, Ernst J; Valori, Roland

    2012-01-01

    BACKGROUND: Increasing use of gastrointestinal endoscopy, particularly for colorectal cancer screening, and increasing emphasis on health care quality, highlight the need for clearly defined, evidence-based processes to support quality improvement in endoscopy. OBJECTIVE: To identify processes and indicators of quality and safety relevant to high-quality endoscopy service delivery. METHODS: A multidisciplinary group of 35 voting participants developed recommendation statements and performance indicators. Systematic literature searches generated 50 initial statements that were revised iteratively following a modified Delphi approach using a web-based evaluation and voting tool. Statement development and evidence evaluation followed the AGREE (Appraisal of Guidelines, REsearch and Evaluation) and GRADE (Grading of Recommendations, Assessment, Development and Evaluation) guidelines. At the consensus conference, participants voted anonymously on all statements using a 6-point scale. Subsequent web-based voting evaluated recommendations for specific, individual quality indicators, safety indicators and mandatory endoscopy reporting fields. Consensus was defined a priori as agreement by 80% of participants. RESULTS: Consensus was reached on 23 recommendation statements addressing the following: ethics (statement 1: agreement 100%), facility standards and policies (statements 2 to 9: 90% to 100%), quality assurance (statements 10 to 13: 94% to 100%), training, education, competency and privileges (statements 14 to 19: 97% to 100%), endoscopy reporting standards (statements 20 and 21: 97% to 100%) and patient perceptions (statements 22 and 23: 100%). Additionally, 18 quality indicators (agreement 83% to 100%), 20 safety indicators (agreement 77% to 100%) and 23 recommended endoscopy-reporting elements (agreement 91% to 100%) were identified. DISCUSSION: The consensus process identified a clear need for high-quality clinical and outcomes research to support quality improvement in the delivery of endoscopy services. CONCLUSIONS: The guidelines support quality improvement in endoscopy by providing explicit recommendations on systematic monitoring, assessment and modification of endoscopy service delivery to yield benefits for all patients affected by the practice of gastrointestinal endoscopy. PMID:22308578

  16. Usability Evaluation and Implementation of a Health Information Technology Dashboard of Evidence-Based Quality Indicators.

    PubMed

    Schall, Mark Christopher; Cullen, Laura; Pennathur, Priyadarshini; Chen, Howard; Burrell, Keith; Matthews, Grace

    2017-06-01

    Health information technology dashboards that integrate evidence-based quality indicators can efficiently and accurately display patient risk information to promote early intervention and improve overall quality of patient care. We describe the process of developing, evaluating, and implementing a dashboard designed to promote quality care through display of evidence-based quality indicators within an electronic health record. Clinician feedback was sought throughout the process. Usability evaluations were provided by three nurse pairs and one physician from medical-surgical areas. Task completion times, error rates, and ratings of system usability were collected to compare the use of quality indicators displayed on the dashboard to the indicators displayed in a conventional electronic health record across eight experimental scenarios. Participants rated the dashboard as "highly usable" following System Usability Scale (mean, 87.5 [SD, 9.6]) and Poststudy System Usability Questionnaire (mean, 1.7 [SD, 0.5]) criteria. Use of the dashboard led to reduced task completion times and error rates in comparison to the conventional electronic health record for quality indicator-related tasks. Clinician responses to the dashboard display capabilities were positive, and a multifaceted implementation plan has been used. Results suggest application of the dashboard in the care environment may lead to improved patient care.

  17. Whole wheat bread: Effect of bran fractions on dough and end-product quality

    USDA-ARS?s Scientific Manuscript database

    Consumption of whole-wheat based products is encouraged due to its important nutritional elements that beneficial to human health. However, processing of whole-wheat based products, such as whole-wheat bread, results in poor end-product quality. Bran was postulated as the major problem. In this stud...

  18. Education Quality in Kazakhstan in the Context of Competence-Based Approach

    ERIC Educational Resources Information Center

    Nabi, Yskak; Zhaxylykova, Nuriya Ermuhametovna; Kenbaeva, Gulmira Kaparbaevna; Tolbayev, Abdikerim; Bekbaeva, Zeinep Nusipovna

    2016-01-01

    The background of this paper is to present how education system of Kazakhstan evolved during the last 24 years of independence, highlighting the contemporary transformational processes. We defined the aim to identify the education quality in the context of competence-based approach. Methods: Analysis of references, interviewing, experimental work.…

  19. Quality Enhancement and Educational Professional Development

    ERIC Educational Resources Information Center

    Knight, Peter

    2006-01-01

    There is a strong international interest in the enhancement of teaching quality. Enhancement is a big job because teaching is an extensive activity. It is a complex job because learning to teach is not, mainly, a formal process: non-formal, practice-based learning is more significant. These two points, extensiveness and practice-based learning,…

  20. Practical Considerations for Optic Nerve Estimation in Telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karnowski, Thomas Paul; Aykac, Deniz; Chaum, Edward

    The projected increase in diabetes in the United States and worldwide has created a need for broad-based, inexpensive screening for diabetic retinopathy (DR), an eye disease which can lead to vision impairment. A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion / anomaly detection is a low-cost way of achieving broad-based screening. In this work we report on the effect of quality estimation on an optic nerve (ON) detection method with a confidence metric. We report on an improvement of the fusion technique using a data set from an ophthalmologists practice then show themore » results of the method as a function of image quality on a set of images from an on-line telemedicine network collected in Spring 2009 and another broad-based screening program. We show that the fusion method, combined with quality estimation processing, can improve detection performance and also provide a method for utilizing a physician-in-the-loop for images that may exceed the capabilities of automated processing.« less

  1. Engaging Clinical Nurses in Quality Improvement Projects.

    PubMed

    Moore, Susan; Stichler, Jaynelle F

    2015-10-01

    Clinical nurses have the knowledge and expertise required to provide efficient and proficient patient care. Time and knowledge deficits can prevent nurses from developing and implementing quality improvement or evidence-based practice projects. This article reviews a process for professional development of clinical nurses that helped them to define, implement, and analyze quality improvement or evidence-based practice projects. The purpose of this project was to educate advanced clinical nurses to manage a change project from inception to completion, using the Six Sigma DMAIC (Define, Measure, Analyze, Improve, Control) Change Acceleration Process as a framework. One-to-one mentoring and didactic in-services advanced the knowledge, appreciation, and practice of advanced practice clinicians who completed multiple change projects. The projects facilitated clinical practice changes, with improved patient outcomes; a unit cultural shift, with appreciation of quality improvement and evidence-based projects; and engagement with colleagues. Project outcomes were displayed in poster presentations at a hospital exposition for knowledge dissemination. Copyright 2015, SLACK Incorporated.

  2. Stepwise heating in Stille polycondensation toward no batch-to-batch variations in polymer solar cell performance.

    PubMed

    Lee, Sang Myeon; Park, Kwang Hyun; Jung, Seungon; Park, Hyesung; Yang, Changduk

    2018-05-14

    For a given π-conjugated polymer, the batch-to-batch variations in molecular weight (M w ) and polydispersity index (Ð) can lead to inconsistent process-dependent material properties and consequent performance variations in the device application. Using a stepwise-heating protocol in the Stille polycondensation in conjunction with optimized processing, we obtained an ultrahigh-quality PTB7 polymer having high M w and very narrow Ð. The resulting ultrahigh-quality polymer-based solar cells demonstrate up to 9.97% power conversion efficiencies (PCEs), which is over 24% enhancement from the control devices fabricated with commercially available PTB7. Moreover, we observe almost negligible batch-to-batch variations in the overall PCE values from ultrahigh-quality polymer-based devices. The proposed stepwise polymerization demonstrates a facile and effective strategy for synthesizing high-quality semiconducting polymers that can significantly improve device yield in polymer-based solar cells, an important factor for the commercialization of organic solar cells, by mitigating device-to-device variations.

  3. Process evaluation to explore internal and external validity of the "Act in Case of Depression" care program in nursing homes.

    PubMed

    Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J

    2012-06-01

    A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  4. Multisensor-based real-time quality monitoring by means of feature extraction, selection and modeling for Al alloy in arc welding

    NASA Astrophysics Data System (ADS)

    Zhang, Zhifen; Chen, Huabin; Xu, Yanling; Zhong, Jiyong; Lv, Na; Chen, Shanben

    2015-08-01

    Multisensory data fusion-based online welding quality monitoring has gained increasing attention in intelligent welding process. This paper mainly focuses on the automatic detection of typical welding defect for Al alloy in gas tungsten arc welding (GTAW) by means of analzing arc spectrum, sound and voltage signal. Based on the developed algorithms in time and frequency domain, 41 feature parameters were successively extracted from these signals to characterize the welding process and seam quality. Then, the proposed feature selection approach, i.e., hybrid fisher-based filter and wrapper was successfully utilized to evaluate the sensitivity of each feature and reduce the feature dimensions. Finally, the optimal feature subset with 19 features was selected to obtain the highest accuracy, i.e., 94.72% using established classification model. This study provides a guideline for feature extraction, selection and dynamic modeling based on heterogeneous multisensory data to achieve a reliable online defect detection system in arc welding.

  5. Analytical and regression models of glass rod drawing process

    NASA Astrophysics Data System (ADS)

    Alekseeva, L. B.

    2018-03-01

    The process of drawing glass rods (light guides) is being studied. The parameters of the process affecting the quality of the light guide have been determined. To solve the problem, mathematical models based on general equations of continuum mechanics are used. The conditions for the stable flow of the drawing process have been found, which are determined by the stability of the motion of the glass mass in the formation zone to small uncontrolled perturbations. The sensitivity of the formation zone to perturbations of the drawing speed and viscosity is estimated. Experimental models of the drawing process, based on the regression analysis methods, have been obtained. These models make it possible to customize a specific production process to obtain light guides of the required quality. They allow one to find the optimum combination of process parameters in the chosen area and to determine the required accuracy of maintaining them at a specified level.

  6. [Definition and stabilization of processes II. Clinical Processes in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen

    2015-01-01

    New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.

  7. Understanding patients' behavioral intentions: evidence from Iran's private hospitals industry.

    PubMed

    Zarei, Ehsan; Arab, Mohammad; Tabatabaei, Seyed Mahmoud Ghazi; Rashidian, Arash; Forushani, Abbas Rahimi; Khabiri, Roghayeh

    2014-01-01

    In the ever-increasing competitive market of private hospital industry, creating a strong relationship with the customers that shapes patients' loyalty has been considered a key factor in obtaining market share. The purpose of this paper is to test a model of customer loyalty among patients of private hospitals in Iran. This cross-sectional study was carried out in Tehran, the capital of the Islamic Republic of Iran in 2010. The study samples composed of 969 patients who were consecutively selected from eight private hospitals. The survey instrument was designed based on a review of the related literature and included 36 items. Data analysis was performed using structural equation modeling. For the service quality construct, three dimensions extracted: Process, interaction, and environment. Both process and interaction quality had significant effects on perceived value. Perceived value along with the process and interaction quality were the most important antecedents of patient overall satisfaction. The direct effect of the process and interaction quality on behavioral intentions was insignificant. Perceived value and patient overall satisfaction were the direct antecedents of patient behavioral intentions and the mediators between service quality and behavioral intentions. Environment quality of service delivery had no significant effect on perceived value, overall satisfaction, and behavioral intentions. Contrary to previous similar studies, the role of service quality was investigated not in a general sense, but in the form of three types of qualities including quality of environment, quality of process, and quality of interaction.

  8. Dynamic neural network-based methods for compensation of nonlinear effects in multimode communication lines

    NASA Astrophysics Data System (ADS)

    Sidelnikov, O. S.; Redyuk, A. A.; Sygletos, S.

    2017-12-01

    We consider neural network-based schemes of digital signal processing. It is shown that the use of a dynamic neural network-based scheme of signal processing ensures an increase in the optical signal transmission quality in comparison with that provided by other methods for nonlinear distortion compensation.

  9. Total Quality Management in Higher Education.

    ERIC Educational Resources Information Center

    Sherr, Lawrence A.; Lozier, G. Gredgory

    1991-01-01

    Total Quality Management, based on theories of W. Edward Deming and others, is a style of management using continuous process improvement characterized by mission and customer focus, a systematic approach to operations, vigorous development of human resources, long-term thinking, and a commitment to ensuring quality. The values espoused by this…

  10. A School-Based Quality Improvement Program.

    ERIC Educational Resources Information Center

    Rappaport, Lewis A.

    1993-01-01

    As one Brooklyn high school discovered, quality improvement begins with administrator commitment and participants' immersion in the literature. Other key elements include ongoing training of personnel involved in the quality-improvement process, tools such as the Deming Cycle (plan-do-check-act), voluntary and goal-oriented teamwork, and a worthy…

  11. Managing Change from a Quality Perspective.

    ERIC Educational Resources Information Center

    Snyder, Karolyn J.

    This paper presents findings of a study that examined the change process in 28 schools, with a focus on how principals went about transforming traditional school-work cultures into quality systems. The principals had participated in Managing Productive Schools (MPS), a comprehensive systems-approach program based on quality management concepts.…

  12. Improving patient safety through quality assurance.

    PubMed

    Raab, Stephen S

    2006-05-01

    Anatomic pathology laboratories use several quality assurance tools to detect errors and to improve patient safety. To review some of the anatomic pathology laboratory patient safety quality assurance practices. Different standards and measures in anatomic pathology quality assurance and patient safety were reviewed. Frequency of anatomic pathology laboratory error, variability in the use of specific quality assurance practices, and use of data for error reduction initiatives. Anatomic pathology error frequencies vary according to the detection method used. Based on secondary review, a College of American Pathologists Q-Probes study showed that the mean laboratory error frequency was 6.7%. A College of American Pathologists Q-Tracks study measuring frozen section discrepancy found that laboratories improved the longer they monitored and shared data. There is a lack of standardization across laboratories even for governmentally mandated quality assurance practices, such as cytologic-histologic correlation. The National Institutes of Health funded a consortium of laboratories to benchmark laboratory error frequencies, perform root cause analysis, and design error reduction initiatives, using quality assurance data. Based on the cytologic-histologic correlation process, these laboratories found an aggregate nongynecologic error frequency of 10.8%. Based on gynecologic error data, the laboratory at my institution used Toyota production system processes to lower gynecologic error frequencies and to improve Papanicolaou test metrics. Laboratory quality assurance practices have been used to track error rates, and laboratories are starting to use these data for error reduction initiatives.

  13. Quality risk management in pharmaceutical development.

    PubMed

    Charoo, Naseem Ahmad; Ali, Areeg Anwer

    2013-07-01

    The objective of ICH Q8, Q9 and Q10 documents is application of systemic and science based approach to formulation development for building quality into product. There is always some uncertainty in new product development. Good risk management practice is essential for success of new product development in decreasing this uncertainty. In quality by design paradigm, the product performance properties relevant to the patient are predefined in target product profile (TPP). Together with prior knowledge and experience, TPP helps in identification of critical quality attributes (CQA's). Initial risk assessment which identifies risks to these CQA's provides impetus for product development. Product and process are designed to gain knowledge about these risks, devise strategies to eliminate or mitigate these risks and meet objectives set in TPP. By laying more emphasis on high risk events the protection level of patient is increased. The process being scientifically driven improves the transparency and reliability of the manufacturer. The focus on risk to the patient together with flexible development approach saves invaluable resources, increases confidence on quality and reduces compliance risk. The knowledge acquired in analysing risks to CQA's permits construction of meaningful design space. Within the boundaries of the design space, variation in critical material characteristics and process parameters must be managed in order to yield a product having the desired characteristics. Specifications based on product and process understanding are established such that product will meet the specifications if tested. In this way, the product is amenable to real time release, since specifications only confirm quality but they do not serve as a means of effective process control.

  14. Error Sources in Proccessing LIDAR Based Bridge Inspection

    NASA Astrophysics Data System (ADS)

    Bian, H.; Chen, S. E.; Liu, W.

    2017-09-01

    Bridge inspection is a critical task in infrastructure management and is facing unprecedented challenges after a series of bridge failures. The prevailing visual inspection was insufficient in providing reliable and quantitative bridge information although a systematic quality management framework was built to ensure visual bridge inspection data quality to minimize errors during the inspection process. The LiDAR based remote sensing is recommended as an effective tool in overcoming some of the disadvantages of visual inspection. In order to evaluate the potential of applying this technology in bridge inspection, some of the error sources in LiDAR based bridge inspection are analysed. The scanning angle variance in field data collection and the different algorithm design in scanning data processing are the found factors that will introduce errors into inspection results. Besides studying the errors sources, advanced considerations should be placed on improving the inspection data quality, and statistical analysis might be employed to evaluate inspection operation process that contains a series of uncertain factors in the future. Overall, the development of a reliable bridge inspection system requires not only the improvement of data processing algorithms, but also systematic considerations to mitigate possible errors in the entire inspection workflow. If LiDAR or some other technology can be accepted as a supplement for visual inspection, the current quality management framework will be modified or redesigned, and this would be as urgent as the refine of inspection techniques.

  15. Evolving the ECSS Standards and their Use: Experience Based on Industrial Case Studies

    NASA Astrophysics Data System (ADS)

    Feldt, R.; Ahmad, E.; Raza, B.; Hult, E.; Nordebäck, T.

    2009-05-01

    This paper introduces two case studies conducted at two Swedish companies developing software for the space industry. The overall goal of the project is to evaluate if current use of ECSS is cost efficient and if there are ways to make the process leaner while maintaining quality. The case studies reported on here focused on how the ECSS standard was used by the companies and how that affected software development processes and software quality. This paper describes the results and recommendations based on identified challenges.

  16. Quality Audit in the Fastener Industry

    NASA Technical Reports Server (NTRS)

    Reagan, John R.

    1995-01-01

    Both the financial and quality communities rely on audits to verify customers records. The financial community is highly structured around three categories of risk, INHERENT RISK, CONTROL RISK, and DETECTION RISK. Combined, the product of these three categories constitute the AUDIT RISK. The financial community establishes CONTROL RISK based in large part on a systems level understanding of the process flow. This system level understanding is best expressed in a flowchart. The quality community may be able to adopt this structure and thereby reduce cost while maintaining and enhancing quality. The quality community should attempt to flowchart the systems level quality process before beginning substantive testing. This theory needs to be applied in several trial cases to prove or disprove this hypothesis

  17. Quality audit in the fastener industry

    NASA Astrophysics Data System (ADS)

    Reagan, John R.

    1995-09-01

    Both the financial and quality communities rely on audits to verify customers records. The financial community is highly structured around three categories of risk, INHERENT RISK, CONTROL RISK, and DETECTION RISK. Combined, the product of these three categories constitute the AUDIT RISK. The financial community establishes CONTROL RISK based in large part on a systems level understanding of the process flow. This system level understanding is best expressed in a flowchart. The quality community may be able to adopt this structure and thereby reduce cost while maintaining and enhancing quality. The quality community should attempt to flowchart the systems level quality process before beginning substantive testing. This theory needs to be applied in several trial cases to prove or disprove this hypothesis

  18. The quality dilemma.

    PubMed

    Lucassen, Peter

    2007-06-01

    In the language and logic of the free market, providers of health care will have to demonstrate the quality of their work. However, in this setting quality is only interpreted in quantitative ways and consequently does not necessarily do justice to good physicians. Moreover, both outcome measures and process measures have serious drawbacks. An emphasis on outcome measures will disadvantage physicians working in deprived areas and doctors managing more complicated cases. Although process measures give the most direct information on the physician's performance, their evidence base is not always as straightforward as commonly supposed. Finally, measurement of quality indicators is complicated and time consuming. Physicians should be aware of the drawbacks of quality measurement and of the poor effects of quality improvement strategies on patient outcomes.

  19. Socially Shared Regulation in Collaborative Groups: An Analysis of the Interplay between Quality of Social Regulation and Group Processes

    ERIC Educational Resources Information Center

    Rogat, Toni Kempler; Linnenbrink-Garcia, Lisa

    2011-01-01

    This study extends prior research on both individual self-regulation and socially shared regulation during group learning to examine the range and quality of the cognitive and behavioral social regulatory sub-processes employed by six small collaborative groups of upper-elementary students (n = 24). Qualitative analyses were conducted based on…

  20. Total quality management: It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.

  1. Intelligent Systems Approaches to Product Sound Quality Analysis

    NASA Astrophysics Data System (ADS)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach. Next, an unsupervised jury clustering algorithm is used to identify and classify subgroups within a jury who have conflicting preferences. In addition, a nested Artificial Neural Network (ANN) architecture is developed to predict subjective preference based on objective sound quality metrics, in the presence of non-linear preferences. Finally, statistical decomposition and correlation algorithms are reviewed that can help an analyst establish a clear understanding of the variability of the product sounds used as inputs into the jury study and to identify correlations between preference scores and sound quality metrics in the presence of non-linearities.

  2. Geodynamics branch data base for main magnetic field analysis

    NASA Technical Reports Server (NTRS)

    Langel, Robert A.; Baldwin, R. T.

    1991-01-01

    The data sets used in geomagnetic field modeling at GSFC are described. Data are measured and obtained from a variety of information and sources. For clarity, data sets from different sources are categorized and processed separately. The data base is composed of magnetic observatory data, surface data, high quality aeromagnetic, high quality total intensity marine data, satellite data, and repeat data. These individual data categories are described in detail in a series of notebooks in the Geodynamics Branch, GSFC. This catalog reviews the original data sets, the processing history, and the final data sets available for each individual category of the data base and is to be used as a reference manual for the notebooks. Each data type used in geomagnetic field modeling has varying levels of complexity requiring specialized processing routines for satellite and observatory data and two general routines for processing aeromagnetic, marine, land survey, and repeat data.

  3. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  4. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  5. [Establishment of design space for production process of traditional Chinese medicine preparation].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Qiao, Yan-Jiang; Wu, Zhi-Sheng; Lin, Zhao-Zhou

    2013-03-01

    The philosophy of quality by design (QbD) is now leading the changes in the drug manufacturing mode from the conventional test-based approach to the science and risk based approach focusing on the detailed research and understanding of the production process. Along with the constant deepening of the understanding of the manufacturing process, the design space will be determined, and the emphasis of quality control will be shifted from the quality standards to the design space. Therefore, the establishment of the design space is core step in the implementation of QbD, and it is of great importance to study the methods for building the design space. This essay proposes the concept of design space for the production process of traditional Chinese medicine (TCM) preparations, gives a systematic introduction of the concept of the design space, analyzes the feasibility and significance to build the design space in the production process of traditional Chinese medicine preparations, and proposes study approaches on the basis of examples that comply with the characteristics of traditional Chinese medicine preparations, as well as future study orientations.

  6. Scientific and Regulatory Considerations in Solid Oral Modified Release Drug Product Development.

    PubMed

    Li, Min; Sander, Sanna; Duan, John; Rosencrance, Susan; Miksinski, Sarah Pope; Yu, Lawrence; Seo, Paul; Rege, Bhagwant

    2016-11-01

    This review presents scientific and regulatory considerations for the development of solid oral modified release (MR) drug products. It includes a rationale for patient-focused development based on Quality-by-Design (QbD) principles. Product and process understanding of MR products includes identification and risk-based evaluation of critical material attributes (CMAs), critical process parameters (CPPs), and their impact on critical quality attributes (CQAs) that affect the clinical performance. The use of various biopharmaceutics tools that link the CQAs to a predictable and reproducible clinical performance for patient benefit is emphasized. Product and process understanding lead to a more comprehensive control strategy that can maintain product quality through the shelf life and the lifecycle of the drug product. The overall goal is to develop MR products that consistently meet the clinical objectives while mitigating the risks to patients by reducing the probability and increasing the detectability of CQA failures.

  7. The quality of paper-based versus electronic nursing care plan in Australian aged care homes: A documentation audit study.

    PubMed

    Wang, Ning; Yu, Ping; Hailey, David

    2015-08-01

    The nursing care plan plays an essential role in supporting care provision in Australian aged care. The implementation of electronic systems in aged care homes was anticipated to improve documentation quality. Standardized nursing terminologies, developed to improve communication and advance the nursing profession, are not required in aged care practice. The language used by nurses in the nursing care plan and the effect of the electronic system on documentation quality in residential aged care need to be investigated. To describe documentation practice for the nursing care plan in Australian residential aged care homes and to compare the quantity and quality of documentation in paper-based and electronic nursing care plans. A nursing documentation audit was conducted in seven residential aged care homes in Australia. One hundred and eleven paper-based and 194 electronic nursing care plans, conveniently selected, were reviewed. The quantity of documentation in a care plan was determined by the number of phrases describing a resident problem and the number of goals and interventions. The quality of documentation was measured using 16 relevant questions in an instrument developed for the study. There was a tendency to omit 'nursing problem' or 'nursing diagnosis' in the nursing process by changing these terms (used in the paper-based care plan) to 'observation' in the electronic version. The electronic nursing care plan documented more signs and symptoms of resident problems and evaluation of care than the paper-based format (48.30 vs. 47.34 out of 60, P<0.01), but had a lower total mean quality score. The electronic care plan contained fewer problem or diagnosis statements, contributing factors and resident outcomes than the paper-based system (P<0.01). Both types of nursing care plan were weak in documenting measurable and concrete resident outcomes. The overall quality of documentation content for the nursing process was no better in the electronic system than in the paper-based system. Omission of the nursing problem or diagnosis from the nursing process may reflect a range of factors behind the practice that need to be understood. Further work is also needed on qualitative aspects of the nurse care plan, nurses' attitudes towards standardized terminologies and the effect of different documentation practice on care quality and resident outcomes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Quality Measures for the Care of Patients with Insomnia

    PubMed Central

    Edinger, Jack D.; Buysse, Daniel J.; Deriy, Ludmila; Germain, Anne; Lewin, Daniel S.; Ong, Jason C.; Morgenthaler, Timothy I.

    2015-01-01

    The American Academy of Sleep Medicine (AASM) commissioned five Workgroups to develop quality measures to optimize management and care for patients with common sleep disorders including insomnia. Following the AASM process for quality measure development, this document describes measurement methods for two desirable outcomes of therapy, improving sleep quality or satisfaction, and improving daytime function, and for four processes important to achieving these goals. To achieve the outcome of improving sleep quality or satisfaction, pre- and post-treatment assessment of sleep quality or satisfaction and providing an evidence-based treatment are recommended. To realize the outcome of improving daytime functioning, pre- and post-treatment assessment of daytime functioning, provision of an evidence-based treatment, and assessment of treatment-related side effects are recommended. All insomnia measures described in this report were developed by the Insomnia Quality Measures Workgroup and approved by the AASM Quality Measures Task Force and the AASM Board of Directors. The AASM recommends the use of these measures as part of quality improvement programs that will enhance the ability to improve care for patients with insomnia. Citation: Edinger JD, Buysse DJ, Deriy L, Germain A, Lewin DS, Ong JC, Morgenthaler TI. Quality measures for the care of patients with insomnia. J Clin Sleep Med 2015;11(3):311–334. PMID:25700881

  9. Cost-effectiveness of an electronic clinical decision support system for improving quality of antenatal and childbirth care in rural Tanzania: an intervention study.

    PubMed

    Saronga, Happiness Pius; Duysburgh, Els; Massawe, Siriel; Dalaba, Maxwell Ayindenaba; Wangwe, Peter; Sukums, Felix; Leshabari, Melkizedeck; Blank, Antje; Sauerborn, Rainer; Loukanova, Svetla

    2017-08-07

    QUALMAT project aimed at improving quality of maternal and newborn care in selected health care facilities in three African countries. An electronic clinical decision support system was implemented to support providers comply with established standards in antenatal and childbirth care. Given that health care resources are limited and interventions differ in their potential impact on health and costs (efficiency), this study aimed at assessing cost-effectiveness of the system in Tanzania. This was a quantitative pre- and post- intervention study involving 6 health centres in rural Tanzania. Cost information was collected from health provider's perspective. Outcome information was collected through observation of the process of maternal care. Incremental cost-effectiveness ratios for antenatal and childbirth care were calculated with testing of four models where the system was compared to the conventional paper-based approach to care. One-way sensitivity analysis was conducted to determine whether changes in process quality score and cost would impact on cost-effectiveness ratios. Economic cost of implementation was 167,318 USD, equivalent to 27,886 USD per health center and 43 USD per contact. The system improved antenatal process quality by 4.5% and childbirth care process quality by 23.3% however these improvements were not statistically significant. Base-case incremental cost-effectiveness ratios of the system were 2469 USD and 338 USD per 1% change in process quality for antenatal and childbirth care respectively. Cost-effectiveness of the system was sensitive to assumptions made on costs and outcomes. Although the system managed to marginally improve individual process quality variables, it did not have significant improvement effect on the overall process quality of care in the short-term. A longer duration of usage of the electronic clinical decision support system and retention of staff are critical to the efficiency of the system and can reduce the invested resources. Realization of gains from the system requires effective implementation and an enabling healthcare system. Registered clinical trial at www.clinicaltrials.gov ( NCT01409824 ). Registered May 2009.

  10. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. What Is Nursing Home Quality and How Is It Measured?

    PubMed Central

    Castle, Nicholas G.; Ferguson, Jamie C.

    2010-01-01

    Purpose: In this commentary, we examine nursing home quality and indicators that have been used to measure nursing home quality. Design and Methods: A brief review of the history of nursing home quality is presented that provides some context and insight into currently used quality indicators. Donabedian's structure, process, and outcome (SPO) model is used to frame the discussion. Current quality indicators and quality initiatives are discussed, including those included in the Facility Quality Indicator Profile Report, Nursing Home Compare, deficiency citations included as part of Medicare/Medicaid certification, and the Advancing Excellence Campaign. Results: Current quality indicators are presented as a mix of structural, process, and outcome measures, each of which has noted advantages and disadvantages. We speculate on steps that need to be taken in the future to address and potentially improve the quality of care provided by nursing homes, including report cards, pay for performance, market-based incentives, and policy developments in the certification process. Areas for future research are identified throughout the review. Implications: We conclude that improvements in nursing home quality have likely occurred, but improvements are still needed. PMID:20631035

  12. Vibronic spectra of the p-benzoquinone radical anion and cation: a matrix isolation and computational study.

    PubMed

    Piech, Krzysztof; Bally, Thomas; Ichino, Takatoshi; Stanton, John

    2014-02-07

    The electronic and vibrational absorption spectra of the radical anion and cation of p-benzoquinone (PBQ) in an Ar matrix between 500 and 40,000 cm(-1) are presented and discussed in detail. Of particular interest is the radical cation, which shows very unusual spectroscopic features that can be understood in terms of vibronic coupling between the ground and a very low-lying excited state. The infrared spectrum of PBQ˙(+) exhibits a very conspicuous and complicated pattern of features above 1900 cm(-1) that is due to this electronic transition, and offers an unusually vivid demonstration of the effects of vibronic coupling in what would usually be a relatively simple region of the electromagnetic spectrum associated only with vibrational transitions. As expected, the intensities of most of the IR transitions leading to levels that couple the ground to the very low-lying first excited state of PBQ˙(+) increase by large factors upon ionization, due to "intensity borrowing" from the D0 → D1 electronic transition. A notable exception is the antisymmetric C=O stretching vibration, which contributes significantly to the vibronic coupling, but has nevertheless quite small intensity in the cation spectrum. This surprising feature is rationalized on the basis of a simple perturbation analysis.

  13. The association between parental mental health and behavioral disorders in pre-school children

    PubMed Central

    Karimzadeh, Mansoureh; Rostami, Mohammad; Teymouri, Robab; Moazzen, Zahra; Tahmasebi, Siyamak

    2017-01-01

    Background and Aim Behavioral disorders among children reflect psychological problems of parents, as mental illness of either parent would increase the likelihood of mental disorder in the child. In view of the negative relationship between parents’ and children’s illness, the current study intended to determine the correlation between mental health of parents and behavioral disorders of pre-school children. Methods The present descriptive-correlational research studied 80 children registered at pre-school centers in Pardis Township, Tehran, Iran during 2014–2015 using convenience sampling. The research tools included General Health Questionnaire (GHQ) and Preschool Behavior Questionnaire (PBQ). The resulted data were analyzed using Pearson Product-moment Correlation Coefficient and regression analysis in SPSS 21. Results The research results showed that there was a significant positive correlation between all dimensions of mental health of parents with general behavioral disorders (p<0.001). The results of the regression analysis showed that parents’ depression was the first and the only predictive variable of behavioral disorders in children with 26.8% predictive strength. Conclusion Given the strong relationship between children’s behavioral disorders and parents’ general health, and the significant role of parents’ depression in children’s behavioral disorders, it seems necessary to take measures to decrease the impact of parents’ disorders on children. PMID:28848622

  14. In silico validation and structure activity relationship study of a series of pyridine-3-carbohydrazide derivatives as potential anticonvulsants in generalized and partial seizures.

    PubMed

    Sinha, Reema; Sara, Udai Vir Singh; Khosa, Ratan Lal; Stables, James; Jain, Jainendra

    2013-06-01

    A series of twelve compounds (Compounds RNH1-RNH12) of acid hydrazones of pyridine-3-carbohydrazide or nicotinic acid hydrazide was synthesized and evaluated for anticonvulsant activity by MES, scPTZ, minimal clonic seizure and corneal kindling seizure test. Neurotoxicity was also determined for these compounds by rotarod test. Results showed that halogen substitution at meta and para position of phenyl ring exhibited better protection than ortho substitution. Compounds RNH4 and RNH12, were found to be the active analogs displaying 6Hz ED50 of 75.4 and 14.77 mg/kg while the corresponding MES ED50 values were 113.4 and 29.3 mg/kg respectively. In addition, compound RNH12 also showed scPTZ ED50 of 54.2 mg/kg. In the series, compound RNH12 with trifluoromethoxy substituted phenyl ring was the most potent analog exhibiting protection in all four animal models of epilepsy. Molecular docking study has also shown significant binding interactions of these two compounds with 1OHV, 2A1H and 1PBQ receptors. Thus, N-[(meta or para halogen substituted) benzylidene] pyridine-3-carbohydrazides could be used as lead compounds in anticonvulsant drug design and discovery.

  15. Establishment of an unrelated umbilical cord blood bank qualification program: ensuring quality while meeting Food and Drug Administration vendor qualification requirements.

    PubMed

    Rabe, Fran; Kadidlo, Diane; Van Orsow, Lisa; McKenna, David

    2013-10-01

    Qualification of a cord blood bank (CBB) is a complex process that includes evaluation of multiple aspects of donor screening and testing, processing, accreditation and approval by professional cell therapy groups, and results of received cord blood units. The University of Minnesota Medical Center Cell Therapy Laboratory has established a CBB vendor qualification process to ensure the CBB meets established regulatory and quality requirements. The deployed qualification of CBBs is based on retrospective and prospective review of the CBB. Forty-one CBBs were evaluated retrospectively: seven CBBs were disqualified based on failed quality control (QC) results. Eight CBBs did not meet the criteria for retrospective qualification because fewer than 3 cord blood units were received and the CBB was not accredited. As of March 2012, three US and one non-US CBBs have been qualified prospectively. One CBB withdrew from the qualification process after successful completion of the comprehensive survey and subsequent failure of the provided QC unit to pass the minimum criteria. One CBB failed the prospective qualification process based on processing methods that were revealed during the paper portion of the evaluation. A CBB qualification process is necessary for a transplant center to manage the qualification of the large number of CBBs needed to support a umbilical cord blood transplantation program. A transplant center that has utilized cord blood for a number of years before implementation of a qualification process should use a retrospective qualification process along with a prospective process. © 2013 American Association of Blood Banks.

  16. Clinical process analysis and activity-based costing at a heart center.

    PubMed

    Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans

    2002-08-01

    Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.

  17. Dynamic Fuzzy Logic-Based Quality of Interaction within Blended-Learning: The Rare and Contemporary Dance Cases

    ERIC Educational Resources Information Center

    Dias, Sofia B.; Diniz, José A.; Hadjileontiadis, Leontios J.

    2014-01-01

    The combination of the process of pedagogical planning within the Blended (b-) learning environment with the users' quality of interaction ("QoI") with the Learning Management System (LMS) is explored here. The required "QoI" (both for professors and students) is estimated by adopting a fuzzy logic-based modeling approach,…

  18. Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing.

    PubMed

    González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto

    2015-01-01

    to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care.

  19. The network of Shanghai Stroke Service System (4S): A public health-care web-based database using automatic extraction of electronic medical records.

    PubMed

    Dong, Yi; Fang, Kun; Wang, Xin; Chen, Shengdi; Liu, Xueyuan; Zhao, Yuwu; Guan, Yangtai; Cai, Dingfang; Li, Gang; Liu, Jianmin; Liu, Jianren; Zhuang, Jianhua; Wang, Panshi; Chen, Xin; Shen, Haipeng; Wang, David Z; Xian, Ying; Feng, Wuwei; Campbell, Bruce Cv; Parsons, Mark; Dong, Qiang

    2018-07-01

    Background Several stroke outcome and quality control projects have demonstrated the success in stroke care quality improvement through structured process. However, Chinese health-care systems are challenged with its overwhelming numbers of patients, limited resources, and large regional disparities. Aim To improve quality of stroke care to address regional disparities through process improvement. Method and design The Shanghai Stroke Service System (4S) is established as a regional network for stroke care quality improvement in the Shanghai metropolitan area. The 4S registry uses a web-based database that automatically extracts data from structured electronic medical records. Site-specific education and training program will be designed and administrated according to their baseline characteristics. Both acute reperfusion therapies including thrombectomy and thrombolysis in the acute phase and subsequent care were measured and monitored with feedback. Primary outcome is to evaluate the differences in quality metrics between baseline characteristics (including rate of thrombolysis in acute stroke and key performance indicators in secondary prevention) and post-intervention. Conclusions The 4S system is a regional stroke network that monitors the ongoing stroke care quality in Shanghai. This project will provide the opportunity to evaluate the spectrum of acute stroke care and design quality improvement processes for better stroke care. A regional stroke network model for quality improvement will be explored and might be expanded to other large cities in China. Clinical Trial Registration-URL http://www.clinicaltrials.gov . Unique identifier: NCT02735226.

  20. Method and apparatus for in-process sensing of manufacturing quality

    DOEpatents

    Hartman, Daniel A [Santa Fe, NM; Dave, Vivek R [Los Alamos, NM; Cola, Mark J [Santa Fe, NM; Carpenter, Robert W [Los Alamos, NM

    2005-02-22

    A method for determining the quality of an examined weld joint comprising the steps of providing acoustical data from the examined weld joint, and performing a neural network operation on the acoustical data determine the quality of the examined weld joint produced by a friction weld process. The neural network may be trained by the steps of providing acoustical data and observable data from at least one test weld joint, and training the neural network based on the acoustical data and observable data to form a trained neural network so that the trained neural network is capable of determining the quality of a examined weld joint based on acoustical data from the examined weld joint. In addition, an apparatus having a housing, acoustical sensors mounted therein, and means for mounting the housing on a friction weld device so that the acoustical sensors do not contact the weld joint. The apparatus may sample the acoustical data necessary for the neural network to determine the quality of a weld joint.

  1. Method and Apparatus for In-Process Sensing of Manufacturing Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, D.A.; Dave, V.R.; Cola, M.J.

    2005-02-22

    A method for determining the quality of an examined weld joint comprising the steps of providing acoustical data from the examined weld joint, and performing a neural network operation on the acoustical data determine the quality of the examined weld joint produced by a friction weld process. The neural network may be trained by the steps of providing acoustical data and observable data from at least one test weld joint, and training the neural network based on the acoustical data and observable data to form a trained neural network so that the trained neural network is capable of determining themore » quality of a examined weld joint based on acoustical data from the examined weld joint. In addition, an apparatus having a housing, acoustical sensors mounted therein, and means for mounting the housing on a friction weld device so that the acoustical sensors do not contact the weld joint. The apparatus may sample the acoustical data necessary for the neural network to determine the quality of a weld joint.« less

  2. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  3. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  4. Quality by Design (QbD)-Based Process Development for Purification of a Biotherapeutic.

    PubMed

    Rathore, Anurag S

    2016-05-01

    Quality by Design (QbD) is currently receiving increased attention from the pharmaceutical community. As a result, most major biotech manufacturers are in varying stages of implementing QbD. Here, I present a case study that illustrates the step-by-step development using QbD of a purification process for the production of a biosimilar product: granulocyte colony-stimulating factor (GCSF). I also highlight and discuss the advantages that QbD-based process development offers over traditional approaches. The case study is intended to help those who wish to implement QbD towards the development and commercialization of biotech products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Regulatory mechanisms of RNA function: emerging roles of DNA repair enzymes.

    PubMed

    Jobert, Laure; Nilsen, Hilde

    2014-07-01

    The acquisition of an appropriate set of chemical modifications is required in order to establish correct structure of RNA molecules, and essential for their function. Modification of RNA bases affects RNA maturation, RNA processing, RNA quality control, and protein translation. Some RNA modifications are directly involved in the regulation of these processes. RNA epigenetics is emerging as a mechanism to achieve dynamic regulation of RNA function. Other modifications may prevent or be a signal for degradation. All types of RNA species are subject to processing or degradation, and numerous cellular mechanisms are involved. Unexpectedly, several studies during the last decade have established a connection between DNA and RNA surveillance mechanisms in eukaryotes. Several proteins that respond to DNA damage, either to process or to signal the presence of damaged DNA, have been shown to participate in RNA quality control, turnover or processing. Some enzymes that repair DNA damage may also process modified RNA substrates. In this review, we give an overview of the DNA repair proteins that function in RNA metabolism. We also discuss the roles of two base excision repair enzymes, SMUG1 and APE1, in RNA quality control.

  6. PHYTOASSESSMENT OF ESTUARINE SEDIMENTS

    EPA Science Inventory

    Most sediment quality assessments and quality guidelines are based on the laboratory response of single animal species and benthic animal community composition. The role of plants in this hazard assessment process is poorly understood despite the fact that plant-dominated habitat...

  7. Annual Quality Assurance Conference Abstracts by Barbara Marshik

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstracts: Material and Process Conditions for Successful Use of Extractive Sampling Techniques and Certification Methods Errors in the Analysis of NMHC and VOCs in CNG-Based Engine Emissions by Barbara Marshik

  8. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  9. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-08-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  10. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  11. Improving the Quality of Services in Residential Treatment Facilities: A Strength-Based Consultative Review Process

    ERIC Educational Resources Information Center

    Pavkov, Thomas W.; Lourie, Ira S.; Hug, Richard W.; Negash, Sesen

    2010-01-01

    This descriptive case study reports on the positive impact of a consultative review methodology used to conduct quality assurance reviews as part of the Residential Treatment Center Evaluation Project. The study details improvement in the quality of services provided to youth in unmonitored residential treatment facilities. Improvements were…

  12. The Pathways from Parents' Marital Quality to Adolescents' School Adjustment in South Korea

    ERIC Educational Resources Information Center

    Jeong, Yu-Jin; Chun, Young-Ju

    2010-01-01

    This study tested the hypothesized pathways from parents' marital quality to Korean adolescents' school adjustment through the perception of self and parent-child relations. Based on previous literature and two major family theories, the authors hypothesized a path model to explain the process of how parents' marital quality influenced school…

  13. Review of the Primary National Ambient Air Quality Standards for Nitrogen Dioxide: Risk and Exposure Assessment Planning Document

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is conducting a review of the air quality criteria and the primary (health-based) national ambient air quality standards (NAAQS) for nitrogen dioxide (NO2). The major phases of the process for reviewing NAAQS include the following: (...

  14. Review of the Primary National Ambient Air Quality Standard for Sulfur Oxides: Risk and Exposure Assessment Planning Document

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is conducting a review of the air quality criteria and the primary (health-based) national ambient air quality standards (NAAQS) for sulfur oxides (SOx). The major phases of the process for reviewing NAAQS include the following: (1) ...

  15. BatMass: a Java Software Platform for LC-MS Data Visualization in Proteomics and Metabolomics.

    PubMed

    Avtonomov, Dmitry M; Raskind, Alexander; Nesvizhskii, Alexey I

    2016-08-05

    Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC-MS-based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC-MS data are often overlooked, and assessment of an experiment's success is based on some derived metrics such as "the number of identified compounds". The human brain interprets visual data much better than plain text, hence the saying "a picture is worth a thousand words". Here, we present the BatMass software package, which allows for performing quick quality control of raw LC-MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC-MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration.

  16. BatMass: a Java software platform for LC/MS data visualization in proteomics and metabolomics

    PubMed Central

    Avtonomov, Dmitry; Raskind, Alexander; Nesvizhskii, Alexey I.

    2017-01-01

    Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC/MS based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC/MS data are often overlooked and assessment of an experiment's success is based on some derived metrics such as “the number of identified compounds”. Human brain interprets visual data much better than plain text, hence the saying “a picture is worth a thousand words”. Here we present BatMass software package which allows to perform quick quality control of raw LC/MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC/MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration. PMID:27306858

  17. Developing students' worksheets applying soft skill-based scientific approach for improving building engineering students' competencies in vocational high schools

    NASA Astrophysics Data System (ADS)

    Suparno, Sudomo, Rahardjo, Boedi

    2017-09-01

    Experts and practitioners agree that the quality of vocational high schools needs to be greatly improved. Many construction services have voiced their dissatisfaction with today's low-quality vocational high school graduates. The low quality of graduates is closely related to the quality of the teaching and learning process, particularly teaching materials. In their efforts to improve the quality of vocational high school education, the government have implemented Curriculum 2013 (K13) and supplied teaching materials. However, the results of monitoring and evaluation done by the Directorate of Vocational High School, Directorate General of Secondary Education (2014), the provision of tasks for students in the teaching materials was totally inadequate. Therefore, to enhance the quality and the result of the instructional process, there should be provided students' worksheets that can stimulate and improve students' problem-solving skills and soft skills. In order to develop worksheets that can meet the academic requirements, the development needs to be in accordance with an innovative learning approach, which is the soft skill-based scientific approach.

  18. Innovating for quality and value: Utilizing national quality improvement programs to identify opportunities for responsible surgical innovation.

    PubMed

    Woo, Russell K; Skarsgard, Erik D

    2015-06-01

    Innovation in surgical techniques, technology, and care processes are essential for improving the care and outcomes of surgical patients, including children. The time and cost associated with surgical innovation can be significant, and unless it leads to improvements in outcome at equivalent or lower costs, it adds little or no value from the perspective of the patients, and decreases the overall resources available to our already financially constrained healthcare system. The emergence of a safety and quality mandate in surgery, and the development of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP) allow needs-based surgical care innovation which leads to value-based improvement in care. In addition to general and procedure-specific clinical outcomes, surgeons should consider the measurement of quality from the patients' perspective. To this end, the integration of validated Patient Reported Outcome Measures (PROMs) into actionable, benchmarked institutional outcomes reporting has the potential to facilitate quality improvement in process, treatment and technology that optimizes value for our patients and health system. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Rough case-based reasoning system for continues casting

    NASA Astrophysics Data System (ADS)

    Su, Wenbin; Lei, Zhufeng

    2018-04-01

    The continuous casting occupies a pivotal position in the iron and steel industry. The rough set theory and the CBR (case based reasoning, CBR) were combined in the research and implementation for the quality assurance of continuous casting billet to improve the efficiency and accuracy in determining the processing parameters. According to the continuous casting case, the object-oriented method was applied to express the continuous casting cases. The weights of the attributes were calculated by the algorithm which was based on the rough set theory and the retrieval mechanism for the continuous casting cases was designed. Some cases were adopted to test the retrieval mechanism, by analyzing the results, the law of the influence of the retrieval attributes on determining the processing parameters was revealed. A comprehensive evaluation model was established by using the attribute recognition theory. According to the features of the defects, different methods were adopted to describe the quality condition of the continuous casting billet. By using the system, the knowledge was not only inherited but also applied to adjust the processing parameters through the case based reasoning method as to assure the quality of the continuous casting and improve the intelligent level of the continuous casting.

  20. DataFed: A Federated Data System for Visualization and Analysis of Spatio-Temporal Air Quality Data

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.

    2017-12-01

    DataFed is a distributed web-services-based computing environment for accessing, processing, and visualizing atmospheric data in support of air quality science and management. The flexible, adaptive environment facilitates the access and flow of atmospheric data from provider to users by enabling the creation of user-driven data processing/visualization applications. DataFed `wrapper' components, non-intrusively wrap heterogeneous, distributed datasets for access by standards-based GIS web services. The mediator components (also web services) map the heterogeneous data into a spatio-temporal data model. Chained web services provide homogeneous data views (e.g., geospatial, time views) using a global multi-dimensional data model. In addition to data access and rendering, the data processing component services can be programmed for filtering, aggregation, and fusion of multidimensional data. A complete application software is written in a custom made data flow language. Currently, the federated data pool consists of over 50 datasets originating from globally distributed data providers delivering surface-based air quality measurements, satellite observations, emissions data as well as regional and global-scale air quality models. The web browser-based user interface allows point and click navigation and browsing the XYZT multi-dimensional data space. The key applications of DataFed are for exploring spatial pattern of pollutants, seasonal, weekly, diurnal cycles and frequency distributions for exploratory air quality research. Since 2008, DataFed has been used to support EPA in the implementation of the Exceptional Event Rule. The data system is also used at universities in the US, Europe and Asia.

  1. [Risk management--a new aspect of quality assessment in intensive care medicine: first results of an analysis of the DIVI's interdisciplinary quality assessment research group].

    PubMed

    Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch

    2006-10-01

    Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.

  2. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  3. Investigation into image quality difference between total variation and nonlinear sparsifying transform based compressed sensing

    NASA Astrophysics Data System (ADS)

    Dong, Jian; Kudo, Hiroyuki

    2017-03-01

    Compressed sensing (CS) is attracting growing concerns in sparse-view computed tomography (CT) image reconstruction. The most standard approach of CS is total variation (TV) minimization. However, images reconstructed by TV usually suffer from distortions, especially in reconstruction of practical CT images, in forms of patchy artifacts, improper serrate edges and loss of image textures. Most existing CS approaches including TV achieve image quality improvement by applying linear transforms to object image, but linear transforms usually fail to take discontinuities into account, such as edges and image textures, which is considered to be the key reason for image distortions. Actually, discussions on nonlinear filter based image processing has a long history, leading us to clarify that the nonlinear filters yield better results compared to linear filters in image processing task such as denoising. Median root prior was first utilized by Alenius as nonlinear transform in CT image reconstruction, with significant gains obtained. Subsequently, Zhang developed the application of nonlocal means-based CS. A fact is gradually becoming clear that the nonlinear transform based CS has superiority in improving image quality compared with the linear transform based CS. However, it has not been clearly concluded in any previous paper within the scope of our knowledge. In this work, we investigated the image quality differences between the conventional TV minimization and nonlinear sparsifying transform based CS, as well as image quality differences among different nonlinear sparisying transform based CSs in sparse-view CT image reconstruction. Additionally, we accelerated the implementation of nonlinear sparsifying transform based CS algorithm.

  4. [Suggestions to strengthen quality management of herbal decoction pieces--based on production chain of herbal decoction pieces].

    PubMed

    Liu, Yan; Nie, Qing; Chen, Jing

    2015-08-01

    With the development of society and the improvement of people's living standards, the effect of Chinese medicine in treatment and health care is more and more prominent. The herbal decoction pieces are the important part of Chinese medicine,it can be applied directly to clinical treatment and it's also the raw material of Chinese patent medicine. Therefore, the quality of herbal decoction pieces is quite important. The parts of the production of herbal decoction pieces are numerous, and there are possibilities of adverse effects on the quality of the herbal decoction pieces in every part. In this paper, we based on the production chain of herbal decoction pieces, analyzed the main problem that affect the quality of herbal decoction pieces in the part of selection of Chinese herbal medicines, planting, purchasing, processing, packaging, storage and transport, such as the poor quality of seed and seedlings of plant-based Chinese medicines, some plants left their place of origin and have been introduced in the place that is not suitable for this kind of plant, the insufficient growth time and the excessive harmful substances. The purchasers and the accepters lack of professional knowledge and professional ethics. The mechanism of processing is not clear, the standards can not be uniformed, and lack of qualified person in processing, etc. So we suggest: intensify the basic research of key scientific issues. Improve the quality of persons who work in herbal decoction pieces; Establish an "integration" mode of operation in herbal decoction pieces enterprise; Breeding high quality plant resources, establish the large-scale planting basement; Make the packing of herbal decoction pieces standard; Establish the modernization traditional Chinese medicine logistics enterprise.

  5. An interval programming model for continuous improvement in micro-manufacturing

    NASA Astrophysics Data System (ADS)

    Ouyang, Linhan; Ma, Yizhong; Wang, Jianjun; Tu, Yiliu; Byun, Jai-Hyun

    2018-03-01

    Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers' preference information are typically neglected. This article proposes an economic continuous improvement strategy based on an interval programming model. The proposed strategy differs from previous studies in two ways. First, an interval programming model is proposed to measure the quality level, where decision makers' preference information is considered in order to determine the weight of location and dispersion effects. Second, the proposed strategy is a more flexible approach since it considers the trade-off between the quality level and the associated costs, and leaves engineers a larger decision space through adjusting the quality level. The proposed strategy is compared with its conventional counterparts using an Nd:YLF laser beam micro-drilling process.

  6. Improving the Quality of Academic Services through Implementation of Internal Quality Assurance System in State Institute of Islamic Studies STS Jambi

    ERIC Educational Resources Information Center

    Iskandar

    2017-01-01

    Implementation of quality assurance systems in IAIN STS Jambi implemented in early 2012, through the build system of internal quality assurance based on ISO 9001: 2008, in the process of implementation required strong reasons behind not growing atmosphere of academic standards of accreditation of study programs and institutions that are reflected…

  7. Quality assurance paradigms for artificial intelligence in modelling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oren, T.I.

    1987-04-01

    New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.

  8. The Brazilian Air Force Uniform Distribution Process: Using Lean Thinking, Statistical Process Control and Theory of Constraints to Address Improvement Opportunities

    DTIC Science & Technology

    2015-03-26

    universal definition” (Evans & Lindsay, 1996). Heizer and Render (2010) argue that several definitions of this term are user-based, meaning, that quality...for example, really good ice cream has high butterfat levels.” ( Heizer & Render , 2010). Garvin, in his Competing in Eight Dimensions of Quality...Montgomery, 2005). As for definition purposes, the concept adopted by this research was provided by Heizer and Render (2010), for whom Statistical Process

  9. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  10. A quality by design approach using artificial intelligence techniques to control the critical quality attributes of ramipril tablets manufactured by wet granulation.

    PubMed

    Aksu, Buket; Paradkar, Anant; de Matas, Marcel; Özer, Özgen; Güneri, Tamer; York, Peter

    2013-02-01

    Quality by design (QbD) is an essential part of the modern approach to pharmaceutical quality. This study was conducted in the framework of a QbD project involving ramipril tablets. Preliminary work included identification of the critical quality attributes (CQAs) and critical process parameters (CPPs) based on the quality target product profiles (QTPPs) using the historical data and risk assessment method failure mode and effect analysis (FMEA). Compendial and in-house specifications were selected as QTPPs for ramipril tablets. CPPs that affected the product and process were used to establish an experimental design. The results thus obtained can be used to facilitate definition of the design space using tools such as design of experiments (DoE), the response surface method (RSM) and artificial neural networks (ANNs). The project was aimed at discovering hidden knowledge associated with the manufacture of ramipril tablets using a range of artificial intelligence-based software, with the intention of establishing a multi-dimensional design space that ensures consistent product quality. At the end of the study, a design space was developed based on the study data and specifications, and a new formulation was optimized. On the basis of this formulation, a new laboratory batch formulation was prepared and tested. It was confirmed that the explored formulation was within the design space.

  11. Paraho environmental data. Part I. Process characterization. Par II. Air quality. Part III. Water quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heistand, R.N.; Atwood, R.A.; Richardson, K.L.

    1980-06-01

    From 1973 to 1978, Development Engineering, Inc. (DEI), a subsidiary of Paraho Development Corporation, demostrated the Paraho technology for surface oil shale retorting at Anvil Points, Colorado. A considerable amount of environmentally-related research was also conducted. This body of data represents the most comprehensive environmental data base relating to surface retorting that is currently available. In order to make this information available, the DOE Office of Environment has undertaken to compile, assemble, and publish this environmental data. The compilation has been prepared by DEI. This report includes the process characterization, air quality, and water quality categories.

  12. Quality assurance and management in microelectronics companies: ISO 9000 versus Six Sigma

    NASA Astrophysics Data System (ADS)

    Lupan, Razvan; Kobi, Abdessamad; Robledo, Christian; Bacivarov, Ioan; Bacivarov, Angelica

    2009-01-01

    A strategy for the implementation of the Six Sigma method as an improvement solution for the ISO 9000:2000 Quality Standard is proposed. Our approach is focused on integrating the DMAIC cycle of the Six Sigma method with the PDCA process approach, highly recommended by the standard ISO 9000:2000. The Six Sigma steps applied to each part of the PDCA cycle are presented in detail, giving some tools and training examples. Based on this analysis the authors conclude that applying Six Sigma philosophy to the Quality Standard implementation process is the best way to achieve the optimal results in quality progress and therefore in customers satisfaction.

  13. SkySat-1: very high-resolution imagery from a small satellite

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Shearn, Michael; Smiley, Byron D.; Chau, Alexandra H.; Levine, Josh; Robinson, M. Dirk

    2014-10-01

    This paper presents details of the SkySat-1 mission, which is the first microsatellite-class commercial earth- observation system to generate sub-meter resolution panchromatic imagery, in addition to sub-meter resolution 4-band pan-sharpened imagery. SkySat-1 was built and launched for an order of magnitude lower cost than similarly performing missions. The low-cost design enables the deployment of a large imaging constellation that can provide imagery with both high temporal resolution and high spatial resolution. One key enabler of the SkySat-1 mission was simplifying the spacecraft design and instead relying on ground- based image processing to achieve high-performance at the system level. The imaging instrument consists of a custom-designed high-quality optical telescope and commercially-available high frame rate CMOS image sen- sors. While each individually captured raw image frame shows moderate quality, ground-based image processing algorithms improve the raw data by combining data from multiple frames to boost image signal-to-noise ratio (SNR) and decrease the ground sample distance (GSD) in a process Skybox calls "digital TDI". Careful qual-ity assessment and tuning of the spacecraft, payload, and algorithms was necessary to generate high-quality panchromatic, multispectral, and pan-sharpened imagery. Furthermore, the framing sensor configuration en- abled the first commercial High-Definition full-frame rate panchromatic video to be captured from space, with approximately 1 meter ground sample distance. Details of the SkySat-1 imaging instrument and ground-based image processing system are presented, as well as an overview of the work involved with calibrating and validating the system. Examples of raw and processed imagery are shown, and the raw imagery is compared to pre-launch simulated imagery used to tune the image processing algorithms.

  14. Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency.

    PubMed

    Shepherd, Jonathan; Frampton, Geoff K; Pickett, Karen; Wyatt, Jeremy C

    2018-01-01

    To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review 'innovations'. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality.

  15. Determination of the smoke-plume heights and their dynamics with ground-based scanning LIDAR

    Treesearch

    V. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao

    2015-01-01

    Lidar-data processing techniques are analyzed, which allow determining smoke-plume heights and their dynamics and can be helpful for the improvement of smoke dispersion and air quality models. The data processing algorithms considered in the paper are based on the analysis of two alternative characteristics related to the smoke dispersion process: the regularized...

  16. Prevention of falls, malnutrition and pressure ulcers among older persons - nursing staff's experiences of a structured preventive care process.

    PubMed

    Lannering, Christina; Ernsth Bravell, Marie; Johansson, Linda

    2017-05-01

    A structured and systematic care process for preventive work, aimed to reduce falls, pressure ulcers and malnutrition among older people, has been developed in Sweden. The process involves risk assessment, team-based interventions and evaluation of results. Since development, this structured work process has become web-based and has been implemented in a national quality registry called 'Senior Alert' and used countrywide. The aim of this study was to describe nursing staff's experience of preventive work by using the structured preventive care process as outlined by Senior Alert. Eight focus group interviews were conducted during 2015 including staff from nursing homes and home-based nursing care in three municipalities. The interview material was subjected to qualitative content analysis. In this study, both positive and negative opinions were expressed about the process. The systematic and structured work flow seemed to only partly facilitate care providers to improve care quality by making better clinical assessments, performing team-based planned interventions and learning from results. Participants described lack of reliability in the assessments and varying opinions about the structure. Furthermore, organisational structures limited the preventive work. © 2016 John Wiley & Sons Ltd.

  17. Model-based quality assessment and base-calling for second-generation sequencing data.

    PubMed

    Bravo, Héctor Corrada; Irizarry, Rafael A

    2010-09-01

    Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in sequencing quality. Our model provides these informative estimates readily usable in quality assessment tools while significantly improving base-calling performance. © 2009, The International Biometric Society.

  18. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  19. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Palumbo, Davide; De Finis, Rosa; Galietti, Umberto

    2017-10-11

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength.

  20. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Technical Reports Server (NTRS)

    Strand, Albert A.; Jackson, Darryl J.

    1992-01-01

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  1. Teamwork for Oversight of Processes and Systems (TOPS). Implementation guide for TOPS version 2.0, 10 August 1992

    NASA Astrophysics Data System (ADS)

    Strand, Albert A.; Jackson, Darryl J.

    As the nation redefines priorities to deal with a rapidly changing world order, both government and industry require new approaches for oversight of management systems, particularly for high technology products. Declining defense budgets will lead to significant reductions in government contract management personnel. Concurrently, defense contractors are reducing administrative and overhead staffing to control costs. These combined pressures require bold approaches for the oversight of management systems. In the Spring of 1991, the DPRO and TRW created a Process Action Team (PAT) to jointly prepare a Performance Based Management (PBM) system titled Teamwork for Oversight of Processes and Systems (TOPS). The primary goal is implementation of a performance based management system based on objective data to review critical TRW processes with an emphasis on continuous improvement. The processes are: Finance and Business Systems, Engineering and Manufacturing Systems, Quality Assurance, and Software Systems. The team established a number of goals: delivery of quality products to contractual terms and conditions; ensure that TRW management systems meet government guidance and good business practices; use of objective data to measure critical processes; elimination of wasteful/duplicative reviews and audits; emphasis on teamwork--all efforts must be perceived to add value by both sides and decisions are made by consensus; and synergy and the creation of a strong working trust between TRW and the DPRO. TOPS permits the adjustment of oversight resources when conditions change or when TRW systems performance indicate either an increase or decrease in surveillance is appropriate. Monthly Contractor Performance Assessments (CPA) are derived from a summary of supporting system level and process-level ratings obtained from objective process-level data. Tiered, objective, data-driven metrics are highly successful in achieving a cooperative and effective method of measuring performance. The teamwork-based culture developed by TOPS proved an unequaled success in removing adversarial relationships and creating an atmosphere of continuous improvement in quality processes at TRW. The new working relationship does not decrease the responsibility or authority of the DPRO to ensure contract compliance and it permits both parties to work more effectively to improve total quality and reduce cost. By emphasizing teamwork in developing a stronger approach to efficient management of the defense industrial base TOPS is a singular success.

  2. Retinal image quality assessment based on image clarity and content

    NASA Astrophysics Data System (ADS)

    Abdel-Hamid, Lamiaa; El-Rafei, Ahmed; El-Ramly, Salwa; Michelson, Georg; Hornegger, Joachim

    2016-09-01

    Retinal image quality assessment (RIQA) is an essential step in automated screening systems to avoid misdiagnosis caused by processing poor quality retinal images. A no-reference transform-based RIQA algorithm is introduced that assesses images based on five clarity and content quality issues: sharpness, illumination, homogeneity, field definition, and content. Transform-based RIQA algorithms have the advantage of considering retinal structures while being computationally inexpensive. Wavelet-based features are proposed to evaluate the sharpness and overall illumination of the images. A retinal saturation channel is designed and used along with wavelet-based features for homogeneity assessment. The presented sharpness and illumination features are utilized to assure adequate field definition, whereas color information is used to exclude nonretinal images. Several publicly available datasets of varying quality grades are utilized to evaluate the feature sets resulting in area under the receiver operating characteristic curve above 0.99 for each of the individual feature sets. The overall quality is assessed by a classifier that uses the collective features as an input vector. The classification results show superior performance of the algorithm in comparison to other methods from literature. Moreover, the algorithm addresses efficiently and comprehensively various quality issues and is suitable for automatic screening systems.

  3. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  4. Promoting quality: the health-care organization from a management perspective.

    PubMed

    Glickman, Seth W; Baggett, Kelvin A; Krubert, Christopher G; Peterson, Eric D; Schulman, Kevin A

    2007-12-01

    Although agreement about the need for quality improvement in health care is almost universal, the means of achieving effective improvement in overall care is not well understood. Avedis Donabedian developed the structure-process-outcome framework in which to think about quality-improvement efforts. There is now a robust evidence-base in the quality-improvement literature on process and outcomes, but structure has received considerably less attention. The health-care field would benefit from expanding the current interpretation of structure to include broader perspectives on organizational attributes as primary determinants of process change and quality improvement. We highlight and discuss the following key elements of organizational attributes from a management perspective: (i) executive management, including senior leadership and board responsibilities (ii) culture, (iii) organizational design, (iv) incentive structures and (v) information management and technology. We discuss the relevant contributions from the business and medical literature for each element, and provide this framework as a roadmap for future research in an effort to develop the optimal definition of 'structure' for transforming quality-improvement initiatives.

  5. Is a Quality Course a Worthy Course? Designing for Value and Worth in Online Courses

    ERIC Educational Resources Information Center

    Youger, Robin E.; Ahern, Terence C.

    2015-01-01

    There are many strategies for estimating the effectiveness of instruction. Typically, most methods are based on the student evaluation. Recently a more standardized approach, Quality Matters (QM), has been developed that uses an objectives-based strategy. QM, however, does not account for the learning process, nor for the value and worth of the…

  6. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  7. Opportunities for bio-based packaging technologies to improve the quality and safety of fresh and further processed muscle foods.

    PubMed

    Cutter, Catherine Nettles

    2006-09-01

    It has been well documented that vacuum or modified atmosphere packaging materials, made from polyethylene- or other plastic-based materials, have been found to improve the stability and safety of raw or further processed muscle foods. However, recent research developments have demonstrated the feasibility, utilization, and commercial application of a variety of bio-based polymers or bio-polymers made from a variety of materials, including renewable/sustainable agricultural commodities, and applied to muscle foods. A variety of these bio-based materials have been shown to prevent moisture loss, drip, reduce lipid oxidation and improve flavor attributes, as well as enhancing the handling properties, color retention, and microbial stability of foods. With consumers demanding more environmentally friendly packaging and a desire for more natural products, bio-based films or bio-polymers will continue to play an important role in the food industry by improving the quality of many products, including fresh or further processed muscle foods.

  8. Supplier selection based on complex indicator of finished products quality

    NASA Astrophysics Data System (ADS)

    Chernikova, Anna; Golovkina, Svetlana; Kuzmina, Svetlana; Demenchenok, Tatiana

    2017-10-01

    In the article the authors consider possible directions of solving problems when selecting a supplier for deliveries of raw materials and materials of an industrial enterprise, possible difficulties are analyzed and ways of their solution are suggested. Various methods are considered to improve the efficiency of the supplier selection process based on the analysis of the paper bags supplier selection process for the needs of the construction company. In the article the calculation of generalized indicators and complex indicator, which should include single indicators, formed in groups that reflect different aspects of quality, is presented.

  9. Production system with process quality control: modelling and application

    NASA Astrophysics Data System (ADS)

    Tsou, Jia-Chi

    2010-07-01

    Over the past decade, there has been a great deal of research dedicated to the study of quality and the economics of production. In this article, we develop a dynamic model which is based on the hypothesis of a traditional economic production quantity model. Taguchi's cost of poor quality is used to evaluate the cost of poor quality in the dynamic production system. A practical case from the automotive industry, which uses the Six-sigma DMAIC methodology, is discussed to verify the proposed model. This study shows that there is an optimal value of quality investment to make the production system reach a reasonable quality level and minimise the production cost. Based on our model, the management can adjust its investment in quality improvement to generate considerable financial return.

  10. Material quality development during the automated tow placement process

    NASA Astrophysics Data System (ADS)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  11. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations

    PubMed Central

    Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge. PMID:27313604

  12. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations.

    PubMed

    Castelli, Mauro; Manzoni, Luca; Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge.

  13. An evaluation method of power quality about electrified railways connected to power grid based on PSCAD/EMTDC

    NASA Astrophysics Data System (ADS)

    Liang, Weibin; Ouyang, Sen; Huang, Xiang; Su, Weijian

    2017-05-01

    The existing modeling process of power quality about electrified railways connected to power grid is complicated and the simulation scene is incomplete, so this paper puts forward a novel evaluation method of power quality based on PSCAD/ETMDC. Firstly, a model of power quality about electrified railways connected to power grid is established, which is based on testing report or measured data. The equivalent model of electrified locomotive contains power characteristic and harmonic characteristic, which are substituted by load and harmonic source. Secondly, in order to make evaluation more complete, an analysis scheme has been put forward. The scheme uses a combination of three-dimensions of electrified locomotive, which contains types, working conditions and quantity. At last, Shenmao Railway is taken as example to evaluate the power quality at different scenes, and the result shows electrified railways connected to power grid have significant effect on power quality.

  14. Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.

    2017-12-01

    Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as well as technician experience on quality controlled data products.

  15. Application of ICH Q9 Quality Risk Management Tools for Advanced Development of Hot Melt Coated Multiparticulate Systems.

    PubMed

    Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh

    2017-01-01

    This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Health services research in urology.

    PubMed

    Yu, Hua-Yin; Ulmer, William; Kowalczyk, Keith J; Hu, Jim C

    2011-06-01

    Health services research (HSR) is increasingly important given the focus on patient-centered, cost-effective, high-quality health care. We examine how HSR affects contemporary evidence-based urologic practice and its role in shaping future urologic research and care. PubMed, urologic texts, and lay literature were reviewed for terms pertaining to HSR/outcomes research and urologic disease processes. HSR is a broad discipline that focuses on access, cost, and outcomes of Health care. Its use has been applied to a myriad of urologic conditions to identify deficiencies in access, to evaluate cost-effectiveness of therapies, and to evaluate structural, process, and outcome quality measures. HSR utilizes an evidence-based approach to identify the most effective ways to organize/manage, finance, and deliver high-quality urologic care and to tailor care optimized to individuals.

  17. [Thinking on designation of sham acupuncture in clinical research].

    PubMed

    Pan, Li-Jia; Chen, Bo; Zhao, Xue; Guo, Yi

    2014-01-01

    Randomized controlled trials (RCT) is the source of the raw data of evidence-based medicine. Blind method is adopted in most of the high-quality RCT. Sham acupuncture is the main form of blinded in acupuncture clinical trial. In order to improve the quality of acupuncture clinical trail, based on the necessity of sham acupuncture in clinical research, the current situation as well as the existing problems of sham acupuncture, suggestions were put forward from the aspects of new way and new designation method which can be adopted as reference, and factors which have to be considered during the process of implementing. Various subjective and objective factors involving in the process of trial should be considered, and used of the current international standards, try to be quantification, and carry out strict quality monitoring.

  18. Elucidating the Key Role of a Lewis Base Solvent in the Formation of Perovskite Films Fabricated from the Lewis Adduct Approach.

    PubMed

    Cao, Xiaobing; Zhi, Lili; Li, Yahui; Fang, Fei; Cui, Xian; Yao, Youwei; Ci, Lijie; Ding, Kongxian; Wei, Jinquan

    2017-09-27

    High-quality perovskite films can be fabricated from Lewis acid-base adducts through molecule exchange. Substantial work is needed to fully understand the formation mechanism of the perovskite films, which helps to further improve their quality. Here, we study the formation of CH 3 NH 3 PbI 3 perovskite films by introducing some dimethylacetamide into the PbI 2 /N,N-dimethylformamide solution. We reveal that there are three key processes during the formation of perovskite films through the Lewis acid-base adduct approach: molecule intercalation of solvent into the PbI 2 lattice, molecule exchange between the solvent and CH 3 NH 3 I, and dissolution-recrystallization of the perovskite grains during annealing. The Lewis base solvents play multiple functions in the above processes. The properties of the solvent, including Lewis basicity and boiling point, play key roles in forming smooth perovskite films with large grains. We also provide some rules for choosing Lewis base additives to prepare high-quality perovskite films through the Lewis adduct approach.

  19. Study on Stationarity of Random Load Spectrum Based on the Special Road

    NASA Astrophysics Data System (ADS)

    Yan, Huawen; Zhang, Weigong; Wang, Dong

    2017-09-01

    In the special road quality assessment method, there is a method using a wheel force sensor, the essence of this method is collecting the load spectrum of the car to reflect the quality of road. According to the definition of stochastic process, it is easy to find that the load spectrum is a stochastic process. However, the analysis method and application range of different random processes are very different, especially in engineering practice, which will directly affect the design and development of the experiment. Therefore, determining the type of a random process has important practical significance. Based on the analysis of the digital characteristics of road load spectrum, this paper determines that the road load spectrum in this experiment belongs to a stationary stochastic process, paving the way for the follow-up modeling and feature extraction of the special road.

  20. A Non-Intrusive GMA Welding Process Quality Monitoring System Using Acoustic Sensing.

    PubMed

    Cayo, Eber Huanca; Alfaro, Sadek Crisostomo Absi

    2009-01-01

    Most of the inspection methods used for detection and localization of welding disturbances are based on the evaluation of some direct measurements of welding parameters. This direct measurement requires an insertion of sensors during the welding process which could somehow alter the behavior of the metallic transference. An inspection method that evaluates the GMA welding process evolution using a non-intrusive process sensing would allow not only the identification of disturbances during welding runs and thus reduce inspection time, but would also reduce the interference on the process caused by the direct sensing. In this paper a nonintrusive method for weld disturbance detection and localization for weld quality evaluation is demonstrated. The system is based on the acoustic sensing of the welding electrical arc. During repetitive tests in welds without disturbances, the stability acoustic parameters were calculated and used as comparison references for the detection and location of disturbances during the weld runs.

  1. A quality by design approach to scale-up of high-shear wet granulation process.

    PubMed

    Pandey, Preetanshu; Badawy, Sherif

    2016-01-01

    High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.

  2. A Non-Intrusive GMA Welding Process Quality Monitoring System Using Acoustic Sensing

    PubMed Central

    Cayo, Eber Huanca; Alfaro, Sadek Crisostomo Absi

    2009-01-01

    Most of the inspection methods used for detection and localization of welding disturbances are based on the evaluation of some direct measurements of welding parameters. This direct measurement requires an insertion of sensors during the welding process which could somehow alter the behavior of the metallic transference. An inspection method that evaluates the GMA welding process evolution using a non-intrusive process sensing would allow not only the identification of disturbances during welding runs and thus reduce inspection time, but would also reduce the interference on the process caused by the direct sensing. In this paper a nonintrusive method for weld disturbance detection and localization for weld quality evaluation is demonstrated. The system is based on the acoustic sensing of the welding electrical arc. During repetitive tests in welds without disturbances, the stability acoustic parameters were calculated and used as comparison references for the detection and location of disturbances during the weld runs. PMID:22399990

  3. Finding Quality in "Being Good Enough" Conversations

    ERIC Educational Resources Information Center

    Gibbs, Paul

    2011-01-01

    This article considers how a contribution based on Rorty's philosophy can help in the understanding of quality and its assurance in higher education. The suggestion is that if quality has an edifying purpose, then it should be seen as an ongoing process and might be aptly judged as being just "good enough". This position is argued for it…

  4. Peer Assessment: Judging the Quality of Students' Work by Comments Rather than Marks

    ERIC Educational Resources Information Center

    Davies, Phil

    2006-01-01

    This paper reports the results of a study into the quality of peer feedback provided by students within a computerised peer-assessment environment. The study looks at the creation of a "feedback index" that represents the quality of an essay based upon the feedback provided during a peer-marking process and identifies a significant…

  5. Comparison of case note review methods for evaluating quality and safety in health care.

    PubMed

    Hutchinson, A; Coster, J E; Cooper, K L; McIntosh, A; Walters, S J; Bath, P A; Pearson, M; Young, T A; Rantell, K; Campbell, M J; Ratcliffe, J

    2010-02-01

    To determine which of two methods of case note review--holistic (implicit) and criterion-based (explicit)--provides the most useful and reliable information for quality and safety of care, and the level of agreement within and between groups of health-care professionals when they use the two methods to review the same record. To explore the process-outcome relationship between holistic and criterion-based quality-of-care measures and hospital-level outcome indicators. Case notes of patients at randomly selected hospitals in England. In the first part of the study, retrospective multiple reviews of 684 case notes were undertaken at nine acute hospitals using both holistic and criterion-based review methods. Quality-of-care measures included evidence-based review criteria and a quality-of-care rating scale. Textual commentary on the quality of care was provided as a component of holistic review. Review teams comprised combinations of: doctors (n = 16), specialist nurses (n = 10) and clinically trained audit staff (n = 3) and non-clinical audit staff (n = 9). In the second part of the study, process (quality and safety) of care data were collected from the case notes of 1565 people with either chronic obstructive pulmonary disease (COPD) or heart failure in 20 hospitals. Doctors collected criterion-based data from case notes and used implicit review methods to derive textual comments on the quality of care provided and score the care overall. Data were analysed for intrarater consistency, inter-rater reliability between pairs of staff using intraclass correlation coefficients (ICCs) and completeness of criterion data capture, and comparisons were made within and between staff groups and between review methods. To explore the process-outcome relationship, a range of publicly available health-care indicator data were used as proxy outcomes in a multilevel analysis. Overall, 1473 holistic and 1389 criterion-based reviews were undertaken in the first part of the study. When same staff-type reviewer pairs/groups reviewed the same record, holistic scale score inter-rater reliability was moderate within each of the three staff groups [intraclass correlation coefficient (ICC) 0.46-0.52], and inter-rater reliability for criterion-based scores was moderate to good (ICC 0.61-0.88). When different staff-type pairs/groups reviewed the same record, agreement between the reviewer pairs/groups was weak to moderate for overall care (ICC 0.24-0.43). Comparison of holistic review score and criterion-based score of case notes reviewed by doctors and by non-clinical audit staff showed a reasonable level of agreement (p-values for difference 0.406 and 0.223, respectively), although results from all three staff types showed no overall level of agreement (p-value for difference 0.057). Detailed qualitative analysis of the textual data indicated that the three staff types tended to provide different forms of commentary on quality of care, although there was some overlap between some groups. In the process-outcome study there generally were high criterion-based scores for all hospitals, whereas there was more interhospital variation between the holistic review overall scale scores. Textual commentary on the quality of care verified the holistic scale scores. Differences among hospitals with regard to the relationship between mortality and quality of care were not statistically significant. Using the holistic approach, the three groups of staff appeared to interpret the recorded care differently when they each reviewed the same record. When the same clinical record was reviewed by doctors and non-clinical audit staff, there was no significant difference between the assessments of quality of care generated by the two groups. All three staff groups performed reasonably well when using criterion-based review, although the quality and type of information provided by doctors was of greater value. Therefore, when measuring quality of care from case notes, consideration needs to be given to the method of review, the type of staff undertaking the review, and the methods of analysis available to the review team. Review can be enhanced using a combination of both criterion-based and structured holistic methods with textual commentary, and variation in quality of care can best be identified from a combination of holistic scale scores and textual data review.

  6. A longitudinal, multi-level comparative study of quality and safety in European hospitals: the QUASER study protocol.

    PubMed

    Robert, Glenn B; Anderson, Janet E; Burnett, Susan J; Aase, Karina; Andersson-Gare, Boel; Bal, Roland; Calltorp, Johan; Nunes, Francisco; Weggelaar, Anne-Marie; Vincent, Charles A; Fulop, Naomi J

    2011-10-26

    although there is a wealth of information available about quality improvement tools and techniques in healthcare there is little understanding about overcoming the challenges of day-to-day implementation in complex organisations like hospitals. The 'Quality and Safety in Europe by Research' (QUASER) study will investigate how hospitals implement, spread and sustain quality improvement, including the difficulties they face and how they overcome them. The overall aim of the study is to explore relationships between the organisational and cultural characteristics of hospitals and how these impact on the quality of health care; the findings will be designed to help policy makers, payers and hospital managers understand the factors and processes that enable hospitals in Europe to achieve-and sustain-high quality services for their patients. in-depth multi-level (macro, meso and micro-system) analysis of healthcare quality policies and practices in 5 European countries, including longitudinal case studies in a purposive sample of 10 hospitals. The project design has three major features: • a working definition of quality comprising three components: clinical effectiveness, patient safety and patient experience • a conceptualisation of quality as a human, social, technical and organisational accomplishment • an emphasis on translational research that is evidence-based and seeks to provide strategic and practical guidance for hospital practitioners and health care policy makers in the European Union. Throughout the study we will adopt a mixed methods approach, including qualitative (in-depth, narrative-based, ethnographic case studies using interviews, and direct non-participant observation of organisational processes) and quantitative research (secondary analysis of safety and quality data, for example: adverse incident reporting; patient complaints and claims). the protocol is based on the premise that future research, policy and practice need to address the sociology of improvement in equal measure to the science and technique of improvement, or at least expand the discipline of improvement to include these critical organisational and cultural processes. We define the 'organisational and cultural characteristics associated with better quality of care' in a broad sense that encompasses all the features of a hospital that might be hypothesised to impact upon clinical effectiveness, patient safety and/or patient experience.

  7. Quality Risk Management: Putting GMP Controls First.

    PubMed

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered to be any control that is put in place to assure product quality and regulatory compliance. This improved approach is also based on how the detectability of risks is assessed. This is important because when producing medicines, it is not always good practice to place a high reliance upon detection-type controls in the absence of an adequate level of assurance in the manufacturing process that leads to the finished medicine.

  8. Quality and Safety Aspects of Cereals (Wheat) and Their Products.

    PubMed

    Varzakas, Theo

    2016-11-17

    Cereals and, most specifically, wheat are described in this chapter highlighting on their safety and quality aspects. Moreover, wheat quality aspects are adequately addressed since they are used to characterize dough properties and baking quality. Determination of dough properties is also mentioned and pasta quality is also described in this chapter. Chemometrics-multivariate analysis is one of the analyses carried out. Regarding production weighing/mixing of flours, kneading, extruded wheat flours, and sodium chloride are important processing steps/raw materials used in the manufacturing of pastry products. Staling of cereal-based products is also taken into account. Finally, safety aspects of cereal-based products are well documented with special emphasis on mycotoxins, acrylamide, and near infrared methodology.

  9. Doctors or technicians: assessing quality of medical education

    PubMed Central

    Hasan, Tayyab

    2010-01-01

    Medical education institutions usually adapt industrial quality management models that measure the quality of the process of a program but not the quality of the product. The purpose of this paper is to analyze the impact of industrial quality management models on medical education and students, and to highlight the importance of introducing a proper educational quality management model. Industrial quality management models can measure the training component in terms of competencies, but they lack the educational component measurement. These models use performance indicators to assess their process improvement efforts. Researchers suggest that the performance indicators used in educational institutions may only measure their fiscal efficiency without measuring the quality of the educational experience of the students. In most of the institutions, where industrial models are used for quality assurance, students are considered as customers and are provided with the maximum services and facilities possible. Institutions are required to fulfill a list of recommendations from the quality control agencies in order to enhance student satisfaction and to guarantee standard services. Quality of medical education should be assessed by measuring the impact of the educational program and quality improvement procedures in terms of knowledge base development, behavioral change, and patient care. Industrial quality models may focus on academic support services and processes, but educational quality models should be introduced in parallel to focus on educational standards and products. PMID:23745059

  10. Doctors or technicians: assessing quality of medical education.

    PubMed

    Hasan, Tayyab

    2010-01-01

    Medical education institutions usually adapt industrial quality management models that measure the quality of the process of a program but not the quality of the product. The purpose of this paper is to analyze the impact of industrial quality management models on medical education and students, and to highlight the importance of introducing a proper educational quality management model. Industrial quality management models can measure the training component in terms of competencies, but they lack the educational component measurement. These models use performance indicators to assess their process improvement efforts. Researchers suggest that the performance indicators used in educational institutions may only measure their fiscal efficiency without measuring the quality of the educational experience of the students. In most of the institutions, where industrial models are used for quality assurance, students are considered as customers and are provided with the maximum services and facilities possible. Institutions are required to fulfill a list of recommendations from the quality control agencies in order to enhance student satisfaction and to guarantee standard services. Quality of medical education should be assessed by measuring the impact of the educational program and quality improvement procedures in terms of knowledge base development, behavioral change, and patient care. Industrial quality models may focus on academic support services and processes, but educational quality models should be introduced in parallel to focus on educational standards and products.

  11. Findings From a Nursing Care Audit Based on the Nursing Process: A Descriptive Study.

    PubMed

    Poortaghi, Sarieh; Salsali, Mahvash; Ebadi, Abbas; Rahnavard, Zahra; Maleki, Farzaneh

    2015-09-01

    Although using the nursing process improves nursing care quality, few studies have evaluated nursing performance in accordance with nursing process steps either nationally or internationally. This study aimed to audit nursing care based on a nursing process model. This was a cross-sectional descriptive study in which a nursing audit checklist was designed and validated for assessing nurses' compliance with nursing process. A total of 300 nurses from various clinical settings of Tehran university of medical sciences were selected. Data were analyzed using descriptive and inferential statistics, including frequencies, Pearson correlation coefficient and independent samples t-tests. The compliance rate of nursing process indicators was 79.71 ± 0.87. Mean compliance scores did not significantly differ by education level and gender. However, overall compliance scores were correlated with nurses' age (r = 0.26, P = 0.001) and work experience (r = 0.273, P = 0.001). Nursing process indicators can be used to audit nursing care. Such audits can be used as quality assurance tools.

  12. An adaptive framework to differentiate receiving water quality impacts on a multi-scale level.

    PubMed

    Blumensaat, F; Tränckner, J; Helm, B; Kroll, S; Dirckx, G; Krebs, P

    2013-01-01

    The paradigm shift in recent years towards sustainable and coherent water resources management on a river basin scale has changed the subject of investigations to a multi-scale problem representing a great challenge for all actors participating in the management process. In this regard, planning engineers often face an inherent conflict to provide reliable decision support for complex questions with a minimum of effort. This trend inevitably increases the risk to base decisions upon uncertain and unverified conclusions. This paper proposes an adaptive framework for integral planning that combines several concepts (flow balancing, water quality monitoring, process modelling, multi-objective assessment) to systematically evaluate management strategies for water quality improvement. As key element, an S/P matrix is introduced to structure the differentiation of relevant 'pressures' in affected regions, i.e. 'spatial units', which helps in handling complexity. The framework is applied to a small, but typical, catchment in Flanders, Belgium. The application to the real-life case shows: (1) the proposed approach is adaptive, covers problems of different spatial and temporal scale, efficiently reduces complexity and finally leads to a transparent solution; and (2) water quality and emission-based performance evaluation must be done jointly as an emission-based performance improvement does not necessarily lead to an improved water quality status, and an assessment solely focusing on water quality criteria may mask non-compliance with emission-based standards. Recommendations derived from the theoretical analysis have been put into practice.

  13. Quality of Information Approach to Improving Source Selection in Tactical Networks

    DTIC Science & Technology

    2017-02-01

    consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a

  14. Quality management of human resources. Providers should begin by focusing on education, performance management, and reward systems.

    PubMed

    Blair, C S; Fordyce, M; Barney, S M

    1993-10-01

    For a quality management transformation to occur, a healthcare organization must focus on education and development, performance management, and recognition and reward systems during the first years of implementation. Education and development are perhaps the most important human resource management functions when implementing quality management principles and processes because behavioral changes will be required at all organizational levels. Specific programs that support an organization's quality management effort will vary but should include the conceptual, cultural, and technical aspects of quality management. The essence of quality management is to always satisfy the customer and to continuously improve the services and products the organization offers. The approach to performance management should therefore rely on customer feedback and satisfaction. An organization committed to quality management should base its performance management approach on customer orientation, process improvement, employee involvement, decision making with data, and continuous improvement. Managers and trustees are being challenged to provide innovative recognition and reward systems that reinforce the values and behaviors consistent with quality management. Such systems must also be aligned with the behaviors and outcomes that support the philosophy, mission, and values of the Catholic healthcare ministry. The following components should be considered for a recognition and reward system: base pay, incentives, benefits, and nonmonetary rewards.

  15. Approach to design space from retrospective quality data.

    PubMed

    Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon

    2016-01-01

    Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.

  16. Setting quality and safety priorities in a target-rich environment: an academic medical center's challenge.

    PubMed

    Mort, Elizabeth A; Demehin, Akinluwa A; Marple, Keith B; McCullough, Kathryn Y; Meyer, Gregg S

    2013-08-01

    Hospitals are continually challenged to provide safer and higher-quality patient care despite resource constraints. With an ever-increasing range of quality and safety targets at the national, state, and local levels, prioritization is crucial in effective institutional quality goal setting and resource allocation.Organizational goal-setting theory is a performance improvement methodology with strong results across many industries. The authors describe a structured goal-setting process they have established at Massachusetts General Hospital for setting annual institutional quality and safety goals. Begun in 2008, this process has been conducted on an annual basis. Quality and safety data are gathered from many sources, both internal and external to the hospital. These data are collated and classified, and multiple approaches are used to identify the most pressing quality issues facing the institution. The conclusions are subject to stringent internal review, and then the top quality goals of the institution are chosen. Specific tactical initiatives and executive owners are assigned to each goal, and metrics are selected to track performance. A reporting tool based on these tactics and metrics is used to deliver progress updates to senior hospital leadership.The hospital has experienced excellent results and strong organizational buy-in using this effective, low-cost, and replicable goal-setting process. It has led to improvements in structural, process, and outcomes aspects of quality.

  17. Quality Indicators for Safe Medication Preparation and Administration: A Systematic Review

    PubMed Central

    Maaskant, Jolanda M.; de Boer, Monica; Krediet, C. T. Paul; Nieveen van Dijkum, Els J. M.

    2015-01-01

    Background One-third of all medication errors causing harm to hospitalized patients occur in the medication preparation and administration phase, which is predominantly a nursing activity. To monitor, evaluate and improve the quality and safety of this process, evidence-based quality indicators can be used. Objectives The aim of study was to identify evidence-based quality indicators (structure, process and outcome) for safe in-hospital medication preparation and administration. Methods MEDLINE, EMBASE and CINAHL were searched for relevant studies published up to January 2015. Additionally, nine databases were searched to identify relevant grey literature. Two reviewers independently selected studies if (1) the method for quality indicator development combined a literature search with expert panel opinion, (2) the study contained quality indicators on medication safety, and (3) any of the quality indicators were applicable to hospital medication preparation and administration. A multidisciplinary team appraised the studies independently using the AIRE instrument, which contains four domains and 20 items. Quality indicators applicable to in-hospital medication preparation and administration were extracted using a structured form. Results The search identified 1683 studies, of which 64 were reviewed in detail and five met the inclusion criteria. Overall, according to the AIRE domains, all studies were clear on purpose; most of them applied stakeholder involvement and used evidence reasonably; usage of the indicator in practice was scarcely described. A total of 21 quality indicators were identified: 5 structure indicators (e.g. safety management and high alert medication), 11 process indicators (e.g. verification and protocols) and 5 outcome indicators (e.g. harm and death). These quality indicators partially cover the 7 rights. Conclusion Despite the relatively small number of included studies, the identified quality indicators can serve as an excellent starting point for further development of nursing specific quality indicators for medication safety. Especially on the right patient, right route, right time and right documentation there is room future development of quality indicators. PMID:25884623

  18. Manufacturing history of etanercept (Enbrel®): Consistency of product quality through major process revisions.

    PubMed

    Hassett, Brian; Singh, Ena; Mahgoub, Ehab; O'Brien, Julie; Vicik, Steven M; Fitzpatrick, Brian

    2018-01-01

    Etanercept (ETN) (Enbrel®) is a soluble protein that binds to, and specifically inhibits, tumor necrosis factor (TNF), a proinflammatory cytokine. ETN is synthesized in Chinese hamster ovary cells by recombinant DNA technology as a fusion protein, with a fully human TNFRII ectodomain linked to the Fc portion of human IgG1. Successful manufacture of biologics, such as ETN, requires sophisticated process and product understanding, as well as meticulous control of operations to maintain product consistency. The objective of this evaluation was to show that the product profile of ETN drug substance (DS) has been consistent over the course of production. Multiple orthogonal biochemical analyses, which included evaluation of attributes indicative of product purity, potency, and quality, were assessed on >2,000 batches of ETN from three sites of DS manufacture, during the period 1998-2015. Based on the key quality attributes of product purity (assessed by hydrophobic interaction chromatography HPLC), binding activity (to TNF by ELISA), potency (inhibition of TNF-induced apoptosis by cell-based bioassay) and quality (N-linked oligosaccharide map), we show that the integrity of ETN DS has remained consistent over time. This consistency was maintained through three major enhancements to the initial process of manufacturing that were supported by detailed comparability assessments, and approved by the European Medicines Agency. Examination of results for all major quality attributes for ETN DS indicates a highly consistent process for over 18 years and throughout changes to the manufacturing process, without affecting safety and efficacy, as demonstrated across a wide range of clinical trials of ETN in multiple inflammatory diseases.

  19. Adsorption detection for polylysine biomolecules based on high-Q silica capillary whispering gallery mode microresonator

    NASA Astrophysics Data System (ADS)

    Wu, Jixuan; Liu, Bo; Zhang, Hao; Song, Binbin

    2017-11-01

    A silica-capillary-based whispering gallery mode (WGM) microresonator has been proposed and experimentally demonstrated for the real-time monitoring of the polylysine adsorption process. The spectral characteristics of the WGM resonance dips with high quality factor and good wavelength selectivity have been investigated to evaluate the dynamic process for the binding of polylysine with a capillary surface. The WGM transmission spectrum shows a regular shift with increments of observation time, which could be exploited for the analysis of the polylysine adsorption process. The proposed WGM microresonator system possesses desirable qualities such as high sensitivity, fast response, label-free method, high detection resolution and compactness, which could find promising applications in histology and related bioengineering areas.

  20. In-Process Atomic-Force Microscopy (AFM) Based Inspection

    PubMed Central

    Mekid, Samir

    2017-01-01

    A new in-process atomic-force microscopy (AFM) based inspection is presented for nanolithography to compensate for any deviation such as instantaneous degradation of the lithography probe tip. Traditional method used the AFM probes for lithography work and retract to inspect the obtained feature but this practice degrades the probe tip shape and hence, affects the measurement quality. This paper suggests a second dedicated lithography probe that is positioned back-to-back to the AFM probe under two synchronized controllers to correct any deviation in the process compared to specifications. This method shows that the quality improvement of the nanomachining, in progress probe tip wear, and better understanding of nanomachining. The system is hosted in a recently developed nanomanipulator for educational and research purposes. PMID:28561747

  1. The practice of quality-associated costing: application to transfusion manufacturing processes.

    PubMed

    Trenchard, P M; Dixon, R

    1997-01-01

    This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.

  2. Qualification process of CR system and quantification of digital image quality

    NASA Astrophysics Data System (ADS)

    Garnier, P.; Hun, L.; Klein, J.; Lemerle, C.

    2013-01-01

    CEA Valduc uses several X-Ray generators to carry out many inspections: void search, welding expertise, gap measurements, etc. Most of these inspections are carried out on silver based plates. For several years, the CEA/Valduc has decided to qualify new devices such as digital plates or CCD/flat panel plates. On one hand, the choice of this technological orientation is to forecast the assumed and eventual disappearance of silver based plates; on the other hand, it is also to keep our skills mastering up-to-date. The main improvement brought by numerical plates is the continuous progress of the measurement accuracy, especially with image data processing. It is now common to measure defects thickness or depth position within a part. In such applications, data image processing is used to obtain complementary information compared to scanned silver based plates. This scanning procedure is harmful for measurements which imply a data corruption of the resolution, the adding of numerical noise and is time expensive. Digital plates enable to suppress the scanning procedure and to increase resolution. It is nonetheless difficult to define, for digital images, single criteria for the image quality. A procedure has to be defined in order to estimate quality of the digital data itself; the impact of the scanning device and the configuration parameters are also to be taken into account. This presentation deals with the qualification process developed by CEA/Valduc for digital plates (DUR-NDT) based on the study of quantitative criteria chosen to define a direct numerical image quality that could be compared with scanned silver based pictures and the classical optical density. The versatility of the X-Ray parameters is also discussed (X-ray tension, intensity, time exposure). The aim is to be able to transfer the year long experience of CEA/Valduc with silver-based plates inspection to these new digital plates supports. This is an industrial stake.

  3. Integrated Application of Quality-by-Design Principles to Drug Product Development: A Case Study of Brivanib Alaninate Film-Coated Tablets.

    PubMed

    Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A

    2016-01-01

    Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.

  4. Make or buy analysis model based on tolerance allocation to minimize manufacturing cost and fuzzy quality loss

    NASA Astrophysics Data System (ADS)

    Rosyidi, C. N.; Puspitoingrum, W.; Jauhari, W. A.; Suhardi, B.; Hamada, K.

    2016-02-01

    The specification of tolerances has a significant impact on the quality of product and final production cost. The company should carefully pay attention to the component or product tolerance so they can produce a good quality product at the lowest cost. Tolerance allocation has been widely used to solve problem in selecting particular process or supplier. But before merely getting into the selection process, the company must first make a plan to analyse whether the component must be made in house (make), to be purchased from a supplier (buy), or used the combination of both. This paper discusses an optimization model of process and supplier selection in order to minimize the manufacturing costs and the fuzzy quality loss. This model can also be used to determine the allocation of components to the selected processes or suppliers. Tolerance, process capability and production capacity are three important constraints that affect the decision. Fuzzy quality loss function is used in this paper to describe the semantic of the quality, in which the product quality level is divided into several grades. The implementation of the proposed model has been demonstrated by solving a numerical example problem that used a simple assembly product which consists of three components. The metaheuristic approach were implemented to OptQuest software from Oracle Crystal Ball in order to obtain the optimal solution of the numerical example.

  5. Winter wheat quality monitoring and forecasting system based on remote sensing and environmental factors

    NASA Astrophysics Data System (ADS)

    Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie

    2014-03-01

    To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.

  6. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    NASA Astrophysics Data System (ADS)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  7. British Thoracic Society Quality Standards for acute non-invasive ventilation in adults

    PubMed Central

    Davies, Michael; Allen, Martin; Bentley, Andrew; Bourke, Stephen C; Creagh-Brown, Ben; D’Oliveiro, Rachel; Glossop, Alastair; Gray, Alasdair; Jacobs, Phillip; Mahadeva, Ravi; Moses, Rachael; Setchfield, Ian

    2018-01-01

    Introduction The purpose of the quality standards document is to provide healthcare professionals, commissioners, service providers and patients with a guide to standards of care that should be met for the provision of acute non-invasive ventilation in adults together with measurable markers of good practice. Methods Development of British Thoracic Society (BTS) Quality Standards follows the BTS process of quality standard production based on the National Institute for Health and Care Excellence process manual for the development of quality standards. Results 6 quality statements have been developed, each describing a standard of care for the provision of acute non-invasive ventilation in the UK, together with measurable markers of good practice. Conclusion BTS Quality Standards for acute non-invasive ventilation in adults form a key part of the range of supporting materials that the Society produces to assist in the dissemination and implementation of guideline’s recommendations. PMID:29636979

  8. US EPA Base Study Standard Operating Procedure for Data Processing and Data Management

    EPA Pesticide Factsheets

    The purpose of the Standard Operating Procedures (SOP) for data management and data processing is to facilitate consistent documentation and completion of data processing duties and management responsibilities in order to maintain a high standard of data quality.

  9. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  10. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    PubMed

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and protein levels, and resulted in an improvement of 20% versus any of the read-based approaches alone. To the best of our knowledge, this is the first time that an automated transcript definition is subjected to quality control using manually defined and curated genes and thereafter the process is improved. We recommend using a set of manually curated genes to troubleshoot transcriptome reconstruction.

  11. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  12. Launching a Laboratory Testing Process Quality Improvement Toolkit: From the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP).

    PubMed

    Fernald, Douglas; Hamer, Mika; James, Kathy; Tutt, Brandon; West, David

    2015-01-01

    Family medicine and internal medicine physicians order diagnostic laboratory tests for nearly one-third of patient encounters in an average week, yet among medical errors in primary care, an estimated 15% to 54% are attributed to laboratory testing processes. From a practice improvement perspective, we (1) describe the need for laboratory testing process quality improvements from the perspective of primary care practices, and (2) describe the approaches and resources needed to implement laboratory testing process quality improvements in practice. We applied practice observations, process mapping, and interviews with primary care practices in the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP)-affiliated practice-based research networks that field-tested in 2013 a laboratory testing process improvement toolkit. From the data collected in each of the 22 participating practices, common testing quality issues included, but were not limited to, 3 main testing process steps: laboratory test preparation, test tracking, and patient notification. Three overarching qualitative themes emerged: practices readily acknowledge multiple laboratory testing process problems; practices know that they need help addressing the issues; and practices face challenges with finding patient-centered solutions compatible with practice priorities and available resources. While practices were able to get started with guidance and a toolkit to improve laboratory testing processes, most did not seem able to achieve their quality improvement aims unassisted. Providing specific guidance tools with practice facilitation or other rapid-cycle quality improvement support may be an effective approach to improve common laboratory testing issues in primary care. © Copyright 2015 by the American Board of Family Medicine.

  13. The quality of the evidence base for clinical pathway effectiveness: room for improvement in the design of evaluation trials.

    PubMed

    Rotter, Thomas; Kinsman, Leigh; James, Erica; Machotta, Andreas; Steyerberg, Ewout W

    2012-06-18

    The purpose of this article is to report on the quality of the existing evidence base regarding the effectiveness of clinical pathway (CPW) research in the hospital setting. The analysis is based on a recently published Cochrane review of the effectiveness of CPWs. An integral component of the review process was a rigorous appraisal of the methodological quality of published CPW evaluations. This allowed the identification of strengths and limitations of the evidence base for CPW effectiveness. We followed the validated Cochrane Effective Practice and Organisation of Care Group (EPOC) criteria for randomized and non-randomized clinical pathway evaluations. In addition, we tested the hypotheses that simple pre-post studies tend to overestimate CPW effects reported. Out of the 260 primary studies meeting CPW content criteria, only 27 studies met the EPOC study design criteria, with the majority of CPW studies (more than 70 %) excluded from the review on the basis that they were simple pre-post evaluations, mostly comparing two or more annual patient cohorts. Methodologically poor study designs are often used to evaluate CPWs and this compromises the quality of the existing evidence base. Cochrane EPOC methodological criteria, including the selection of rigorous study designs along with detailed descriptions of CPW development and implementation processes, are recommended for quantitative evaluations to improve the evidence base for the use of CPWs in hospitals.

  14. Quality Measures in Orthopaedic Sports Medicine: A Systematic Review.

    PubMed

    Abrams, Geoffrey D; Greenberg, Daniel R; Dragoo, Jason L; Safran, Marc R; Kamal, Robin N

    2017-10-01

    To report the current quality measures that are applicable to orthopaedic sports medicine physicians. Six databases were searched with a customized search term to identify quality measures relevant to orthopaedic sports medicine surgeons: MEDLINE/PubMed, EMBASE, the National Quality Forum (NQF) Quality Positioning System (QPS), the Agency for Healthcare Research and Quality (AHRQ) National Quality Measures Clearinghouse (NQMC), the Physician Quality Reporting System (PQRS) database, and the American Academy of Orthopaedic Surgeons (AAOS) website. Results were screened by 2 Board-certified orthopaedic surgeons with fellowship training in sports medicine and dichotomized based on sports medicine-specific or general orthopaedic (nonarthroplasty) categories. Hip and knee arthroplasty measures were excluded. Included quality measures were further categorized based on Donabedian's domains and the Center for Medicare and Medicaid (CMS) National Quality Strategy priorities. A total of 1,292 quality measures were screened and 66 unique quality measures were included. A total of 47 were sports medicine-specific and 19 related to the general practice of orthopaedics for a fellowship-trained sports medicine specialist. Nineteen (29%) quality measures were collected within PQRS, with 5 of them relating to sports medicine and 14 relating to general orthopaedics. AAOS Clinical Practice Guidelines (CPGs) comprised 40 (60%) of the included measures and were all within sports medicine. Five (8%) additional measures were collected within AHRQ and 2 (3%) within NQF. Most quality measures consist of process rather than outcome or structural measures. No measures addressing concussions were identified. There are many existing quality measures relating to the practice of orthopaedic sports medicine. Most quality measures are process measures described within PQRS or AAOS CPGs. Knowledge of quality measures are important as they may be used to improve care, are increasingly being used to determine physician reimbursement, and can inform future quality measure development efforts. Published by Elsevier Inc.

  15. Delivery System Integration and Health Care Spending and Quality for Medicare Beneficiaries

    PubMed Central

    McWilliams, J. Michael; Chernew, Michael E.; Zaslavsky, Alan M.; Hamed, Pasha; Landon, Bruce E.

    2013-01-01

    Background The Medicare accountable care organization (ACO) programs rely on delivery system integration and provider risk sharing to lower spending while improving quality of care. Methods Using 2009 Medicare claims and linked American Medical Association Group Practice data, we assigned 4.29 million beneficiaries to provider groups based on primary care use. We categorized group size according to eligibility thresholds for the Shared Savings (≥5,000 assigned beneficiaries) and Pioneer (≥15,000) ACO programs and distinguished hospital-based from independent groups. We compared spending and quality of care between larger and smaller provider groups and examined how size-related differences varied by 2 factors considered central to ACO performance: group primary care orientation (measured by the primary care share of large groups’ specialty mix) and provider risk sharing (measured by county health maintenance organization penetration and its relationship to financial risk accepted by different group types for managed care patients). Spending and quality of care measures included total medical spending, spending by type of service, 5 process measures of quality, and 30-day readmissions, all adjusted for sociodemographic and clinical characteristics. Results Compared with smaller groups, larger hospital-based groups had higher total per-beneficiary spending in 2009 (mean difference: +$849), higher 30-day readmission rates (+1.3% percentage points), and similar performance on 4 of 5 process measures of quality. In contrast, larger independent physician groups performed better than smaller groups on all process measures and exhibited significantly lower per-beneficiary spending in counties where risk sharing by these groups was more common (−$426). Among all groups sufficiently large to participate in ACO programs, a strong primary care orientation was associated with lower spending, fewer readmissions, and better quality of diabetes care. Conclusions Spending was lower and quality of care better for Medicare beneficiaries served by larger independent physician groups with strong primary care orientations in environments where providers accepted greater risk. PMID:23780467

  16. Tandem mass spectrometry data quality assessment by self-convolution.

    PubMed

    Choo, Keng Wah; Tham, Wai Mun

    2007-09-20

    Many algorithms have been developed for deciphering the tandem mass spectrometry (MS) data sets. They can be essentially clustered into two classes. The first performs searches on theoretical mass spectrum database, while the second based itself on de novo sequencing from raw mass spectrometry data. It was noted that the quality of mass spectra affects significantly the protein identification processes in both instances. This prompted the authors to explore ways to measure the quality of MS data sets before subjecting them to the protein identification algorithms, thus allowing for more meaningful searches and increased confidence level of proteins identified. The proposed method measures the qualities of MS data sets based on the symmetric property of b- and y-ion peaks present in a MS spectrum. Self-convolution on MS data and its time-reversal copy was employed. Due to the symmetric nature of b-ions and y-ions peaks, the self-convolution result of a good spectrum would produce a highest mid point intensity peak. To reduce processing time, self-convolution was achieved using Fast Fourier Transform and its inverse transform, followed by the removal of the "DC" (Direct Current) component and the normalisation of the data set. The quality score was defined as the ratio of the intensity at the mid point to the remaining peaks of the convolution result. The method was validated using both theoretical mass spectra, with various permutations, and several real MS data sets. The results were encouraging, revealing a high percentage of positive prediction rates for spectra with good quality scores. We have demonstrated in this work a method for determining the quality of tandem MS data set. By pre-determining the quality of tandem MS data before subjecting them to protein identification algorithms, spurious protein predictions due to poor tandem MS data are avoided, giving scientists greater confidence in the predicted results. We conclude that the algorithm performs well and could potentially be used as a pre-processing for all mass spectrometry based protein identification tools.

  17. Quality of the delivery services in health facilities in Northern Ethiopia.

    PubMed

    Fisseha, Girmatsion; Berhane, Yemane; Worku, Alemayehu; Terefe, Wondwossen

    2017-03-09

    Substantial improvements have been observed in the coverage of and access to maternal health service, especially in skilled birth attendants, in Ethiopia. However, the quality of care has been lagging behind. Therefore, this study investigated the status of the quality of delivery services in Northern Ethiopia. A facility based survey was conducted from December 2014 to February 2015 in Northern Ethiopia. The quality of delivery service was assessed in 32 health facilities using a facility audit checklist, by reviewing delivery, by conducting in-depth interview and observation, and by conducting exit interviews with eligible mothers. Facilities were considered as 'good quality' if they scored positively on 75% of the quality indicators set in the national guidelines for all the three components; input (materials, infrastructure, and human resource), process (adherence to standard care procedures during intrapartum and immediate postpartum periods) and output (the mothers' satisfaction and utilization of lifesaving procedures). Overall 2 of 32 (6.3%) of the study facilities fulfilled all the three quality components; input, process and output. Two of the three components were assessed as good in 11 of the 32 (34.4%) health facilities. The input quality was the better of the other quality components; which was good in 21 out of the 32 (65.6%) health facilities. The process and output quality was good in only 10 of the 32 (31.3%) facilities. Only 6.3% of the studied health facilities had good quality in all three dimensions of quality measures that was done in accordance to the national delivery service guidelines. The most compromised quality component was the process. Systematic and sustained efforts need to be strengthened to improve all dimensions of quality in order to achieve the desired quality of delivery services and increase the proportion of births occurring in health facilities.

  18. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2014-10-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  19. Leadership, safety climate, and continuous quality improvement: impact on process quality and patient safety.

    PubMed

    McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R

    2015-01-01

    Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.

  20. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    PubMed

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  1. Study on the Quality Management of Building Electricity Engineering Construction in the Whole Process

    NASA Astrophysics Data System (ADS)

    Qin, Minwu

    2018-05-01

    With the progress of science and technology, people use more and more types of electrical equipment and the functions are more and more complicated, which put forward higher requirements on the construction quality of electrical construction. If you ignore some of the necessary quality requirements and violate the specification of operation in the process of building electrical construction, that will bring great security risks and resulting in huge economic losses, even endanger personal safety. Manage and control construction quality of building electrical construction must be carried out throughout the whole process of construction. According to the construction characteristics of building electrical construction, this article analyze the construction details that are easy to be ignored but very important in the construction, based on management theory and put forward the methods of quality management in the whole process of building electrical construction. This template explains and demonstrates how to prepare your camera-ready paper for Trans Tech Publications. The best is to read these instructions and follow the outline of this text.

  2. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    NASA Astrophysics Data System (ADS)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  3. Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1

    PubMed Central

    González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto

    2015-01-01

    Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173

  4. [Quality assurance and quality improvement. Personal experiences and intentions].

    PubMed

    Roche, B G; Sommer, C

    1995-01-01

    In may 1994 we were selected by the surgical Swiss association to make a study about quality in USA. During our travel we visited 3 types of institutions: Hospitals, National Institute of standard and Technology, Industry, Johnson & Johnson. We appreciate to compare 2 types of quality programs: Quality Assurance (QA) and Continuous Quality Improvement (CQI). In traditional healthcare circles, QA is the process established to meet external regulatory requirements and to assure that patient care is consistent with established standards. In a modern quality terms, QA outside of healthcare means designing a product or service, as well as controlling its production, so well that quality is inevitable. The ideas of W. Edward Deming is that there is never improvement just by inspection. He developed a theory based on 14 principles. A productive work is accomplished through processes. Understanding the variability of processes is a key to improve quality. Quality management sees each person in an organisation as part of one or more processes. The job of every worker is to receive the work of others, add value to that work, and supply it to the next person in the process. This is called the triple role the workers as customer, processor, and supplier. The main source of quality defects is problems in the process. The old assumption is that quality fails when people do the right thing wrong; the new assumption is that, more often, quality failures arise when people do the wrong think right. Exhortation, incentives and discipline of workers are unlikely to improve quality. If quality is failing when people do their jobs as designed, then exhorting them to do better is managerial nonsense. Modern quality theory is customer focused. Customers are identified internally and externally. The modern approach to quality is thoroughly grounded in scientific and statistical thinking. Like in medicine, the symptom is a defect in quality. The therapist of process must perform diagnostic test, formulate hypotheses of cause, test those hypotheses, apply remedies, and assess the effect of remedies. Total employee involvement is critical. A power comes from enabling all employees to become involved in quality improvement. A great advantage of CQI is the prevention orientation of the concept. The CQI permeated a collegial approach, people learn how to work together to improve. CQI is a time consuming procedure. During our travel we learned the definition of quality as the customer satisfaction. To build a CQI concept in employed time but all employed are involved in quality improvement. Applying CQI we could be able to refuse Quality control programs.

  5. Exploiting mAb structure characteristics for a directed QbD implementation in early process development.

    PubMed

    Karlberg, Micael; von Stosch, Moritz; Glassey, Jarka

    2018-03-07

    In today's biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure-activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure-activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure-activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.

  6. Laser welding of polymers: phenomenological model for a quick and reliable process quality estimation considering beam shape influences

    NASA Astrophysics Data System (ADS)

    Timpe, Nathalie F.; Stuch, Julia; Scholl, Marcus; Russek, Ulrich A.

    2016-03-01

    This contribution presents a phenomenological, analytical model for laser welding of polymers which is suited for a quick process quality estimation for the practitioner. Besides material properties of the polymer and processing parameters like welding pressure, feed rate and laser power the model is based on a simple few parameter description of the size and shape of the laser power density distribution (PDD) in the processing zone. The model allows an estimation of the weld seam tensile strength. It is based on energy balance considerations within a thin sheet with the thickness of the optical penetration depth on the surface of the absorbing welding partner. The joining process itself is modelled by a phenomenological approach. The model reproduces the experimentally known process windows for the main process parameters correctly. Using the parameters describing the shape of the laser PDD the critical dependence of the process windows on the PDD shape will be predicted and compared with experiments. The adaption of the model to other laser manufacturing processes where the PDD influence can be modelled comparably will be discussed.

  7. WE-A-BRC-00: The Quality Gap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Quality and safety in healthcare are inextricably linked. There are compelling data that link poor quality radiation therapy to inferior patient survival. Radiation Oncology clinical trial protocol deviations often involve incorrect target volume delineation or dosing, akin to radiotherapy incidents which also often involve partial geometric miss or improper radiation dosing. When patients with radiation protocol variations are compared to those without significant protocol variations, clinical outcome is negatively impacted. Traditionally, quality assurance in radiation oncology has been driven largely by new technological advances, and safety improvement has been driven by reactive responses to past system failures and prescriptive mandatesmore » recommended by professional organizations and promulgated by regulators. Prescriptive approaches to quality and safety alone often do not address the huge variety of process and technique used in radiation oncology. Risk-based assessments of radiotherapy processes provide a mechanism to enhance quality and safety, both for new and for established techniques. It is imperative that we explore such a paradigm shift at this time, when expectations from patients as well as providers are rising while available resources are falling. There is much we can learn from our past experiences to be applied towards the new risk-based assessments. Learning Objectives: Understand the impact of clinical and technical quality on outcomes Understand the importance of quality care in radiation oncology Learn to assess the impact of quality on clinical outcomes D. Followill, NIH Grant CA180803.« less

  8. Current Use of Evidence-Based Medicine in Pediatric Spine Surgery.

    PubMed

    Oetgen, Matthew E

    2018-04-01

    Evidence-based medicine (EBM) is a process of decision-making aimed at making the best clinical decisions as they relate to patients' health. The current use of EBM in pediatric spine surgery is varied, based mainly on the availability of high-quality data. The use of EBM is limited in idiopathic scoliosis, whereas EBM has been used to investigate the treatment of pediatric spondylolysis. Studies on early onset scoliosis are of low quality, making EBM difficult in this condition. Future focus and commitment to study quality in pediatric spinal surgery will likely increase the role of EBM in these conditions. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  10. Investigation into the influence of laser energy input on selective laser melted thin-walled parts by response surface method

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Zhang, Jian; Pang, Zhicong; Wu, Weihui

    2018-04-01

    Selective laser melting (SLM) provides a feasible way for manufacturing of complex thin-walled parts directly, however, the energy input during SLM process, namely derived from the laser power, scanning speed, layer thickness and scanning space, etc. has great influence on the thin wall's qualities. The aim of this work is to relate the thin wall's parameters (responses), namely track width, surface roughness and hardness to the process parameters considered in this research (laser power, scanning speed and layer thickness) and to find out the optimal manufacturing conditions. Design of experiment (DoE) was used by implementing composite central design to achieve better manufacturing qualities. Mathematical models derived from the statistical analysis were used to establish the relationships between the process parameters and the responses. Also, the effects of process parameters on each response were determined. Then, a numerical optimization was performed to find out the optimal process set at which the quality features are at their desired values. Based on this study, the relationship between process parameters and SLMed thin-walled structure was revealed and thus, the corresponding optimal process parameters can be used to manufactured thin-walled parts with high quality.

  11. Knowledge work productivity effect on quality of knowledge work in software development process in SME

    NASA Astrophysics Data System (ADS)

    Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida

    2016-08-01

    Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME

  12. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    PubMed

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%-130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. The Bologna Process and the Search for Excellence: Between Rhetoric and Reality, the Emotional Reactions of Teachers

    ERIC Educational Resources Information Center

    Bahia, Sara; Freire, Isabel P.; Estrela, Maria Teresa; Amaral, Anabela; Espírito Santo, José António

    2017-01-01

    There has been an overall change in higher education towards the achievement of outstanding patterns of quality and excellence that assure competitiveness at a global scale. Teachers feel the pressure of official regulations for achieving quality and excellence, based on questionable concepts of quality that do not take into account the experience…

  14. Social Justice and School Improvement: Improving the Quality of Schooling in the Poorest Neighbourhoods

    ERIC Educational Resources Information Center

    Lupton, Ruth

    2005-01-01

    Social justice in education demands, at the very least, that all students should have access to the same quality of educational processes, even if their outcomes turn out to be unequal. Yet schools in the poorest neighbourhoods are consistently adjudged to provide a lower quality of education than those in more advantaged areas. Based on a…

  15. Enhancing Early Child Care Quality and Learning for Toddlers at Risk: The Responsive Early Childhood Program

    ERIC Educational Resources Information Center

    Landry, Susan H.; Zucker, Tricia A.; Taylor, Heather B.; Swank, Paul R.; Williams, Jeffrey M.; Assel, Michael; Crawford, April; Huang, Weihua; Clancy-Menchetti, Jeanine; Lonigan, Christopher J.; Phillips, Beth M.; Eisenberg, Nancy; Spinrad, Tracy L.; de Viliers, Jill; de Viliers, Peter; Barnes, Marcia; Starkey, Prentice; Klein, Alice

    2014-01-01

    Despite reports of positive effects of high-quality child care, few experimental studies have examined the process of improving low-quality center-based care for toddler-age children. In this article, we report intervention effects on child care teachers' behaviors and children's social, social-emotional classroom activities (RECC).…

  16. An approach to quality and security of supply for single-use bioreactors.

    PubMed

    Barbaroux, Magali; Gerighausen, Susanne; Hackel, Heiko

    2014-01-01

    Single-use systems (also referred to as disposables) have become a huge part of the bioprocessing industry, which raised concern in the industry regarding quality and security of supply. Processes must be in place to assure the supply and control of outsourced activities and quality of purchased materials along the product life cycle. Quality and security of supply for single-use bioreactors (SUBs) are based on a multidisciplinary approach. Developing a state-of-the-art SUB-system based on quality by design (QbD) principles requires broad expertise and know-how including the cell culture application, polymer chemistry, regulatory requirements, and a deep understanding of the biopharmaceutical industry. Using standardized products reduces the complexity and strengthens the robustness of the supply chain. Well-established supplier relations including risk mitigation strategies are the basis for achieving long-term security of supply. Well-developed quality systems including change control approaches aligned with the requirements of the biopharmaceutical industry are a key factor in supporting long-term product availability. This chapter outlines the approach to security of supply for key materials used in single-use production processes for biopharmaceuticals from a supplier perspective.

  17. [Improvement of medical processes with Six Sigma - practicable zero-defect quality in preparation for surgery].

    PubMed

    Sobottka, Stephan B; Töpfer, Armin; Eberlein-Gonska, Maria; Schackert, Gabriele; Albrecht, D Michael

    2010-01-01

    Six Sigma is an innovative management- approach to reach practicable zero- defect quality in medical service processes. The Six Sigma principle utilizes strategies, which are based on quantitative measurements and which seek to optimize processes, limit deviations or dispersion from the target process. Hence, Six Sigma aims to eliminate errors or quality problems of all kinds. A pilot project to optimize the preparation for neurosurgery could now show that the Six Sigma method enhanced patient safety in medical care, while at the same time disturbances in the hospital processes and failure costs could be avoided. All six defined safety relevant quality indicators were significantly improved by changes in the workflow by using a standardized process- and patient- oriented approach. Certain defined quality standards such as a 100% complete surgical preparation at start of surgery and the required initial contact of the surgeon with the patient/ surgical record on the eve of surgery could be fulfilled within the range of practical zero- defect quality. Likewise, the degree of completion of the surgical record by 4 p.m. on the eve of surgery and their quality could be improved by a factor of 170 and 16, respectively, at sigma values of 4.43 and 4.38. The other two safety quality indicators "non-communicated changes in the OR- schedule" and the "completeness of the OR- schedule by 12:30 a.m. on the day before surgery" also show an impressive improvement by a factor of 2.8 and 7.7, respectively, corresponding with sigma values of 3.34 and 3.51. The results of this pilot project demonstrate that the Six Sigma method is eminently suitable for improving quality of medical processes. In our experience this methodology is suitable, even for complex clinical processes with a variety of stakeholders. In particular, in processes in which patient safety plays a key role, the objective of achieving a zero- defect quality is reasonable and should definitely be aspirated. Copyright © 2010. Published by Elsevier GmbH.

  18. Measuring housing quality in the absence of a monetized real estate market.

    PubMed

    Rindfuss, Ronald R; Piotrowski, Martin; Thongthai, Varachai; Prasartkul, Pramote

    2007-03-01

    Measuring housing quality or value or both has been a weak component of demographic and development research in less developed countries that lack an active real estate (housing) market. We describe a new method based on a standardized subjective rating process. It is designed to be used in settings that do not have an active, monetized housing market. The method is applied in an ongoing longitudinal study in north-east Thailand and could be straightforwardly used in many other settings. We develop a conceptual model of the process whereby households come to reside in high-quality or low-quality housing units. We use this theoretical model in conjunction with longitudinal data to show that the new method of measuring housing quality behaves as theoretically expected, thus providing evidence of face validity.

  19. Incorporating Topic Assignment Constraint and Topic Correlation Limitation into Clinical Goal Discovering for Clinical Pathway Mining.

    PubMed

    Xu, Xiao; Jin, Tao; Wei, Zhijie; Wang, Jianmin

    2017-01-01

    Clinical pathways are widely used around the world for providing quality medical treatment and controlling healthcare cost. However, the expert-designed clinical pathways can hardly deal with the variances among hospitals and patients. It calls for more dynamic and adaptive process, which is derived from various clinical data. Topic-based clinical pathway mining is an effective approach to discover a concise process model. Through this approach, the latent topics found by latent Dirichlet allocation (LDA) represent the clinical goals. And process mining methods are used to extract the temporal relations between these topics. However, the topic quality is usually not desirable due to the low performance of the LDA in clinical data. In this paper, we incorporate topic assignment constraint and topic correlation limitation into the LDA to enhance the ability of discovering high-quality topics. Two real-world datasets are used to evaluate the proposed method. The results show that the topics discovered by our method are with higher coherence, informativeness, and coverage than the original LDA. These quality topics are suitable to represent the clinical goals. Also, we illustrate that our method is effective in generating a comprehensive topic-based clinical pathway model.

  20. Incorporating Topic Assignment Constraint and Topic Correlation Limitation into Clinical Goal Discovering for Clinical Pathway Mining

    PubMed Central

    Xu, Xiao; Wei, Zhijie

    2017-01-01

    Clinical pathways are widely used around the world for providing quality medical treatment and controlling healthcare cost. However, the expert-designed clinical pathways can hardly deal with the variances among hospitals and patients. It calls for more dynamic and adaptive process, which is derived from various clinical data. Topic-based clinical pathway mining is an effective approach to discover a concise process model. Through this approach, the latent topics found by latent Dirichlet allocation (LDA) represent the clinical goals. And process mining methods are used to extract the temporal relations between these topics. However, the topic quality is usually not desirable due to the low performance of the LDA in clinical data. In this paper, we incorporate topic assignment constraint and topic correlation limitation into the LDA to enhance the ability of discovering high-quality topics. Two real-world datasets are used to evaluate the proposed method. The results show that the topics discovered by our method are with higher coherence, informativeness, and coverage than the original LDA. These quality topics are suitable to represent the clinical goals. Also, we illustrate that our method is effective in generating a comprehensive topic-based clinical pathway model. PMID:29065617

  1. Can current analytical quality performance of UK clinical laboratories support evidence-based guidelines for diabetes and ischaemic heart disease?--A pilot study and a proposal.

    PubMed

    Jassam, Nuthar; Yundt-Pacheco, John; Jansen, Rob; Thomas, Annette; Barth, Julian H

    2013-08-01

    The implementation of national and international guidelines is beginning to standardise clinical practice. However, since many guidelines have decision limits based on laboratory tests, there is an urgent need to ensure that different laboratories obtain the same analytical result on any sample. A scientifically-based quality control process will be a pre-requisite to provide this level of analytical performance which will support evidence-based guidelines and movement of patients across boundaries while maintaining standardised outcomes. We discuss the finding of a pilot study performed to assess UK clinical laboratories readiness to work to a higher grade quality specifications such as biological variation-based quality specifications. Internal quality control (IQC) data for HbA1c, glucose, creatinine, cholesterol and high density lipoprotein (HDL)-cholesterol were collected from UK laboratories participating in the Bio-Rad Unity QC programme. The median of the coefficient of variation (CV%) of the participating laboratories was evaluated against the CV% based on biological variation. Except creatinine, the other four analytes had a variable degree of compliance with the biological variation-based quality specifications. More than 75% of the laboratories met the biological variation-based quality specifications for glucose, cholesterol and HDL-cholesterol. Slightly over 50% of the laboratories met the analytical goal for HBA1c. Only one analyte (cholesterol) had a performance achieving the higher quality specifications consistent with 5σ. Our data from IQC do not consistently demonstrate that the results from clinical laboratories meet evidence-based quality specifications. Therefore, we propose that a graded scale of quality specifications may be needed at this stage.

  2. Contact and non-contact ultrasonic measurement in the food industry: a review

    NASA Astrophysics Data System (ADS)

    Taufiq Mohd Khairi, Mohd; Ibrahim, Sallehuddin; Yunus, Mohd Amri Md; Faramarzi, Mahdi

    2016-01-01

    The monitoring of the food manufacturing process is vital since it determines the safety and quality level of foods which directly affect the consumers’ health. Companies which produce high quality products will gain trust from consumers. This factor helps the companies to make profits. The use of efficient and appropriate sensors for the monitoring process can also reduce cost. The food assessing process based on an ultrasonic sensor has attracted the attention of the food industry due to its excellent capabilities in several applications. The utilization of low or high frequencies for the ultrasonic transducer has provided an enormous benefit for analysing, modifying and guaranteeing the quality of food. The contact and non-contact ultrasonic modes for measurement also contributed significantly to the food processing. This paper presents a review of the application of the contact and non-contact mode of ultrasonic measurement focusing on safety and quality control areas. The results from previous researches are shown and elaborated.

  3. Comparison of modelling accuracy with and without exploiting automated optical monitoring information in predicting the treated wastewater quality.

    PubMed

    Tomperi, Jani; Leiviskä, Kauko

    2018-06-01

    Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.

  4. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall architecture and illustrates the results of each of the key stages in the processor.

  5. AQMEII: A New International Initiative on Air Quality Model Evaluation

    EPA Science Inventory

    We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....

  6. Applying a Continuous Quality Improvement Model To Assess Institutional Effectiveness.

    ERIC Educational Resources Information Center

    Roberts, Keith

    This handbook outlines techniques and processes for improving institutional effectiveness and ensuring continuous quality improvement, based on strategic planning activities at Wisconsin's Milwaukee Area Technical College (MATC). First, institutional effectiveness is defined and 17 core indicators of effectiveness developed by the Wisconsin…

  7. QAIT: a quality assurance issue tracking tool to facilitate the improvement of clinical data quality.

    PubMed

    Zhang, Yonghong; Sun, Weihong; Gutchell, Emily M; Kvecher, Leonid; Kohr, Joni; Bekhash, Anthony; Shriver, Craig D; Liebman, Michael N; Mural, Richard J; Hu, Hai

    2013-01-01

    In clinical and translational research as well as clinical trial projects, clinical data collection is prone to errors such as missing data, and misinterpretation or inconsistency of the data. A good quality assurance (QA) program can resolve many such errors though this requires efficient communications between the QA staff and data collectors. Managing such communications is critical to resolving QA problems but imposes a major challenge for a project involving multiple clinical and data processing sites. We have developed a QA issue tracking (QAIT) system to support clinical data QA in the Clinical Breast Care Project (CBCP). This web-based application provides centralized management of QA issues with role-based access privileges. It has greatly facilitated the QA process and enhanced the overall quality of the CBCP clinical data. As a stand-alone system, QAIT can supplement any other clinical data management systems and can be adapted to support other projects. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  9. From Empiricism to Total Quality Management in Greek Education

    NASA Astrophysics Data System (ADS)

    Karavasilis, Ioannis; Samoladas, Ioannis; Nedos, Apostolos

    Nowadays the education system in Greece moves towards democratization and decentralization. School unit is the cell and the base of the education system. Principal's role is highly demanding, multi-dimensional, and a critical determinant of school performance and effectiveness. The paper proposes an effective organizational plan of school units in Primary Education based on basic administration processes and Total Quality Management. Using theory of emotional intelligence and Blake-Mouton's grid it emphasizes the impact of Principal's leadership on democratizing the school unit, on creating a safe and secure environment and positive school climate and motivating teachers committee to participate in the decision making process.

  10. Extension of quality-by-design concept to the early development phase of pharmaceutical R&D processes.

    PubMed

    Csóka, Ildikó; Pallagi, Edina; Paál, Tamás L

    2018-03-27

    Here, we propose the extension of the quality-by-design (QbD) concept to also fit the early development phases of pharmaceuticals by adding elements that are currently widely applied, but not yet included in the QbD model in a structured way. These are the introduction of a 'zero' preformulation phase (i.e., selection of drug substance, possible dosage forms and administration routes based on the evaluated therapeutic need); building in stakeholders' (industry, patient, and regulatory) requirements into the quality target product profile (QTTP); and the use of modern quality management tools during the composition and process design phase [collecting critical quality attributes (CQAs) and selection of CPPs) for (still laboratory-scale) design space (DS) development. Moreover, during industrial scale-up, CQAs (as well as critical process parameters; CPPs) can be changed; however, we recommend that the existing QbD elements are reconsidered and updated after this phase. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  12. Quality expectations and tolerance limits of trial master files (TMF) – Developing a risk-based approach for quality assessments of TMFs

    PubMed Central

    Hecht, Arthur; Busch-Heidger, Barbara; Gertzen, Heiner; Pfister, Heike; Ruhfus, Birgit; Sanden, Per-Holger; Schmidt, Gabriele B.

    2015-01-01

    This article addresses the question of when a trial master file (TMF) can be considered sufficiently accurate and complete: What attributes does the TMF need to have so that a clinical trial can be adequately reconstructed from documented data and procedures? Clinical trial sponsors face significant challenges in assembling the TMF, especially when dealing with large, international, multicenter studies; despite all newly introduced archiving techniques it is becoming more and more difficult to ensure that the TMF is complete. This is directly reflected in the number of inspection findings reported and published by the EMA in 2014. Based on quality risk management principles in clinical trials the authors defined the quality expectations for the different document types in a TMF and furthermore defined tolerance limits for missing documents. This publication provides guidance on what type of documents and processes are most important, and in consequence, indicates on which documents and processes trial team staff should focus in order to achieve a high-quality TMF. The members of this working group belong to the CQAG Group (Clinical Quality Assurance Germany) and are QA (quality assurance) experts (auditors or compliance functions) with long-term experience in the practical handling of TMFs. PMID:26693218

  13. Quality expectations and tolerance limits of trial master files (TMF) - Developing a risk-based approach for quality assessments of TMFs.

    PubMed

    Hecht, Arthur; Busch-Heidger, Barbara; Gertzen, Heiner; Pfister, Heike; Ruhfus, Birgit; Sanden, Per-Holger; Schmidt, Gabriele B

    2015-01-01

    This article addresses the question of when a trial master file (TMF) can be considered sufficiently accurate and complete: What attributes does the TMF need to have so that a clinical trial can be adequately reconstructed from documented data and procedures? Clinical trial sponsors face significant challenges in assembling the TMF, especially when dealing with large, international, multicenter studies; despite all newly introduced archiving techniques it is becoming more and more difficult to ensure that the TMF is complete. This is directly reflected in the number of inspection findings reported and published by the EMA in 2014. Based on quality risk management principles in clinical trials the authors defined the quality expectations for the different document types in a TMF and furthermore defined tolerance limits for missing documents. This publication provides guidance on what type of documents and processes are most important, and in consequence, indicates on which documents and processes trial team staff should focus in order to achieve a high-quality TMF. The members of this working group belong to the CQAG Group (Clinical Quality Assurance Germany) and are QA (quality assurance) experts (auditors or compliance functions) with long-term experience in the practical handling of TMFs.

  14. Manufacture and quality control of interconnecting wire hardnesses, Volume 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.

  15. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques

    PubMed Central

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Galietti, Umberto

    2017-01-01

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength. PMID:29019948

  16. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  17. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  18. Template-based protein structure modeling using the RaptorX web server.

    PubMed

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2012-07-19

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world.

  19. Template-based protein structure modeling using the RaptorX web server

    PubMed Central

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2016-01-01

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world. PMID:22814390

  20. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  1. Designing an in-situ ultrasonic nondestructive evaluation system for ultrasonic additive manufacturing

    NASA Astrophysics Data System (ADS)

    Nadimpalli, Venkata K.; Nagy, Peter B.

    2018-04-01

    Ultrasonic Additive Manufacturing (UAM) is a solid-state layer by layer manufacturing process that utilizes vibration induced plastic deformation to form a metallurgical bond between a thin layer and an existing base structure. Due to the vibration based bonding mechanism, the quality of components at each layer depends on the geometry of the structure. In-situ monitoring during and between UAM manufacturing steps offers the potential for closed-loop control to optimize process parameters and to repair existing defects. One interface that is most prone to delamination is the base/build interface and often UAM component height and quality are limited by failure at the base/build interface. Low manufacturing temperatures and favorable orientation of typical interface defects in UAM make ultrasonic NDE an attractive candidate for online monitoring. Two approaches for in-situ NDE are discussed and the design of the monitoring system optimized so that the quality of UAM components is not affected by the addition of the NDE setup. Preliminary results from in-situ ultrasonic NDE indicate the potential to be utilized for online qualification, closed-loop control and offline certification of UAM components.

  2. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  3. Rice industrial processing worldwide and impact on macro- and micronutrient content, stability, and retention

    USDA-ARS?s Scientific Manuscript database

    Various processing methods are used in the food industry worldwide to produce numerous rice products with desirable sensory qualities based on cultural and cooking preferences and nutritional considerations. The processes result in variable degrees of macro- and micronutrient content, stability, and...

  4. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    PubMed

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.

  5. European Society of Gynaecologic Oncology Quality Indicators for Advanced Ovarian Cancer Surgery.

    PubMed

    Querleu, Denis; Planchamp, François; Chiva, Luis; Fotopoulou, Christina; Barton, Desmond; Cibula, David; Aletti, Giovanni; Carinelli, Silvestro; Creutzberg, Carien; Davidson, Ben; Harter, Philip; Lundvall, Lene; Marth, Christian; Morice, Philippe; Rafii, Arash; Ray-Coquard, Isabelle; Rockall, Andrea; Sessa, Cristiana; van der Zee, Ate; Vergote, Ignace; du Bois, Andreas

    2016-09-01

    The surgical management of advanced ovarian cancer involves complex surgery. Implementation of a quality management program has a major impact on survival. The goal of this work was to develop a list of quality indicators (QIs) for advanced ovarian cancer surgery that can be used to audit and improve the clinical practice. This task has been carried out under the auspices of the European Society of Gynaecologic Oncology (ESGO). Quality indicators were based on scientific evidence and/or expert consensus. A 4-step evaluation process included a systematic literature search for the identification of potential QIs and the documentation of scientific evidence, physical meetings of an ad hoc multidisciplinarity International Development Group, an internal validation of the targets and scoring system, and an external review process involving physicians and patients. Ten structural, process, or outcome indicators were selected. Quality indicators 1 to 3 are related to achievement of complete cytoreduction, caseload in the center, training, and experience of the surgeon. Quality indicators 4 to 6 are related to the overall management, including active participation to clinical research, decision-making process within a structured multidisciplinary team, and preoperative workup. Quality indicator 7 addresses the high value of adequate perioperative management. Quality indicators 8 to 10 highlight the need of recording pertinent information relevant to improvement of quality. An ESGO-approved template for the operative report has been designed. Quality indicators were described using a structured format specifying what the indicator is measuring, measurability specifications, and targets. Each QI was associated with a score, and an assessment form was built. The ESGO quality criteria can be used for self-assessment, for institutional or governmental quality assurance programs, and for the certification of centers. Quality indicators and corresponding targets give practitioners and health administrators a quantitative basis for improving care and organizational processes in the surgical management of advanced ovarian cancer.

  6. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  7. A survey-based benchmarking approach for health care using the Baldrige quality criteria.

    PubMed

    Jennings, K; Westfall, F

    1994-09-01

    Since 1988, manufacturing and service industries have been using the Malcolm Baldrige National Quality Award to assess their management processes (for example, leadership, information, and analysis) against critical performance criteria. Recognizing that the typical Baldrige assessment is time intensive and dependent on intensive training, The Pacer Group, a consulting firm in Dayton, Ohio, developed a self-assessment tool based on the Baldrige criteria which provides a snapshot assessment of an organization's management practices. The survey was administered at 25 hospitals within a health care system. Hospitals were able to compare their scores with other hospitals in the system, as well as the scores of a Baldrige award winner. Results were also analyzed on a systemwide basis to identify strengths and weaknesses across the system. For all 25 hospitals, the following areas were identified as strengths: management of process quality, leadership, and customer focus and satisfaction. Weaknesses included lack of employee involvement in the quality planning process, poor design of quality systems, and lack of cross-departmental cooperation. One of the surveyed hospitals launched improvement initiatives in knowledge of improvement tools and methods and in a patient satisfaction focus. A team was formed to improve the human resource management system. Also, a new unit was designed using patient-centered care principles. A team re-evaluated every operation that affected patients on the unit. A survey modeled after the Baldrige Award criteria can be useful in benchmarking an organization's quality improvement practices.

  8. Design of video processing and testing system based on DSP and FPGA

    NASA Astrophysics Data System (ADS)

    Xu, Hong; Lv, Jun; Chen, Xi'ai; Gong, Xuexia; Yang, Chen'na

    2007-12-01

    Based on high speed Digital Signal Processor (DSP) and Field Programmable Gate Array (FPGA), a video capture, processing and display system is presented, which is of miniaturization and low power. In this system, a triple buffering scheme was used for the capture and display, so that the application can always get a new buffer without waiting; The Digital Signal Processor has an image process ability and it can be used to test the boundary of workpiece's image. A video graduation technology is used to aim at the position which is about to be tested, also, it can enhance the system's flexibility. The character superposition technology realized by DSP is used to display the test result on the screen in character format. This system can process image information in real time, ensure test precision, and help to enhance product quality and quality management.

  9. Numerical simulation of polishing U-tube based on solid-liquid two-phase

    NASA Astrophysics Data System (ADS)

    Li, Jun-ye; Meng, Wen-qing; Wu, Gui-ling; Hu, Jing-lei; Wang, Bao-zuo

    2018-03-01

    As the advanced technology to solve the ultra-precision machining of small hole structure parts and complex cavity parts, the abrasive grain flow processing technology has the characteristics of high efficiency, high quality and low cost. So this technology in many areas of precision machining has an important role. Based on the theory of solid-liquid two-phase flow coupling, a solid-liquid two-phase MIXTURE model is used to simulate the abrasive flow polishing process on the inner surface of U-tube, and the temperature, turbulent viscosity and turbulent dissipation rate in the process of abrasive flow machining of U-tube were compared and analyzed under different inlet pressure. In this paper, the influence of different inlet pressure on the surface quality of the workpiece during abrasive flow machining is studied and discussed, which provides a theoretical basis for the research of abrasive flow machining process.

  10. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient. PMID:27370140

  11. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M. Saiful, E-mail: HUQS@UPMC.EDU

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact ofmore » possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.« less

  12. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.

  13. Basecalling with LifeTrace

    PubMed Central

    Walther, Dirk; Bartha, Gábor; Morris, Macdonald

    2001-01-01

    A pivotal step in electrophoresis sequencing is the conversion of the raw, continuous chromatogram data into the actual sequence of discrete nucleotides, a process referred to as basecalling. We describe a novel algorithm for basecalling implemented in the program LifeTrace. Like Phred, currently the most widely used basecalling software program, LifeTrace takes processed trace data as input. It was designed to be tolerant to variable peak spacing by means of an improved peak-detection algorithm that emphasizes local chromatogram information over global properties. LifeTrace is shown to generate high-quality basecalls and reliable quality scores. It proved particularly effective when applied to MegaBACE capillary sequencing machines. In a benchmark test of 8372 dye-primer MegaBACE chromatograms, LifeTrace generated 17% fewer substitution errors, 16% fewer insertion/deletion errors, and 2.4% more aligned bases to the finished sequence than did Phred. For two sets totaling 6624 dye-terminator chromatograms, the performance improvement was 15% fewer substitution errors, 10% fewer insertion/deletion errors, and 2.1% more aligned bases. The processing time required by LifeTrace is comparable to that of Phred. The predicted quality scores were in line with observed quality scores, permitting direct use for quality clipping and in silico single nucleotide polymorphism (SNP) detection. Furthermore, we introduce a new type of quality score associated with every basecall: the gap-quality. It estimates the probability of a deletion error between the current and the following basecall. This additional quality score improves detection of single basepair deletions when used for locating potential basecalling errors during the alignment. We also describe a new protocol for benchmarking that we believe better discerns basecaller performance differences than methods previously published. PMID:11337481

  14. Steps in Moving Evidence-Based Health Informatics from Theory to Practice.

    PubMed

    Rigby, Michael; Magrabi, Farah; Scott, Philip; Doupi, Persephone; Hypponen, Hannele; Ammenwerth, Elske

    2016-10-01

    To demonstrate and promote the importance of applying a scientific process to health IT design and implementation, and of basing this on research principles and techniques. A review by international experts linked to the IMIA Working Group on Technology Assessment and Quality Development. Four approaches are presented, linking to the creation of national professional expectations, adherence to research-based standards, quality assurance approaches to ensure safety, and scientific measurement of impact. Solely marketing- and aspiration-based approaches to health informatics applications are no longer ethical or acceptable when scientifically grounded evidence-based approaches are available and in use.

  15. Quality control in the recycling stream of PVC cable waste by hyperspectral imaging analysis

    NASA Astrophysics Data System (ADS)

    Luciani, Valentina; Serranti, Silvia; Bonifazi, Giuseppe; Rem, Peter

    2005-05-01

    In recent years recycling is gaining a key role in the manufacturing industry. The use of recycled materials in the production of new goods has the double advantage of saving energy and natural resources, moreover from an economic point of view, recycled materials are in general cheaper than the virgin ones. Despite of these environmental and economic strengths, the use of recycled sources is still low compared to the raw materials consumption, indeed in Europe only 10% of the market is covered by recycled products. One of the reasons of this reticence in the use of secondary sources is the lack of an accurate quality certification system. The inputs of a recycled process are not always the same, which means that also the output of a particular process can vary depending on the initial composition of the treated material. Usually if a continuous quality control system is not present at the end of the process the quality of the output material is assessed on the minimum certified characteristics. Solving this issue is crucial to expand the possible applications of recycled materials and to assign a price based on the real characteristic of the material. The possibility of applying a quality control system based on a hyperspectral imaging (HSI) technology working in the near infrared (NIR) range to the output of a separation process of PVC cable wastes is explored in this paper. The analysed material was a residue fraction of a traditional separation process further treated by magnetic density separation. Results show as PVC, PE, rubber and copper particles can be identified and classified adopting the NIR-HSI approach.

  16. Digital Light Processing update: status and future applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  17. Research on the processing technology of elongated holes based on rotary ultrasonic drilling

    NASA Astrophysics Data System (ADS)

    Tong, Yi; Chen, Jianhua; Sun, Lipeng; Yu, Xin; Wang, Xin

    2014-08-01

    The optical glass is hard, brittle and difficult to process. Based on the method of rotating ultrasonic drilling, the study of single factor on drilling elongated holes was made in optical glass. The processing equipment was DAMA ultrasonic machine, and the machining tools were electroplated with diamond. Through the detection and analysis on the processing quality and surface roughness, the process parameters (the spindle speed, amplitude, feed rate) of rotary ultrasonic drilling were researched, and the influence of processing parameters on surface roughness was obtained, which will provide reference and basis for the actual processing.

  18. Total Quality Management: Implications for Educational Assessment.

    ERIC Educational Resources Information Center

    Rankin, Stuart C.

    1992-01-01

    Deming's "System of Profound Knowledge" is even more fundamental than his 14-principle system transformation guide and is based on 4 elements: systems theory, statistical variation, a theory of knowledge, and psychology. Management should revamp total system processes so that quality of product is continually improved. Implications for…

  19. How Do Deaf Adults Define Quality of Life?

    ERIC Educational Resources Information Center

    McAbee, Emilee R.; Drasgow, Erik; Lowrey, K. Alisa

    2017-01-01

    Six deaf adults defined quality of life (QOL) in personal interviews. Questions were based on an eight-domain QOL framework: physical well-being, emotional well-being, interpersonal relations, social inclusion, personal development, material well-being, self-determination, and rights (Schalock & Alonso, 2002). The interview process had three…

  20. Microfilm Permanence and Archival Quality

    ERIC Educational Resources Information Center

    Avedon, Don M.

    1972-01-01

    The facts about microfilm permanence and archival quality are presented in simple terms. The major factors, including the film base material, the film emulsion, processing, and storage conditions are reviewed. The designations on the edge of the film are explained and a list of refernces provided. (14 references) (Author)

Top