Sample records for chemico process

  1. Predicting skin sensitisation using a decision tree integrated testing strategy with an in silico model and in chemico/in vitro assays.

    PubMed

    Macmillan, Donna S; Canipa, Steven J; Chilton, Martyn L; Williams, Richard V; Barber, Christopher G

    2016-04-01

    There is a pressing need for non-animal methods to predict skin sensitisation potential and a number of in chemico and in vitro assays have been designed with this in mind. However, some compounds can fall outside the applicability domain of these in chemico/in vitro assays and may not be predicted accurately. Rule-based in silico models such as Derek Nexus are expert-derived from animal and/or human data and the mechanism-based alert domain can take a number of factors into account (e.g. abiotic/biotic activation). Therefore, Derek Nexus may be able to predict for compounds outside the applicability domain of in chemico/in vitro assays. To this end, an integrated testing strategy (ITS) decision tree using Derek Nexus and a maximum of two assays (from DPRA, KeratinoSens, LuSens, h-CLAT and U-SENS) was developed. Generally, the decision tree improved upon other ITS evaluated in this study with positive and negative predictivity calculated as 86% and 81%, respectively. Our results demonstrate that an ITS using an in silico model such as Derek Nexus with a maximum of two in chemico/in vitro assays can predict the sensitising potential of a number of chemicals, including those outside the applicability domain of existing non-animal assays. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Chemical stability and in chemico reactivity of 24 fragrance ingredients of concern for skin sensitization risk assessment.

    PubMed

    Avonto, Cristina; Wang, Mei; Chittiboyina, Amar G; Vukmanovic, Stanislav; Khan, Ikhlas A

    2018-02-01

    Twenty-four pure fragrance ingredients have been identified as potential concern for skin sensitization. Several of these compounds are chemically unstable and convert into reactive species upon exposure to air or light. In the present work, a systematic investigation of the correlation between chemical stability and reactivity has been undertaken. The compounds were subjected to forced photodegradation for three months and the chemical changes were studied with GC-MS. At the end of the stability study, two-thirds of the samples were found to be unstable. The generation of chemically reactive species was investigated using the in chemico HTS-DCYA assay. Eleven and fourteen compounds were chemically reactive before and after three months, respectively. A significant increase in reactivity upon degradation was found for isoeugenol, linalool, limonene, lyral, citronellol and geraniol; in the same conditions, the reactivity of hydroxycitronellal decreased. The non-reactive compounds α-isomethyl ionone, benzyl alcohol, amyl cinnamal and farnesol became reactive after photo-oxidative degradation. Overall, forced degradation resulted in four non-reactive fragrance compounds to display in chemico thiol reactivity, while ten out of 24 compounds remained inactive. Chemical degradation does not necessarily occur with generation of reactive species. Non-chemical activation may be involved for the 10 stable unreactive compounds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avonto, Cristina

    German chamomile is one of the most popular herbal ingredients used in cosmetics and personal care products. Allergic skin reactions following topical application of German chamomile have been occasionally reported, although it is not fully understood which of the chemical constituents is responsible for this adverse effect. In the present work, three candidate sensitizers were isolated from German chamomile based on activity-guided fractionation of chamomile extracts tested using the in vitro KeratinoSens™ assay. The compounds were identified as the polyacetylene tonghaosu (1), and both trans- and cis-glucomethoxycinnamic acids (2 and 3). These three compounds were classified as non- to weaklymore » reactive using in chemico methods; however, aged tonghaosu was found to be more reactive when compared to freshly isolated tonghaosu. The polyacetylene (1) constituent was determined to be chemically unstable, generating a small electrophilic spirolactone, 1,6-dioxaspiro[4.4]non-3-en-2-one (4), upon aging. This small lactone (4) was strongly reactive in both in chemico HTS- and NMR-DCYA methods and further confirmed as a potential skin sensitizer by Local Lymph Node Assay (LLNA). - Highlights: • Fractions of German chamomile tested positive in the KeratinoSens™ assay. • Three compounds containing structural alerts were isolated and tested with in chemico methods. • The polyacetylene tonghaosu was found to be unstable and categorized as potential pre-hapten. • A degradation product of tonghaosu tested as positive dermal sensitizer in animal studies.« less

  4. Waste treatment integration in space

    NASA Technical Reports Server (NTRS)

    Baresi, L.; Kern, R.

    1991-01-01

    The circumstances and criteria for space-based waste treatment bioregenerative life-support systems differ in many ways from those needed in terrestrial applications. In fact, the term "waste" may not even be appropriate in the context of nearly closed, cycling, ecosystems such as those under consideration. Because of these constraints there is a need for innovative approaches to the problem of "materials recycling". Hybrid physico-chemico-biological systems offer advantages over both strictly physico-chemico or biological approaches that would be beneficial to material recycling. To effectively emulate terrestrial cycling, the use of various microbial consortia ("assemblies of interdependent microbes") should be seriously considered for the biological components of such systems. This paper will examine the use of consortia in the context of a hybrid-system for materials recycling in space.

  5. Antonella Amore | NREL

    Science.gov Websites

    Pleurotus ostreatus," Biotechnol. Appl. Biochem. (2014) "Enzymes for Food and Beverage Industries : Current Situation, Challenges and Perspectives" in Advances in Food Biotechnology (2013) " : Enzymatic and Microbial Tools for Bioethanol Production (2013) "Chemico-physical factors affecting food

  6. Machine Learning Approaches for Predicting Human Skin Sensitization Hazard

    EPA Science Inventory

    One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary for a substance to elicit a skin sensitization reaction suggests that no single in chemico, in vit...

  7. A fluorescence high throughput screening method for the detection of reactive electrophiles as potential skin sensitizers.

    PubMed

    Avonto, Cristina; Chittiboyina, Amar G; Rua, Diego; Khan, Ikhlas A

    2015-12-01

    Skin sensitization is an important toxicological end-point in the risk assessment of chemical allergens. Because of the complexity of the biological mechanisms associated with skin sensitization, integrated approaches combining different chemical, biological and in silico methods are recommended to replace conventional animal tests. Chemical methods are intended to characterize the potential of a sensitizer to induce earlier molecular initiating events. The presence of an electrophilic mechanistic domain is considered one of the essential chemical features to covalently bind to the biological target and induce further haptenation processes. Current in chemico assays rely on the quantification of unreacted model nucleophiles after incubation with the candidate sensitizer. In the current study, a new fluorescence-based method, 'HTS-DCYA assay', is proposed. The assay aims at the identification of reactive electrophiles based on their chemical reactivity toward a model fluorescent thiol. The reaction workflow enabled the development of a High Throughput Screening (HTS) method to directly quantify the reaction adducts. The reaction conditions have been optimized to minimize solubility issues, oxidative side reactions and increase the throughput of the assay while minimizing the reaction time, which are common issues with existing methods. Thirty-six chemicals previously classified with LLNA, DPRA or KeratinoSens™ were tested as a proof of concept. Preliminary results gave an estimated 82% accuracy, 78% sensitivity, 90% specificity, comparable to other in chemico methods such as Cys-DPRA. In addition to validated chemicals, six natural products were analyzed and a prediction of their sensitization potential is presented for the first time. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Survey of flue gas desulfurization systems: Dickerson Station, Potomac Electric Power Co. Final report, Feb--Aug 1975

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isaacs, G.A.

    1975-09-01

    Results are given of a survey of a flue gas desulfurization system, utilizing the Chemico/Basic MgO-SO2 removal/recovery process, that has been retrofitted to handle approximately half of the exhaust gas from the 190 MW unit 3 at Potomac Electric Power Company's Dickerson Station. The system was installed at a cost of SO.5 million. The boiler burns 2% sulfur coal and is equipped with a 94% efficient electrostatic precipitator. A single two-stage scrubber/absorber is used. The liquor streams for the two stages are separate, both operating in a closed-loop mode. Magnesium oxide (MgO) is regenerated off-site. (GRA)

  9. Methyl methacrylate and respiratory sensitization: A Critical review

    PubMed Central

    Borak, Jonathan; Fields, Cheryl; Andrews, Larry S; Pemberton, Mark A

    2011-01-01

    Methyl methacrylate (MMA) is a respiratory irritant and dermal sensitizer that has been associated with occupational asthma in a small number of case reports. Those reports have raised concern that it might be a respiratory sensitizer. To better understand that possibility, we reviewed the in silico, in chemico, in vitro, and in vivo toxicology literature, and also epidemiologic and occupational medicine reports related to the respiratory effects of MMA. Numerous in silico and in chemico studies indicate that MMA is unlikely to be a respiratory sensitizer. The few in vitro studies suggest that MMA has generally weak effects. In vivo studies have documented contact skin sensitization, nonspecific cytotoxicity, and weakly positive responses on local lymph node assay; guinea pig and mouse inhalation sensitization tests have not been performed. Cohort and cross-sectional worker studies reported irritation of eyes, nose, and upper respiratory tract associated with short-term peaks exposures, but little evidence for respiratory sensitization or asthma. Nineteen case reports described asthma, laryngitis, or hypersensitivity pneumonitis in MMA-exposed workers; however, exposures were either not well described or involved mixtures containing more reactive respiratory sensitizers and irritants.The weight of evidence, both experimental and observational, argues that MMA is not a respiratory sensitizer. PMID:21401327

  10. Identification of a compound isolated from German chamomile (Matricaria chamomilla) with dermal sensitization potential.

    PubMed

    Avonto, Cristina; Rua, Diego; Lasonkar, Pradeep B; Chittiboyina, Amar G; Khan, Ikhlas A

    2017-03-01

    German chamomile is one of the most popular herbal ingredients used in cosmetics and personal care products. Allergic skin reactions following topical application of German chamomile have been occasionally reported, although it is not fully understood which of the chemical constituents is responsible for this adverse effect. In the present work, three candidate sensitizers were isolated from German chamomile based on activity-guided fractionation of chamomile extracts tested using the in vitro KeratinoSens™ assay. The compounds were identified as the polyacetylene tonghaosu (1), and both trans- and cis-glucomethoxycinnamic acids (2 and 3). These three compounds were classified as non- to weakly reactive using in chemico methods; however, aged tonghaosu was found to be more reactive when compared to freshly isolated tonghaosu. The polyacetylene (1) constituent was determined to be chemically unstable, generating a small electrophilic spirolactone, 1,6-dioxaspiro[4.4]non-3-en-2-one (4), upon aging. This small lactone (4) was strongly reactive in both in chemico HTS- and NMR-DCYA methods and further confirmed as a potential skin sensitizer by Local Lymph Node Assay (LLNA). Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Production and purification of recombinant human glucagon overexpressed as intein fusion protein in Escherichia coli.

    PubMed

    Esipov, Roman S; Stepanenko, Vasily N; Gurevich, Alexandr I; Chupova, Larisa A; Miroshnikov, Anatoly I

    2006-01-01

    Chemico-enzymatic synthesis and cloning in Esherichia coli of an artificial gene coding human glucagon was performed. Recombinant plasmid containing hybrid glucagons gene and intein Ssp dnaB from Synechocestis sp. was designed. Expression of the obtained hybrid gene in E. coli, properties of the formed hybrid protein, and conditions of its autocatalytic cleavage leading to glucagon formation were studied.

  12. In chemico evaluation of tea tree essential oils as skin sensitizers: Impact of the chemical composition on aging and generation of reactive species

    USDA-ARS?s Scientific Manuscript database

    Tea tree oil (TTO) is a popular skin remedy obtained from the leaves of Melaleuca alternifolia, M. linariifolia or M dissitiflora. Due to the commercial importance ofTTO, substitution or adulteration with other tea tree species (such as cajeput, niaouli, manuka and kanuka oils) is common and may p...

  13. Surface investigation of naturally corroded gilded copper-based objects

    NASA Astrophysics Data System (ADS)

    Ingo, G. M.; Riccucci, C.; Lavorgna, M.; Salzano de Luna, M.; Pascucci, M.; Di Carlo, G.

    2016-11-01

    Gold and silver coated copper-based artefacts subjected to long-term natural corrosion phenomena were studied by means of the combined use of X-ray photoelectron spectroscopy (XPS), scanning electron microscopy coupled with energy dispersive X-ray spectroscopy (SEM + EDS), and optical microscopy (OM). The results allowed the identification of the chemistry and structure of the Au or Ag layers deposited by fire-gilding or mercury-silvering and the determination of the corrosion products formed due to interaction with the surrounding environment. Different degradation phenomena of the noble metal layer and copper substrate are induced by the presence of chlorine, sulphur and phosphorous and they are boosted by the metal galvanic coupling which makes gilded-metal art works unstable from a chemico-physical point of view. The SEM + EDS and OM results also suggest that particular care must be used during the removal of the encrustations and of the external corrosion products to avoid the loss of the remains of the noble layer often floating or embedded in the corrosion products. Furthermore, in order to avoid the reaction between nantokite (CuCl) and moisture the use no or low toxic inhibitors is suggested to avoid further severe degradation phenomena enhancing the long-lasting chemico-physical stability of these precious artefacts and giving them a greater chance of survival.

  14. COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach

    PubMed Central

    Kapetanovic, I.M.

    2008-01-01

    It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415

  15. Compliant electrospun silk fibroin tubes for small vessel bypass grafting.

    PubMed

    Marelli, Benedetto; Alessandrino, Antonio; Farè, Silvia; Freddi, Giuliano; Mantovani, Diego; Tanzi, Maria Cristina

    2010-10-01

    Processing silk fibroin (SF) by electrospinning offers a very attractive opportunity for producing three-dimensional nanofibrillar matrices in tubular form, which may be useful for a biomimetic approach to small calibre vessel regeneration. Bypass grafting of small calibre vessels, with a diameter less than 6mm, is performed mainly using autografts, like the saphenous vein or internal mammary artery. At present no polymeric grafts made of SF are commercially available, mainly due to inadequate properties (low compliance and lack of endothelium cells). The aim of this work was to electrospin SF into tubular structures (Ø=6mm) for small calibre vessel grafting, characterize the morphological, chemico-physical and mechanical properties of the electrospun SF structures and to validate their potential to interact with cells. The morphological properties of electrospun SF nanofibres were investigated by scanning electron microscopy. Chemico-physical analyses revealed an increase in the crystallinity of the structure of SF nanofibres on methanol treatment. Mechanical tests, i.e. compliance and burst pressure measurements, of the electrospun SF tubes showed that the inner pressure to radial deformation ratio was linear for elongation up to 15% and pressure up to 400 mm Hg. The mean compliance value between 80 and 120 mm Hg was higher than the values reported for both Goretex(R) and Dacron(R) grafts and for bovine heterografts, but still slightly lower than those of saphenous and umbilical vein, which nowadays represent the gold standard for the replacement of small calibre arteries. The electrospun tubes resisted up to 575+/-17 mmHg, which is more than four times the upper physiological pressure of 120 mmHg and more than twice the pathological upper pressures (range 180-220 mmHg). The in vitro tests showed a good cytocompatibility of the electrospun SF tubes. Therefore, the electrospun SF tubes developed within this work represent a suitable candidate for small calibre blood vessel replacement. 2010 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  16. A fluorescence high throughput screening method for the detection of reactive electrophiles as potential skin sensitizers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avonto, Cristina; Chittiboyina, Amar G.; Rua, Diego

    2015-12-01

    Skin sensitization is an important toxicological end-point in the risk assessment of chemical allergens. Because of the complexity of the biological mechanisms associated with skin sensitization, integrated approaches combining different chemical, biological and in silico methods are recommended to replace conventional animal tests. Chemical methods are intended to characterize the potential of a sensitizer to induce earlier molecular initiating events. The presence of an electrophilic mechanistic domain is considered one of the essential chemical features to covalently bind to the biological target and induce further haptenation processes. Current in chemico assays rely on the quantification of unreacted model nucleophiles aftermore » incubation with the candidate sensitizer. In the current study, a new fluorescence-based method, ‘HTS-DCYA assay’, is proposed. The assay aims at the identification of reactive electrophiles based on their chemical reactivity toward a model fluorescent thiol. The reaction workflow enabled the development of a High Throughput Screening (HTS) method to directly quantify the reaction adducts. The reaction conditions have been optimized to minimize solubility issues, oxidative side reactions and increase the throughput of the assay while minimizing the reaction time, which are common issues with existing methods. Thirty-six chemicals previously classified with LLNA, DPRA or KeratinoSens™ were tested as a proof of concept. Preliminary results gave an estimated 82% accuracy, 78% sensitivity, 90% specificity, comparable to other in chemico methods such as Cys-DPRA. In addition to validated chemicals, six natural products were analyzed and a prediction of their sensitization potential is presented for the first time. - Highlights: • A novel fluorescence-based method to detect electrophilic sensitizers is proposed. • A model fluorescent thiol was used to directly quantify the reaction products. • A discussion of the reaction workflow and critical parameters is presented. • The method could provide a useful tool to complement existing chemical assays.« less

  17. All-SPEEK flexible supercapacitor exploiting laser-induced graphenization

    NASA Astrophysics Data System (ADS)

    Lamberti, A.; Serrapede, M.; Ferraro, G.; Fontana, M.; Perrucci, F.; Bianco, S.; Chiolerio, A.; Bocchini, S.

    2017-09-01

    Flexible supercapacitors have emerged as one of the more promising and efficient space-saving energy storage system for portable and wearable electronics. Laser-induced graphenization has been recently proposed as a powerful and scalable method to directly convert a polymeric substrate into a 3D network of few layer graphene as high-performance supercapacitor electrode. Unfortunately this outstanding process has been reported to be feasible only for few thermoplastic polymers, strongly limiting its future developments. Here we show that laser induced graphenization of sulfonated poly(ether ether ketone) (SPEEK) can be obtained and the mechanism of this novel process is proposed. The resulting material can act at the same time as binder-free electrode and current collector. Moreover SPEEK is also used both as separator and polymeric electrolyte allowing the assembling of an all-SPEEK flexible supercapacitor. Chemico-physical characterization provides deep understanding of the laser-induced graphenization process, reported on this polymer for the first time, while the device performance studied by cyclic voltammetry, charging-discharging, and impedance spectroscopy prove the enormous potential of the proposed approach.

  18. Control of Breast Tumor Cell Growth by Dietary Indoles

    DTIC Science & Technology

    1997-09-01

    N-nitrosodimethylamine metabolites to mouse liver macromolecules. Chemico-Biol. Interactions 48, 81-90. 5. Bailey, G.S., Hendricks , J.D., Shelton...Food Chem. Toxicol. 21, 31-36. 7. Dashwood, R.H., Arbogast, D.N., Fong, A.T., Hendricks , J.D. and Bailey, G.S. (1988) Mechanisms of... penicillin , 50 units/ml streptomycin, and 2 mM L-glutamine. MDA- MB-231 cells were grown in DMEM supplemented with 10% FBS, 50 units/ml penicillin , 50

  19. In Vivo Reactivation by Oximes of Inhibited Blood, Brain and Peripheral Tissue Cholinesterase Activity Following Exposure to Nerve Agents in Guinea Pigs

    DTIC Science & Technology

    2010-01-01

    L.W.Harris, D.L. Stitcher , Reactivation of VX-inhibited cholinesterase by 2-PAM and HS-6 in rats, Drug Chem. Toxicol. 6 (1983) 235–240. [9] P.M. Lundy, T.-M...rat, monkey and human, Arch. Toxicol. 68 (1994) 648–655. 2 gical In [ [ 14 T.-M. Shih et al. / Chemico-Biolo27] L.W. Harris, W.C. Heyl, D.L. Stitcher

  20. The local lymph node assay in 2014.

    PubMed

    Basketter, David A; Gerberick, G Frank; Kimber, Ian

    2014-01-01

    Toxicology endeavors to predict the potential of materials to cause adverse health (and environmental) effects and to assess the risk(s) associated with exposure. For skin sensitizers, the local lymph node assay was the first method to be fully and independently validated, as well as the first to offer an objective end point with a quantitative measure of sensitizing potency (in addition to hazard identification). Fifteen years later, it serves as the primary standard for the development of in vitro/in chemico/in silico alternatives.

  1. Chemico-Genetic Identification of Drebrin as a Regulator of Calcium Responses

    PubMed Central

    Mercer, Jason C.; Qi, Qian; Mottram, Laurie F.; Law, Mankit; Bruce, Danny; Iyer, Archana; Morales, J. Luis; Yamazaki, Hiroyuki; Shirao, Tomoaki; Peterson, Blake R.; August, Avery

    2009-01-01

    Store-operated calcium channels are plasma membrane Ca2+ channels that are activated by depletion of intracellular Ca2+ stores, resulting in an increase in intracellular Ca2+ concentration, which is maintained for prolonged periods in some cell types. Increases in intracellular Ca2+ concentration serve as signals that activate a number of cellular processes, however, little is known about the regulation of these channels. We have characterized the immuno-suppressant compound BTP, which blocks store-operated channel mediated calcium influx into cells. Using an affinity purification scheme to identify potential targets of BTP, we identified the actin reorganizing protein, drebrin, and demonstrated that loss of drebrin protein expression prevents store-operated channel mediated Ca2+ entry, similar to BTP treatment. BTP also blocks actin rearrangements induced by drebrin. While actin cytoskeletal reorganization has been implicated in store-operated calcium channel regulation, little is known about actin binding proteins that are involved in this process, or how actin regulates channel function. The identification of drebrin as a mediator of this process should provide new insight into the interaction between actin rearrangement and tore-operated channel mediated calcium influx. PMID:19948240

  2. Non-animal methods to predict skin sensitization (II): an assessment of defined approaches *.

    PubMed

    Kleinstreuer, Nicole C; Hoffmann, Sebastian; Alépée, Nathalie; Allen, David; Ashikaga, Takao; Casey, Warren; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Göbel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Strickland, Judy; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Skin sensitization is a toxicity endpoint of widespread concern, for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture, with many available options for predicting sensitization without using animals. Cosmetics Europe and the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods collaborated to analyze the performance of multiple non-animal data integration approaches for the skin sensitization safety assessment of cosmetics ingredients. The Cosmetics Europe Skin Tolerance Task Force (STTF) collected and generated data on 128 substances in multiple in vitro and in chemico skin sensitization assays selected based on a systematic assessment by the STTF. These assays, together with certain in silico predictions, are key components of various non-animal testing strategies that have been submitted to the Organization for Economic Cooperation and Development as case studies for skin sensitization. Curated murine local lymph node assay (LLNA) and human skin sensitization data were used to evaluate the performance of six defined approaches, comprising eight non-animal testing strategies, for both hazard and potency characterization. Defined approaches examined included consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which were reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs demonstrated equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization.

  3. Chemico-therapeutic approach to prevention of dental caries. [using stannous fluoride gel

    NASA Technical Reports Server (NTRS)

    Shannon, I. L.

    1975-01-01

    The program of chemical preventive dentistry is based primarily upon the development of a procedure for stabilizing stannous fluoride in solution by forcing it into glycerin. New topical fluoride treatment concentrates, fluoride containing gels and prophylaxis pastes, as well as a completely stable stannous fluoride dentifrice are made possible by the development of a rather complicated heat application method to force stannous fluoride into solution in glycerin. That the stannous fluoride is clinically effective in such a preparation is demonstrated briefly on orthodontic patients.

  4. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  5. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  6. Physicochemical regeneration of high silica zeolite Y used to clean-up water polluted with sulfonamide antibiotics.

    PubMed

    Braschi, I; Blasioli, S; Buscaroli, E; Montecchio, D; Martucci, A

    2016-05-01

    High silica zeolite Y has been positively evaluated to clean-up water polluted with sulfonamides, an antibiotic family which is known to be involved in the antibiotic resistance evolution. To define possible strategies for the exhausted zeolite regeneration, the efficacy of some chemico-physical treatments on the zeolite loaded with four different sulfonamides was evaluated. The evolution of photolysis, Fenton-like reaction, thermal treatments, and solvent extractions and the occurrence in the zeolite pores of organic residues eventually entrapped was elucidated by a combined thermogravimetric (TGA-DTA), diffractometric (XRPD), and spectroscopic (FT-IR) approach. The chemical processes were not able to remove the organic guest from zeolite pores and a limited transformation on embedded molecules was observed. On the contrary, both thermal treatment and solvent extraction succeeded in the regeneration of the zeolite loaded from deionized and natural fresh water. The recyclability of regenerated zeolite was evaluated over several adsorption/regeneration cycles, due to the treatment efficacy and its stability as well as the ability to regain the structural features of the unloaded material. Copyright © 2015. Published by Elsevier B.V.

  7. Synthesis and mechanical behavior of β-tricalcium phosphate/titania composites addressed to regeneration of long bone segments.

    PubMed

    Sprio, Simone; Guicciardi, Stefano; Dapporto, Massimiliano; Melandri, Cesare; Tampieri, Anna

    2013-01-01

    Bioactive tricalcium phosphate/titania ceramic composites were synthesized by pressureless air sintering of mixed hydroxyapatite and titania (TiO2) powders. The sintering process was optimized to achieve dense ceramic bodies consisting in a bioactive/bioresorbable matrix (β-tricalcium phosphate) reinforced with defined amounts of sub-micron sized titania particles. Extensive chemico-physical and mechanical characterization was carried out on the resulting composites, which displayed values of flexural strength, fracture toughness and elastic modulus in the range or above the typical ranges of values manifested by human cortical bone. It was shown that titania particles provided a toughening effect to the calcium-phosphate matrix and a reinforcement in fracture strength, in comparison with sintered hydroxyapatite bodies characterized by similar relative density. The characteristics of the resulting composites, i.e. bioactivity/bioresorbability and ability of manifesting biomimetic mechanical behavior, are features that can promote processes of bone regeneration in load-bearing sites. Hence, in the perspective of developing porous bone scaffolds with high bioactivity and improved biomechanical behavior, TCP/TiO2 composites with controlled composition can be considered as very promising biomaterials for application in a field of orthopedics where no acceptable clinical solutions still exist. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Challenges of UV light processing of low UVT foods and beverages

    NASA Astrophysics Data System (ADS)

    Koutchma, Tatiana

    2010-08-01

    Ultraviolet (UV) technology holds promise as a low cost non-thermal alternative to heat pasteurization of liquid foods and beverages. However, its application for foods is still limited due to low UV transmittance (LUVT). LUVT foods have a diverse range of chemical (pH, Brix, Aw), physical (density and viscosity) and optical properties (absorbance and scattering) that are critical for systems and process designs. The commercially available UV sources tested for foods include low and medium pressure mercury lamps (LPM and MPM), excimer and pulsed lamps (PUV). The LPM and excimer lamps are monochromatic sources whereas emission of MPM and PUV is polychromatic. The optimized design of UV-systems and UV-sources with parameters that match to specific product spectra have a potential to make UV treatments of LUVT foods more effective and will serve its further commercialization. In order to select UV source for specific food application, processing effects on nutritional, quality, sensorial and safety markers have to be evaluated. This paper will review current status of UV technology for food processing along with regulatory requirements. Discussion of approaches and results of measurements of chemico-physical and optical properties of various foods (fresh juices, milk, liquid whey proteins and sweeteners) that are critical for UV process and systems design will follow. Available UV sources did not prove totally effective either resulting in low microbial reduction or UV over-dosing of the product thereby leading to sensory changes. Beam shaping of UV light presents new opportunities to improve dosage uniformity and delivery of UV photons in LUVT foods.

  9. Responsive hydrogels--structurally and dimensionally optimized smart frameworks for applications in catalysis, micro-system technology and material science.

    PubMed

    Döring, Artjom; Birnbaum, Wolfgang; Kuckling, Dirk

    2013-09-07

    Although the technological and scientific importance of functional polymers has been well established over the last few decades, the most recent focus that has attracted much attention has been on stimuli-responsive polymers. This group of materials is of particular interest due to its ability to respond to internal and/or external chemico-physical stimuli, which is often manifested as large macroscopic responses. Aside from scientific challenges of designing stimuli-responsive polymers, the main technological interest lies in their numerous applications ranging from catalysis through microsystem technology and chemomechanical actuators to sensors that have been extensively explored. Since the phase transition phenomenon of hydrogels is theoretically well understood advanced materials based on the predictions can be prepared. Since the volume phase transition of hydrogels is a diffusion-limited process the size of the synthesized hydrogels is an important factor. Consistent downscaling of the gel size will result in fast smart gels with sufficient response times. In order to apply smart gels in microsystems and sensors, new preparation techniques for hydrogels have to be developed. For the up-coming nanotechnology, nano-sized gels as actuating materials would be of great interest.

  10. Development of novel antibiofouling materials from natural phenol compounds

    NASA Astrophysics Data System (ADS)

    Chelikani, Rahul; Kim, Dong Shik

    2007-03-01

    Biofilms consist of a gelatinous matrix formed on a solid surface by microbial organisms.Biofilm is caused due to the adhesion of microbes to solid surfaces with production of extracellular polymers and the process of the biofilm formation is reffered to as biofouling.Biofouling causes serious problems in chemical, medical and pharmaceutical industries.Although there have been some antibiofouling materials developed over the years,no plausible results have been found yet.Natural polyphenolic compounds like flavanoids,cathechins have strong antioxidant and antimicrobial properties.Recently,apocynin,a phenol derivative,was polymerized to form oligomers,which can regulate intracellular pathways in cancer cells preventing cell proliferation and migration.These natural phenolic compounds have never been applied to solid surfaces to prevent biofouling.It is thought that probably because of the difficulty to crosslink them to form a stable coating.In this study,some novel polyphenolic compounds synthesized using enzymatic technique from cashew nut shell liquid,a cheap and renewable byproduct of the cashew industry are used as coating materials to prevent biofouling.The interaction of these materials with microbes preventing fouling on surfaces and the chemico-physical properties of the materials causing the antibiofouling effect will be discussed.It is critical to understand the antibiofouling mechanism of these materials for better design and application in various fields.

  11. AtlasCBS: a web server to map and explore chemico-biological space

    NASA Astrophysics Data System (ADS)

    Cortés-Cabrera, Álvaro; Morreale, Antonio; Gago, Federico; Abad-Zapatero, Celerino

    2012-09-01

    New approaches are needed that can help decrease the unsustainable failure in small-molecule drug discovery. Ligand Efficiency Indices (LEI) are making a great impact on early-stage compound selection and prioritization. Given a target-ligand database with chemical structures and associated biological affinities/activities for a target, the AtlasCBS server generates two-dimensional, dynamical representations of its contents in terms of LEI. These variables allow an effective decoupling of the chemical (angular) and biological (radial) components. BindingDB, PDBBind and ChEMBL databases are currently implemented. Proprietary datasets can also be uploaded and compared. The utility of this atlas-like representation in the future of drug design is highlighted with some examples. The web server can be accessed at http://ub.cbm.uam.es/atlascbs and https://www.ebi.ac.uk/chembl/atlascbs.

  12. AtlasCBS: a web server to map and explore chemico-biological space.

    PubMed

    Cortés-Cabrera, Alvaro; Morreale, Antonio; Gago, Federico; Abad-Zapatero, Celerino

    2012-09-01

    New approaches are needed that can help decrease the unsustainable failure in small-molecule drug discovery. Ligand Efficiency Indices (LEI) are making a great impact on early-stage compound selection and prioritization. Given a target-ligand database with chemical structures and associated biological affinities/activities for a target, the AtlasCBS server generates two-dimensional, dynamical representations of its contents in terms of LEI. These variables allow an effective decoupling of the chemical (angular) and biological (radial) components. BindingDB, PDBBind and ChEMBL databases are currently implemented. Proprietary datasets can also be uploaded and compared. The utility of this atlas-like representation in the future of drug design is highlighted with some examples. The web server can be accessed at http://ub.cbm.uam.es/atlascbs and https://www.ebi.ac.uk/chembl/atlascbs.

  13. Integrated Decision Strategies for Skin Sensitization Hazard

    PubMed Central

    Strickland, Judy; Zang, Qingda; Kleinstreuer, Nicole; Paris, Michael; Lehmann, David M.; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Lowit, Anna; Allen, David; Casey, Warren

    2016-01-01

    One of the top priorities of ICCVAM is the identification and evaluation of non-animal alternatives for skin sensitization testing. Although skin sensitization is a complex process, the key biological events of the process have been well characterized in an adverse outcome pathway (AOP) proposed by OECD. Accordingly, ICCVAM is working to develop integrated decision strategies based on the AOP using in vitro, in chemico, and in silico information. Data were compiled for 120 substances tested in the murine local lymph node assay (LLNA), direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens assay. Data for six physicochemical properties that may affect skin penetration were also collected, and skin sensitization read-across predictions were performed using OECD QSAR Toolbox. All data were combined into a variety of potential integrated decision strategies to predict LLNA outcomes using a training set of 94 substances and an external test set of 26 substances. Fifty-four models were built using multiple combinations of machine learning approaches and predictor variables. The seven models with the highest accuracy (89–96% for the test set and 96–99% for the training set) for predicting LLNA outcomes used a support vector machine (SVM) approach with different combinations of predictor variables. The performance statistics of the SVM models were higher than any of the non-animal tests alone and higher than simple test battery approaches using these methods. These data suggest that computational approaches are promising tools to effectively integrate data sources to identify potential skin sensitizers without animal testing. PMID:26851134

  14. Molecular Mechanism of Acrylamide Neurotoxicity: Lessons Learned from Organic Chemistry

    PubMed Central

    Gavin, Terrence

    2012-01-01

    Background: Acrylamide (ACR) produces cumulative neurotoxicity in exposed humans and laboratory animals through a direct inhibitory effect on presynaptic function. Objectives: In this review, we delineate how knowledge of chemistry provided an unprecedented understanding of the ACR neurotoxic mechanism. We also show how application of the hard and soft, acids and bases (HSAB) theory led to the recognition that the α,β-unsaturated carbonyl structure of ACR is a soft electrophile that preferentially forms covalent bonds with soft nucleophiles. Methods: In vivo proteomic and in chemico studies demonstrated that ACR formed covalent adducts with highly nucleophilic cysteine thiolate groups located within active sites of presynaptic proteins. Additional research showed that resulting protein inactivation disrupted nerve terminal processes and impaired neurotransmission. Discussion: ACR is a type-2 alkene, a chemical class that includes structurally related electrophilic environmental pollutants (e.g., acrolein) and endogenous mediators of cellular oxidative stress (e.g., 4-hydroxy-2-nonenal). Members of this chemical family produce toxicity via a common molecular mechanism. Although individual environmental concentrations might not be toxicologically relevant, exposure to an ambient mixture of type-2 alkene pollutants could pose a significant risk to human health. Furthermore, environmentally derived type-2 alkenes might act synergistically with endogenously generated unsaturated aldehydes to amplify cellular damage and thereby accelerate human disease/injury processes that involve oxidative stress. Conclusions: These possibilities have substantial implications for environmental risk assessment and were realized through an understanding of ACR adduct chemistry. The approach delineated here can be broadly applied because many toxicants of different chemical classes are electrophiles that produce toxicity by interacting with cellular proteins. PMID:23060388

  15. El Niño impact on mollusk biomineralization-implications for trace element proxy reconstructions and the paleo-archeological record.

    PubMed

    Pérez-Huerta, Alberto; Etayo-Cadavid, Miguel F; Andrus, C Fred T; Jeffries, Teresa E; Watkins, Clifton; Street, Shane C; Sandweiss, Daniel H

    2013-01-01

    Marine macroinvertebrates are ideal sentinel organisms to monitor rapid environmental changes associated with climatic phenomena. These organisms build up protective exoskeletons incrementally by biologically-controlled mineralization, which is deeply rooted in long-term evolutionary processes. Recent studies relating potential rapid environmental fluctuations to climate change, such as ocean acidification, suggest modifications on carbonate biominerals of marine invertebrates. However, the influence of known, and recurrent, climatic events on these biological processes during active mineralization is still insufficiently understood. Analysis of Peruvian cockles from the 1982-83 large magnitude El Niño event shows significant alterations of the chemico-structure of carbonate biominerals. Here, we show that bivalves modify the main biomineralization mechanism during the event to continue shell secretion. As a result, magnesium content increases to stabilize amorphous calcium carbonate (ACC), inducing a rise in Mg/Ca unrelated to the associated increase in sea-surface temperature. Analysis of variations in Sr/Ca also suggests that this proxy should not be used in these bivalves to detect the temperature anomaly, while Ba/Ca peaks are recorded in shells in response to an increase in productivity, or dissolved barium in seawater, after the event. Presented data contribute to a better understanding of the effects of abrupt climate change on shell biomineralization, while also offering an alternative view of bivalve elemental proxy reconstructions. Furthermore, biomineralization changes in mollusk shells can be used as a novel potential proxy to provide a more nuanced historical record of El Niño and similar rapid environmental change events.

  16. Graphene-like layers as promising chemiresistive sensing material for detection of alcohols at low concentration

    NASA Astrophysics Data System (ADS)

    Gargiulo, Valentina; Alfano, Brigida; Di Capua, Roberto; Alfé, Michela; Vorokhta, Mykhailo; Polichetti, Tiziana; Massera, Ettore; Miglietta, Maria Lucia; Schiattarella, Chiara; Di Francia, Girolamo

    2018-01-01

    In the manifold of materials for Volatile Organic Compound (VOC) sensing, graphene related materials (GRMs) gain special attention thanks to their versatility and overall chemico-physical tunability as a function of specific applications. In this work, the sensing performances of graphene-like (GL) layers, a new material belonging to the GRM family, are tested against ethanol and n-butanol. Two typologies of GL samples were produced by employing two different approaches and tested in view of their application as VOC sensors. The experiments were performed under atmospheric pressure, in dry air, and at room temperature and demonstrated that the sensing capabilities are related to the film surface features. The results indicated that GL films are promising candidates for the detection of low concentrations of VOCs at room temperature. The present investigation thus paves the way for VOC sensing optimization using cost-effective and easily scalable materials.

  17. Recent Advances in Marine Algae Polysaccharides: Isolation, Structure, and Activities.

    PubMed

    Xu, Shu-Ying; Huang, Xuesong; Cheong, Kit-Leong

    2017-12-13

    Marine algae have attracted a great deal of interest as excellent sources of nutrients. Polysaccharides are the main components in marine algae, hence a great deal of attention has been directed at isolation and characterization of marine algae polysaccharides because of their numerous health benefits. In this review, extraction and purification approaches and chemico-physical properties of marine algae polysaccharides (MAPs) are summarized. The biological activities, which include immunomodulatory, antitumor, antiviral, antioxidant, and hypolipidemic, are also discussed. Additionally, structure-function relationships are analyzed and summarized. MAPs' biological activities are closely correlated with their monosaccharide composition, molecular weights, linkage types, and chain conformation. In order to promote further exploitation and utilization of polysaccharides from marine algae for functional food and pharmaceutical areas, high efficiency, and low-cost polysaccharide extraction and purification methods, quality control, structure-function activity relationships, and specific mechanisms of MAPs activation need to be extensively investigated.

  18. Regulation of mesenchymal stem cell 3D microenvironment: From macro to microfluidic bioreactors.

    PubMed

    Sart, Sébastien; Agathos, Spiros N; Li, Yan; Ma, Teng

    2016-01-01

    Human mesenchymal stem cells (hMSCs) have emerged as an important cell type in cell therapy and tissue engineering. In these applications, maintaining the therapeutic properties of hMSCs requires tight control of the culture environments and the structural cell organizations. Bioreactor systems are essential tools to achieve these goals in the clinical-scale expansion and tissue engineering applications. This review summarizes how different bioreactors provide cues to regulate the structure and the chemico-mechanical microenvironment of hMSCs with a focus on 3D organization. In addition to conventional bioreactors, recent advances in microfluidic bioreactors as a novel approach to better control the hMSC microenvironment are also discussed. These advancements highlight the key role of bioreactor systems in preserving hMSC's functional properties by providing dynamic and temporal regulation of in vitro cellular microenvironment. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A Designed Peptide Targets Two Types of Modifications of p53 with Anti-cancer Activity.

    PubMed

    Liang, Lunxi; Wang, Huanbin; Shi, Hubing; Li, Zhaoli; Yao, Han; Bu, Zhigao; Song, Ningning; Li, Chushu; Xiang, Dabin; Zhang, Yao; Wang, Jilin; Hu, Ye; Xu, Qi; Ma, Yanlei; Cheng, Zhongyi; Wang, Yingchao; Zhao, Shuliang; Qian, Jin; Chen, Yingxuan; Fang, Jing-Yuan; Xu, Jie

    2018-06-21

    Many cancer-related proteins are controlled by composite post-translational modifications (PTMs), but prevalent strategies only target one type of modification. Here we describe a designed peptide that controls two types of modifications of the p53 tumor suppressor, based on the discovery of a protein complex that suppresses p53 (suppresome). We found that Morn3, a cancer-testis antigen, recruits different PTM enzymes, such as sirtuin deacetylase and ubiquitin ligase, to confer composite modifications on p53. The molecular functions of Morn3 were validated through in vivo assays and chemico-biological intervention. A rationally designed Morn3-targeting peptide (Morncide) successfully activated p53 and suppressed tumor growth. These findings shed light on the regulation of protein PTMs and present a strategy for targeting two modifications with one molecule. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Sourdough fermentation and chestnut flour in gluten-free bread: A shelf-life evaluation.

    PubMed

    Rinaldi, Massimiliano; Paciulli, Maria; Caligiani, Augusta; Scazzina, Francesca; Chiavaro, Emma

    2017-06-01

    The effect of sourdough fermentation combined with chestnut flour was investigated for improving technological and nutritional quality of gluten-free bread during 5day shelf life by means of chemico-physical and nutritional properties. Sourdough fermentation by itself and with chestnut flour reduced volume of loaves and heterogeneity in crumb grain. Sourdough technology allowed increasing crumb moisture content with no significant variations during shelf-life. Chestnut flour darkened crumb and crust while no effects on colour were observed for sourdough. Sourdough and/or chestnut flour addition caused a significant increase in crumb hardness at time 0 while a significant reduction of staling was observed only at 5days, even if a decrease in amylopectin fusion enthalpy was observed. The percentage of hydrolysed starch during in vitro digestion was significantly reduced by sourdough fermentation with a presumable lower glycaemic index. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Phase transitions and their energetics in calcite biominerals

    NASA Astrophysics Data System (ADS)

    Gilbert, Pupa

    2013-03-01

    Biominerals include mollusk shells and the skeletons of algae, sponges, corals, sea urchins and most other animals. The function of biominerals are diverse: mechanical support, attack, defense, grinding, biting, and chewing, gravitational and magnetic field sensing, light focusing, and many others. The exquisite nanostructure of biominerals is directly controlled by the organisms, which have evolved to master the chemico-physical aspects of mineralization. By controlling the inorganic precursor nanoparticle size, packing, and phase transitions, organisms efficiently fill space, produce tough and hard structures, with micro- or macroscopic morphology optimized for their functions. Specifically, this talk will address two key questions: Q: How are the beautiful biomineral morphologies achieved? A: Using amorphous precursor phases, with phase transitions kinetically regulated (retarded) by proteins. Q: How do organisms co-orient their single-crystalline biominerals? A: Controlling the propagation of crystallinity one nanoparticle at a time, not atom-by-atom.

  2. Optical and mechanical properties of UV-weathered biodegradable PHBV/PBAT nanocomposite films containing halloysite nanotubes

    NASA Astrophysics Data System (ADS)

    Scarfato, P.; Avallone, E.; Acierno, D.; Russo, P.

    2014-05-01

    Recently, the increasing use of plastics, stringent environmental issues and the awareness of the progressive reduction of available petrochemical resources have ever more guided the research interest towards the investigation and development of innovative materials intrinsically biodegradable or derived from renewable sources, and generally known as bio-based polymers. Amongst the biobased and biodegradable polymers, many investigations were reported in literature about a family of polyesters known as poly(hydroxyalkanoate)s (PHAs), one of whose most prevalent is poly(3-hydroxybutyrate-co-3-hydroxyvalerate) (PHBV). In this context, here we report the results of a photo-degradation study performed on biodegradable blown film samples based on a commercial grade PHBV/PBAT formulation. The films, subjected to photo-oxidative weathering in a climatic chamber under UV exposure, were systematically analysed in order to check the chemico-physical changes induced by the aging protocol, taking the as-produced films as the reference materials.

  3. Experimental verification, and domain definition, of structural alerts for protein binding: epoxides, lactones, nitroso, nitros, aldehydes and ketones.

    PubMed

    Nelms, M D; Cronin, M T D; Schultz, T W; Enoch, S J

    2013-01-01

    This study outlines how a combination of in chemico and Tetrahymena pyriformis data can be used to define the applicability domain of selected structural alerts within the profilers of the OECD QSAR Toolbox. Thirty-three chemicals were profiled using the OECD and OASIS profilers, enabling the applicability domain of six structural alerts to be defined, the alerts being: epoxides, lactones, nitrosos, nitros, aldehydes and ketones. Analysis of the experimental data showed the applicability domains for the epoxide, nitroso, aldehyde and ketone structural alerts to be well defined. In contrast, the data showed the applicability domains for the lactone and nitro structural alerts needed modifying. The accurate definition of the applicability domain for structural alerts within in silico profilers is important due to their use in the chemical category in predictive and regulatory toxicology. This study highlights the importance of utilizing multiple profilers in category formation.

  4. Non-animal sensitization testing: state-of-the-art.

    PubMed

    Vandebriel, Rob J; van Loveren, Henk

    2010-05-01

    Predictive tests to identify the sensitizing properties of chemicals are carried out using animals. In the European Union timelines for phasing out many standard animal tests were established for cosmetics. Following this policy, the new European Chemicals Legislation (REACH) favors alternative methods, if validated and appropriate. In this review the authors aim to provide a state-of-the art overview of alternative methods (in silico, in chemico, and in vitro) to identify contact and respiratory sensitizing capacity and in some occasions give a measure of potency. The past few years have seen major advances in QSAR (quantitative structure-activity relationship) models where especially mechanism-based models have great potential, peptide reactivity assays where multiple parameters can be measured simultaneously, providing a more complete reactivity profile, and cell-based assays. Several cell-based assays are in development, not only using different cell types, but also several specifically developed assays such as three-dimenionally (3D)-reconstituted skin models, an antioxidant response reporter assay, determination of signaling pathways, and gene profiling. Some of these assays show relatively high sensitivity and specificity for a large number of sensitizers and should enter validation (or are indeed entering this process). Integrating multiple assays in a decision tree or integrated testing system is a next step, but has yet to be developed. Adequate risk assessment, however, is likely to require significantly more time and efforts.

  5. Part Marking and Identification Materials' for MISSE

    NASA Technical Reports Server (NTRS)

    Roxby, Donald; Finckenor, Miria M.

    2008-01-01

    The Materials on International Space Station Experiment (MISSE) is being conducted with funding from NASA and the U.S. Department of Defense, in order to evaluate candidate materials and processes for flight hardware. MISSE modules include test specimens used to validate NASA technical standards for part markings exposed to harsh environments in low-Earth orbit and space, including: atomic oxygen, ultraviolet radiation, thermal vacuum cycling, and meteoroid and orbital debris impact. Marked test specimens are evaluated and then mounted in a passive experiment container (PEC) that is affixed to an exterior surface on the International Space Station (ISS). They are exposed to atomic oxygen and/or ultraviolet radiation for a year or more before being retrieved and reevaluated. Criteria include percent contrast, axial uniformity, print growth, error correction, and overall grade. MISSE 1 and 2 (2001-2005), MISSE 3 and 4 (2006-2007), and MISSE 5 (2005-2006) have been completed to date. Acceptable results were found for test specimens marked with Data Matrix(TradeMark) symbols by Intermec Inc. and Robotic Vision Systems Inc using: laser bonding, vacuum arc vapor deposition, gas assisted laser etch, chemical etch, mechanical dot peening, laser shot peening, laser etching, and laser induced surface improvement. MISSE 6 (2008-2009) is exposing specimens marked by DataLase(Registed TradeMark), Chemico technologies Inc., Intermec Inc., and tesa with laser-markable paint, nanocode tags, DataLase and tesa laser markings, and anodized metal labels.

  6. CO2 sorption on surface-modified carbonaceous support: Probing the influence of the carbon black microporosity and surface polarity

    NASA Astrophysics Data System (ADS)

    Gargiulo, Valentina; Alfè, Michela; Ammendola, Paola; Raganati, Federica; Chirone, Riccardo

    2016-01-01

    The use of solid sorbents is a convenient option in post-combustion CO2 capture strategies. Sorbents selection is a key point because the materials are required to be both low-cost and versatile in typical post-combustion conditions in order to guarantee an economically advantageous overall process. This work compares strategies to tailor the chemico-physical features of carbon black (CB) by surface-modification and/or coating with a CO2-sorbent phase. The influence of the CB microporosity, enhanced by chemical/thermal treatments, is also taken into account. Three CB surface modifications are performed and compared: (i) oxidation and functionalization with amino-groups, (ii) coating with iron oxides and (iii) impregnation with an ionic liquid (IL). The CO2 capture performance is evaluated on the basis of the breakthrough curves measured at atmospheric pressure and room temperature in a lab-scale fixed bed micro-reactor. Most of tested solids adsorb a CO2 amount significantly higher than a 13X zeolite and DARCO FGD (Norit) activated carbon (up to 4 times more in the best case). The sorbents bearing basic functionalities (amino-groups and IL) exhibit the highest CO2 sorption capacity. The use of a microporous carbonaceous support limits the accessibility of CO2 toward the adsorbing phase (IL or FM) lowering the number of accessible binding sites for CO2.

  7. A novel two-step ultrasound post-assisted lye peeling regime for tomatoes: Reducing pollution while improving product yield and quality.

    PubMed

    Gao, Ruiping; Ye, Fayin; Lu, Zhiqiang; Wang, Jiajia; Li Shen, Xiao; Zhao, Guohua

    2018-07-01

    In this paper, the effects and mechanisms of a novel two-step tomato peeling method, hot lye with a post-assistance of ultrasound, were investigated. The present work aims to improve the environmental friendliness of the conventional hot lye tomato peeling method (10% w/v, 97 °C, 45 s). The results showed that 4% (w/v) lye treatment at 97 °C for 30 s with a post-assistance of a 31.97 W/L ultrasound treatment at 70 °C for 50 s achieved a 100% peelability. In this scenario, the peeling yield and lycopene content in the peeled product were significantly higher than the peeling yield and lycopene content with the conventional hot lye peeling method. The present two-step peeling method was concluded with a mechanism of chemico-mechanical synergism, in which the hot lye functions mainly in a chemical way while the ultrasound is a mechanical process. Especially from the lye side, this work first demonstrated that the lye penetrated across the tomato skin via a pitting model rather than evenly. The findings reported in this paper not only provide a novel tomato peeling method with significant environmental benefits but also discover new clues to the peeling mechanism using hot lye. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Immunocytochemical and autoradiographic studies on the process of keratinization in avian epidermis suggests absence of keratohyalin.

    PubMed

    Alibardi, Lorenzo

    2004-02-01

    The process of keratinization in apteric avian epidermis and in scutate scales of some avian species has been studied by autoradiography for histidine and immunohistochemistry for keratins and other epidermal proteins. Acidic or basic alpha-keratins are present in basal, spinosus, and transitional layers, but are not seen in the corneous layer. Keratinization-specific alpha-keratins (AE2-positive) are observed in the corneous layer of apteric epidermis but not in that of scutate scales, which contain mainly beta-keratin. Alpha-keratin bundles accumulate along the plasma membrane of transitional cells of apteric epidermis. In contrast to the situation in scutate scales, in the transitional layer and in the lowermost part of the corneous layer of apteric epidermis, filaggrin-like, loricrin-like, and transglutaminase immunoreactivities are present. The lack of isopeptide bond immunoreactivity suggests that undetectable isopeptide bonds are present in avian keratinocytes. Using immunogold ultrastructural immunocytochemistry a low but localized loricrin-like and, less, filaggrin-like labeling is seen over round-oval granules or vesicles among keratin bundles of upper spinosus and transitional keratinocytes of apteric epidermis. Filaggrin-and loricrin-labeling are absent in alpha-keratin bundles localized along the plasma membrane and in the corneous layer, formerly considered keratohyalin. Using ultrastructural autoradiography for tritiated histidine, occasional trace grains are seen among these alpha-keratin bundles. A different mechanism of redistribution of matrix and corneous cell envelope proteins probably operates in avian keratinocytes as compared to that of mammals. Keratin bundles are compacted around the lipid-core of apteric epidermis keratinocytes, which do not form complex chemico/mechanical-resistant corneous cell envelopes as in mammalian keratinocytes. These observations suggest that low amounts of matrix proteins are present among keratin bundles of avian keratinocytes and that keratohyalin granules are absent. Copyright 2003 Wiley-Liss, Inc.

  9. Visible light driven mineralization of spiramycin over photostructured N-doped TiO2 on up conversion phosphors.

    PubMed

    Sacco, Olga; Vaiano, Vincenzo; Sannino, Diana; Ciambelli, Paolo

    2017-04-01

    A novel visible light-active photocatalyst formulation (NdT/OP) was obtained by supporting N-doped TiO 2 (NdT) particles on up-conversion luminescent organic phosphors (OP). The photocatalytic activity of such catalysts was evaluated for the mineralization process of spiramycin in aqueous solution. The effect of NdT loading in the range 15-60wt.% on bulk and surface characteristics of NdT/OP catalysts was investigated by several chemico-physical characterization techniques. The photocatalytic performance of NdT/OP catalysts in the removal of spyramicin from aqueous solution was assessed through photocatalytic tests under visible light irradiation. Total organic carbon (TOC) of aqueous solution, and CO and CO 2 gas concentrations evolved during the photodegradation were analyzed. A dramatic enhancement of photocatalytic activity of the photostructured visible active NdT/OP catalysts, compared to NdT catalyst, was observed. Only CO 2 was detected in gas-phase during visible light irradiation, proving that the photocatalytic process is effective in the mineralization of spiramycin, reaching very high values of TOC removal. The photocatalyst NdT/OP at 30wt.% of NdT loading showed the highest photocatalytic activity (58% of TOC removed after 180min irradiation against only 31% removal after 300min of irradiation of NdT). We attribute this enhanced activity to the high effectiveness in the utilization of visible light through improved light harvesting and exploiting. OP particles act as "photoactive support", able to be excited by the external visible light irradiation, and reissue luminescence of wavelength suitable to promote NdT photomineralization activity. Copyright © 2016. Published by Elsevier B.V.

  10. Preparation and characterization of shape memory polymer scaffolds via solvent casting/particulate leaching.

    PubMed

    De Nardo, Luigi; Bertoldi, Serena; Cigada, Alberto; Tanzi, Maria Cristina; Haugen, Håvard Jostein; Farè, Silvia

    2012-09-27

    Porous Shape Memory Polymers (SMPs) are ideal candidates for the fabrication of defect fillers, able to support tissue regeneration via minimally invasive approaches. In this regard, control of pore size, shape and interconnection is required to achieve adequate nutrient transport and cell ingrowth. Here, we assessed the feasibility of the preparation of SMP porous structures and characterized their chemico-physical properties and in vitro cell response. SMP scaffolds were obtained via solvent casting/particulate leaching of gelatin microspheres, prepared via oil/water emulsion. A solution of commercial polyether-urethane (MM-4520, Mitsubishi Heavy Industries) was cast on compacted microspheres and leached-off after polymer solvent evaporation. The obtained structures were characterized in terms of morphology (SEM and micro-CT), thermo-mechanical properties (DMTA), shape recovery behavior in compression mode, and in vitro cytocompatibility (MG63 Osteoblast-like cell line). The fabrication process enabled easy control of scaffold morphology, pore size, and pore shape by varying the gelatin microsphere morphology. Homogeneous spherical and interconnected pores have been achieved together with the preservation of shape memory ability, with recovery rate up to 90%. Regardless of pore dimensions, MG63 cells were observed adhering and spreading onto the inner surface of the scaffolds obtained for up to seven days of static in vitro tests. A new class of SMP porous structures has been obtained and tested in vitro: according to these preliminary results reported, SMP scaffolds can be further exploited in the design of a new class of implantable devices.

  11. [Medecine and chemistry in the context of the Enlightenment: the thesis of the Lausanne physician Marc-Louis Vullyamoz].

    PubMed

    Terrier, Georges

    2003-01-01

    The chemico-medical essay "De sale lactis essentiali" is a thesis presented in Leyden in 1756 by a physician from Lausanne, M.-L. Vullyamoz, to obtain the medical degree. It shows that chemistry has become a university science connected with medicine and that combustion was explained at that time by G. E. Stahl's theory of the phlogiston. It reminds us that the hypotheses of this German physician, which were part of the animistic doctrine, were widely adopted in Europe. In chemistry they began to fade in 1789 after the publication of Lavoisier's work on oxydation. Nevertheless, they have contributed to establish a modern science. In medicine Stahl's animism evolved towards vitalism, which survived in several forms. Vullyamoz's thesis, which presents chemical experiences intended to analyse and promote a popular medicine, is a testimony of the spirit of Enlightenment, which rejects dogmatism and tries to understand facts through observation and the use of reason.

  12. New strategy for drug discovery by large-scale association analysis of molecular networks of different species.

    PubMed

    Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua

    2016-02-25

    The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.

  13. Integrating non-animal test information into an adaptive testing strategy - skin sensitization proof of concept case.

    PubMed

    Jaworska, Joanna; Harol, Artsiom; Kern, Petra S; Gerberick, G Frank

    2011-01-01

    There is an urgent need to develop data integration and testing strategy frameworks allowing interpretation of results from animal alternative test batteries. To this end, we developed a Bayesian Network Integrated Testing Strategy (BN ITS) with the goal to estimate skin sensitization hazard as a test case of previously developed concepts (Jaworska et al., 2010). The BN ITS combines in silico, in chemico, and in vitro data related to skin penetration, peptide reactivity, and dendritic cell activation, and guides testing strategy by Value of Information (VoI). The approach offers novel insights into testing strategies: there is no one best testing strategy, but the optimal sequence of tests depends on information at hand, and is chemical-specific. Thus, a single generic set of tests as a replacement strategy is unlikely to be most effective. BN ITS offers the possibility of evaluating the impact of generating additional data on the target information uncertainty reduction before testing is commenced.

  14. Binders for Energetics - Modelling and Synthesis in Harmony

    NASA Astrophysics Data System (ADS)

    Dossi, Licia; Cleaver, Doug; Gould, Peter; Dunnett, Jim; Cavaye, Hamish; Ellison, Laurence; Luppi, Federico; Hollands, Ron; Bradley, Mark

    The Binders by Design UK programme develop new polymeric materials for energetic applications that can overcome problems related to chemico-physical properties, aging, additives, environmental and performance of energetic compositions. Combined multi-scale modelling and experiment is used for the development of a new modelling tool and with the aim to produce novel materials with great confidence and fast turnaround. New synthesised binders with attractive properties for energetic applications used to provide a high level of confidence in the results of developed models. Molecular dynamics simulations investigate the thermal behaviour and the results directly feed into a Group Interaction Model (GIM). A viscoelastic constitutive model has been developed examining stress development in energetic/binder configurations. GIM data has been used as the basis for developing hydrocode equations of state, which then applied in run-to-detonation type investigations to examine the effect of the shock properties of a binder on the reactivity of a typical Polymer Bonded Explosive in a high-velocity impact type scenario. The Binders by Design UK programme is funded through the Weapons Science and Technology Centre by DSTL.

  15. Biomolecular imaging of 13C-butyrate with dissolution-DNP: Polarization enhancement and formulation for in vivo studies

    NASA Astrophysics Data System (ADS)

    Flori, Alessandra; Giovannetti, Giulio; Santarelli, Maria Filomena; Aquaro, Giovanni Donato; De Marchi, Daniele; Burchielli, Silvia; Frijia, Francesca; Positano, Vincenzo; Landini, Luigi; Menichetti, Luca

    2018-06-01

    Magnetic Resonance Spectroscopy of hyperpolarized isotopically enriched molecules facilitates the non-invasive real-time investigation of in vivo tissue metabolism in the time-frame of a few minutes; this opens up a new avenue in the development of biomolecular probes. Dissolution Dynamic Nuclear Polarization is a hyperpolarization technique yielding a more than four orders of magnitude increase in the 13C polarization for in vivo Magnetic Resonance Spectroscopy studies. As reported in several studies, the dissolution Dynamic Nuclear Polarization polarization performance relies on the chemico-physical properties of the sample. In this study, we describe and quantify the effects of the different sample components on the dissolution Dynamic Nuclear Polarization performance of [1-13C]butyrate. In particular, we focus on the polarization enhancement provided by the incremental addition of the glassy agent dimethyl sulfoxide and gadolinium chelate to the formulation. Finally, preliminary results obtained after injection in healthy rats are also reported, showing the feasibility of an in vivo Magnetic Resonance Spectroscopy study with hyperpolarized [1-13C]butyrate using a 3T clinical set-up.

  16. The first investigative science-based evidence of Morgellons psychogenesis.

    PubMed

    Roncati, Luca; Gatti, Antonietta Morena; Pusiol, Teresa; Piscioli, Francesco; Barbolini, Giuseppe; Maiorana, Antonio

    2016-01-01

    Morgellons disease is an infrequent syndromic condition, that typically affects middle-aged white women, characterized by crawling sensations on and under the skin, associated with itchy rashes, stinging sores, fiber-like filaments emerging from the sores, severe fatigue, concentrating difficulty, and memory loss. The scientific community is prone to believe that Morgellons is the manifestation of various psychiatric syndromes (Munchausen, Munchausen by proxy, Ekbom, Wittmaack-Ekbom). Up until now, no investigative science-based evidence about its psychogenesis has ever been provided. In order to close this gap, we have analyzed the filaments extracted from the skin lesions of a 49-year-old Caucasian female patient, by using a Field Emission Gun-Environmental Electron Scanning Microscope equipped with an X-ray microprobe, for the chemico-elemental characterization of the filaments, comparing them with those collected during a detailed indoor investigation, with careful air monitoring, in her apartment. Our results prove the self-introduction under the epidermis of environmental filaments. For the first time in the literature, we have scientifically demonstrated the self-induced nature of Morgellons disease, thereby wiping out fanciful theories about its etiopathogenesis.

  17. Carbon-based hybrid nanogels: a synergistic nanoplatform for combined biosensing, bioimaging, and responsive drug delivery.

    PubMed

    Wang, Hui; Chen, Qianwang; Zhou, Shuiqin

    2018-06-05

    Nanosized crosslinked polymer networks, named as nanogels, are playing an increasingly important role in a diverse range of applications by virtue of their porous structures, large surface area, good biocompatibility and responsiveness to internal and/or external chemico-physical stimuli. Recently, a variety of carbon nanomaterials, such as carbon quantum dots, graphene/graphene oxide nanosheets, fullerenes, carbon nanotubes, and nanodiamonds, have been embedded into responsive polymer nanogels, in order to integrate the unique electro-optical properties of carbon nanomaterials with the merits of nanogels into a single hybrid nanogel system for improvement of their applications in nanomedicine. A vast number of studies have been pursued to explore the applications of carbon-based hybrid nanogels in biomedical areas for biosensing, bioimaging, and smart drug carriers with combinatorial therapies and/or theranostic ability. New synthetic methods and structures have been developed to prepare carbon-based hybrid nanogels with versatile properties and functions. In this review, we summarize the latest developments and applications and address the future perspectives of these carbon-based hybrid nanogels in the biomedical field.

  18. Parallel computational and experimental studies of the morphological modification of calcium carbonate by cobalt

    NASA Astrophysics Data System (ADS)

    Braybrook, A. L.; Heywood, B. R.; Jackson, R. A.; Pitt, K.

    2002-08-01

    Crystal growth can be controlled by the incorporation of dopant ions into the lattice and yet the question of how such substituents affect the morphology has not been addressed. This paper describes the forms of calcite (CaCO 3) which arise when the growth assay is doped with cobalt. Distinct and specific morphological changes are observed; the calcite crystals adopt a morphology which is dominated by the {01.1} family of faces. These experimental studies paralleled the development of computational methods for the analysis of crystal habit as a function of dopant concentration. In this case, the predicted defect morphology also argued for the dominance of the (01.1) face in the growth form. The appearance of this face was related to the preferential segregation of the dopant ions to the crystal surface. This study confirms the evolution of a robust computational model for the analysis of calcite growth forms under a range of environmental conditions and presages the use of such tools for the predictive development of crystal morphologies in those applications where chemico-physical functionality is linked closely to a specific crystallographic form.

  19. Integrated decision strategies for skin sensitization hazard.

    PubMed

    Strickland, Judy; Zang, Qingda; Kleinstreuer, Nicole; Paris, Michael; Lehmann, David M; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Lowit, Anna; Allen, David; Casey, Warren

    2016-09-01

    One of the top priorities of the Interagency Coordinating Committee for the Validation of Alternative Methods (ICCVAM) is the identification and evaluation of non-animal alternatives for skin sensitization testing. Although skin sensitization is a complex process, the key biological events of the process have been well characterized in an adverse outcome pathway (AOP) proposed by the Organisation for Economic Co-operation and Development (OECD). Accordingly, ICCVAM is working to develop integrated decision strategies based on the AOP using in vitro, in chemico and in silico information. Data were compiled for 120 substances tested in the murine local lymph node assay (LLNA), direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens assay. Data for six physicochemical properties, which may affect skin penetration, were also collected, and skin sensitization read-across predictions were performed using OECD QSAR Toolbox. All data were combined into a variety of potential integrated decision strategies to predict LLNA outcomes using a training set of 94 substances and an external test set of 26 substances. Fifty-four models were built using multiple combinations of machine learning approaches and predictor variables. The seven models with the highest accuracy (89-96% for the test set and 96-99% for the training set) for predicting LLNA outcomes used a support vector machine (SVM) approach with different combinations of predictor variables. The performance statistics of the SVM models were higher than any of the non-animal tests alone and higher than simple test battery approaches using these methods. These data suggest that computational approaches are promising tools to effectively integrate data sources to identify potential skin sensitizers without animal testing. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  20. Climate change driven plant-metal-microbe interactions.

    PubMed

    Rajkumar, Mani; Prasad, Majeti Narasimha Vara; Swaminathan, Sandhya; Freitas, Helena

    2013-03-01

    Various biotic and abiotic stress factors affect the growth and productivity of crop plants. Particularly, the climatic and/or heavy metal stress influence various processes including growth, physiology, biochemistry, and yield of crops. Climatic changes particularly the elevated atmospheric CO₂ enhance the biomass production and metal accumulation in plants and help plants to support greater microbial populations and/or protect the microorganisms against the impacts of heavy metals. Besides, the indirect effects of climatic change (e.g., changes in the function and structure of plant roots and diversity and activity of rhizosphere microbes) would lead to altered metal bioavailability in soils and concomitantly affect plant growth. However, the effects of warming, drought or combined climatic stress on plant growth and metal accumulation vary substantially across physico-chemico-biological properties of the environment (e.g., soil pH, heavy metal type and its bio-available concentrations, microbial diversity, and interactive effects of climatic factors) and plant used. Overall, direct and/or indirect effects of climate change on heavy metal mobility in soils may further hinder the ability of plants to adapt and make them more susceptible to stress. Here, we review and discuss how the climatic parameters including atmospheric CO₂, temperature and drought influence the plant-metal interaction in polluted soils. Other aspects including the effects of climate change and heavy metals on plant-microbe interaction, heavy metal phytoremediation and safety of food and feed are also discussed. This review shows that predicting how plant-metal interaction responds to altering climatic change is critical to select suitable crop plants that would be able to produce more yields and tolerate multi-stress conditions without accumulating toxic heavy metals for future food security. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Nano/micro hybrid scaffold of PCL or P3HB nanofibers combined with silk fibroin for tendon and ligament tissue engineering.

    PubMed

    Naghashzargar, Elham; Farè, Silvia; Catto, Valentina; Bertoldi, Serena; Semnani, Dariush; Karbasi, Saeed; Tanzi, Maria Cristina

    2015-07-04

    A novel biodegradable nano/micro hybrid structure was obtained by electrospinning P3HB or PCL nanofibers onto a twisted silk fibroin (SF) structure, with the aim of fabricating a suitable scaffold for tendon and ligament tissue engineering. The electrospinning (ES) processing parameters for P3HB and PCL were optimized on 2D samples, and applied to produce two different nano/micro hybrid constructs (SF/ES-PCL and SF/ES-P3HB).Morphological, chemico-physical and mechanical properties of the novel hybrid scaffolds were evaluated by SEM, ATR FT-IR, DSC, tensile and thermodynamic mechanical tests. The results demonstrated that the nanofibers were tightly wrapped around the silk filaments, and the crystallinity of the SF twisted yarns was not influenced by the presence of the electrospun polymers. The slightly higher mechanical properties of the hybrid constructs confirmed an increase of internal forces due to the interaction between nano and micro components. Cell culture tests with L929 fibroblasts, in the presence of the sample eluates or in direct contact with the hybrid structures, showed no cytotoxic effects and a good level of cytocompatibility of the nano/micro hybrid structures in term of cell viability, particularly at day 1. Cell viability onto the nano/micro hybrid structures decreased from the first to the third day of culture when compared with the control culture plastic, but appeared to be higher when compared with the uncoated SF yarns. Although additional in vitro and in vivo tests are needed, the original fabrication method here described appears promising for scaffolds suitable for tendon and ligament tissue engineering.

  2. Chemical composition of igneous rocks expressed by means of diagrams, with reference to rock classification on a quantitative chemico-mineralogical basis

    USGS Publications Warehouse

    Iddings, J.P.

    1903-01-01

    The value of graphical methods for expressing relative quantities has been well established in all kinds of statistical exposition and discussion. Their use in conveying definite conceptions of relative quantities of chemical and mineral components of rocks is becoming more and more frequent, and the value of the results in some cases can not be overestimated. This is especially true when a series or group of rocks is being considered. The intricate variations in the amounts of numerous mineral components, or of chemical components, baffle most attempts to comprehend their interrelationships by simple contemplation or by study of the numbers in which they may be expressed. Many facts and relations are overlooked which arc readily observed when diagrams are used to represent numerical figures. Moreover, visual memory is sufficiently developed in most persons to enable them to carry in mind simple geometrical forms, where it does not permit them to recollect manifold assemblages of oft-repeated numbers. Mental impressions of simple diagrams are, therefore, more definite and lasting and enable the student to store up a much greater amount of quantitative data than he could otherwise acquire.

  3. Magnesium- and strontium-co-substituted hydroxyapatite: the effects of doped-ions on the structure and chemico-physical properties.

    PubMed

    Aina, Valentina; Lusvardi, Gigliola; Annaz, Basil; Gibson, Iain R; Imrie, Flora E; Malavasi, Gianluca; Menabue, Ledi; Cerrato, Giuseppina; Martra, Gianmario

    2012-12-01

    The present study is aimed at investigating the contribution of two biologically important cations, Mg(2+) and Sr(2+), when substituted into the structure of hydroxyapatite (Ca(10)(PO(4))(6)(OH)(2),HA). The substituted samples were synthesized by an aqueous precipitation method that involved the addition of Mg(2+)- and Sr(2+)-containing precursors to partially replace Ca(2+) ions in the apatite structure. Eight substituted HA samples with different concentrations of single (only Mg(2+)) or combined (Mg(2+) and Sr(2+)) substitution of cations have been investigated and the results compared with those of pure HA. The obtained materials were characterized by X-ray powder diffraction, specific surface area and porosity measurements (N(2) adsorption at 77 K), FT-IR and Raman spectroscopies and scanning electron microscopy. The results indicate that the co-substitution gives rise to the formation of HA and β-TCP structure types, with a variation of their cell parameters and of the crystallinity degree of HA with varying levels of substitution. An evaluation of the amount of substituents allows us to design and prepare BCP composite materials with a desired HA/β-TCP ratio.

  4. Monohalogenated acetamide-induced cellular stress and genotoxicity are related to electrophilic softness and thiol/thiolate reactivity.

    PubMed

    Pals, Justin A; Wagner, Elizabeth D; Plewa, Michael J; Xia, Menghang; Attene-Ramos, Matias S

    2017-08-01

    Haloacetamides (HAMs) are cytotoxic, genotoxic, and mutagenic byproducts of drinking water disinfection. They are soft electrophilic compounds that form covalent bonds with the free thiol/thiolate in cysteine residues through an S N 2 reaction mechanism. Toxicity of the monohalogenated HAMs (iodoacetamide, IAM; bromoacetamide, BAM; or chloroacetamide, CAM) varied depending on the halogen substituent. The aim of this research was to investigate how the halogen atom affects the reactivity and toxicological properties of HAMs, measured as induction of oxidative/electrophilic stress response and genotoxicity. Additionally, we wanted to determine how well in silico estimates of electrophilic softness matched thiol/thiolate reactivity and in vitro toxicological endpoints. Each of the HAMs significantly induced nuclear Rad51 accumulation and ARE signaling activity compared to a negative control. The rank order of effect was IAM>BAM>CAM for Rad51, and BAM≈IAM>CAM for ARE. In general, electrophilic softness and in chemico thiol/thiolate reactivity provided a qualitative indicator of toxicity, as the softer electrophiles IAM and BAM were more thiol/thiolate reactive and were more toxic than CAM. Copyright © 2017. Published by Elsevier B.V.

  5. Pyrene maleimide as a probe of microenvironmental and dynamics properties of protein binding sites

    NASA Astrophysics Data System (ADS)

    Benci, S.; Vaccari, S.; Schianchi, G.; Locatelli, Donata; Vaghi, P.; Bottiroli, Giovanni F.

    1995-01-01

    N-(1-Pyrene)maleimide is highly fluorescent upon covalent binding with sulfhydryl and amino groups of the proteins. Multiexponential fluorescence decays were observed for the dye bound to different proteins even when a single binding site is involved. The lack of information about the fluorescence decay of free dye does not allow to define the variations of fluorescence parameter following the conjugation and their correlation with the binding properties of the fluorophore. In this work, a study of the fluorescence of the probe, free in solution, bound to different antibodies and to the antigen-antibody complex both in solution and in cell, has been performed. The experimental results showed that chemico-physical properties of the medium influence the fluorescence decay of the probe in both the free and bound forms, although to a different extent. The variations of fluorescence decay and anisotropy of the bound probe are related to the electronic characteristics of microenvironment and show an increased stabilization of the probe binding site with the increasing complexity of the substrate. The sensitivity of the fluorescence properties of the probe to the binding site environment opens interesting perspectives concerning the application of Py- maleimide fluorochromization to assess the degree of specificity of immunocytochemical labelling.

  6. Exploiting novel sterilization techniques for porous polyurethane scaffolds.

    PubMed

    Bertoldi, Serena; Farè, Silvia; Haugen, Håvard Jostein; Tanzi, Maria Cristina

    2015-05-01

    Porous polyurethane (PU) structures raise increasing interest as scaffolds in tissue engineering applications. Understanding the effects of sterilization on their properties is mandatory to assess their potential use in the clinical practice. The aim of this work is the evaluation of the effects of two innovative sterilization techniques (i.e. plasma, Sterrad(®) system, and ozone) on the morphological, chemico-physical and mechanical properties of a PU foam synthesized by gas foaming, using water as expanding agent. In addition, possible toxic effects of the sterilization were evaluated by in vitro cytotoxicity tests. Plasma sterilization did not affect the morphological and mechanical properties of the PU foam, but caused at some extent degradative phenomena, as detected by infrared spectroscopy. Ozone sterilization had a major effect on foam morphology, causing the formation of new small pores, and stronger degradation and oxidation on the structure of the material. These modifications affected the mechanical properties of the sterilized PU foam too. Even though, no cytotoxic effects were observed after both plasma and ozone sterilization, as confirmed by the good values of cell viability assessed by Alamar Blue assay. The results here obtained can help in understanding the effects of sterilization procedures on porous polymeric scaffolds, and how the scaffold morphology, in particular porosity, can influence the effects of sterilization, and viceversa.

  7. Alternatives to In Vivo Draize Rabbit Eye and Skin Irritation Tests with a Focus on 3D Reconstructed Human Cornea-Like Epithelium and Epidermis Models

    PubMed Central

    Lee, Miri; Hwang, Jee-Hyun; Lim, Kyung-Min

    2017-01-01

    Human eyes and skin are frequently exposed to chemicals accidentally or on purpose due to their external location. Therefore, chemicals are required to undergo the evaluation of the ocular and dermal irritancy for their safe handling and use before release into the market. Draize rabbit eye and skin irritation test developed in 1944, has been a gold standard test which was enlisted as OECD TG 404 and OECD TG 405 but it has been criticized with respect to animal welfare due to invasive and cruel procedure. To replace it, diverse alternatives have been developed: (i) For Draize eye irritation test, organotypic assay, in vitro cytotoxicity-based method, in chemico tests, in silico prediction model, and 3D reconstructed human cornea-like epithelium (RhCE); (ii) For Draize skin irritation test, in vitro cytotoxicity-based cell model, and 3D reconstructed human epidermis models (RhE). Of these, RhCE and RhE models are getting spotlight as a promising alternative with a wide applicability domain covering cosmetics and personal care products. In this review, we overviewed the current alternatives to Draize test with a focus on 3D human epithelium models to provide an insight into advancing and widening their utility. PMID:28744350

  8. Morphology and Histochemistry of the Glandular Trichomes of Lippia scaberrima (Verbenaceae)

    PubMed Central

    Combrinck, S.; Du Plooy, G. W.; McCrindle, R. I.; Botha, B. M.

    2007-01-01

    Background and Aims Lippia scaberrima, an aromatic indigenous South African plant, with medicinal application, potentially has economic value. The production of essential oil from this plant has not been optimized, and this study of the chemico-morphological characteristics was aimed at determining the location of oil production within the plant. Furthermore, the locality of other secondary metabolites important in medicinal applications needed to be ascertained. This information would be useful in deciding the protocol required for isolation of such compounds. Methods The morphology of the glandular trichomes was investigated using a combination of scanning electron and light microscopy. Concurrently, the chemical content was studied by applying various chemical reagents and fluorescence microscopy. Key Results Three types of trichomes were distinguished on the material investigated. Large, bulbous peltate glands containing compounds of terpenoid nature are probably the main site of essential oil accumulation. Small glands were found to be both peltate and capitate and fluorescent stain indicated the possible presence of phenolic compounds. The third type was a slender tapered seta with an ornamented surface and uniseriate base, and evidently secretory in nature. Conclusions This study linking the chemical content and morphology of the glandular trichomes of L. scaberrima has contributed to the knowledge and understanding of secretory structures of Lippia spp. in general. PMID:17468110

  9. Concentration Dependent Ion-Protein Interaction Patterns Underlying Protein Oligomerization Behaviours

    NASA Astrophysics Data System (ADS)

    Batoulis, Helena; Schmidt, Thomas H.; Weber, Pascal; Schloetel, Jan-Gero; Kandt, Christian; Lang, Thorsten

    2016-04-01

    Salts and proteins comprise two of the basic molecular components of biological materials. Kosmotropic/chaotropic co-solvation and matching ion water affinities explain basic ionic effects on protein aggregation observed in simple solutions. However, it is unclear how these theories apply to proteins in complex biological environments and what the underlying ionic binding patterns are. Using the positive ion Ca2+ and the negatively charged membrane protein SNAP25, we studied ion effects on protein oligomerization in solution, in native membranes and in molecular dynamics (MD) simulations. We find that concentration-dependent ion-induced protein oligomerization is a fundamental chemico-physical principle applying not only to soluble but also to membrane-anchored proteins in their native environment. Oligomerization is driven by the interaction of Ca2+ ions with the carboxylate groups of aspartate and glutamate. From low up to middle concentrations, salt bridges between Ca2+ ions and two or more protein residues lead to increasingly larger oligomers, while at high concentrations oligomers disperse due to overcharging effects. The insights provide a conceptual framework at the interface of physics, chemistry and biology to explain binding of ions to charged protein surfaces on an atomistic scale, as occurring during protein solubilisation, aggregation and oligomerization both in simple solutions and membrane systems.

  10. Identification of Chemical Features Linked to Thyroperoxidase ...

    EPA Pesticide Factsheets

    Disruption of maternal serum thyroid hormone (TH) adversely affects fetal neurodevelopment. Therefore, assay development within the US EPA ToxCast program is ongoing to enable screening for chemicals that may disrupt TH, in support of the Endocrine Disruption Screening Program (EDSP21). The AUR-TPO assay was recently developed to screen >1,000 ToxCast chemicals for potential thyroperoxidase (TPO) inhibition activity. TPO is critical for TH synthesis and is a known target of thyroid-disrupting chemicals. The bioactivity results from the AUR-TPO assay were used to identify chemical substructures associated with in vitro TPO inhibition. Substructure profiles were generated for each chemical in the ToxCast test set using the publicly-available ToxPrint 2.0 chemotypes. Chemotypes enriched among the putative TPO inhibitors were identified using a cumulative hypergeometric probability (p < 0.01). Of the total 729 chemotypes evaluated, 31 were overrepresented among TPO inhibitors. Examination of those 31 chemotypes revealed four basic pharmacophores that accounted for 70% of the ToxCast chemicals active in the AUR-TPO assay: aromatic alcohols, aromatic amines, thiocarbonyls and phosphothioates. Chemico-structural analysis of AUR-TPO screening results enabled the identification of chemical features that likely drive TPO inhibition in the AUR-TPO assay. This highlights the potential to identify thyroid-disrupting chemicals in silico using structural alerts identified by

  11. Local tolerance testing under REACH: Accepted non-animal methods are not on equal footing with animal tests.

    PubMed

    Sauer, Ursula G; Hill, Erin H; Curren, Rodger D; Raabe, Hans A; Kolle, Susanne N; Teubner, Wera; Mehling, Annette; Landsiedel, Robert

    2016-07-01

    In general, no single non-animal method can cover the complexity of any given animal test. Therefore, fixed sets of in vitro (and in chemico) methods have been combined into testing strategies for skin and eye irritation and skin sensitisation testing, with pre-defined prediction models for substance classification. Many of these methods have been adopted as OECD test guidelines. Various testing strategies have been successfully validated in extensive in-house and inter-laboratory studies, but they have not yet received formal acceptance for substance classification. Therefore, under the European REACH Regulation, data from testing strategies can, in general, only be used in so-called weight-of-evidence approaches. While animal testing data generated under the specific REACH information requirements are per se sufficient, the sufficiency of weight-of-evidence approaches can be questioned under the REACH system, and further animal testing can be required. This constitutes an imbalance between the regulatory acceptance of data from approved non-animal methods and animal tests that is not justified on scientific grounds. To ensure that testing strategies for local tolerance testing truly serve to replace animal testing for the REACH registration 2018 deadline (when the majority of existing chemicals have to be registered), clarity on their regulatory acceptance as complete replacements is urgently required. 2016 FRAME.

  12. Rational design of Ag/TiO2 nanosystems by a combined RF-sputtering/sol-gel approach.

    PubMed

    Armelao, Lidia; Barreca, Davide; Bottaro, Gregorio; Gasparotto, Alberto; Maccato, Chiara; Tondello, Eugenio; Lebedev, Oleg I; Turner, Stuart; Van Tendeloo, Gustaaf; Sada, Cinzia; Stangar, Urska Lavrencic

    2009-12-21

    The present work is devoted to the preparation of Ag/TiO(2) nanosystems by an original synthetic strategy, based on the radio-frequency (RF) sputtering of silver particles on titania-based xerogels prepared by the sol-gel (SG) route. This approach takes advantage of the synergy between the microporous xerogel structure and the infiltration power characterizing RF-sputtering, whose combination enables the obtainment of a tailored dispersion of Ag-containing particles into the titania matrix. In addition, the system's chemico-physical features can be tuned further through proper ex situ thermal treatments in air at 400 and 600 degrees C. The synthesized composites are extensively characterized by the joint use of complementary techniques, that is, X-ray photoelectron and X-ray excited Auger electron spectroscopies (XPS, XE-AES), secondary ion mass spectrometry (SIMS), glancing incidence X-ray diffraction (GIXRD), field emission scanning electron microscopy (FE-SEM), transmission electron microscopy (TEM), electron diffraction (ED), high-angle annular dark field scanning TEM (HAADF-STEM), energy-filtered TEM (EF-TEM) and optical absorption spectroscopy. Finally, the photocatalytic performances of selected samples in the decomposition of the azo-dye Plasmocorinth B are preliminarily investigated. The obtained results highlight the possibility of tailoring the system characteristics over a broad range, directly influencing their eventual functional properties.

  13. Polyurethane foam/nano hydroxyapatite composite as a suitable scaffold for bone tissue regeneration.

    PubMed

    Meskinfam, M; Bertoldi, S; Albanese, N; Cerri, A; Tanzi, M C; Imani, R; Baheiraei, N; Farokhi, M; Farè, S

    2018-01-01

    In bone tissue regeneration, the use of biomineralized scaffolds to create the 3D porous structure needed for well-fitting with defect size and appropriate cell interactions, is a promising alternative to autologous and heterologous bone grafts. Biomineralized polyurethane (PU) foams are here investigated as scaffold for bone tissue regeneration. Biomineralization of the foams was carried out by activation of PU surface by a two steps procedure performed for different times (1 to 4 weeks). Scaffolds were investigated for morphological, chemico-physical and mechanical properties, as well as for in vitro interaction with rat Bone Marrow Mesenchymal Stem Cells (BMSCs). Untreated and biomineralized PU samples showed a homogenous morphology and regular pore size (average Ø=407μm). Phase and structure of formed calcium phosphates (CaPs) layer onto the PU foam were analyzed by Fourier Transform Infrared spectroscopy and X-ray diffraction, proving the formation of bone-like nano hydroxyapatite. Biomineralization caused a significant increase of mechanical properties of treated foams compared to untreated ones. Biomineralization also affected the PU scaffold cytocompatibility providing a more appropriate surface for cell attachment and proliferation. Considering the obtained results, the proposed scaffold can be considered suitable for bone tissue regeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Saporin-S6: A Useful Tool in Cancer Therapy

    PubMed Central

    Polito, Letizia; Bortolotti, Massimo; Mercatelli, Daniele; Battelli, Maria Giulia; Bolognesi, Andrea

    2013-01-01

    Thirty years ago, the type 1 ribosome-inactivating protein (RIP) saporin-S6 (also known as saporin) was isolated from Saponaria officinalis L. seeds. Since then, the properties and mechanisms of action of saporin-S6 have been well characterized, and it has been widely employed in the construction of conjugates and immunotoxins for different purposes. These immunotoxins have shown many interesting results when used in cancer therapy, particularly in hematological tumors. The high enzymatic activity, stability and resistance to conjugation procedures and blood proteases make saporin-S6 a very useful tool in cancer therapy. High efficacy has been reported in clinical trials with saporin-S6-containing immunotoxins, at dosages that induced only mild and transient side effects, which were mainly fever, myalgias, hepatotoxicity, thrombocytopenia and vascular leak syndrome. Moreover, saporin-S6 triggers multiple cell death pathways, rendering impossible the selection of RIP-resistant mutants. In this review, some aspects of saporin-S6, such as the chemico-physical characteristics, the structural properties, its endocytosis, its intracellular routing and the pathogenetic mechanisms of the cell damage, are reported. In addition, the recent progress and developments of saporin-S6-containing immunotoxins in cancer immunotherapy are summarized, including in vitro and in vivo pre-clinical studies and clinical trials. PMID:24105401

  15. Biogenic amines in dry fermented sausages: a review.

    PubMed

    Suzzi, Giovanna; Gardini, Fausto

    2003-11-15

    Biogenic amines are compounds commonly present in living organisms in which they are responsible for many essential functions. They can be naturally present in many foods such as fruits and vegetables, meat, fish, chocolate and milk, but they can also be produced in high amounts by microorganisms through the activity of amino acid decarboxylases. Excessive consumption of these amines can be of health concern because their not equilibrate assumption in human organism can generate different degrees of diseases determined by their action on nervous, gastric and intestinal systems and blood pressure. High microbial counts, which characterise fermented foods, often unavoidably lead to considerable accumulation of biogenic amines, especially tyramine, 2-phenylethylamine, tryptamine, cadaverine, putrescine and histamine. However, great fluctuations of amine content are reported in the same type of product. These differences depend on many variables: the quali-quantitative composition of microbial microflora, the chemico-physical variables, the hygienic procedure adopted during production, and the availability of precursors. Dry fermented sausages are worldwide diffused fermented meat products that can be a source of biogenic amines. Even in the absence of specific rules and regulations regarding the presence of these compounds in sausages and other fermented products, an increasing attention is given to biogenic amines, especially in relation to the higher number of consumers with enhanced sensitivity to biogenic amines determined by the inhibition of the action of amino oxidases, the enzymes involved in the detoxification of these substances. The aim of this paper is to give an overview on the presence of these compounds in dry fermented sausages and to discuss the most important factors influencing their accumulation. These include process and implicit factors as well as the role of starter and nonstarter microflora growing in the different steps of sausage production. Moreover, the role of microorganisms with amino oxidase activity as starter cultures to control or reduce the accumulation of biogenic amines during ripening and storage of sausages is discussed.

  16. Decontamination and functional reclamation of dredged brackish sediments.

    PubMed

    Doni, S; Macci, C; Peruzzi, E; Iannelli, R; Ceccanti, B; Masciandaro, G

    2013-07-01

    The continuous stream of sediments, dredged from harbors and waterways for keeping shipping traffic efficiency, is a considerable ongoing problem recognized worldwide. This problem gets worse as most of the sediments dredged from commercial ports and waterways turn out to be polluted by a wide range of organic and inorganic contaminants. In this study, phytoremediation was explored as a sustainable reclamation technology for turning slightly-polluted brackish dredged sediments into a matrix feasible for productive use. To test this possibility, a phytoremediation experimentation was carried out in containers of about 0.7 m(3) each, filled with brackish dredged sediments contaminated by heavy metals and hydrocarbons. The sediments were pre-conditioned by adding an agronomic soil (30 % v/v) to improve their clayey granulometric composition, and by topping the mixture with high quality compost (4 kg m(-2)) to favour the initial adaptation of the selected vegetal species. The following plant treatments were tested: (1) Paspalum vaginatum, (2) Phragmites australis, (3) Spartium junceum + P. vaginatum, (4) Nerium oleander + P. vaginatum, (5) Tamarix gallica + P. vaginatum, and (6) unplanted control. Eighteen months after the beginning of the experimentation, all the plant species were found in healthy condition and well developed. Throughout the whole experiment, the monitored biological parameters (total microbial population and dehydrogenase activity) were generally observed as constantly increasing in all the planted sediments more than in the control, pointing out an improvement of the chemico-physical conditions of both microorganisms and plants. The concentration decrease of organic and inorganic contaminants (>35 and 20 %, respectively) in the treatments with plants, particularly in the T. gallica + P. vaginatum, confirmed the importance of the root-microorganism interaction in activating the decontamination processes. Finally, the healthy state of the plants and the sediment characteristics, approaching those of an uncontaminated natural soil (technosoil), indicated the efficiency and success of this technology for brackish sediments reclamation.

  17. The influence of the wooden equipment employed for cheese manufacture on the characteristics of a traditional stretched cheese during ripening.

    PubMed

    Di Grigoli, Antonino; Francesca, Nicola; Gaglio, Raimondo; Guarrasi, Valeria; Moschetti, Marta; Scatassa, Maria Luisa; Settanni, Luca; Bonanno, Adriana

    2015-04-01

    The influence of the wooden equipment used for the traditional cheese manufacturing from raw milk was evaluated on the variations of chemico-physical characteristics and microbial populations during the ripening of Caciocavallo Palermitano cheese. Milk from two farms (A, extensive; B, intensive) was processed in traditional and standard conditions. Chemical and physical traits of cheeses were affected by the farming system and the cheese making technology, and changed during ripening. Content in NaCl and N soluble was lower, and paste consistency higher in cheese from the extensive farm and traditional technology, whereas ripening increased the N soluble and the paste yellow and consistency. The ripening time decreased the number of all lactic acid bacteria (LAB) groups, except enterococci detected at approximately constant levels (10(4) and 10(5) cfu g(-1) for standard and traditional cheeses, respectively), till 120 d of ripening. In all productions, at each ripening time, the levels detected for enterococci were lower than those for the other LAB groups. The canonical discriminant analysis of chemical, physical and microbiological data was able to separate cheeses from different productions and ripening time. The dominant LAB were isolated, phenotypically characterised and grouped, genetically differentiated at strain level and identified. Ten species of LAB were found and the strains detected at the highest levels were Pediococcus acidilactici and Lactobacillus casei. Ten strains, mainly belonging to Lactobacillus rhamnosus and Lactobacillus fermentum showed an antibacterial activity. The comparison of the polymorphic profiles of the LAB strains isolated from the wooden vat with those of the strains collected during maturation, showed the persistence of three enterococci in traditional cheeses, with Enterococcus faecalis found at dominant levels over the Enterococcus population till 120 d; the absence of these strains in the standard productions evidenced the contribution of vat LAB during Caciocavallo Palermitano cheese ripening. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Alpha-momorcharin: a ribosome-inactivating protein from Momordica charantia, possessing DNA cleavage properties.

    PubMed

    Wang, Shuzhen; Zheng, Yinzhen; Yan, Junjie; Zhu, Zhixuan; Wu, Zhihua; Ding, Yi

    2013-11-01

    Ribosome-inactivating proteins (RIPs) function to inhibit protein synthesis through the removal of specific adenine residues from eukaryotic ribosomal RNA and rending the 60S subunit unable to bind elongation factor 2. They have received much attention in biological and biomedical research due to their unique activities toward tumor cells, as well as the important roles in plant defense. Alpha-momorcharin (α-MC), a member of the type I family of RIPs, is rich in the seeds of Momordica charantia L. Previous studies demonstrated that α-MC is an effective antifungal and antibacterial protein. In this study, a detailed analysis of the DNase-like activity of α-MC was conducted. Results showed that the DNase-like activity toward plasmid DNA was time-dependent, temperature-related, and pH-stable. Moreover, a requirement for divalent metal ions in the catalytic domain of α-MC was confirmed. Additionally, Tyr(93) was found to be a critical residue for the DNase-like activity, while Tyr(134), Glu(183), Arg(186), and Trp(215) were activity-related residues. This study on the chemico-physical properties and mechanism of action of α-MC will improve its utilization in scientific research, as well as its potential industrial uses. These results may also assist in the characterization and elucidation of the DNase-like enzymatic properties of other RIPs.

  19. Alpha-enolase on apical surface of renal tubular epithelial cells serves as a calcium oxalate crystal receptor

    NASA Astrophysics Data System (ADS)

    Fong-Ngern, Kedsarin; Thongboonkerd, Visith

    2016-10-01

    To search for a strategy to prevent kidney stone formation/recurrence, this study addressed the role of α-enolase on apical membrane of renal tubular cells in mediating calcium oxalate monohydrate (COM) crystal adhesion. Its presence on apical membrane and in COM crystal-bound fraction was confirmed by Western blotting and immunofluorescence staining. Pretreating MDCK cells with anti-α-enolase antibody, not isotype-controlled IgG, dramatically reduced cell-crystal adhesion. Immunofluorescence staining also confirmed the direct binding of purified α-enolase to COM crystals at {121} > {100} > {010} crystal faces. Coating COM crystals with urinary proteins diminished the crystal binding capacity to cells and purified α-enolase. Moreover, α-enolase selectively bound to COM, not other crystals. Chemico-protein interactions analysis revealed that α-enolase interacted directly with Ca2+ and Mg2+. Incubating the cells with Mg2+ prior to cell-crystal adhesion assay significantly reduced crystal binding on the cell surface, whereas preincubation with EDTA, a divalent cation chelator, completely abolished Mg2+ effect, indicating that COM and Mg2+ competitively bind to α-enolase. Taken together, we successfully confirmed the role of α-enolase as a COM crystal receptor to mediate COM crystal adhesion at apical membrane of renal tubular cells. It may also serve as a target for stone prevention by blocking cell-crystal adhesion and stone nidus formation.

  20. Heavy metals affect nematocysts discharge response and biological activity of crude venom in the jellyfish Pelagia noctiluca (Cnidaria, Scyphozoa).

    PubMed

    Morabito, Rossana; Dossena, Silvia; La Spada, Giuseppa; Marino, Angela

    2014-01-01

    Pollution of marine ecosystems and, specifically, heavy metals contamination may compromise the physiology of marine animals with events occurring on a cellular and molecular level. The present study focuses on the effect of short-term exposure to heavy metals like Zinc, Cadmium, Cobalt and Lanthanum (2-10 mM) on the homeostasis of Pelagia noctiluca (Cnidaria, Scyphozoa), a jellyfish abundant in the Mediterranean sea. This species possesses stinging organoids, termed nematocysts, whose discharge and concomitant delivery of venom underlie the survival of all Cnidaria. Nematocysts discharge response, elicited by combined chemico-physical stimulation, was verified on excised oral arms exposed to heavy metals for 20 min. In addition, the hemolytic activity of toxins, contained in the crude venom extracted from nematocysts isolated from oral arms, was tested on human erythrocytes, in the presence of heavy metals or their mixture. Treatment with heavy metals significantly inhibited both nematocysts discharge response and hemolytic activity of crude venom, in a dose-dependent manner, not involving oxidative events, that was irreversible in the case of Lanthanum. Our findings show that the homeostasis of Pelagia noctiluca, in terms of nematocysts discharge capability and effectiveness of venom toxins, is dramatically and rapidly compromised by heavy metals and confirm that this jellyfish is eligible as a model for ecotoxicological investigations. © 2014 S. Karger AG, Basel.

  1. Multivariate Models for Prediction of Human Skin Sensitization ...

    EPA Pesticide Factsheets

    One of the lnteragency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens TM assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches , logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine

  2. Memantine-derived drugs as potential antitumor agents for the treatment of glioblastoma.

    PubMed

    Cacciatore, Ivana; Fornasari, Erika; Marinelli, Lisa; Eusepi, Piera; Ciulla, Michele; Ozdemir, Ozlem; Tatar, Abdulgani; Turkez, Hasan; Di Stefano, Antonio

    2017-11-15

    Glioblastoma is one of the most aggressive malignant primary brain cancer in adults. To date, surgery, radiotherapy and current pharmacological treatments are not sufficient to manage this pathology that has a high mortality rate (median survival 12-15months). Recently, anticancer multi-targeted compounds have attracted much attention with the aim to obtain new drugs able to hit different biological targets that are involved in the onset and progression of the disease. Here, we report the synthesis of novel memantine-derived drugs (MP1-10) and their potential antitumor activities in human U87MG glioblastoma cell line. MP1-10 were synthetized joining memantine, which is a NMDA antagonist, to different histone deacetylase inhibitors to obtain one molecule with improved therapeutic efficacy. Biological results indicated that MP1 and MP2 possessed more potent anti-proliferative effects on U87MG cells than MP3-10 in a dose-dependent manner. MP1 and MP2 induced significant cell death by apoptosis characterized by apoptotic morphological changes in Hoechst staining. Both drugs also exhibited non-genotoxic and only mild cytotoxic effects in human whole blood cells. However, only MP1, showing good chemico-physical properties (solubility, LogP) and enzymatic stabilities in gastric and intestinal fluids, can be considered a suitable candidate for in vivo pharmacokinetic studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Almond tree and organic fertilization for soil quality improvement in southern Italy.

    PubMed

    Macci, Cristina; Doni, Serena; Peruzzi, Eleonora; Masciandaro, Grazia; Mennone, Carmelo; Ceccanti, Brunello

    2012-03-01

    The semi-arid Mediterranean region, characterized by long dry periods followed by heavy bursts of rainfall, is particularly prone to soil erosion. The main goal of this study is to evaluate the soil quality under different practices of bio-physical amelioration which involve the soil-plant system (almond trees) and microorganism-manure. This study, carried out in the South of Italy (Basilicata Region- Pantanello farm), considered two types of fertilization (mineral and organic) and three slope gradients (0, 2 and 6%), in order to evaluate the effects of management practices in resisting soil erosion. Chemical (organic carbon and nitrogen), physical (soil shrinkage and bulk density) and biochemical (dehydrogenase activity and hydrolytic enzyme activities) parameters were selected as markers to follow agro-ecological changes with time. The organic treatment affected soil microbiological and physico-chemical properties by increasing soil nutrient availability, microbial activity, and improving soil structure. The consistently higher values of the hydrolytic enzyme activities (β-glucosidase, phosphatase, urease and protease) often observed in the presence of plants and on the 0 and 2% slopes, suggested the stimulation of nutrient cycles by tree roots, which improve the conditions for soil microorganisms in carrying out their metabolic activity. In the 6% slope and, in particular, in the mineral fertilizer treatment, soil metabolism was lower as suggested by the dehydrogenase activity which was 50% lower than that found in the 0 and 2% slopes, this seemed to be related to a slowdown in the nutrient cycling and organic carbon metabolism. However, on this slope, in both mineral and organic treatments, a significant stimulation of hydrolytic enzyme activities and an improvement of soil structure (reduction of bulk density of about 10% and increase in total shrinkage from 20 to 60%) were observed with plants compared to the control soil. The combination of organic fertilization and almond trees resulted effective, also in the highest slope, in mitigating the degradation processes through the improvement of chemico-nutritional, biochemical and physical soil properties. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Mechanisms of detonation formation due to a temperature gradient

    NASA Astrophysics Data System (ADS)

    Kapila, A. K.; Schwendeman, D. W.; Quirk, J. J.; Hawa, T.

    2002-12-01

    Emergence of a detonation in a homogeneous, exothermically reacting medium can be deemed to occur in two phases. The first phase processes the medium so as to create conditions ripe for the onset of detonation. The actual events leading up to preconditioning may vary from one experiment to the next, but typically, at the end of this stage the medium is hot and in a state of nonuniformity. The second phase consists of the actual formation of the detonation wave via chemico-gasdynamic interactions. This paper considers an idealized medium with simple, rate-sensitive kinetics for which the preconditioned state is modelled as one with an initially prescribed linear gradient of temperature. Accurate and well-resolved numerical computations are carrried out to determine the mode of detonation formation as a function of the size of the initial gradient. For shallow gradients, the result is a decelerating supersonic reaction wave, a weak detonation, whose trajectory is dictated by the initial temperature profile, with only weak intervention from hydrodynamics. If the domain is long enough, or the gradient less shallow, the wave slows down to the Chapman-Jouguet speed and undergoes a swift transition to the ZND structure. For sharp gradients, gasdynamic nonlinearity plays a much stronger role. Now the path to detonation is through an accelerating pulse that runs ahead of the reaction wave and rearranges the induction-time distribution there to one that bears little resemblance to that corresponding to the initial temperature gradient. The pulse amplifies and steepens, transforming itself into a complex consisting of a lead shock, an induction zone, and a following fast deflagration. As the pulse advances, its three constituent entities attain progressively higher levels of mutual coherence, to emerge as a ZND detonation. For initial gradients that are intermediate in size, aspects of both the extreme scenarios appear in the path to detonation. The novel aspect of this study resides in the fact that it is guided by, and its results are compared with, existing asymptotic analyses of detonation evolution.

  5. Influence of compositional variation on structural, electrical and magnetic characteristics of (Ba1-x Gd) (Ti1-x Fe x ) O3 (0.2 ≤ x ≤ 0.5)

    NASA Astrophysics Data System (ADS)

    Sahoo, Sushrisangita; Mahapatra, P. K.; Choudhary, R. N. P.; Alagarsamy, Perumal

    2018-01-01

    The effect of composition variation of (Ba1-x Gd x )(Ti1-x Fe x )O3 (0.2 ≤ x ≤ 0.5) on structural, optical, electrical and multiferroic properties was investigated. The polycrystalline samples were fabricated by a chemico-thermal route. While the compound with composition x ≤ 0.3 has a tetragonal structure akin to BaTiO3, the higher compositions (x > 0.3) crystallize in a mixed phase of the tetragonal and orthorhombic structure. The different polarization mechanisms in the compound were analyzed on the basis of ferroelectric-paraelectric phase transition at 120 °C, magnetic reorientation mediated by Gd3+ ↔ Fe3+ exchange interaction at 200 °C and that induced by antiferromagnetic ordering mediated through the Fe3+ ↔ Fe3+ exchange interactions at 380 °C. Analysis of ac conductivity on the basis of Jonscher’s power law indicates the presence of correlated barrier hopping conduction mechanism in the samples. Among the studied samples, the composition with x = 0.3 exhibiting improved material properties like lower optical band gap and higher optical absorption, high dielectric constant (830 at room temperature and peak value of 3944 at 160 °C and 6478 at 377.5 °C), and the room temperature ME coefficient of 1.53 mV cm-1 Oe-1 have promising technological applications.

  6. Chemistry and biology of reactive species with special reference to the antioxidative defence status in pancreatic β-cells.

    PubMed

    Lenzen, Sigurd

    2017-08-01

    Diabetes mellitus is a serious metabolic disease. Dysfunction and subsequent loss of the β-cells in the islets of Langerhans through apoptosis ultimately cause a life-threatening insulin deficiency. The underlying reason for the particular vulnerability of the β-cells is an extraordinary sensitivity to the toxicity of reactive oxygen and nitrogen species (ROS and RNS) due to its low antioxidative defense status. This review considers the different aspects of the chemistry and biology of the biologically most important reactive species and their chemico-biological interactions in the β-cell toxicity of proinflammatory cytokines in type 1 diabetes and of lipotoxicity in type 2 diabetes development. The weak antioxidative defense equipment in the different subcellular organelles makes the β-cells particularly vulnerable and prone to mitochondrial, peroxisomal and ER stress. Looking upon the enzyme deficiencies which are responsible for the low antioxidative defense status of the pancreatic β-cells it is the lack of enzymatic capacity for H 2 O 2 inactivation at all major subcellular sites. Diabetes is the most prevalent metabolic disorder with a steadily increasing incidence of both type 1 and type 2 diabetes worldwide. The weak protection of the pancreatic β-cells against oxidative stress is a major reason for their particular vulnerability. Thus, careful protection of the β-cells is required for prevention of the disease. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Skin sensitisation: the Colipa strategy for developing and evaluating non-animal test methods for risk assessment.

    PubMed

    Maxwell, Gavin; Aeby, Pierre; Ashikaga, Takao; Bessou-Touya, Sandrine; Diembeck, Walter; Gerberick, Frank; Kern, Petra; Marrec-Fairley, Monique; Ovigne, Jean-Marc; Sakaguchi, Hitoshi; Schroeder, Klaus; Tailhardat, Magali; Teissier, Silvia; Winkler, Petra

    2011-01-01

    Allergic contact dermatitis is a delayed-type hypersensitivity reaction induced by small reactive chemicals (haptens). Currently, the sensitising potential and potency of new chemicals is usually characterised using data generated via animal studies, such as the local lymph node assay (LLNA). There are, however, increasing public and political concerns regarding the use of animals for the testing of new chemicals. Consequently, the development of in vitro, in chemico or in silico models for predicting the sensitising potential and/or potency of new chemicals is receiving widespread interest. The Colipa Skin Tolerance task force currently collaborates with and/or funds several academic research groups to expand our understanding of the molecular and cellular events occurring during the acquisition of skin sensitisation. Knowledge gained from this research is being used to support the development and evaluation of novel alternative approaches for the identification and characterisation of skin sensitizing chemicals. At present three non-animal test methods (Direct Peptide Reactivity Assay (DPRA), Myeloid U937 Skin Sensitisation Test (MUSST) and human Cell Line Activation Test (hCLAT)) have been evaluated in Colipa interlaboratory ring trials for their potential to predict skin sensitisation potential and were recently submitted to ECVAM for formal pre-validation. Data from all three test methods will now be used to support the study and development of testing strategy approaches for skin sensitiser potency prediction. This publication represents the current viewpoint of the cosmetics industry on the feasibility of replacing the need for animal test data for informing skin sensitisation risk assessment decisions.

  8. Integrated testing strategies for toxicity employing new and existing technologies.

    PubMed

    Combes, Robert D; Balls, Michael

    2011-07-01

    We have developed individual, integrated testing strategies (ITS) for predicting the toxicity of general chemicals, cosmetics, pharmaceuticals, inhaled chemicals, and nanoparticles. These ITS are based on published schemes developed previously for the risk assessment of chemicals to fulfil the requirements of REACH, which have been updated to take account of the latest developments in advanced in chemico modelling and in vitro technologies. In addition, we propose an ITS for neurotoxicity, based on the same principles, for incorporation in the other ITS. The technologies are deployed in a step-wise manner, as a basis for decision-tree approaches, incorporating weight-of-evidence stages. This means that testing can be stopped at the point where a risk assessment and/or classification can be performed, with labelling in accordance with the requirements of the regulatory authority concerned, rather than following a checklist approach to hazard identification. In addition, the strategies are intelligent, in that they are based on the fundamental premise that there is no hazard in the absence of exposure - which is why pharmacokinetic modelling plays a key role in each ITS. The new technologies include the use of complex, three-dimensional human cell tissue culture systems with in vivo-like structural, physiological and biochemical features, as well as dosing conditions. In this way, problems of inter-species extrapolation and in vitro/in vivo extrapolation are minimised. This is reflected in the ITS placing more emphasis on the use of volunteers at the whole organism testing stage, rather than on existing animal testing, which is the current situation. 2011 FRAME.

  9. Multivariate Models for Prediction of Human Skin Sensitization Hazard

    PubMed Central

    Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M.; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole

    2016-01-01

    One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays—the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens™ assay—six physicochemical properties, and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression (LR) and support vector machine (SVM), to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three LR and three SVM) with the highest accuracy (92%) used: (1) DPRA, h-CLAT, and read-across; (2) DPRA, h-CLAT, read-across, and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens, and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy = 88%), any of the alternative methods alone (accuracy = 63–79%), or test batteries combining data from the individual methods (accuracy = 75%). These results suggest that computational methods are promising tools to effectively identify potential human skin sensitizers without animal testing. PMID:27480324

  10. In Silico Molecular Interaction of Bisphenol Analogues with Human Nuclear Receptors Reveals their Stronger Affinity vs. Classical Bisphenol A.

    PubMed

    Sharma, Shikha; Ahmad, Shahzad; Faraz Khan, Mohemmed; Parvez, Suhel; Raisuddin, Sheikh

    2018-06-21

    Bisphenol A (BPA) is known for endocrine disrupting activity. In order to replace BPA a number of bisphenol analogues have been designed. However, their activity profile is poorly described and little information exists about their endocrine disrupting potential and interactions with nuclear receptors. An understanding of such interaction may unravel mechanism of their molecular action and provide valuable inputs for risk assessment. BPA binds and activates peroxisome proliferator-activated receptors (PPARs) and retinoid X receptors (RXRs) which act as transcription factors and regulate genes involved in glucose, lipid, and cholesterol metabolism and adipogenesis. We studied binding efficiency of 18 bisphenol analogues and BPA with human PPARs and RXRs. Using Maestro Schrodinger 9.4, docking scores of bisphenols were compared with the known endogenous and exogenous ligands of hPPARs and hRXRs. BPA showed good binding efficiency. Several analogues also showed higher binding efficiency than BPA. BPPH which has high tendency to be absorbed in tissues showed the strongest binding with hPPARα, hPPARβ, hPPARγ and hRXRα whereas two of the most toxic bisphenols, BPM and BPAF showed strongest binding with hRXRβ and hRXRγ. Some of the bisphenol analogues showed a stronger binding affinity with PPAR and RXR compared to BPA implying that BPA substitutes may not be fully safe and chemico-biological interactions indicate their toxic potential. These results may also serve to plan further studies for determining safety profile of bisphenol analogues and be helpful in risk characterization.

  11. A novel in chemico method to detect skin sensitizers in highly diluted reaction conditions.

    PubMed

    Yamamoto, Yusuke; Tahara, Haruna; Usami, Ryota; Kasahara, Toshihiko; Jimbo, Yoshihiro; Hioki, Takanori; Fujita, Masaharu

    2015-11-01

    The direct peptide reactivity assay (DPRA) is a simple and versatile alternative method for the evaluation of skin sensitization that involves the reaction of test chemicals with two peptides. However, this method requires concentrated solutions of test chemicals, and hydrophobic substances may not dissolve at the concentrations required. Furthermore, hydrophobic test chemicals may precipitate when added to the reaction solution. We previously established a high-sensitivity method, the amino acid derivative reactivity assay (ADRA). This method uses novel cysteine (NAC) and novel lysine derivatives (NAL), which were synthesized by introducing a naphthalene ring to the amine group of cysteine and lysine residues. In this study, we modified the ADRA method by reducing the concentration of the test chemicals 100-fold. We investigated the accuracy of skin sensitization predictions made using the modified method, which was designated the ADRA-dilutional method (ADRA-DM). The predictive accuracy of the ADRA-DM for skin sensitization was 90% for 82 test chemicals which were also evaluated via the ADRA, and the predictive accuracy in the ADRA-DM was higher than that in the ADRA and DPRA. Furthermore, no precipitation of test compounds was observed at the initiation of the ADRA-DM reaction. These results show that the ADRA-DM allowed the use of test chemicals at concentrations two orders of magnitude lower than that possible with the ADRA. In addition, ADRA-DM does not have the restrictions on test compound solubility that were a major problem with the DPRA. Therefore, the ADRA-DM is a versatile and useful method. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Carbon sequestration and fertility after centennial time scale incorporation of charcoal into soil

    NASA Astrophysics Data System (ADS)

    Criscuoli, Irene; Alberti, Giorgio; Baronti, Silvia; Favilli, Filippo; Martinez, Cristina; Calzolari, Costanza; Pusceddu, Emanuela; Rumpel, Cornelia; Viola, Roberto; Miglietta, Franco

    2014-05-01

    The addition of pyrogenic carbon (C) in the soil is considered a sustainable strategy to achieve direct C sequestration and potential reduction of non-CO2 greenhouse gas emissions. In this paper, we investigated the long term effects of charcoal addition on C sequestration and soil chemico-physical properties by studying a series of abandoned charcoal hearths in the Eastern Alps established in the XIX century. This natural setting can be seen as an analogue of a deliberate experiment with replications. Carbon sequestration was assessed indirectly by comparing the amount of C present in the hearths with the estimated amount of charcoal that was left on the soil after the carbonization. Approximately 80% of the C originally added to the soil via charcoal can still be found today, thus supporting the view that charcoal incorporation is an effective way to sequester atmospheric CO2. We also observed an improvement in the physical properties (hydrophobicity and bulk density) of charcoal hearth soils and an accumulation of nutrients compared to the adjacent soil without charcoal. Then, we focused on the morphological and physical characterization of several fragments, using scanning electron microscopy (SEM), X-ray diffraction (XRD) and X-ray fluorescence (XRF). Such study enabled the identification of peculiar morphological features of tracheids, which were tentatively associated to a differential oxidation of the structures that were created during carbonization from lignine and cellulose. In order to assess the effect of soil-aging we compared the old-biochar with a modern one obtained from the same feedstock and with similar carbonization process. XRD and XRF analysis were performed on both old and modern biochar, in order to study the multiphase crystalline structure and chemical elements found. We observed mineralization and a fossilization of old biochar samples respect to the modern ones, with accumulation of several mineral oxides and a substantial presence of quartz. A graphene structure was also found, indicating weak bonds in the carbon structures, explained by inter-molecular Van der Waals forces. Furthermore, we have detected a graphite oxide structure responsible of the bending effect in the tracheid, revealed in SEM images. We consider that those results may contribute to the ongoing debate on the best, most suitable geo-engineering strategies that can potentially enable effective and sustainable carbon sequestration in agricultural soils using biochar.

  13. Probabilistic hazard assessment for skin sensitization potency by dose–response modeling using feature elimination instead of quantitative structure–activity relationships

    PubMed Central

    McKim, James M.; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa

    2016-01-01

    Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose–response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimension-ality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals’ potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced "false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. PMID:26046447

  14. The MyoRobot: A novel automated biomechatronics system to assess voltage/Ca2+ biosensors and active/passive biomechanics in muscle and biomaterials.

    PubMed

    Haug, M; Reischl, B; Prölß, G; Pollmann, C; Buckert, T; Keidel, C; Schürmann, S; Hock, M; Rupitsch, S; Heckel, M; Pöschel, T; Scheibel, T; Haynl, C; Kiriaev, L; Head, S I; Friedrich, O

    2018-04-15

    We engineered an automated biomechatronics system, MyoRobot, for robust objective and versatile assessment of muscle or polymer materials (bio-)mechanics. It covers multiple levels of muscle biosensor assessment, e.g. membrane voltage or contractile apparatus Ca 2+ ion responses (force resolution 1µN, 0-10mN for the given sensor; [Ca 2+ ] range ~ 100nM-25µM). It replaces previously tedious manual protocols to obtain exhaustive information on active/passive biomechanical properties across various morphological tissue levels. Deciphering mechanisms of muscle weakness requires sophisticated force protocols, dissecting contributions from altered Ca 2+ homeostasis, electro-chemical, chemico-mechanical biosensors or visco-elastic components. From whole organ to single fibre levels, experimental demands and hardware requirements increase, limiting biomechanics research potential, as reflected by only few commercial biomechatronics systems that can address resolution, experimental versatility and mostly, automation of force recordings. Our MyoRobot combines optical force transducer technology with high precision 3D actuation (e.g. voice coil, 1µm encoder resolution; stepper motors, 4µm feed motion), and customized control software, enabling modular experimentation packages and automated data pre-analysis. In small bundles and single muscle fibres, we demonstrate automated recordings of (i) caffeine-induced-, (ii) electrical field stimulation (EFS)-induced force, (iii) pCa-force, (iv) slack-tests and (v) passive length-tension curves. The system easily reproduces results from manual systems (two times larger stiffness in slow over fast muscle) and provides novel insights into unloaded shortening velocities (declining with increasing slack lengths). The MyoRobot enables automated complex biomechanics assessment in muscle research. Applications also extend to material sciences, exemplarily shown here for spider silk and collagen biopolymers. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. The "Janus face" of the thrombin binding aptamer: Investigating the anticoagulant and antiproliferative properties through straightforward chemical modifications.

    PubMed

    Esposito, Veronica; Russo, Annapina; Amato, Teresa; Vellecco, Valentina; Bucci, Mariarosaria; Mayol, Luciano; Russo, Giulia; Virgilio, Antonella; Galeone, Aldo

    2018-02-01

    The thrombin binding aptamer (TBA) is endowed with both anticoagulant and antiproliferative activities. Its chemico-physical and/or biological properties can be tuned by the site-specific replacement of selected residues. Four oligodeoxynucleotides (ODNs) based on the TBA sequence (5'-GGTTGGTGTGGTTGG-3') and containing 2'-deoxyuridine (U) or 5-bromo-2'-deoxyuridine (B) residues at positions 4 or 13 have been investigated by NMR and CD techniques. Furthermore, their anticoagulant (PT assay) and antiproliferative properties (MTT assay) have been tested and compared with two further ODNs containing 5-hydroxymethyl-2'-deoxyuridine (H) residues in the same positions, previously investigated. The CD and NMR data suggest that all the investigated ODNs are able to form G-quadruplexes strictly resembling that of TBA. The introduction of B residues in positions 4 or 13 increases the melting temperature of the modified aptamers by 7 °C. The replacement of thymidines with U in the same positions results in an enhanced anticoagulant activity compared to TBA, also at low ODN concentration. Although all ODNs show antiproliferative properties, only TBA derivatives containing H in the positions 4 and 13 lose the anticoagulant activity and remarkably preserve the antiproliferative one. All ODNs have shown antiproliferative activities against two cancer cell lines but only those with U and B are endowed with anticoagulant activities similar or improved compared to TBA. The appropriate site-specific replacement of the residues in the TT loops of TBA with commercially available thymine analogues is a useful strategy either to improve the anticoagulant activity or to preserve the antiproliferative properties by quenching the anticoagulant ones. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Not all boronic acids with a five-membered cycle induce tremor, neuronal damage and decreased dopamine.

    PubMed

    Pérez-Rodríguez, Maribel; García-Mendoza, Esperanza; Farfán-García, Eunice D; Das, Bhaskar C; Ciprés-Flores, Fabiola J; Trujillo-Ferrara, José G; Tamay-Cach, Feliciano; Soriano-Ursúa, Marvin A

    2017-09-01

    Several striatal toxins can be used to induce motor disruption. One example is MPTP (1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine), whose toxicity is accepted as a murine model of parkinsonism. Recently, 3-Thienylboronic acid (3TB) was found to produce motor disruption and biased neuronal damage to basal ganglia in mice. The aim of this study was to examine the toxic effects of four boronic acids with a close structural relationship to 3TB (all having a five-membered cycle), as well as boric acid and 3TB. These boron-containing compounds were compared to MPTP regarding brain access, morphological disruption of the CNS, and behavioral manifestations of such disruption. Data was collected through acute toxicity evaluations, motor behavior tests, necropsies, determination of neuronal survival by immunohistochemistry, Raman spectroscopic analysis of brain tissue, and HPLC measurement of dopamine in substantia nigra and striatum tissue. Each compound showed a distinct profile for motor disruption. For example, motor activity was not disrupted by boric acid, but was decreased by two boronic acids (caused by a sedative effect). 3TB, 2-Thienyl and 2-furanyl boronic acid gave rise to shaking behavior. The various manifestations generated by these compounds can be linked, in part, to different levels of dopamine (measured by HPLC) and degrees of neuronal damage in the basal ganglia and cerebellum. Clearly, motor disruption is not induced by all boronic acids with a five-membered cycle as substituent. Possible explanations are given for the diverse chemico-morphological changes and degrees of disruption of the motor system, considering the role of boron and the structure-toxicity relationship. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Homeostasis and the concept of 'interstitial fluids hierarchy': Relevance of cerebrospinal fluid sodium concentrations and brain temperature control (Review)

    PubMed Central

    Agnati, Luigi F.; Marcoli, Manuela; Leo, Giuseppina; Maura, Guido; Guidolin, Diego

    2017-01-01

    In this review, the aspects and further developments of the concept of homeostasis are discussed also in the perspective of their possible impact in the clinical practice, particularly as far as psychic homeostasis is concerned. A brief historical survey and comments on the concept of homeostasis and allostasis are presented to introduce our proposal that is based on the classical assumption of the interstitial fluid (ISF) as the internal medium for multicellular organisms. However, the new concept of a hierarchic role of ISF of the various organs is introduced. Additionally, it is suggested that particularly for some chemico-physical parameters, oscillatory rhythms within their proper set-ranges should be considered a fundamental component of homeostasis. Against this background, we propose that the brain ISF has the highest hierarchic role in human beings, providing the optimal environment, not simply for brain cell survival, but also for brain complex functions and the oscillatory rhythms of some parameters, such as cerebrospinal fluid sodium and brain ISF pressure waves, which may play a crucial role in brain physio-pathological states. Thus, according to this proposal, the brain ISF represents the real internal medium since the maintenance of its dynamic intra-set-range homeostasis is the main factor for a free and independent life of higher vertebrates. Furthermore, the evolutionary links between brain and kidney and their synergistic role in H2O/Na balance and brain temperature control are discussed. Finally, it is surmised that these two interrelated parameters have deep effects on the Central Nervous System (CNS) higher integrative actions such those linked to psychic homeostasis. PMID:28204813

  18. Insulin-secretagogue, antihyperlipidemic and other protective effects of gallic acid isolated from Terminalia bellerica Roxb. in streptozotocin-induced diabetic rats.

    PubMed

    Latha, R Cecily Rosemary; Daisy, P

    2011-01-15

    Diabetes mellitus causes derangement of carbohydrate, protein and lipid metabolism which eventually leads to a number of secondary complications. Terminalia bellerica is widely used in Indian medicine to treat various diseases including diabetes. The present study was carried out to isolate and identify the putative antidiabetic compound from the fruit rind of T. bellerica and assess its chemico-biological interaction in experimental diabetic rat models. Bioassay guided fractionation was followed to isolate the active compound, structure was elucidated using (1)H and (13)C NMR, IR, UV and mass spectrometry and the compound was identified as gallic acid (GA). GA isolated from T. bellerica and synthetic GA was administered to streptozotocin (STZ)-induced diabetic male Wistar rats at different doses for 28 days. Plasma glucose level was significantly (p<0.05) reduced in a dose-dependent manner when compared to the control.Histopathological examination of the pancreatic sections showed regeneration of β-cells of islets of GA-treated rats when compared to untreated diabetic rats. In addition, oral administration of GA (20mg/kg bw) significantly decreased serum total cholesterol, triglyceride, LDL-cholesterol, urea, uric acid, creatinine and at the same time markedly increased plasma insulin, C-peptide and glucose tolerance level. Also GA restored the total protein, albumin and body weight of diabetic rats to near normal. Thus our findings indicate that gallic acid present in fruit rind of T. bellerica is the active principle responsible for the regeneration of β-cells and normalizing all the biochemical parameters related to the patho-biochemistry of diabetes mellitus and hence it could be used as a potent antidiabetic agent. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Probabilistic hazard assessment for skin sensitization potency by dose-response modeling using feature elimination instead of quantitative structure-activity relationships.

    PubMed

    Luechtefeld, Thomas; Maertens, Alexandra; McKim, James M; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa

    2015-11-01

    Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose-response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimensionality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals' potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced " false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Methodological Approaches toward Chemico-Biological Diagnostics of the State of Soils in Technogenically Transformed Territories

    NASA Astrophysics Data System (ADS)

    Fokina, A. I.; Dabakh, E. V.; Domracheva, L. I.; Skugoreva, S. G.; Lyalina, E. I.; Ashikhmina, T. Ya.; Zykova, Yu. N.; Leonova, K. A.

    2018-05-01

    The comprehensive diagnostics of the state of soils in the impact zone of thermal power station (TPS-5) in the city of Kirov was performed on the basis of the soil chemical analyses and the study of biota response to the loads at different organization levels. The chemical analyses attested to a satisfactory state of the soils. However, the use of soil cyanobacteria and bird's-foot trefoil ( Lótus corniculátus) as test objects showed the toxicity of studied soil samples. The toxicity of the samples was judged from the bioindication effects of cyanophytization and melanization of soil microbial complexes. The obtained results demonstrated that at relatively low concentrations of total and mobile heavy metal compounds in the soil samples, their amount released into the tested soil water (1: 4) extract exceeded the limit allowable for normal functioning of living organisms. For the first time, the express cyanobacterial tetrazole-topographic method of biotesting was applied in the geoecological study to estimate the toxicity of the soil samples. The results obtained with the help of traditional and express methods proved to be comparable. The express-method was sufficiently sensitive and efficient. It allowed the determination of the samples' toxicity in five hours, i.e., four to five times faster than the traditional technique. An inverse relationship between the number of viable cells of cyanobacteria (as judged from the inclusion of formazan crystals) and the concentration of lead ions in the tested soil extracts was found. This finding can be considered a prerequisite for further study and application of the express method in the practice of geoecological monitoring. Our study demonstrated the necessity of a comprehensive approach for the assessment of the real ecological state of soils in the investigated impact zone of the thermal power station.

  1. Accounting for data variability, a key factor in in vivo/in vitro relationships: application to the skin sensitization potency (in vivo LLNA versus in vitro DPRA) example.

    PubMed

    Dimitrov, S; Detroyer, A; Piroird, C; Gomes, C; Eilstein, J; Pauloin, T; Kuseva, C; Ivanova, H; Popova, I; Karakolev, Y; Ringeissen, S; Mekenyan, O

    2016-12-01

    When searching for alternative methods to animal testing, confidently rescaling an in vitro result to the corresponding in vivo classification is still a challenging problem. Although one of the most important factors affecting good correlation is sample characteristics, they are very rarely integrated into correlation studies. Usually, in these studies, it is implicitly assumed that both compared values are error-free numbers, which they are not. In this work, we propose a general methodology to analyze and integrate data variability and thus confidence estimation when rescaling from one test to another. The methodology is demonstrated through the case study of rescaling the in vitro Direct Peptide Reactivity Assay (DPRA) reactivity to the in vivo Local Lymph Node Assay (LLNA) skin sensitization potency classifications. In a first step, a comprehensive statistical analysis evaluating the reliability and variability of LLNA and DPRA as such was done. These results allowed us to link the concept of gray zones and confidence probability, which in turn represents a new perspective for a more precise knowledge of the classification of chemicals within their in vivo OR in vitro test. Next, the novelty and practical value of our methodology introducing variability into the threshold optimization between the in vitro AND in vivo test resides in the fact that it attributes a confidence probability to the predicted classification. The methodology, classification and screening approach presented in this study are not restricted to skin sensitization only. They could be helpful also for fate, toxicity and health hazard assessment where plenty of in vitro and in chemico assays and/or QSARs models are available. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Arsenic removal by discontinuous ZVI two steps system for drinking water production at household scale.

    PubMed

    Casentini, Barbara; Falcione, Fabiano Teo; Amalfitano, Stefano; Fazi, Stefano; Rossetti, Simona

    2016-12-01

    Different countries in Europe still suffer of elevated arsenic (As) concentration in groundwaters used for human consumption. In the case of households not connected to the distribution system, decentralized water supply systems, such as Point of Use (POU) and Point of Entry (POE), offer a direct benefit for the consumers. Field scale ex-situ treatment systems based on metallic iron (ZVI) are already available for the production of reduced volumes of drinking water in remote areas (village scale). To address drinking water needs at larger scale, we designed a pilot unit able to produce an elevated daily volume of water for human consumption. We tested the long-term As removal efficiency of a two steps ZVI treatment unit for the production of 400 L/day clean water based on the combination of ZVI corrosion process with sedimentation and retention of freshly formed Fe precipitates. The system treated 100 μg/L As(V)-contaminated oxic groundwater in a discontinuous operation mode at a flow rate of 1 L/min for 31 days. Final removal was 77-96% and the most performing step was aeration/sedimentation (A/S) tank with a 60-94% efficiency. Arsenic in the outflow slightly exceeded the drinking water limit of 10 μg/L only after 6000 L treated and Fe concentration was always below 0.2 mg/L. Under proposed operating conditions ZVI passivation readily occurred and, as a consequence, Fe production sharply decreased. Arsenic mobility attached to particulate was 13-60% after ZVI column and 37-100% after A/S tank. Uniform amorphous cluster of Fe nanoparticles (100 nm) formed during aeration drove As removal process with an adsorption capacity corresponding to 20.5 mg As /g Fe . Research studies often focus only on chemico-physical aspects disregarding the importance of biological processes that may co-occur and interfere with ZVI corrosion, As removal and safe water production. We explored the microbial transport dynamics by flow cytometry, proved as a suitable tool to monitor the fate of both single cells and bioactive particles along the treatment train of the pilot unit. A net release of bioactive particles, representing on average 26.5% of flow cytometric events, was promoted by the ZVI filter, with densities 10 times higher than those found in the inflow. In conclusion, the proposed system was efficient to treat large daily volumes of As contaminated groundwater. However, filter design and operating conditions should be carefully adapted to specific situation, since several key factors affect As removal efficiency. An effort in the optimization of ZVI filter design should be made to reduce fast observed ZVI passivation and low As adsorption capacity of the whole filter. More attention to biomass retention and bioactive particles travelling within the unit should be given in order to elucidate bacteria influences on As removal efficiency and related sanitary risks on long term basis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The Pietra Grande thrust (Brenta Dolomites, Italy): looking for co-seismic indicators along a main fault in carbonate sequences

    NASA Astrophysics Data System (ADS)

    Viganò, Alfio; Tumiati, Simone; Martin, Silvana; Rigo, Manuel

    2013-04-01

    At present, pseudotachylytes (i.e. solidified frictional melts) are the only unambiguous geological record of seismic faulting. Even if pseudotachylytes are frequently observed along faults within crystalline rocks they are discovered along carbonate faults in very few cases only, suggesting that other chemico-physical processes than melting could occur (e.g. thermal decomposition). In order to investigate possible co-seismic indicators we study the Pietra Grande thrust, a carbonate fault in the Brenta Dolomites (Trentino, NE Italy), to analyse field structure, microtextures and composition of rocks from the principal slip plane, the fault core and the damage zone. The Pietra Grande thrust is developed within limestones and dolomitic limestones of Late Triassic-Early Jurassic age (Calcari di Zu and Monte Zugna Formations). The thrust, interpreted as a north-vergent décollement deeply connected with the major Cima Tosa thrust, is a sub-horizontal fault plane gently dipping to the North that mainly separates the massive Monte Zugna Fm. limestones (upper side) from the stratified Calcari di Zu Fm. limestones with intercalated marls (lower side). On the western face of the Pietra Grande klippe the thrust is continuously well-exposed for about 1 km. The main fault plane shows reddish infillings, which form veins with thicknesses between few millimetres to several decimetres. These red veins lie parallel to the thrust plane or in same cases inject lateral fractures and minor high-angle faults departing from the main fault plane. Veins have carbonate composition and show textures characterized by fine-grained reddish matrix with embedded carbonate clasts of different size (from few millimetres to centimetres). In some portions carbonate boulders (dimension of some decimetres) are embedded in the red matrix, while clast content generally significantly decreases at the vein borders (chilled margins). Red veins are typically associated with cohesive cataclasites and/or breccias of the fault zone. Host and fault rocks are locally folded, with fold axes having a rough E-W direction compatible with simultaneous thrust activation, suggesting deformation under brittle-ductile conditions. A late brittle deformation is testified by near-vertical fractures and strike-slip faults (WNW-directed) intersecting the whole thrust system. Field structure, microtextures, chemical and mineralogical compositions of host rocks, cataclasites and breccias are analysed. In particular, red veins are carefully compared with the very similar Grigne carbonate pseudotachylytes (Viganò et al. 2011, Terra Nova, vol. 23, pp.187-194), in order to evaluate if they could represent a certain geological record of seismic faulting of the Pietra Grande thrust.

  4. Combined chemical and physical transformation method with RbCl and sepiolite for the transformation of various bacterial species.

    PubMed

    Ren, Jun; Lee, Haram; Yoo, Seung Min; Yu, Myeong-Sang; Park, Hansoo; Na, Dokyun

    2017-04-01

    DNA transformation that delivers plasmid DNAs into bacterial cells is fundamental in genetic manipulation to engineer and study bacteria. Developed transformation methods to date are optimized to specific bacterial species for high efficiency. Thus, there is always a demand for simple and species-independent transformation methods. We herein describe the development of a chemico-physical transformation method that combines a rubidium chloride (RbCl)-based chemical method and sepiolite-based physical method, and report its use for the simple and efficient delivery of DNA into various bacterial species. Using this method, the best transformation efficiency for Escherichia coli DH5α was 4.3×10 6 CFU/μg of pUC19 plasmid, which is higher than or comparable to the reported transformation efficiencies to date. This method also allowed the introduction of plasmid DNAs into Bacillus subtilis (5.7×10 3 CFU/μg of pSEVA3b67Rb), Bacillus megaterium (2.5×10 3 CFU/μg of pSPAsp-hp), Lactococcus lactis subsp. lactis (1.0×10 2 CFU/μg of pTRKH3-ermGFP), and Lactococcus lactis subsp. cremoris (2.2×10 2 CFU/μg of pMSP3535VA). Remarkably, even when the conventional chemical and physical methods failed to generate transformed cells in Bacillus sp. and Enterococcus faecalis, E. malodoratus and E. mundtii, our combined method showed a significant transformation efficiency (2.4×10 4 , 4.5×10 2 , 2×10 1 , and 0.5×10 1 CFU/μg of plasmid DNA). Based on our results, we anticipate that our simple and efficient transformation method should prove usefulness for introducing DNA into various bacterial species without complicated optimization of parameters affecting DNA entry into the cell. Copyright © 2017. Published by Elsevier B.V.

  5. Investigations on the Interactions of 5-Fluorouracil with Herring Sperm DNA: Steady State/Time Resolved and Molecular Modeling Studies

    NASA Astrophysics Data System (ADS)

    Chinnathambi, Shanmugavel; Karthikeyan, Subramani; Velmurugan, Devadasan; Hanagata, Nobutaka; Aruna, Prakasarao; Ganesan, Singaravelu

    2015-04-01

    In the present study, the interaction of 5-Fluorouracil with herring sperm DNA is reported using spectroscopic and molecular modeling techniques. This binding study of 5-FU with hs-DNA is of paramount importance in understanding chemico-biological interactions for drug design, pharmacy and biochemistry without altering the original structure. The challenge of the study was to find the exact binding mode of the drug 5-Fluorouracil with hs-DNA. From the absorption studies, a hyperchromic effect was observed for the herring sperm DNA in the presence of 5-Fluorouracil and a binding constant of 6.153 × 103 M-1 for 5-Fluorouracil reveals the existence of weak interaction between the 5-Fluorouracil and herring sperm DNA. Ethidium bromide loaded herring sperm DNA showed a quenching in the fluorescence intensity after the addition of 5-Fluorouracil. The binding constants for 5-Fluorouracil stranded DNA and competitive bindings of 5-FU interacting with DNA-EB systems were examined by fluorescence spectra. The Stern-Volmer plots and fluorescence lifetime results confirm the static quenching nature of the drug-DNA complex. The binding constant Kb was 2.5 × 104 L mol-1 and the number of binding sites are 1.17. The 5-FU on DNA system was calculated using double logarithmic plot. From the Forster nonradiative energy transfer study it has been found that the distance of 5-FU from DNA was 4.24 nm. In addition to the spectroscopic results, the molecular modeling studies also revealed the major groove binding as well as the partial intercalation mode of binding between the 5-Fluorouracil and herring sperm DNA. The binding energy and major groove binding as -6.04 kcal mol-1 and -6.31 kcal mol-1 were calculated from the modeling studies. All the testimonies manifested that binding modes between 5-Fluorouracil and DNA were evidenced to be groove binding and in partial intercalative mode.

  6. Codominance of Lactobacillus plantarum and obligate heterofermentative lactic acid bacteria during sourdough fermentation.

    PubMed

    Ventimiglia, Giusi; Alfonzo, Antonio; Galluzzo, Paola; Corona, Onofrio; Francesca, Nicola; Caracappa, Santo; Moschetti, Giancarlo; Settanni, Luca

    2015-10-01

    Fifteen sourdoughs produced in western Sicily (southern Italy) were analysed by classical methods for their chemico-physical characteristics and the levels of lactic acid bacteria (LAB). pH and total titratable acidity (TTA) were mostly in the range commonly reported for similar products produced in Italy, but the fermentation quotient (FQ) of the majority of samples was above 4.0, due to the low concentration of acetic acid estimated by high performance liquid chromatography (HPLC). Specific counts of LAB showed levels higher than 10(8) CFU g(-1) for many samples. The colonies representing various morphologies were isolated and, after the differentiation based on phenotypic characteristics, divided into 10 groups. The most numerous group was composed of facultative heterofermentative isolates, indicating a relevance of this bacterial group during fermentation. The genetic analysis by randomly amplified polymorphic DNA (RAPD)-PCR, 16S rRNA gene sequencing and species-specific PCRs identified 33 strains as Lactobacillus plantarum, Lactobacillus curvatus and Lactobacillus graminis. Due to the consistent presence of L. plantarum, it was concluded that this species codominates with obligate heterofermentative LAB in sourdough production in this geographical area. In order to evaluate the performances at the basis of their fitness, the 29 L. plantarum strains were investigated for several technological traits. Twelve cultures showed good acidifying abilities in vitro and L. plantarum PON100148 produced the highest concentrations of organic acids. Eleven strains were positive for extracellular protease activity. Bacteriocin-like inhibitory substances (BLIS) production and antifungal activity was scored positive for several strains, included L. plantarum PON100148 which was selected as starter for experimental sourdough production. The characteristics of the sourdoughs and the resulting breads indicated that the best productions were obtained in presence of L. plantarum PON100148. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. [The possible uses of balneotherapy in treating chronic venous insufficiency of lower limbs].

    PubMed

    Petraccia, L; Mennuni, G; Fontana, M; Nocchi, S; Libri, F; Conte, S; Alhadeff, A; Romano, B; Messini, F; Grassi, M; Fraioli, A

    2013-01-01

    The Chronic Venous Insufficiency (CVI) of inferior limbs is a widespread disease, with an increasing incidence as a consequence of longer life expectance, life-style, obesity, smoking, use of drugs as oestrogens and progestins and working conditions. Medical therapy is still lacking for evidence of efficacy, and compression therapy is useful only in preventing a worsening of this condition. Surgical treatment is the only radical therapy effective for the advanced phases of the disease. In this context spa balneotherapy can be considered as a possible chance to improve some subjective and objective symptoms of CVI of inferior limbs, and to prevent worsening of this condition. The authors performed a review of the relevant scientific literature concerning the treatment of CVI of inferior limbs with mineral water balneotherapy, in order to evaluate its effects on objective and subjective symptoms and its effectiveness to prevent further worsening. We searched the PubMed/Medline, Cochrane Library, Embase, Web of Science databases for articles published between 1990 and 2011 on this topic. To this end, the authors selected few clinical-controlled and case-controlled studies; patients affected from CVI of inferior limbs were treated with balneotherapy at health spas with sulphureous, sulphate, salsojodic or salsobromojodic mineral waters. Baths in mineral waters were often associated with idromassotherapy and vascular pathway. Effects of spa balneotherapy are related to some aspecific properties, like hydrostatic pressure, osmotic pressure and water temperature, partly related with specific chemico-physical properties of the adopted mineral water. The controlled clinical studies on spa therapy showed significant improvement of subjective (such as itch, paresthesias, pain, heaviness) and objective symptoms (namely edema and skin discromias). These studies suggest that spa balneotherapy may give a good chance of secondary prevention and effective therapy of CVI of inferior limbs, but also that it needs of other clinical controlled trials.

  8. Dietary Effects of Oregano (Origanum Vulgaris L.) Plant or Sweet Chestnut (Castanea Sativa Mill.) Wood Extracts on Microbiological, Chemico-Physical Characteristics and Lipid Oxidation of Cooked ham During Storage.

    PubMed

    Ranucci, David; Miraglia, Dino; Trabalza-Marinucci, Massimo; Acuti, Gabriele; Codini, Michela; Ceccarini, Maria Rachele; Forte, Claudio; Branciari, Raffaella

    2015-11-02

    The aim of this study was to evaluate the dietary effect of feeding pigs with diets enriched with sweet chestnut wood ( Castanea sativa Mill.) or oregano ( Origanum vulgaris L.) extract on the microbiological and chemical characteristics of cooked pork ham. Three groups of 10 pigs were fed with a control diet (CTRL), with the CTRL diet enriched with 0.2% of oregano extract (OR) and with the CTRL diet enriched with 0.2% of sweet chestnut wood extract (SCW), respectively. Six cooked hams per group were produced, sliced and packaged under a modified atmosphere (N2:CO2=80:20) and stored at refrigeration temperature (4±1°C). Three packages per cooked ham were sampled for analyses at three different storage times (0, 10 and 20 days). At day 0 time, antioxidant capacity of the products (ORAC FL assay) and chemical composition were performed. At each sampling time, from all the samples the following analyses were performed: total microbial count (TMC), lactic acid bacteria count (LAB), Enterobacteriaceae count, Listeria monocytogenes , pH value, colour coordinates (L*, a*, b*), total basic volatile nitrogen (TBVN) and thiobarbituric reactive substances (TBARs) determinations. No differences in TMC, LAB and Enterobacteriaceae count, pH, TBVN, chemical composition and L* values were registered between the three groups at all the sampling times considered. No Listeria monocytogenes was detected in the samples tested. Significant differences were registered for ORAC FL at 0 days, a* and b* values and TBARs value at 10 and 20 days of storage, with higher values for ORAC FL , a* and b* values and lower values for TBARs in SCW and OR than CTRL. No antimicrobial effect could be recorded for OR and SCW but a higher oxidative stability, also highlighted by the colour maintenance, was observed in both OR and SCW.

  9. Multivariate models for prediction of human skin sensitization hazard.

    PubMed

    Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole

    2017-03-01

    One of the Interagency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens™ assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy 88%), any of the alternative methods alone (accuracy 63-79%) or test batteries combining data from the individual methods (accuracy 75%). These results suggest that computational methods are promising tools to identify effectively the potential human skin sensitizers without animal testing. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  10. Study of the Martian Subsurface with a Fiber Optics Spectrometer: the Ma_Miss Experiment

    NASA Astrophysics Data System (ADS)

    Coradini, A.; de Sanctis, M. C.; Ammannito, E.; Boccaccini, A.; Battistelli, E.; Capanni, A.

    2009-04-01

    In this presentation is described the investigation that we intend to do with a small imaging spectrometer that will be inserted in the drill of the Exomars- Pasteur rover. This spectrometer is named Ma_miss (Mars Multispectral Imager for Subsurface Studies ). The Ma_Miss experiment is located in the drill ,that will be able to make a hole in the Mars soil and rock up to 2 m. Ma_Miss includes the optical head of the spectrometer, a lamp to illuminate the borehole walls, and the optical fiber that brings the signal to the spectrometer. The multispectral images are acquired by means of a sapphire window placed on the lateral wall of the drill tool, as close as possible to the drill head. The images are gathered by means of an optical fibre system and analyzed using the spectrometer. The Ma_Miss gathered light containing the scientific information is transferred to the array detector and electronics of the instrument by means of an optical rotary joint implemented in the roto-translation group of the drill, as shown in the next pictures In the figure is schematically represented the Ma_Miss- Dibs architecture. This experiment will be extremely valuable since it will allow, for the first time, to have an idea of the mineralogical composition of the Martian subsurface and to study freshly cut rocks. The study of surface and subsurface mineralogy of Martian soil and rocks is the key for understanding the chemico-physical processes that led to the formation and evolution of the Red Planet. The history of the water and other volatiles, as well as the signatures of weathering processes are important to understand present and past environmental conditions associated with the possibility of life. Surface samples are highly influenced by exogenous processes (weathering, erosion, sedimentation, impact) that alter their original properties. So, the analyses of uncontaminated samples by means of instrumented drills and in situ analytic stations are the key for unambiguous interpretation of the original environment that leading to the formation of rocks. Analysis of subsurface layers is the only approach that warranties measurements on samples close to their original composition. The upper few meters of the surface materials on Mars play a crucial role in its history, providing important constraints geologic, hydrologic, and climatic to the history of the planet. Drilling into the near-surface crust will provide an opportunity to assess variations in composition, texture, stratification, unconformities, etc. that will help define its lithology and structure, and provide important clues regarding its origin and subsequent evolution. The subsurface material can give information on the evolution of surface sediments (erosion, transport, deposition), on the relation between sediments and bedrock, on the relation between environmental conditions and surface processes permitting to "investigate planetary processes that influence habitability." Investigation of mineralogical composition of near-surface geological materials is needed to fully characterize the geology of the regions that will be visited by the Rover at all appropriate spatial scales, and to interpret the processes that have formed and modified rocks and regolith. Subsurface access, sampling material below the oxidized layer, can be the key to "assess the biological potential of the target environment (past or present)". To date, we have direct observations relative only to the Martian surface. Little is known about the characteristics of the first subsurface layers. The possibility to sample subsurface materials to be delivered to other instruments, and to record the context of the sampled soil doing in situ borehole mineralogical analysis, is fundamental to search for traces of past or present life on Mars. The spectrometer observes a single point target, having 0.1 mm diameter, on the borehole wall surface. Depending on the surface features we are interested in, the observation window can scan the borehole's surface by means of drill tip rotation or translation. When the drill is translated, a "Column Image" is acquired. This translation step can be equal to the observation spot (0.1 mm). The "Ring Image" can be obtained by rotation of the drill tip; a rotation step of about 0.5˚ (corresponding to 720 acquisitions in the ring) is sufficient to assure the full coverage of the ring.

  11. Linking contemporary high resolution magnetic resonance imaging to the von Economo legacy: A study on the comparison of MRI cortical thickness and histological measurements of cortical structure.

    PubMed

    Scholtens, Lianne H; de Reus, Marcel A; van den Heuvel, Martijn P

    2015-08-01

    The cerebral cortex is a distinctive part of the mammalian nervous system, displaying a spatial variety in cyto-, chemico-, and myelinoarchitecture. As part of a rich history of histological findings, pioneering anatomists von Economo and Koskinas provided detailed mappings on the cellular structure of the human cortex, reporting on quantitative aspects of cytoarchitecture of cortical areas. Current day investigations into the structure of human cortex have embraced technological advances in Magnetic Resonance Imaging (MRI) to assess macroscale thickness and organization of the cortical mantle in vivo. However, direct comparisons between current day MRI estimates and the quantitative measurements of early anatomists have been limited. Here, we report on a simple, but nevertheless important cross-analysis between the histological reports of von Economo and Koskinas on variation in thickness of the cortical mantle and MRI derived measurements of cortical thickness. We translated the von Economo cortical atlas to a subdivision of the commonly used Desikan-Killiany atlas (as part of the FreeSurfer Software package and a commonly used parcellation atlas in studies examining MRI cortical thickness). Next, values of "width of the cortical mantle" as provided by the measurements of von Economo and Koskinas were correlated to cortical thickness measurements derived from high-resolution anatomical MRI T1 data of 200+ subjects of the Human Connectome Project (HCP). Cross-correlation revealed a significant association between group-averaged MRI measurements of cortical thickness and histological recordings (r = 0.54, P < 0.001). Further validating such a correlation, we manually segmented the von Economo parcellation atlas on the standardized Colin27 brain dataset and applied the obtained three-dimensional von Economo segmentation atlas to the T1 data of each of the HCP subjects. Highly consistent with our findings for the mapping to the Desikan-Killiany regions, cross-correlation between in vivo MRI cortical thickness and von Economo histology-derived values of cortical mantle width revealed a strong positive association (r = 0.62, P < 0.001). Linking today's state-of-the-art T1-weighted imaging to early histological examinations our findings indicate that MRI technology is a valid method for in vivo assessment of thickness of human cortex. © 2015 Wiley Periodicals, Inc.

  12. The Sea Monitoring Virtual Research Community (VRC) in the EVER-EST Project (a virtual research environment for the Earth Sciences).

    NASA Astrophysics Data System (ADS)

    Foglini, Federica; Boero, Ferdinando; Guarino, Raffaele

    2016-04-01

    The EU's H2020 EVER-EST Project is dedicated to the realization of a Virtual Research Environment (VRE) for Earth Science researchers during 2015-2018. In this framework the Sea monitoring represents one of the four use case VRCs chosen to validate the EVER-EST e-infrastructure, which is aimed at representing a wide and multidisciplinary Earth Science domain. The objective of the Sea Monitoring Virtual Research Community (VRC) is to provide useful and applicable contributions to the identification and definition of variables indicated by the European Commission in the Marine Directive under the framework for Good Environment Status (GES). The European Marine Strategy Framework Directive (MSFD, http://ec.europa.eu/environment/marine/index_en.htm) has defined the descriptors for Good Environmental Status in marine waters. The first descriptor is biodiversity; the second one is the presence of non-indigenous species while the remaining nine (even when they consider physical, chemical or geological variables) require proper functioning of the ecosystem, linked to a good state of biodiversity. The Sea Monitoring VRC is direct to provide practical methods, procedures and protocols to support coherent and widely accepted interpretation of the Descriptors 1(Biodiversity), 2 (non- indigenous species), 4 (food webs) and 6 (seafloor integrity) identified in GES. In that context, the criteria and methodological standards already identified by the European Commission, and at same time considering the activities and projects in progress in the marine framework, will be taken into account. This research of practical methods to estimate and measure GES parameters requires a close cooperation among different disciplines including: biologists, geologists, geophysics, oceanographers, Earth observation experts and others. It will also require a number of different types of scientific data and observations (e.g. biology related, chemico-physical, etc.) from different inputs and sensors (e.g. remote sensing, on-site buoys, marine stations, administrations, citizen observations, etc.). Furthermore, different communities require support and guidance to be able to effectively interoperate and share practices, methods, standards and terminologies. The EVER-EST VRE will provide the Sea Monitoring VRC users community with an innovative framework aimed at enhancing their ability to interoperate and share knowledge, experience and methods for GES assessment and monitoring. Furthermore the Sea monitoring VRC will focus the attention on the implementation of Research Object (RO, a semantically rich aggregation of resources bringing together data, documents and methods in scientific investigations) for GES assessment to be shared among the wide sea monitoring community for the first time.

  13. Development of a high-performance composite cathode for LT-SOFC

    NASA Astrophysics Data System (ADS)

    Lee, Byung Wook

    Solid Oxide Fuel Cell (SOFC) has drawn considerable attention for decades due to its high efficiency and low pollution, which is made possible since chemical energy is directly converted to electrical energy through the system without combustion. However, successful commercialization of SOFC has been delayed due to its high production cost mainly related with using high cost of interconnecting materials and the other structural components required for high temperature operation. This is the reason that intermediate (IT) or low temperature (LT)-SOFC operating at 600~800°C or 650°C and below, respectively, is of particular significance because it allows the wider selection of cheaper materials such as stainless steel for interconnects and the other structural components. Also, extended lifetime and system reliability are expected due to less thermal stress through the system with reduced temperature. More rapid start-up/shut-down procedure is another advantage of lowering the operating temperatures. As a result, commercialization of SOFC will be more viable. However, there exists performance drop with reduced operating temperature due to increased polarization resistances from the electrode electrochemical reactions and decreased electrolyte conductivity. Since ohmic polarization of the electrolyte can be significantly reduced with state-of-the art thin film technology and cathode polarization has more drastic effect on total SOFC electrochemical performance than anode polarization as temperature decreases, development of the cathode with high performance operating at IT or LT range is thus essential. On the other hand, chemical stability of the cathode and its chemical compatibility with the electrolyte should also be considered for cathode development since instability and incompatibility of the cathode will also cause substantial performance loss. Based on requirements of the cathode mentioned above, in this study, several chemico-physical approaches were carried out to develop a high-performance composite cathode, in particular, for LT-SOFC operating 650°C and below since stability and compatibility of the materials in interest are secured at low temperatures. First, a nano-sized pyrochlore bismuth ruthenate (Bi2Ru 2O7 or BRO7 shortly), one of the promising cathode materials, was successfully synthesized using glycine-nitrate combustion (GNC) route. Stoichiometric Bi2Ru2O7 without any impurity phase was achieved with considerably improved processing condition, leading to the crystallite size of ~24nm in diameter. Even though the resulting powder tends to agglomerate, resulting in overall 200~400nm size range, it still showed better quality than the one prepared by solid state (SS) reaction route followed by extra milling steps such as vibro-milling and sonication for further particle size reduction. Glycine-to-nitrate (G/N) ratio was found to play a critical role in determining the reaction temperature and reaction duration, thus phase purity and particle morphology (particle size, shape, and agglomeration etc). Composite cathodes of such prepared BRO7 (GNC BRO7) combined with SS erbia-stabilized bismuth oxide, Bi1.6Er0.4O3 or ESB, showed better electrochemical performance than vibro-milled BRO7 (VM BRO7)-SS ESB. ASR values of 0.123Ocm2 at 700°C and 4.59cOm 2 at 500°C, respectively, were achieved, which follows well the trend of particle size effect on performance of composite cathodes. Additionally, the number of processing steps (thus time) was reduced by GNC route. Several issues in regard to synthesis process and characteristics of BRO7 material itself will be addressed in this dissertation. Secondly, a unique in-situ composite cathode synthesis was successfully developed and applied for BRO7-ESB composite cathodes to improve percolation and to reduce agglomeration of each phase inside the cathode so that the effective triple phase boundary (TPB) length was extended. To disperse and stabilize ESB powder in de-ionized (DI) water, zeta potential profile of ESB powder in DI water as a function of pH was first achieved. The effect of a dispersant (ammonium citrate dibasic) on the stability of ESB powder dispersed in DI water was also investigated. Knowledge of BRO7 wet chemical synthesis from previous study was utilized for final product of in-situ BRO7-ESB composite cathodes. Such prepared composite particles were characterized and the electrochemical performance of in-situ BRO7-ESB composite cathodes was examined as well. Performance enhancement was observed so that ASR values of 0.097Ocm2 and 3.58Ocm2 were achieved at 700°C and 500°C, respectively, which were 19% and 22% improvement, respectively compared to those of conventionally mixed composite cathodes of BRO7-ESB. Finally, a highly controlled nanostructured BRO7-ESB composite cathode was developed by infiltration of BRO7 onto ESB scaffolds to maximize the effective TPB length, to improve the connectivity of ESB phase inside the cathode for better oxygen-ion diffusion, and to minimize delamination between the electrolyte and cathode layers. ESB scaffolds were first established by adding a graphite pore-former and controlling heat treatment condition. Nano-sized BRO7 particles were successfully created on the surface of previously formed ESB scaffold by infiltration of concentrated (Bi, Ru) nitrate solution followed by the optimized heat treatment. Such prepared composite cathodes exhibited superior electrochemical performance to conventionally made BRO7-ESB composite cathodes and even better than GNC BRO7-SS ESB developed in this dissertation, e.g. 0.073Ocm2 at 700°C and 1.82Ocm2 at 500°C, respectively. This cathode system was revealed to be highly competitive among all the reported composite cathodes consisting of the same or different materials prepared by various processing techniques. It was demonstrated that the extended TPB length from continuous network of BRO7 nanoparticles and better connectivity of ESB scaffolds enabled the outstanding performance. Moreover, de-lamination of cathode from the electrolyte was prevented thanks to improved adhesion between ESB scaffolds and ESB electrolyte. Dissociative adsorption of oxygen gas were proposed to be the dominant rate-determining process for the overall oxygen reduction reaction at low temperatures (500-600°C) whereas all of the constituting sub-reactions such as oxygen gas dissociative adsorption, oxygen ion diffusion towards TPB region, and oxygen ion incorporation were found to play roles competitively in the overall reaction at relatively high operating temperature (650-700°C) based on analysis of impedance spectra.

  14. EDITORIAL: A support for prospective nanomaterials A support for prospective nanomaterials

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-01-01

    In the early 1990s, scanning probe microscopy was empowering researchers to view nanoscale features, previously the domain of imaginative theorists. Such images undoubtedly sparked the imagination and goaded researchers into ever more creative endeavours to understand nanoscale systems. At the same time, reports on molecular self-assembly were revolutionizing the philosophy behind how such systems could be produced, and promoting genuine bottom-up fabrication technology. The nanoworld was not only available to be gazed at and probed, but could be recreated by imposing the right conditions for the self assembly of complex and controlled structures. Many creative variants of self-assembly processes have been investigated. Researchers in Germany demonstrated that block copolymer micelle nanolithography could be used to generate nanostructured interfaces, where the pattern dimension and geometry is controlled by a combination of the self-assembly of block copolymer micelles with pre-structures formed by photo or electron-beam lithography [1]. The building blocks of life, DNA molecules, have also bred novel synthesis techniques as described in work by researchers in Finland, in a study of two different techniques of 'DNA origami' for the fabrication of complex protein structures [2]. Some fascinating properties have been revealed in self-assembled structures. Researchers in the US demonstrated extraordinary transmission in the infrared using self-assembled monolayers, phospholipid bilayers, and membrane-bound proteins on a subwavelength metallic array [3]. The surface plasmon properties of the arrays are accentuated by stacking them one upon the other, thus enabling extraordinary transmission and providing the basis of a nanospaced capacitive sensor. So-called nanoscaffolds have been used to promote assembly of biological matter in organized forms. A particularly inspiring application of nanoscaffolds has been found in nerve regeneration. Researchers in Singapore demonstrated the potential of biodegradable poly(L-lactide-co-glycolide) nanofibres to guide axon regeneration in vivo [4] and showed that aligned nanofibrous poly(l-lactic acid) scaffolds could be used as a potential cell carrier in neural tissue engineering [5]. In this issue, researchers in Italy demonstrate the use of magnetic bio-hybrid porous scaffolds for nucleating nano-apatite in situ on self-assembling collagen in the presence of magnetite nano-particles [6]. The magnetic nanoparticles provide a sort of crosslinking agent for the collagen, inducing a chemico-physical-mechanical stabilization of the material and allowing control of the porosity of the scaffold network. The work contributes towards developing assistance to bone regeneration guided by an external magnetic field. Another application of nanoscaffolds is in the development of hydrogen storage technology. Scaffolds can be used to help avoid aggregation of hydrogen storage nanoparticles and aid efficient cycling of storage and release [7, 8]. For more on hydrogen storage and other work on developing energy sources using nanotechnology, keep an eye out for our new Energy section to be launched this spring, 2011. The section will consider both the technological aspects and fundamental physics associated with innovations in the energy industry that exploit the properties of nanoscale structures. As we usher in the new year we can be sure that nanotechnology will remain a topic at the forefront of research agendas. There has been much hype over developments in nanotechnology over the years, and without a doubt the smart self-assembling of complex systems and materials has provoked awe of both a positive and negative nature in its time. Great progress has been made in advancing our control over promoting and guiding the self assembly of biological and industrial materials. The benefits available in applications of such research in medicine, renewable energy and many other industries are evident. Richard Feynman, often touted as a pioneer of a 'bottom-up' approach to nanotechnology, once said 'I was born not knowing and have had only a little time to change that here and there'. In a similar sense, what we do not yet know and understand about the ability to create and manipulate nanosystems is apparently infinite, but as can also be acceded in the case of Richard Feynman, the 'little' we have had time learn so far holds more than a little promise. References [1] Glass R, Möller M and Spatz J P 2003 Nanotechnology 14 314 [2] Kuzyk A, Laitinen K T and Törmä P 2009 Nanotechnology 20 235305 [3] Williams S M, Rodriguez K R, Teeters-Kennedy S, Shah S, Rogers T M, Stafford A D and Coe J V 2004 Nanotechnology 15 S495 [4] Bini T B, Gao S, Tan T C, Wang S, Lim, A, Hai L B and Ramakrishna S 2004 Nanotechnology 15 1459 [5] Yang F, Murugan R, Wang S and Ramakrishna S 2005 Biomaterials 26 2603-10 [6] Tampieri A, Landi E, Valentini F, Sandri M, D'Alessandro T, Dediu V and Marcacci M 2011 Nanotechnology 22 015104 [7] Gross A F, Ahn C C, Van Atta S L, Liu P and Vajo J J 2009 Nanotechnology 20 204005 [8] Zhang S, Gross A F, Van Atta S L, Lopez M, Liu P, Ahn C C, Vajo J J and Jensen C M 2009 Nanotechnology 20 204027

  15. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  16. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  17. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  18. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  19. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  20. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-01-01

    Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)

  1. Performing a local reduction operation on a parallel computer

    DOEpatents

    Blocksome, Michael A; Faraj, Daniel A

    2013-06-04

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  2. Performing a local reduction operation on a parallel computer

    DOEpatents

    Blocksome, Michael A.; Faraj, Daniel A.

    2012-12-11

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  3. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  4. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  5. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  6. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  7. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  8. Cleanliness of Ti-bearing Al-killed ultra-low-carbon steel during different heating processes

    NASA Astrophysics Data System (ADS)

    Guo, Jian-long; Bao, Yan-ping; Wang, Min

    2017-12-01

    During the production of Ti-bearing Al-killed ultra-low-carbon (ULC) steel, two different heating processes were used when the converter tapping temperature or the molten steel temperature in the Ruhrstahl-Heraeus (RH) process was low: heating by Al addition during the RH decarburization process and final deoxidation at the end of the RH decarburization process (process-I), and increasing the oxygen content at the end of RH decarburization, heating and final deoxidation by one-time Al addition (process-II). Temperature increases of 10°C by different processes were studied; the results showed that the two heating processes could achieve the same heating effect. The T.[O] content in the slab and the refining process was better controlled by process-I than by process-II. Statistical analysis of inclusions showed that the numbers of inclusions in the slab obtained by process-I were substantially less than those in the slab obtained by process-II. For process-I, the Al2O3 inclusions produced by Al added to induce heating were substantially removed at the end of decarburization. The amounts of inclusions were substantially greater for process-II than for process-I at different refining stages because of the higher dissolved oxygen concentration in process-II. Industrial test results showed that process-I was more beneficial for improving the cleanliness of molten steel.

  9. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  10. Electricity from sunlight. [low cost silicon for solar cells

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Miller, J. W.; Lutwack, R.; Hsu, G.

    1978-01-01

    The paper discusses a number of new unconventional processes proposed for the low-cost production of silicon for solar cells. Consideration is given to: (1) the Battelle process (Zn/SiCl4), (2) the Battelle process (SiI4), (3) the Silane process, (4) the Motorola process (SiF4/SiF2), (5) the Westinghouse process (Na/SiCl4), (6) the Dow Corning process (C/SiO2), (7) the AeroChem process (SiCl4/H atom), and the Stanford process (Na/SiF4). Preliminary results indicate that the conventional process and the SiI4 processes cannot meet the project goal of $10/kg by 1986. Preliminary cost evaluation results for the Zn/SiCl4 process are favorable.

  11. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  12. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  13. Intranode data communications in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Ratterman, Joseph D; Smith, Brian E

    2014-01-07

    Intranode data communications in a parallel computer that includes compute nodes configured to execute processes, where the data communications include: allocating, upon initialization of a first process of a computer node, a region of shared memory; establishing, by the first process, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; sending, to a second process on the same compute node, a data communications message without determining whether the second process has been initialized, including storing the data communications message in the message buffer of the second process; and upon initialization of the second process: retrieving, by the second process, a pointer to the second process's message buffer; and retrieving, by the second process from the second process's message buffer in dependence upon the pointer, the data communications message sent by the first process.

  14. Intranode data communications in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Ratterman, Joseph D; Smith, Brian E

    2013-07-23

    Intranode data communications in a parallel computer that includes compute nodes configured to execute processes, where the data communications include: allocating, upon initialization of a first process of a compute node, a region of shared memory; establishing, by the first process, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; sending, to a second process on the same compute node, a data communications message without determining whether the second process has been initialized, including storing the data communications message in the message buffer of the second process; and upon initialization of the second process: retrieving, by the second process, a pointer to the second process's message buffer; and retrieving, by the second process from the second process's message buffer in dependence upon the pointer, the data communications message sent by the first process.

  15. Canadian Libraries and Mass Deacidification.

    ERIC Educational Resources Information Center

    Pacey, Antony

    1992-01-01

    Considers the advantages and disadvantages of six mass deacidification processes that libraries can use to salvage printed materials: the Wei T'o process, the Diethyl Zinc (DEZ) process, the FMC (Lithco) process, the Book Preservation Associates (BPA) process, the "Bookkeeper" process, and the "Lyophilization" process. The…

  16. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  17. Depth-of-processing effects on priming in stem completion: tests of the voluntary-contamination, conceptual-processing, and lexical-processing hypotheses.

    PubMed

    Richardson-Klavehn, A; Gardiner, J M

    1998-05-01

    Depth-of-processing effects on incidental perceptual memory tests could reflect (a) contamination by voluntary retrieval, (b) sensitivity of involuntary retrieval to prior conceptual processing, or (c) a deficit in lexical processing during graphemic study tasks that affects involuntary retrieval. The authors devised an extension of incidental test methodology--making conjunctive predictions about response times as well as response proportions--to discriminate among these alternatives. They used graphemic, phonemic, and semantic study tasks, and a word-stem completion test with incidental, intentional, and inclusion instructions. Semantic study processing was superior to phonemic study processing in the intentional and inclusion tests, but semantic and phonemic study processing produced equal priming in the incidental test, showing that priming was uncontaminated by voluntary retrieval--a conclusion reinforced by the response-time data--and that priming was insensitive to prior conceptual processing. The incidental test nevertheless showed a priming deficit following graphemic study processing, supporting the lexical-processing hypothesis. Adding a lexical decision to the 3 study tasks eliminated the priming deficit following graphemic study processing, but did not influence priming following phonemic and semantic processing. The results provide the first clear evidence that depth-of-processing effects on perceptual priming can reflect lexical processes, rather than voluntary contamination or conceptual processes.

  18. Improving operational anodising process performance using simulation approach

    NASA Astrophysics Data System (ADS)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-10-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  19. Value-driven process management: using value to improve processes.

    PubMed

    Melnyk, S A; Christensen, R T

    2000-08-01

    Every firm can be viewed as consisting of various processes. These processes affect everything that the firm does from accepting orders and designing products to scheduling production. In many firms, the management of processes often reflects considerations of efficiency (cost) rather than effectiveness (value). In this article, we introduce a well-structured process for managing processes that begins not with the process, but rather with the customer and the product and the concept of value. This process progresses through a number of steps which include issues such as defining value, generating the appropriate metrics, identifying the critical processes, mapping and assessing the performance of these processes, and identifying long- and short-term areas for action. What makes the approach presented in this article so powerful is that it explicitly links the customer to the process and that the process is evaluated in term of its ability to effectively serve the customers.

  20. Method for routing events from key strokes in a multi-processing computer systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, D.A.; Rustici, E.; Carter, K.H.

    1990-01-23

    The patent describes a method of routing user input in a computer system which concurrently runs a plurality of processes. It comprises: generating keycodes representative of keys typed by a user; distinguishing generated keycodes by looking up each keycode in a routing table which assigns each possible keycode to an individual assigned process of the plurality of processes, one of which processes being a supervisory process; then, sending each keycode to its assigned process until a keycode assigned to the supervisory process is received; sending keycodes received subsequent to the keycode assigned to the supervisory process to a buffer; next,more » providing additional keycodes to the supervisory process from the buffer until the supervisory process has completed operation; and sending keycodes stored in the buffer to processes assigned therewith after the supervisory process has completedoperation.« less

  1. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  2. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  3. [Definition and stabilization of processes I. Management processes and support in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela

    2015-01-01

    The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.

  4. T-Check in Technologies for Interoperability: Business Process Management in a Web Services Context

    DTIC Science & Technology

    2008-09-01

    UML Sequence Diagram) 6  Figure 3:   BPMN Diagram of the Order Processing Business Process 9  Figure 4:   T-Check Process for Technology Evaluation 10...Figure 5:  Notional System Architecture 12  Figure 6:  Flow Chart of the Order Processing Business Process 14  Figure 7:  Order Processing Activities...features. Figure 3 (created with Intalio BPMS Designer [Intalio 2008]) shows a BPMN view of the Order Processing business process that is used in the

  5. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  6. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  7. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  8. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  9. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  10. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  11. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  12. A model for process representation and synthesis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Thomas, R. H.

    1971-01-01

    The problem of representing groups of loosely connected processes is investigated, and a model for process representation useful for synthesizing complex patterns of process behavior is developed. There are three parts, the first part isolates the concepts which form the basis for the process representation model by focusing on questions such as: What is a process; What is an event; Should one process be able to restrict the capabilities of another? The second part develops a model for process representation which captures the concepts and intuitions developed in the first part. The model presented is able to describe both the internal structure of individual processes and the interface structure between interacting processes. Much of the model's descriptive power derives from its use of the notion of process state as a vehicle for relating the internal and external aspects of process behavior. The third part demonstrates by example that the model for process representation is a useful one for synthesizing process behavior patterns. In it the model is used to define a variety of interesting process behavior patterns. The dissertation closes by suggesting how the model could be used as a semantic base for a very potent language extension facility.

  13. Process and Post-Process: A Discursive History.

    ERIC Educational Resources Information Center

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  14. Improving operational anodising process performance using simulation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less

  15. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  16. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  17. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  18. Simplified process model discovery based on role-oriented genetic mining.

    PubMed

    Zhao, Weidong; Liu, Xi; Dai, Weihui

    2014-01-01

    Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.

  19. Electrotechnologies to process foods

    USDA-ARS?s Scientific Manuscript database

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  20. Challenges associated with the implementation of the nursing process: A systematic review.

    PubMed

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses.

  1. Challenges associated with the implementation of the nursing process: A systematic review

    PubMed Central

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Background: Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. Materials and Methods: To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Results: Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. Conclusions: On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses. PMID:26257793

  2. Automated synthesis of image processing procedures using AI planning techniques

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  3. Optimisation of shock absorber process parameters using failure mode and effect analysis and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal

    2013-07-01

    The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.

  4. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  5. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    NASA Astrophysics Data System (ADS)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  6. Composite faces are not (necessarily) processed coactively: A test using systems factorial technology and logical-rule models.

    PubMed

    Cheng, Xue Jun; McCarthy, Callum J; Wang, Tony S L; Palmeri, Thomas J; Little, Daniel R

    2018-06-01

    Upright faces are thought to be processed more holistically than inverted faces. In the widely used composite face paradigm, holistic processing is inferred from interference in recognition performance from a to-be-ignored face half for upright and aligned faces compared with inverted or misaligned faces. We sought to characterize the nature of holistic processing in composite faces in computational terms. We use logical-rule models (Fifić, Little, & Nosofsky, 2010) and Systems Factorial Technology (Townsend & Nozawa, 1995) to examine whether composite faces are processed through pooling top and bottom face halves into a single processing channel-coactive processing-which is one common mechanistic definition of holistic processing. By specifically operationalizing holistic processing as the pooling of features into a single decision process in our task, we are able to distinguish it from other processing models that may underlie composite face processing. For instance, a failure of selective attention might result even when top and bottom components of composite faces are processed in serial or in parallel without processing the entire face coactively. Our results show that performance is best explained by a mixture of serial and parallel processing architectures across all 4 upright and inverted, aligned and misaligned face conditions. The results indicate multichannel, featural processing of composite faces in a manner inconsistent with the notion of coactivity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Fuzzy image processing in sun sensor

    NASA Technical Reports Server (NTRS)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  8. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  9. Reversing the conventional leather processing sequence for cleaner leather production.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2006-02-01

    Conventional leather processing generally involves a combination of single and multistep processes that employs as well as expels various biological, inorganic, and organic materials. It involves nearly 14-15 steps and discharges a huge amount of pollutants. This is primarily due to the fact that conventional leather processing employs a "do-undo" process logic. In this study, the conventional leather processing steps have been reversed to overcome the problems associated with the conventional method. The charges of the skin matrix and of the chemicals and pH profiles of the process have been judiciously used for reversing the process steps. This reversed process eventually avoids several acidification and basification/neutralization steps used in conventional leather processing. The developed process has been validated through various analyses such as chromium content, shrinkage temperature, softness measurements, scanning electron microscopy, and physical testing of the leathers. Further, the performance of the leathers is shown to be on par with conventionally processed leathers through bulk property evaluation. The process enjoys a significant reduction in COD and TS by 53 and 79%, respectively. Water consumption and discharge is reduced by 65 and 64%, respectively. Also, the process benefits from significant reduction in chemicals, time, power, and cost compared to the conventional process.

  10. Group processing in an undergraduate biology course for preservice teachers: Experiences and attitudes

    NASA Astrophysics Data System (ADS)

    Schellenberger, Lauren Brownback

    Group processing is a key principle of cooperative learning in which small groups discuss their strengths and weaknesses and set group goals or norms. However, group processing has not been well-studied at the post-secondary level or from a qualitative or mixed methods perspective. This mixed methods study uses a phenomenological framework to examine the experience of group processing for students in an undergraduate biology course for preservice teachers. The effect of group processing on students' attitudes toward future group work and group processing is also examined. Additionally, this research investigated preservice teachers' plans for incorporating group processing into future lessons. Students primarily experienced group processing as a time to reflect on past performance. Also, students experienced group processing as a time to increase communication among group members and become motivated for future group assignments. Three factors directly influenced students' experiences with group processing: (1) previous experience with group work, (2) instructor interaction, and (3) gender. Survey data indicated that group processing had a slight positive effect on students' attitudes toward future group work and group processing. Participants who were interviewed felt that group processing was an important part of group work and that it had increased their group's effectiveness as well as their ability to work effectively with other people. Participants held positive views on group work prior to engaging in group processing, and group processing did not alter their atittude toward group work. Preservice teachers who were interviewed planned to use group work and a modified group processing protocol in their future classrooms. They also felt that group processing had prepared them for their future professions by modeling effective collaboration and group skills. Based on this research, a new model for group processing has been created which includes extensive instructor interaction and additional group processing sessions. This study offers a new perspective on the phenomenon of group processing and informs science educators and teacher educators on the effective implementation of this important component of small-group learning.

  11. Properties of the Bivariate Delayed Poisson Process

    DTIC Science & Technology

    1974-07-01

    and Lewis (1972) in their Berkeley Symposium paper and here their analysis of the bivariate Poisson processes (without Poisson noise) is carried... Poisson processes . They cannot, however, be independent Poisson processes because their events are associated in pairs by the displace- ment centres...process because its marginal processes for events of each type are themselves (univariate) Poisson processes . Cox and Lewis (1972) assumed a

  12. The Application of Six Sigma Methodologies to University Processes: The Use of Student Teams

    ERIC Educational Resources Information Center

    Pryor, Mildred Golden; Alexander, Christine; Taneja, Sonia; Tirumalasetty, Sowmya; Chadalavada, Deepthi

    2012-01-01

    The first student Six Sigma team (activated under a QEP Process Sub-team) evaluated the course and curriculum approval process. The goal was to streamline the process and thereby shorten process cycle time and reduce confusion about how the process works. Members of this team developed flowcharts on how the process is supposed to work (by…

  13. Impact of Radio Frequency Identification (RFID) on the Marine Corps’ Supply Process

    DTIC Science & Technology

    2006-09-01

    Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................56 3. As-Is: The Current... Processing System Vice a Batch Order Processing System ................58 V. RESULTS ................................................69 A. SIMULATION...Time: Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................71 3. As-Is: The

  14. Global-local processing relates to spatial and verbal processing: implications for sex differences in cognition.

    PubMed

    Pletzer, Belinda; Scheuringer, Andrea; Scherndl, Thomas

    2017-09-05

    Sex differences have been reported for a variety of cognitive tasks and related to the use of different cognitive processing styles in men and women. It was recently argued that these processing styles share some characteristics across tasks, i.e. male approaches are oriented towards holistic stimulus aspects and female approaches are oriented towards stimulus details. In that respect, sex-dependent cognitive processing styles share similarities with attentional global-local processing. A direct relationship between cognitive processing and global-local processing has however not been previously established. In the present study, 49 men and 44 women completed a Navon paradigm and a Kimchi Palmer task as well as a navigation task and a verbal fluency task with the goal to relate the global advantage (GA) effect as a measure of global processing to holistic processing styles in both tasks. Indeed participants with larger GA effects displayed more holistic processing during spatial navigation and phonemic fluency. However, the relationship to cognitive processing styles was modulated by the specific condition of the Navon paradigm, as well as the sex of participants. Thus, different types of global-local processing play different roles for cognitive processing in men and women.

  15. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  16. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  17. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  18. A mathematical study of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of a local Gaussian process and a random amplitude process, and the sum of that product with an independent mean value process. The mathematical properties of the resulting process are developed, including the first and second order properties and the characteristic function of general order. An approximate method for the analysis of the response of linear dynamic systems to the process is developed. The transition properties of the process are also examined.

  19. Standard services for the capture, processing, and distribution of packetized telemetry data

    NASA Technical Reports Server (NTRS)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  20. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  1. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1974-01-01

    A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.

  2. Monitoring autocorrelated process: A geometric Brownian motion process approach

    NASA Astrophysics Data System (ADS)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  3. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  4. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 3 2014-01-01 2014-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  5. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  6. A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji

    Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.

  7. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS) data. Application and comparative study of selected tools.

    PubMed

    O'Callaghan, Sean; De Souza, David P; Isaac, Andrew; Wang, Qiao; Hodkinson, Luke; Olshansky, Moshe; Erwin, Tim; Appelbe, Bill; Tull, Dedreia L; Roessner, Ute; Bacic, Antony; McConville, Malcolm J; Likić, Vladimir A

    2012-05-30

    Gas chromatography-mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface.

  8. Processing mode during repetitive thinking in socially anxious individuals: evidence for a maladaptive experiential mode.

    PubMed

    Wong, Quincy J J; Moulds, Michelle L

    2012-12-01

    Evidence from the depression literature suggests that an analytical processing mode adopted during repetitive thinking leads to maladaptive outcomes relative to an experiential processing mode. To date, in socially anxious individuals, the impact of processing mode during repetitive thinking related to an actual social-evaluative situation has not been investigated. We thus tested whether an analytical processing mode would be maladaptive relative to an experiential processing mode during anticipatory processing and post-event rumination. High and low socially anxious participants were induced to engage in either an analytical or experiential processing mode during: (a) anticipatory processing before performing a speech (Experiment 1; N = 94), or (b) post-event rumination after performing a speech (Experiment 2; N = 74). Mood, cognition, and behavioural measures were employed to examine the effects of processing mode. For high socially anxious participants, the modes had a similar effect on self-reported anxiety during both anticipatory processing and post-event rumination. Unexpectedly, relative to the analytical mode, the experiential mode led to stronger high standard and conditional beliefs during anticipatory processing, and stronger unconditional beliefs during post-event rumination. These experiments are the first to investigate processing mode during anticipatory processing and post-event rumination. Hence, these results are novel and will need to be replicated. These findings suggest that an experiential processing mode is maladaptive relative to an analytical processing mode during repetitive thinking characteristic of socially anxious individuals. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    PubMed

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  10. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  11. Life cycle analysis within pharmaceutical process optimization and intensification: case study of active pharmaceutical ingredient production.

    PubMed

    Ott, Denise; Kralisch, Dana; Denčić, Ivana; Hessel, Volker; Laribi, Yosra; Perrichon, Philippe D; Berguerand, Charline; Kiwi-Minsker, Lioubov; Loeb, Patrick

    2014-12-01

    As the demand for new drugs is rising, the pharmaceutical industry faces the quest of shortening development time, and thus, reducing the time to market. Environmental aspects typically still play a minor role within the early phase of process development. Nevertheless, it is highly promising to rethink, redesign, and optimize process strategies as early as possible in active pharmaceutical ingredient (API) process development, rather than later at the stage of already established processes. The study presented herein deals with a holistic life-cycle-based process optimization and intensification of a pharmaceutical production process targeting a low-volume, high-value API. Striving for process intensification by transfer from batch to continuous processing, as well as an alternative catalytic system, different process options are evaluated with regard to their environmental impact to identify bottlenecks and improvement potentials for further process development activities. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. SOI-CMOS Process for Monolithic, Radiation-Tolerant, Science-Grade Imagers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, George; Lee, Adam

    In Phase I, Voxtel worked with Jazz and Sandia to document and simulate the processes necessary to implement a DH-BSI SOI CMOS imaging process. The development is based upon mature SOI CMOS process at both fabs, with the addition of only a few custom processing steps for integration and electrical interconnection of the fully-depleted photodetectors. In Phase I, Voxtel also characterized the Sandia process, including the CMOS7 design rules, and we developed the outline of a process option that included a “BOX etch”, that will permit a “detector in handle” SOI CMOS process to be developed The process flows weremore » developed in cooperation with both Jazz and Sandia process engineers, along with detailed TCAD modeling and testing of the photodiode array architectures. In addition, Voxtel tested the radiation performance of the Jazz’s CA18HJ process, using standard and circular-enclosed transistors.« less

  13. Face to face with emotion: holistic face processing is modulated by emotional state.

    PubMed

    Curby, Kim M; Johnson, Kareem J; Tyson, Alyssa

    2012-01-01

    Negative emotions are linked with a local, rather than global, visual processing style, which may preferentially facilitate feature-based, relative to holistic, processing mechanisms. Because faces are typically processed holistically, and because social contexts are prime elicitors of emotions, we examined whether negative emotions decrease holistic processing of faces. We induced positive, negative, or neutral emotions via film clips and measured holistic processing before and after the induction: participants made judgements about cued parts of chimeric faces, and holistic processing was indexed by the interference caused by task-irrelevant face parts. Emotional state significantly modulated face-processing style, with the negative emotion induction leading to decreased holistic processing. Furthermore, self-reported change in emotional state correlated with changes in holistic processing. These results contrast with general assumptions that holistic processing of faces is automatic and immune to outside influences, and they illustrate emotion's power to modulate socially relevant aspects of visual perception.

  14. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  15. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  16. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  17. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  18. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  19. 20 CFR 405.725 - Effect of expedited appeals process agreement.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PROCESS FOR ADJUDICATING INITIAL DISABILITY CLAIMS Expedited Appeals Process for Constitutional Issues § 405.725 Effect of expedited appeals process agreement. After an expedited appeals process agreement is... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Effect of expedited appeals process agreement...

  20. Common and distinct networks for self-referential and social stimulus processing in the human brain.

    PubMed

    Herold, Dorrit; Spengler, Stephanie; Sajonz, Bastian; Usnich, Tatiana; Bermpohl, Felix

    2016-09-01

    Self-referential processing is a complex cognitive function, involving a set of implicit and explicit processes, complicating investigation of its distinct neural signature. The present study explores the functional overlap and dissociability of self-referential and social stimulus processing. We combined an established paradigm for explicit self-referential processing with an implicit social stimulus processing paradigm in one fMRI experiment to determine the neural effects of self-relatedness and social processing within one study. Overlapping activations were found in the orbitofrontal cortex and in the intermediate part of the precuneus. Stimuli judged as self-referential specifically activated the posterior cingulate cortex, the ventral medial prefrontal cortex, extending into anterior cingulate cortex and orbitofrontal cortex, the dorsal medial prefrontal cortex, the ventral and dorsal lateral prefrontal cortex, the left inferior temporal gyrus, and occipital cortex. Social processing specifically involved the posterior precuneus and bilateral temporo-parietal junction. Taken together, our data show, not only, first, common networks for both processes in the medial prefrontal and the medial parietal cortex, but also, second, functional differentiations for self-referential processing versus social processing: an anterior-posterior gradient for social processing and self-referential processing within the medial parietal cortex and specific activations for self-referential processing in the medial and lateral prefrontal cortex and for social processing in the temporo-parietal junction.

  1. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  2. Use of Analogies in the Study of Diffusion

    ERIC Educational Resources Information Center

    Letic, Milorad

    2014-01-01

    Emergent processes, such as diffusion, are considered more difficult to understand than direct processes. In physiology, most processes are presented as direct processes, so emergent processes, when encountered, are even more difficult to understand. It has been suggested that, when studying diffusion, misconceptions about random processes are the…

  3. Is Analytic Information Processing a Feature of Expertise in Medicine?

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Rikers, Remy M.; Schmidt, Henk G.

    2008-01-01

    Diagnosing begins by generating an initial diagnostic hypothesis by automatic information processing. Information processing may stop here if the hypothesis is accepted, or analytical processing may be used to refine the hypothesis. This description portrays analytic processing as an optional extra in information processing, leading us to…

  4. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  5. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  6. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  7. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  8. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  9. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  10. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  11. An Integrated Model of Emotion Processes and Cognition in Social Information Processing.

    ERIC Educational Resources Information Center

    Lemerise, Elizabeth A.; Arsenio, William F.

    2000-01-01

    Interprets literature on contributions of social cognitive and emotion processes to children's social competence in the context of an integrated model of emotion processes and cognition in social information processing. Provides neurophysiological and functional evidence for the centrality of emotion processes in personal-social decision making.…

  12. Data Processing and First Products from the Hyperspectral Imager for the Coastal Ocean (HICO) on the International Space Station

    DTIC Science & Technology

    2010-04-01

    NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS). APS was developed for processing...have not previously developed automated processing for 73 hyperspectral ocean color data. The hyperspectral processing branch includes several

  13. DISCRETE COMPOUND POISSON PROCESSES AND TABLES OF THE GEOMETRIC POISSON DISTRIBUTION.

    DTIC Science & Technology

    A concise summary of the salient properties of discrete Poisson processes , with emphasis on comparing the geometric and logarithmic Poisson processes . The...the geometric Poisson process are given for 176 sets of parameter values. New discrete compound Poisson processes are also introduced. These...processes have properties that are particularly relevant when the summation of several different Poisson processes is to be analyzed. This study provides the

  14. Management of processes of electrochemical dimensional processing

    NASA Astrophysics Data System (ADS)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  15. The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview

    DTIC Science & Technology

    2010-01-20

    backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of

  16. [Study on culture and philosophy of processing of traditional Chinese medicines].

    PubMed

    Yang, Ming; Zhang, Ding-Kun; Zhong, Ling-Yun; Wang, Fang

    2013-07-01

    According to cultural views and philosophical thoughts, this paper studies the cultural origin, thinking modes, core principles, general regulation and methods of processing, backtracks processing's culture and history which contains generation and deduction process, experienced and promoting process, and core value, summarizes processing's basic principles which are directed by holistic, objective, dynamic, balanced and appropriate thoughts; so as to propagate cultural characteristic and philosophical wisdom of traditional Chinese medicine processing, to promote inheritance and development of processing and to ensure the maximum therapeutic value of Chinese medical clinical.

  17. Containerless automated processing of intermetallic compounds and composites

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  18. A continuous process for the development of Kodak Aerochrome Infrared Film 2443 as a negative

    NASA Astrophysics Data System (ADS)

    Klimes, D.; Ross, D. I.

    1993-02-01

    A process for the continuous dry-to-dry development of Kodak Aerochrome Infrared Film 2443 as a negative (CIR-neg) is described. The process is well suited for production processing of long film lengths. Chemicals from three commercial film processes are used with modifications. Sensitometric procedures are recommended for the monitoring of processing quality control. Sensitometric data and operational aerial exposures indicate that films developed in this process have approximately the same effective aerial film speed as films processed in the reversal process recommended by the manufacturer (Kodak EA-5). The CIR-neg process is useful when aerial photography is acquired for resources management applications which require print reproductions. Originals can be readily reproduced using conventional production equipment (electronic dodging) in black and white or color (color compensation).

  19. Antibiotics with anaerobic ammonium oxidation in urban wastewater treatment

    NASA Astrophysics Data System (ADS)

    Zhou, Ruipeng; Yang, Yuanming

    2017-05-01

    Biofilter process is based on biological oxidation process on the introduction of fast water filter design ideas generated by an integrated filtration, adsorption and biological role of aerobic wastewater treatment process various purification processes. By engineering example, we show that the process is an ideal sewage and industrial wastewater treatment process of low concentration. Anaerobic ammonia oxidation process because of its advantage of the high efficiency and low consumption, wastewater biological denitrification field has broad application prospects. The process in practical wastewater treatment at home and abroad has become a hot spot. In this paper, anammox bacteria habitats and species diversity, and anaerobic ammonium oxidation process in the form of diversity, and one and split the process operating conditions are compared, focusing on a review of the anammox process technology various types of wastewater laboratory research and engineering applications, including general water quality and pressure filtrate sludge digestion, landfill leachate, aquaculture wastewater, monosodium glutamate wastewater, wastewater, sewage, fecal sewage, waste water salinity wastewater characteristics, research progress and application of the obstacles. Finally, we summarize the anaerobic ammonium oxidation process potential problems during the processing of the actual waste water, and proposed future research focus on in-depth study of water quality anammox obstacle factor and its regulatory policy, and vigorously develop on this basis, and combined process optimization.

  20. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  1. Effects of Processing Parameters on the Forming Quality of C-Shaped Thermosetting Composite Laminates in Hot Diaphragm Forming Process

    NASA Astrophysics Data System (ADS)

    Bian, X. X.; Gu, Y. Z.; Sun, J.; Li, M.; Liu, W. P.; Zhang, Z. G.

    2013-10-01

    In this study, the effects of processing temperature and vacuum applying rate on the forming quality of C-shaped carbon fiber reinforced epoxy resin matrix composite laminates during hot diaphragm forming process were investigated. C-shaped prepreg preforms were produced using a home-made hot diaphragm forming equipment. The thickness variations of the preforms and the manufacturing defects after diaphragm forming process, including fiber wrinkling and voids, were evaluated to understand the forming mechanism. Furthermore, both interlaminar slipping friction and compaction behavior of the prepreg stacks were experimentally analyzed for showing the importance of the processing parameters. In addition, autoclave processing was used to cure the C-shaped preforms to investigate the changes of the defects before and after cure process. The results show that the C-shaped prepreg preforms with good forming quality can be achieved through increasing processing temperature and reducing vacuum applying rate, which obviously promote prepreg interlaminar slipping process. The process temperature and forming rate in hot diaphragm forming process strongly influence prepreg interply frictional force, and the maximum interlaminar frictional force can be taken as a key parameter for processing parameter optimization. Autoclave process is effective in eliminating voids in the preforms and can alleviate fiber wrinkles to a certain extent.

  2. Assessment of Advanced Coal Gasification Processes

    NASA Technical Reports Server (NTRS)

    McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John

    1981-01-01

    This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.

  3. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  4. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  5. PROCESSING ALTERNATIVES FOR DESTRUCTION OF TETRAPHENYLBORATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, D; Thomas Peters, T; Samuel Fink, S

    Two processes were chosen in the 1980's at the Savannah River Site (SRS) to decontaminate the soluble High Level Waste (HLW). The In Tank Precipitation (ITP) process (1,2) was developed at SRS for the removal of radioactive cesium and actinides from the soluble HLW. Sodium tetraphenylborate was added to the waste to precipitate cesium and monosodium titanate (MST) was added to adsorb actinides, primarily uranium and plutonium. Two products of this process were a low activity waste stream and a concentrated organic stream containing cesium tetraphenylborate and actinides adsorbed on monosodium titanate (MST). A copper catalyzed acid hydrolysis process wasmore » built to process (3, 4) the Tank 48H cesium tetraphenylborate waste in the SRS's Defense Waste Processing Facility (DWPF). Operation of the DWPF would have resulted in the production of benzene for incineration in SRS's Consolidated Incineration Facility. This process was abandoned together with the ITP process in 1998 due to high benzene in ITP caused by decomposition of excess sodium tetraphenylborate. Processing in ITP resulted in the production of approximately 1.0 million liters of HLW. SRS has chosen a solvent extraction process combined with adsorption of the actinides to decontaminate the soluble HLW stream (5). However, the waste in Tank 48H is incompatible with existing waste processing facilities. As a result, a processing facility is needed to disposition the HLW in Tank 48H. This paper will describe the process for searching for processing options by SRS task teams for the disposition of the waste in Tank 48H. In addition, attempts to develop a caustic hydrolysis process for in tank destruction of tetraphenylborate will be presented. Lastly, the development of both a caustic and acidic copper catalyzed peroxide oxidation process will be discussed.« less

  6. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  7. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  8. Quantitative analysis of geomorphic processes using satellite image data at different scales

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  9. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  10. Microstructure and Texture of Al-2.5wt.%Mg Processed by Combining Accumulative Roll Bonding and Conventional Rolling

    NASA Astrophysics Data System (ADS)

    Gatti, J. R.; Bhattacharjee, P. P.

    2014-12-01

    Evolution of microstructure and texture during severe deformation and annealing was studied in Al-2.5%Mg alloy processed by two different routes, namely, monotonic Accumulative Roll Bonding (ARB) and a hybrid route combining ARB and conventional rolling (CR). For this purpose Al-2.5%Mg sheets were subjected to 5 cycles of monotonic ARB (equivalent strain (ɛeq) = 4.0) processing while in the hybrid route (ARB + CR) 3 cycle ARB-processed sheets were further deformed by conventional rolling to 75% reduction in thickness (ɛeq = 4.0). Although formation of ultrafine structure was observed in the two processing routes, the monotonic ARB—processed material showed finer microstructure but weak texture as compared to the ARB + CR—processed material. After complete recrystallization, the ARB + CR-processed material showed weak cube texture ({001}<100>) but the cube component was almost negligible in the monotonic ARB-processed material-processed material. However, the ND-rotated cube components were stronger in the monotonic ARB-processed material-processed material. The observed differences in the microstructure and texture evolution during deformation and annealing could be explained by the characteristic differences of the two processing routes.

  11. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  12. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  13. A novel processed food classification system applied to Australian food composition databases.

    PubMed

    O'Halloran, S A; Lacy, K E; Grimes, C A; Woods, J; Campbell, K J; Nowson, C A

    2017-08-01

    The extent of food processing can affect the nutritional quality of foodstuffs. Categorising foods by the level of processing emphasises the differences in nutritional quality between foods within the same food group and is likely useful for determining dietary processed food consumption. The present study aimed to categorise foods within Australian food composition databases according to the level of food processing using a processed food classification system, as well as assess the variation in the levels of processing within food groups. A processed foods classification system was applied to food and beverage items contained within Australian Food and Nutrient (AUSNUT) 2007 (n = 3874) and AUSNUT 2011-13 (n = 5740). The proportion of Minimally Processed (MP), Processed Culinary Ingredients (PCI) Processed (P) and Ultra Processed (ULP) by AUSNUT food group and the overall proportion of the four processed food categories across AUSNUT 2007 and AUSNUT 2011-13 were calculated. Across the food composition databases, the overall proportions of foods classified as MP, PCI, P and ULP were 27%, 3%, 26% and 44% for AUSNUT 2007 and 38%, 2%, 24% and 36% for AUSNUT 2011-13. Although there was wide variation in the classifications of food processing within the food groups, approximately one-third of foodstuffs were classified as ULP food items across both the 2007 and 2011-13 AUSNUT databases. This Australian processed food classification system will allow researchers to easily quantify the contribution of processed foods within the Australian food supply to assist in assessing the nutritional quality of the dietary intake of population groups. © 2017 The British Dietetic Association Ltd.

  14. Process and domain specificity in regions engaged for face processing: an fMRI study of perceptual differentiation.

    PubMed

    Collins, Heather R; Zhu, Xun; Bhatt, Ramesh S; Clark, Jonathan D; Joseph, Jane E

    2012-12-01

    The degree to which face-specific brain regions are specialized for different kinds of perceptual processing is debated. This study parametrically varied demands on featural, first-order configural, or second-order configural processing of faces and houses in a perceptual matching task to determine the extent to which the process of perceptual differentiation was selective for faces regardless of processing type (domain-specific account), specialized for specific types of perceptual processing regardless of category (process-specific account), engaged in category-optimized processing (i.e., configural face processing or featural house processing), or reflected generalized perceptual differentiation (i.e., differentiation that crosses category and processing type boundaries). ROIs were identified in a separate localizer run or with a similarity regressor in the face-matching runs. The predominant principle accounting for fMRI signal modulation in most regions was generalized perceptual differentiation. Nearly all regions showed perceptual differentiation for both faces and houses for more than one processing type, even if the region was identified as face-preferential in the localizer run. Consistent with process specificity, some regions showed perceptual differentiation for first-order processing of faces and houses (right fusiform face area and occipito-temporal cortex and right lateral occipital complex), but not for featural or second-order processing. Somewhat consistent with domain specificity, the right inferior frontal gyrus showed perceptual differentiation only for faces in the featural matching task. The present findings demonstrate that the majority of regions involved in perceptual differentiation of faces are also involved in differentiation of other visually homogenous categories.

  15. Process- and Domain-Specificity in Regions Engaged for Face Processing: An fMRI Study of Perceptual Differentiation

    PubMed Central

    Collins, Heather R.; Zhu, Xun; Bhatt, Ramesh S.; Clark, Jonathan D.; Joseph, Jane E.

    2015-01-01

    The degree to which face-specific brain regions are specialized for different kinds of perceptual processing is debated. The present study parametrically varied demands on featural, first-order configural or second-order configural processing of faces and houses in a perceptual matching task to determine the extent to which the process of perceptual differentiation was selective for faces regardless of processing type (domain-specific account), specialized for specific types of perceptual processing regardless of category (process-specific account), engaged in category-optimized processing (i.e., configural face processing or featural house processing) or reflected generalized perceptual differentiation (i.e. differentiation that crosses category and processing type boundaries). Regions of interest were identified in a separate localizer run or with a similarity regressor in the face-matching runs. The predominant principle accounting for fMRI signal modulation in most regions was generalized perceptual differentiation. Nearly all regions showed perceptual differentiation for both faces and houses for more than one processing type, even if the region was identified as face-preferential in the localizer run. Consistent with process-specificity, some regions showed perceptual differentiation for first-order processing of faces and houses (right fusiform face area and occipito-temporal cortex, and right lateral occipital complex), but not for featural or second-order processing. Somewhat consistent with domain-specificity, the right inferior frontal gyrus showed perceptual differentiation only for faces in the featural matching task. The present findings demonstrate that the majority of regions involved in perceptual differentiation of faces are also involved in differentiation of other visually homogenous categories. PMID:22849402

  16. Achieving Continuous Manufacturing for Final Dosage Formation: Challenges and How to Meet Them May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. An Analysis of the Air Force Government Operated Civil Engineering Supply Store Logistic System: How Can It Be Improved?

    DTIC Science & Technology

    1990-09-01

    6 Logistics Systems ............ 7 GOCESS Operation . . . . . . . ..... 9 Work Order Processing . . . . ... 12 Job Order Processing . . . . . . . . . . 14...orders and job orders to the Material Control Section will be discussed separately. Work Order Processing . Figure 2 illustrates typical WO processing...logistics function. The JO processing is similar. Job Order Processing . Figure 3 illustrates typical JO processing in a GOCESS operation. As with WOs, this

  18. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  19. Data processing system for the Sneg-2MP experiment

    NASA Technical Reports Server (NTRS)

    Gavrilova, Y. A.

    1980-01-01

    The data processing system for scientific experiments on stations of the "Prognoz" type provides for the processing sequence to be broken down into a number of consecutive stages: preliminary processing, primary processing, secondary processing. The tasks of each data processing stage are examined for an experiment designed to study gamma flashes of galactic origin and solar flares lasting from several minutes to seconds in the 20 kev to 1000 kev energy range.

  20. General RMP Guidance - Appendix D: OSHA Guidance on PSM

    EPA Pesticide Factsheets

    OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.

  1. Elaboration Likelihood and the Counseling Process: The Role of Affect.

    ERIC Educational Resources Information Center

    Stoltenberg, Cal D.; And Others

    The role of affect in counseling has been examined from several orientations. The depth of processing model views the efficiency of information processing as a function of the extent to which the information is processed. The notion of cognitive processing capacity states that processing information at deeper levels engages more of one's limited…

  2. 5 CFR 582.202 - Service of legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Service of legal process. 582.202 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.202 Service of legal process. (a) A person using this part shall serve interrogatories and legal process on the agent to receive process as...

  3. 5 CFR 582.202 - Service of legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Service of legal process. 582.202 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.202 Service of legal process. (a) A person using this part shall serve interrogatories and legal process on the agent to receive process as...

  4. Information Processing Concepts: A Cure for "Technofright." Information Processing in the Electronic Office. Part 1: Concepts.

    ERIC Educational Resources Information Center

    Popyk, Marilyn K.

    1986-01-01

    Discusses the new automated office and its six major technologies (data processing, word processing, graphics, image, voice, and networking), the information processing cycle (input, processing, output, distribution/communication, and storage and retrieval), ergonomics, and ways to expand office education classes (versus class instruction). (CT)

  5. Facial Speech Gestures: The Relation between Visual Speech Processing, Phonological Awareness, and Developmental Dyslexia in 10-Year-Olds

    ERIC Educational Resources Information Center

    Schaadt, Gesa; Männel, Claudia; van der Meer, Elke; Pannekamp, Ann; Friederici, Angela D.

    2016-01-01

    Successful communication in everyday life crucially involves the processing of auditory and visual components of speech. Viewing our interlocutor and processing visual components of speech facilitates speech processing by triggering auditory processing. Auditory phoneme processing, analyzed by event-related brain potentials (ERP), has been shown…

  6. 40 CFR 65.62 - Process vent group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., or Group 2B) for each process vent. Group 1 process vents require control, and Group 2A and 2B... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process vent group determination. 65... (CONTINUED) CONSOLIDATED FEDERAL AIR RULE Process Vents § 65.62 Process vent group determination. (a) Group...

  7. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .../or Table 9 compounds are similar and often identical. (3) Biological treatment processes. Biological treatment processes in compliance with this section may be either open or closed biological treatment processes as defined in § 63.111. An open biological treatment process in compliance with this section need...

  8. 5 CFR 581.202 - Service of process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Service of process. 581.202 Section 581... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Service of Process § 581.202 Service of process. (a) A... facilitate proper service of process on its designated agent(s). If legal process is not directed to any...

  9. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 3 2011-07-01 2011-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  10. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 3 2012-07-01 2012-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  11. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 3 2013-07-01 2013-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  12. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  13. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  14. Processing Depth, Elaboration of Encoding, Memory Stores, and Expended Processing Capacity.

    ERIC Educational Resources Information Center

    Eysenck, Michael W.; Eysenck, M. Christine

    1979-01-01

    The effects of several factors on expended processing capacity were measured. Expended processing capacity was greater when information was retrieved from secondary memory than from primary memory, when processing was of a deep, semantic nature than when it was shallow and physical, and when processing was more elaborate. (Author/GDC)

  15. Speed isn’t everything: Complex processing speed measures mask individual differences and developmental changes in executive control

    PubMed Central

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2012-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836

  16. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  17. A new class of random processes with application to helicopter noise

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.; Miamee, A. G.

    1989-01-01

    The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x) (omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.

  18. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  19. A new class of random processes with application to helicopter noise

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.; Miamee, A. G.

    1989-01-01

    The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x)(omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.

  20. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  1. Rapid Automatized Naming in Children with Dyslexia: Is Inhibitory Control Involved?

    PubMed

    Bexkens, Anika; van den Wildenberg, Wery P M; Tijms, Jurgen

    2015-08-01

    Rapid automatized naming (RAN) is widely seen as an important indicator of dyslexia. The nature of the cognitive processes involved in rapid naming is however still a topic of controversy. We hypothesized that in addition to the involvement of phonological processes and processing speed, RAN is a function of inhibition processes, in particular of interference control. A total 86 children with dyslexia and 31 normal readers were recruited. Our results revealed that in addition to phonological processing and processing speed, interference control predicts rapid naming in dyslexia, but in contrast to these other two cognitive processes, inhibition is not significantly associated with their reading and spelling skills. After variance in reading and spelling associated with processing speed, interference control and phonological processing was partialled out, naming speed was no longer consistently associated with the reading and spelling skills of children with dyslexia. Finally, dyslexic children differed from normal readers on naming speed, literacy skills, phonological processing and processing speed, but not on inhibition processes. Both theoretical and clinical interpretations of these results are discussed. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Non-Conscious Perception of Emotions in Psychiatric Disorders: The Unsolved Puzzle of Psychopathology.

    PubMed

    Lee, Seung A; Kim, Chai-Youn; Lee, Seung-Hwan

    2016-03-01

    Psychophysiological and functional neuroimaging studies have frequently and consistently shown that emotional information can be processed outside of the conscious awareness. Non-conscious processing comprises automatic, uncontrolled, and fast processing that occurs without subjective awareness. However, how such non-conscious emotional processing occurs in patients with various psychiatric disorders requires further examination. In this article, we reviewed and discussed previous studies on the non-conscious emotional processing in patients diagnosed with anxiety disorder, schizophrenia, bipolar disorder, and depression, to further understand how non-conscious emotional processing varies across these psychiatric disorders. Although the symptom profile of each disorder does not often overlap with one another, these patients commonly show abnormal emotional processing based on the pathology of their mood and cognitive function. This indicates that the observed abnormalities of emotional processing in certain social interactions may derive from a biased mood or cognition process that precedes consciously controlled and voluntary processes. Since preconscious forms of emotional processing appear to have a major effect on behaviour and cognition in patients with these disorders, further investigation is required to understand these processes and their impact on patient pathology.

  4. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  5. A Framework for Business Process Change Requirements Analysis

    NASA Astrophysics Data System (ADS)

    Grover, Varun; Otim, Samuel

    The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.

  6. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  7. When teams shift among processes: insights from simulation and optimization.

    PubMed

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. Nitrous oxide and methane emissions from different treatment processes in full-scale municipal wastewater treatment plants.

    PubMed

    Rena, Y G; Wang, J H; Li, H F; Zhang, J; Qi, P Y; Hu, Z

    2013-01-01

    Nitrous oxide (N2O) and methane (CH4) are two important greenhouse gases (GHG) emitted from biological nutrient removal (BNR) processes in municipal wastewater treatment plants (WWTP). In this study, three typical biological wastewater treatment processes were studied in WWTP of Northern China: pre-anaerobic carrousel oxidation ditch (A+OD) process, pre-anoxic anaerobic-anoxic-oxic (A-A/ A/O) process and reverse anaerobic-anoxic-oxic (r-A/ A/O) process. The N2O and CH4 emissions from these three different processes were measured in every processing unit of each WWTP. Results showed that N2O and CH4 were mainly discharged during the nitrification/denitrification process and the anaerobic/anoxic treatment process, respectively and the amounts of their formation and release were significantly influenced by different BNR processes implemented in these WWTP. The N2O conversion ratio of r-A/ A/O process was the lowest among the three WWTP, which were 10.9% and 18.6% lower than that of A-A/A/O process and A+OD process, respectively. Similarly, the CH4 conversion ratio of r-A/ A/O process was the lowest among the three WWTP, which were 89. I% and 80.8% lower than that of A-A/ A/O process and A+OD process, respectively. The factors influencing N2O and CH4 formation and emission in the three WWTP were investigated to explain the difference between these processes. The nitrite concentration and oxidation-reduction potential (ORP) value were found to be the dominant influencing factors affecting N2O and CH4 production, respectively. The flow-based emission factors of N2O and CH4 of the WWTP were figured out for better quantification of GHG emissions and further technical assessments of mitigation options.

  9. Effects of children's working memory capacity and processing speed on their sentence imitation performance.

    PubMed

    Poll, Gerard H; Miller, Carol A; Mainela-Arnold, Elina; Adams, Katharine Donnelly; Misra, Maya; Park, Ji Sook

    2013-01-01

    More limited working memory capacity and slower processing for language and cognitive tasks are characteristics of many children with language difficulties. Individual differences in processing speed have not consistently been found to predict language ability or severity of language impairment. There are conflicting views on whether working memory and processing speed are integrated or separable abilities. To evaluate four models for the relations of individual differences in children's processing speed and working memory capacity in sentence imitation. The models considered whether working memory and processing speed are integrated or separable, as well as the effect of the number of operations required per sentence. The role of working memory as a mediator of the effect of processing speed on sentence imitation was also evaluated. Forty-six children with varied language and reading abilities imitated sentences. Working memory was measured with the Competing Language Processing Task (CLPT), and processing speed was measured with a composite of truth-value judgment and rapid automatized naming tasks. Mixed-effects ordinal regression models evaluated the CLPT and processing speed as predictors of sentence imitation item scores. A single mediator model evaluated working memory as a mediator of the effect of processing speed on sentence imitation total scores. Working memory was a reliable predictor of sentence imitation accuracy, but processing speed predicted sentence imitation only as a component of a processing speed by number of operations interaction. Processing speed predicted working memory capacity, and there was evidence that working memory acted as a mediator of the effect of processing speed on sentence imitation accuracy. The findings support a refined view of working memory and processing speed as separable factors in children's sentence imitation performance. Processing speed does not independently explain sentence imitation accuracy for all sentence types, but contributes when the task requires more mental operations. Processing speed also has an indirect effect on sentence imitation by contributing to working memory capacity. © 2013 Royal College of Speech and Language Therapists.

  10. Q-marker based strategy for CMC research of Chinese medicine: A case study of Panax Notoginseng saponins.

    PubMed

    Zhong, Yi; Zhu, Jieqiang; Yang, Zhenzhong; Shao, Qing; Fan, Xiaohui; Cheng, Yiyu

    2018-01-31

    To ensure pharmaceutical quality, chemistry, manufacturing and control (CMC) research is essential. However, due to the inherent complexity of Chinese medicine (CM), CMC study of CM remains a great challenge for academia, industry, and regulatory agencies. Recently, quality-marker (Q-marker) was proposed to establish quality standards or quality analysis approaches of Chinese medicine, which sheds a light on Chinese medicine's CMC study. Here manufacture processes of Panax Notoginseng Saponins (PNS) is taken as a case study and the present work is to establish a Q-marker based research strategy for CMC of Chinese medicine. The Q-markers of Panax Notoginseng Saponins (PNS) is selected and established by integrating chemical profile with pharmacological activities. Then, the key processes of PNS manufacturing are identified by material flow analysis. Furthermore, modeling algorithms are employed to explore the relationship between Q-markers and critical process parameters (CPPs) of the key processes. At last, CPPs of the key processes are optimized in order to improving the process efficiency. Among the 97 identified compounds, Notoginsenoside R 1 , ginsenoside Rg 1 , Re, Rb 1 and Rd are selected as the Q-markers of PNS. Our analysis on PNS manufacturing show the extraction process and column chromatography process are the key processes. With the CPPs of each process as the inputs and Q-markers' contents as the outputs, two process prediction models are built separately for the extraction process and column chromatography process of Panax notoginseng, which both possess good prediction ability. Based on the efficiency models of extraction process and column chromatography process we constructed, the optimal CPPs of both processes are calculated. Our results show that the Q-markers derived from CMC research strategy can be applied to analyze the manufacturing processes of Chinese medicine to assure product's quality and promote key processes' efficiency simultaneously. Copyright © 2018 Elsevier GmbH. All rights reserved.

  11. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS) data. Application and comparative study of selected tools

    PubMed Central

    2012-01-01

    Background Gas chromatography–mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. Results PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). Conclusions PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface. PMID:22647087

  12. The Research Process on Converter Steelmaking Process by Using Limestone

    NASA Astrophysics Data System (ADS)

    Tang, Biao; Li, Xing-yi; Cheng, Han-chi; Wang, Jing; Zhang, Yun-long

    2017-08-01

    Compared with traditional converter steelmaking process, steelmaking process with limestone uses limestone to replace lime partly. A lot of researchers have studied about the new steelmaking process. There are much related research about material balance calculation, the behaviour of limestone in the slag, limestone powder injection in converter and application of limestone in iron and steel enterprises. The results show that the surplus heat of converter can meet the need of the limestone calcination, and the new process can reduce the steelmaking process energy loss in the whole steelmaking process, reduce carbon dioxide emissions, and improve the quality of the gas.

  13. Gas processing handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-04-01

    Brief details are given of processes including: BGC-Lurgi slagging gasification, COGAS, Exxon catalytic coal gasification, FW-Stoic 2-stage, GI two stage, HYGAS, Koppers-Totzek, Lurgi pressure gasification, Saarberg-Otto, Shell, Texaco, U-Gas, W-D.IGI, Wellman-Galusha, Westinghouse, and Winkler coal gasification processes; the Rectisol process; the Catacarb and the Benfield processes for removing CO/SUB/2, H/SUB/2s and COS from gases produced by the partial oxidation of coal; the selectamine DD, Selexol solvent, and Sulfinol gas cleaning processes; the sulphur-tolerant shift (SSK) process; and the Super-meth process for the production of high-Btu gas from synthesis gas.

  14. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  15. Chemical processing of lunar materials

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.; Waldron, R. D.

    1979-01-01

    The paper highlights recent work on the general problem of processing lunar materials. The discussion covers lunar source materials, refined products, motivations for using lunar materials, and general considerations for a lunar or space processing plant. Attention is given to chemical processing through various techniques, including electrolysis of molten silicates, carbothermic/silicothermic reduction, carbo-chlorination process, NaOH basic-leach process, and HF acid-leach process. Several options for chemical processing of lunar materials are well within the state of the art of applied chemistry and chemical engineering to begin development based on the extensive knowledge of lunar materials.

  16. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  17. Sensor-based atomic layer deposition for rapid process learning and enhanced manufacturability

    NASA Astrophysics Data System (ADS)

    Lei, Wei

    In the search for sensor based atomic layer deposition (ALD) process to accelerate process learning and enhance manufacturability, we have explored new reactor designs and applied in-situ process sensing to W and HfO 2 ALD processes. A novel wafer scale ALD reactor, which features fast gas switching, good process sensing compatibility and significant similarity to the real manufacturing environment, is constructed. The reactor has a unique movable reactor cap design that allows two possible operation modes: (1) steady-state flow with alternating gas species; or (2) fill-and-pump-out cycling of each gas, accelerating the pump-out by lifting the cap to employ the large chamber volume as ballast. Downstream quadrupole mass spectrometry (QMS) sampling is applied for in-situ process sensing of tungsten ALD process. The QMS reveals essential surface reaction dynamics through real-time signals associated with byproduct generation as well as precursor introduction and depletion for each ALD half cycle, which are then used for process learning and optimization. More subtle interactions such as imperfect surface saturation and reactant dose interaction are also directly observed by QMS, indicating that ALD process is more complicated than the suggested layer-by-layer growth. By integrating in real-time the byproduct QMS signals over each exposure and plotting it against process cycle number, the deposition kinetics on the wafer is directly measured. For continuous ALD runs, the total integrated byproduct QMS signal in each ALD run is also linear to ALD film thickness, and therefore can be used for ALD film thickness metrology. The in-situ process sensing is also applied to HfO2 ALD process that is carried out in a furnace type ALD reactor. Precursor dose end-point control is applied to precisely control the precursor dose in each half cycle. Multiple process sensors, including quartz crystal microbalance (QCM) and QMS are used to provide real time process information. The sensing results confirm the proposed surface reaction path and once again reveal the complexity of ALD processes. The impact of this work includes: (1) It explores new ALD reactor designs which enable the implementation of in-situ process sensors for rapid process learning and enhanced manufacturability; (2) It demonstrates in the first time that in-situ QMS can reveal detailed process dynamics and film growth kinetics in wafer-scale ALD process, and thus can be used for ALD film thickness metrology. (3) Based on results from two different processes carried out in two different reactors, it is clear that ALD is a more complicated process than normally believed or advertised, but real-time observation of the operational chemistries in ALD by in-situ sensors provides critical insight to the process and the basis for more effective process control for ALD applications.

  18. Implicit Processes, Self-Regulation, and Interventions for Behavior Change.

    PubMed

    St Quinton, Tom; Brunton, Julie A

    2017-01-01

    The ability to regulate and subsequently change behavior is influenced by both reflective and implicit processes. Traditional theories have focused on conscious processes by highlighting the beliefs and intentions that influence decision making. However, their success in changing behavior has been modest with a gap between intention and behavior apparent. Dual-process models have been recently applied to health psychology; with numerous models incorporating implicit processes that influence behavior as well as the more common conscious processes. Such implicit processes are theorized to govern behavior non-consciously. The article provides a commentary on motivational and volitional processes and how interventions have combined to attempt an increase in positive health behaviors. Following this, non-conscious processes are discussed in terms of their theoretical underpinning. The article will then highlight how these processes have been measured and will then discuss the different ways that the non-conscious and conscious may interact. The development of interventions manipulating both processes may well prove crucial in successfully altering behavior.

  19. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  20. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  1. Economics of polysilicon process: A view from Japan

    NASA Technical Reports Server (NTRS)

    Shimizu, Y.

    1986-01-01

    The production process of solar grade silicon (SOG-Si) through trichlorosilane (TCS) was researched in a program sponsored by New Energy Development Organization (NEDO). The NEDO process consists of the following two steps: TCS production from by-product silicon tetrachloride (STC) and SOG-Si formation from TCS using a fluidized bed reactor. Based on the data obtained during the research program, the manufacturing cost of the NEDO process and other polysilicon manufacturing processes were compared. The manufacturing cost was calculated on the basis of 1000 tons/year production. The cost estimate showed that the cost of producing silicon by all of the new processes is less than the cost by the conventional Siemens process. Using a new process, the cost of producing semiconductor grade silicon was found to be virtually the same with any to the TCS, diclorosilane, and monosilane processes when by-products were recycled. The SOG-Si manufacturing processes using the fluidized bed reactor, which needs further development, shows a greater probablility of cost reduction than the filament processes.

  2. Autonomous Agents for Dynamic Process Planning in the Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Nik Nejad, Hossein Tehrani; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka

    Rapid changes of market demands and pressures of competition require manufacturers to maintain highly flexible manufacturing systems to cope with a complex manufacturing environment. This paper deals with development of an agent-based architecture of dynamic systems for incremental process planning in the manufacturing systems. In consideration of alternative manufacturing processes and machine tools, the process plans and the schedules of the manufacturing resources are generated incrementally and dynamically. A negotiation protocol is discussed, in this paper, to generate suitable process plans for the target products real-timely and dynamically, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans are searched and generated to cope with both the dynamic changes of the product specifications and the disturbances of the manufacturing resources. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans in the dynamic manufacturing environment.

  3. Implicit Processes, Self-Regulation, and Interventions for Behavior Change

    PubMed Central

    St Quinton, Tom; Brunton, Julie A.

    2017-01-01

    The ability to regulate and subsequently change behavior is influenced by both reflective and implicit processes. Traditional theories have focused on conscious processes by highlighting the beliefs and intentions that influence decision making. However, their success in changing behavior has been modest with a gap between intention and behavior apparent. Dual-process models have been recently applied to health psychology; with numerous models incorporating implicit processes that influence behavior as well as the more common conscious processes. Such implicit processes are theorized to govern behavior non-consciously. The article provides a commentary on motivational and volitional processes and how interventions have combined to attempt an increase in positive health behaviors. Following this, non-conscious processes are discussed in terms of their theoretical underpinning. The article will then highlight how these processes have been measured and will then discuss the different ways that the non-conscious and conscious may interact. The development of interventions manipulating both processes may well prove crucial in successfully altering behavior. PMID:28337164

  4. Models of recognition: a review of arguments in favor of a dual-process account.

    PubMed

    Diana, Rachel A; Reder, Lynne M; Arndt, Jason; Park, Heekyeong

    2006-02-01

    The majority of computationally specified models of recognition memory have been based on a single-process interpretation, claiming that familiarity is the only influence on recognition. There is increasing evidence that recognition is, in fact, based on two processes: recollection and familiarity. This article reviews the current state of the evidence for dual-process models, including the usefulness of the remember/know paradigm, and interprets the relevant results in terms of the source of activation confusion (SAC) model of memory. We argue that the evidence from each of the areas we discuss, when combined, presents a strong case that inclusion of a recollection process is necessary. Given this conclusion, we also argue that the dual-process claim that the recollection process is always available is, in fact, more parsimonious than the single-process claim that the recollection process is used only in certain paradigms. The value of a well-specified process model such as the SAC model is discussed with regard to other types of dual-process models.

  5. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  6. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  7. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  8. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  9. An Aspect-Oriented Framework for Business Process Improvement

    NASA Astrophysics Data System (ADS)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  10. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  11. Combined mesophilic anaerobic and thermophilic aerobic digestion process for high-strength food wastewater to increase removal efficiency and reduce sludge discharge.

    PubMed

    Jang, H M; Park, S K; Ha, J H; Park, J M

    2014-01-01

    In this study, a process that combines the mesophilic anaerobic digestion (MAD) process with thermophilic aerobic digestion (TAD) for high-strength food wastewater (FWW) treatment was developed to examine the removal of organic matter and methane production. All effluent discharged from the MAD process was separated into solid and liquid portions. The liquid part was discarded and the sludge part was passed to the TAD process for further degradation. Then, the digested sludge from the TAD process was recycled back to the MAD unit to achieve low sludge discharge from the combined process. The reactor combination was operated in two phases: during Phase I, 40 d of total hydraulic retention time (HRT) was applied; during Phase II, 20 d was applied. HRT of the TAD process was fixed at 5 d. For a comparison, a control process (single-stage MAD) was operated with the same HRTs of the combined process. Our results indicated that the combined process showed over 90% total solids, volatile solids and chemical oxygen demand removal efficiencies. In addition, the combined process showed a significantly higher methane production rate than that of the control process. Consequently, the experimental data demonstrated that the combined MAD-TAD process was successfully employed for high-strength FWW treatment with highly efficient organic matter reduction and methane production.

  12. Leading processes of patient care and treatment in hierarchical healthcare organizations in Sweden--process managers' experiences.

    PubMed

    Nilsson, Kerstin; Sandoff, Mette

    2015-01-01

    The purpose of this study is to gain better understanding of the roles and functions of process managers by describing Swedish process managers' experiences of leading processes involving patient care and treatment when working in a hierarchical health-care organization. This study is based on an explorative design. The data were gathered from interviews with 12 process managers at three Swedish hospitals. These data underwent qualitative and interpretative analysis with a modified editing style. The process managers' experiences of leading processes in a hierarchical health-care organization are described under three themes: having or not having a mandate, exposure to conflict situations and leading process development. The results indicate a need for clarity regarding process manager's responsibility and work content, which need to be communicated to all managers and staff involved in the patient care and treatment process, irrespective of department. There also needs to be an emphasis on realistic expectations and orientation of the goals that are an intrinsic part of the task of being a process manager. Generalizations from the results of the qualitative interview studies are limited, but a deeper understanding of the phenomenon was reached, which, in turn, can be transferred to similar settings. This study contributes qualitative descriptions of leading care and treatment processes in a functional, hierarchical health-care organization from process managers' experiences, a subject that has not been investigated earlier.

  13. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    ERIC Educational Resources Information Center

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  14. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  15. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  16. 15 CFR 15.3 - Acceptance of service of process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Acceptance of service of process. 15.3... Process § 15.3 Acceptance of service of process. (a) Except as otherwise provided in this subpart, any... employee by law is to be served personally with process. Service of process in this case is inadequate when...

  17. Weaknesses in Applying a Process Approach in Industry Enterprises

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Mĺkva, Miroslava; Fidlerová, Helena

    2012-12-01

    The paper deals with a process approach as one of the main principles of the quality management. Quality management systems based on process approach currently represents one of a proofed ways how to manage an organization. The volume of sales, costs and profit levels are influenced by quality of processes and efficient process flow. As results of the research project showed, there are some weaknesses in applying of the process approach in the industrial routine and it has been often only a formal change of the functional management to process management in many organizations in Slovakia. For efficient process management it is essential that companies take attention to the way how to organize their processes and seek for their continuous improvement.

  18. Is Primary-Process Cognition a Feature of Hypnosis?

    PubMed

    Finn, Michael T; Goldman, Jared I; Lyon, Gyrid B; Nash, Michael R

    2017-01-01

    The division of cognition into primary and secondary processes is an important part of contemporary psychoanalytic metapsychology. Whereas primary processes are most characteristic of unconscious thought and loose associations, secondary processes generally govern conscious thought and logical reasoning. It has been theorized that an induction into hypnosis is accompanied by a predomination of primary-process cognition over secondary-process cognition. The authors hypothesized that highly hypnotizable individuals would demonstrate more primary-process cognition as measured by a recently developed cognitive-perceptual task. This hypothesis was not supported. In fact, low hypnotizable participants demonstrated higher levels of primary-process cognition. Exploratory analyses suggested a more specific effect: felt connectedness to the hypnotist seemed to promote secondary-process cognition among low hypnotizable participants.

  19. [Dual process in large number estimation under uncertainty].

    PubMed

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  20. Object-processing neural efficiency differentiates object from spatial visualizers.

    PubMed

    Motes, Michael A; Malach, Rafael; Kozhevnikov, Maria

    2008-11-19

    The visual system processes object properties and spatial properties in distinct subsystems, and we hypothesized that this distinction might extend to individual differences in visual processing. We conducted a functional MRI study investigating the neural underpinnings of individual differences in object versus spatial visual processing. Nine participants of high object-processing ability ('object' visualizers) and eight participants of high spatial-processing ability ('spatial' visualizers) were scanned, while they performed an object-processing task. Object visualizers showed lower bilateral neural activity in lateral occipital complex and lower right-lateralized neural activity in dorsolateral prefrontal cortex. The data indicate that high object-processing ability is associated with more efficient use of visual-object resources, resulting in less neural activity in the object-processing pathway.

  1. Process simulation for advanced composites production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less

  2. CDO budgeting

    NASA Astrophysics Data System (ADS)

    Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian

    2008-04-01

    The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to the suggested model into contributions from particular processes or process groups. Last but not least the power of this method to determine the absolute strength of each parameter will be demonstrated. Identification of the root cause of this variation within the unit process itself is not scope of this work.

  3. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  4. Consumers' conceptualization of ultra-processed foods.

    PubMed

    Ares, Gastón; Vidal, Leticia; Allegue, Gimena; Giménez, Ana; Bandeira, Elisa; Moratorio, Ximena; Molina, Verónika; Curutchet, María Rosa

    2016-10-01

    Consumption of ultra-processed foods has been associated with low diet quality, obesity and other non-communicable diseases. This situation makes it necessary to develop educational campaigns to discourage consumers from substituting meals based on unprocessed or minimally processed foods by ultra-processed foods. In this context, the aim of the present work was to investigate how consumers conceptualize the term ultra-processed foods and to evaluate if the foods they perceive as ultra-processed are in concordance with the products included in the NOVA classification system. An online study was carried out with 2381 participants. They were asked to explain what they understood by ultra-processed foods and to list foods that can be considered ultra-processed. Responses were analysed using inductive coding. The great majority of the participants was able to provide an explanation of what ultra-processed foods are, which was similar to the definition described in the literature. Most of the participants described ultra-processed foods as highly processed products that usually contain additives and other artificial ingredients, stressing that they have low nutritional quality and are unhealthful. The most relevant products for consumers' conceptualization of the term were in agreement with the NOVA classification system and included processed meats, soft drinks, snacks, burgers, powdered and packaged soups and noodles. However, some of the participants perceived processed foods, culinary ingredients and even some minimally processed foods as ultra-processed. This suggests that in order to accurately convey their message, educational campaigns aimed at discouraging consumers from consuming ultra-processed foods should include a clear definition of the term and describe some of their specific characteristics, such as the type of ingredients included in their formulation and their nutritional composition. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Rapid communication: Global-local processing affects recognition of distractor emotional faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2011-03-01

    Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.

  6. Tomographical process monitoring of laser transmission welding with OCT

    NASA Astrophysics Data System (ADS)

    Ackermann, Philippe; Schmitt, Robert

    2017-06-01

    Process control of laser processes still encounters many obstacles. Although these processes are stable, a narrow process parameter window during the process or process deviations have led to an increase on the requirements for the process itself and on monitoring devices. Laser transmission welding as a contactless and locally limited joining technique is well-established in a variety of demanding production areas. For example, sensitive parts demand a particle-free joining technique which does not affect the inner components. Inline integrated non-destructive optical measurement systems capable of providing non-invasive tomographical images of the transparent material, the weld seam and its surrounding areas with micron resolution would improve the overall process. Obtained measurement data enable qualitative feedback into the system to adapt parameters for a more robust process. Within this paper we present the inline monitoring device based on Fourier-domain optical coherence tomography developed within the European-funded research project "Manunet Weldable". This device, after adaptation to the laser transmission welding process is optically and mechanically integrated into the existing laser system. The main target lies within the inline process control destined to extract tomographical geometrical measurement data from the weld seam forming process. Usage of this technology makes offline destructive testing of produced parts obsolete. 1,2,3,4

  7. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  8. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  9. [Process management in the hospital pharmacy for the improvement of the patient safety].

    PubMed

    Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D

    2013-01-01

    To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  10. Distributed processing method for arbitrary view generation in camera sensor network

    NASA Astrophysics Data System (ADS)

    Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki

    2003-05-01

    Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.

  11. EEG alpha synchronization is related to top-down processing in convergent and divergent thinking

    PubMed Central

    Benedek, Mathias; Bergner, Sabine; Könen, Tanja; Fink, Andreas; Neubauer, Aljoscha C.

    2011-01-01

    Synchronization of EEG alpha activity has been referred to as being indicative of cortical idling, but according to more recent evidence it has also been associated with active internal processing and creative thinking. The main objective of this study was to investigate to what extent EEG alpha synchronization is related to internal processing demands and to specific cognitive process involved in creative thinking. To this end, EEG was measured during a convergent and a divergent thinking task (i.e., creativity-related task) which once were processed involving low and once involving high internal processing demands. High internal processing demands were established by masking the stimulus (after encoding) and thus preventing further bottom-up processing. Frontal alpha synchronization was observed during convergent and divergent thinking only under exclusive top-down control (high internal processing demands), but not when bottom-up processing was allowed (low internal processing demands). We conclude that frontal alpha synchronization is related to top-down control rather than to specific creativity-related cognitive processes. Frontal alpha synchronization, which has been observed in a variety of different creativity tasks, thus may not reflect a brain state that is specific for creative cognition but can probably be attributed to high internal processing demands which are typically involved in creative thinking. PMID:21925520

  12. Kennedy Space Center Payload Processing

    NASA Technical Reports Server (NTRS)

    Lawson, Ronnie; Engler, Tom; Colloredo, Scott; Zide, Alan

    2011-01-01

    This slide presentation reviews the payload processing functions at Kennedy Space Center. It details some of the payloads processed at KSC, the typical processing tasks, the facilities available for processing payloads, and the capabilities and customer services that are available.

  13. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  14. USE OF INDICATOR ORGANISMS FOR DETERMINING PROCESS EFFECTIVENESS

    EPA Science Inventory

    Wastewaters, process effluents and treatment process residuals contain a variety of microorganisms. Many factors influence their densities as they move through collection systems and process equipment. Biological treatment systems rely on the catabolic processes of such microor...

  15. Food processing by high hydrostatic pressure.

    PubMed

    Yamamoto, Kazutaka

    2017-04-01

    High hydrostatic pressure (HHP) process, as a nonthermal process, can be used to inactivate microbes while minimizing chemical reactions in food. In this regard, a HHP level of 100 MPa (986.9 atm/1019.7 kgf/cm 2 ) and more is applied to food. Conventional thermal process damages food components relating color, flavor, and nutrition via enhanced chemical reactions. However, HHP process minimizes the damages and inactivates microbes toward processing high quality safe foods. The first commercial HHP-processed foods were launched in 1990 as fruit products such as jams, and then some other products have been commercialized: retort rice products (enhanced water impregnation), cooked hams and sausages (shelf life extension), soy sauce with minimized salt (short-time fermentation owing to enhanced enzymatic reactions), and beverages (shelf life extension). The characteristics of HHP food processing are reviewed from viewpoints of nonthermal process, history, research and development, physical and biochemical changes, and processing equipment.

  16. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  17. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  18. A qualitative assessment of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of two Gaussian processes and the sum of that product with a third Gaussian process. The resulting total random process is interpreted as the sum of an amplitude modulated process and a slowly varying, random mean value. The properties of the process are examined, including an interpretation of the process in terms of the physical structure of atmospheric motions. The inclusion of the mean value variation gives an improved representation of the properties of atmospheric motions, since the resulting process can account for the differences in the statistical properties of atmospheric velocity components and their gradients. The application of the process to atmospheric turbulence problems, including the response of aircraft dynamic systems, is examined. The effects of the mean value variation upon aircraft loads are small in most cases, but can be important in the measurement and interpretation of atmospheric turbulence data.

  19. Metals Recovery from Artificial Ore in Case of Printed Circuit Boards, Using Plasmatron Plasma Reactor

    PubMed Central

    Szałatkiewicz, Jakub

    2016-01-01

    This paper presents the investigation of metals production form artificial ore, which consists of printed circuit board (PCB) waste, processed in plasmatron plasma reactor. A test setup was designed and built that enabled research of plasma processing of PCB waste of more than 700 kg/day scale. The designed plasma process is presented and discussed. The process in tests consumed 2 kWh/kg of processed waste. Investigation of the process products is presented with their elemental analyses of metals and slag. The average recovery of metals in presented experiments is 76%. Metals recovered include: Ag, Au, Pd, Cu, Sn, Pb, and others. The chosen process parameters are presented: energy consumption, throughput, process temperatures, and air consumption. Presented technology allows processing of variable and hard-to-process printed circuit board waste that can reach up to 100% of the input mass. PMID:28773804

  20. Characterisation and Processing of Some Iron Ores of India

    NASA Astrophysics Data System (ADS)

    Krishna, S. J. G.; Patil, M. R.; Rudrappa, C.; Kumar, S. P.; Ravi, B. P.

    2013-10-01

    Lack of process characterization data of the ores based on the granulometry, texture, mineralogy, physical, chemical, properties, merits and limitations of process, market and local conditions may mislead the mineral processing entrepreneur. The proper implementation of process characterization and geotechnical map data will result in optimized sustainable utilization of resource by processing. A few case studies of process characterization of some Indian iron ores are dealt with. The tentative ascending order of process refractoriness of iron ores is massive hematite/magnetite < marine black iron oxide sands < laminated soft friable siliceous ore fines < massive banded magnetite quartzite < laminated soft friable clayey aluminous ore fines < massive banded hematite quartzite/jasper < massive clayey hydrated iron oxide ore < manganese bearing iron ores massive < Ti-V bearing magnetite magmatic ore < ferruginous cherty quartzite. Based on diagnostic process characterization, the ores have been classified and generic process have been adopted for some Indian iron ores.

  1. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  2. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey

    2003-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  3. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  4. Metals Recovery from Artificial Ore in Case of Printed Circuit Boards, Using Plasmatron Plasma Reactor.

    PubMed

    Szałatkiewicz, Jakub

    2016-08-10

    This paper presents the investigation of metals production form artificial ore, which consists of printed circuit board (PCB) waste, processed in plasmatron plasma reactor. A test setup was designed and built that enabled research of plasma processing of PCB waste of more than 700 kg/day scale. The designed plasma process is presented and discussed. The process in tests consumed 2 kWh/kg of processed waste. Investigation of the process products is presented with their elemental analyses of metals and slag. The average recovery of metals in presented experiments is 76%. Metals recovered include: Ag, Au, Pd, Cu, Sn, Pb, and others. The chosen process parameters are presented: energy consumption, throughput, process temperatures, and air consumption. Presented technology allows processing of variable and hard-to-process printed circuit board waste that can reach up to 100% of the input mass.

  5. The origins of levels-of-processing effects in a conceptual test: evidence for automatic influences of memory from the process-dissociation procedure.

    PubMed

    Bergerbest, Dafna; Goshen-Gottstein, Yonatan

    2002-12-01

    In three experiments, we explored automatic influences of memory in a conceptual memory task, as affected by a levels-of-processing (LoP) manipulation. We also explored the origins of the LoP effect by examining whether the effect emerged only when participants in the shallow condition truncated the perceptual processing (the lexical-processing hypothesis) or even when the entire word was encoded in this condition (the conceptual-processing hypothesis). Using the process-dissociation procedure and an implicit association-generation task, we found that the deep encoding condition yielded higher estimates of automatic influences than the shallow condition. In support of the conceptual processing hypothesis, the LoP effect was found even when the shallow task did not lead to truncated processing of the lexical units. We suggest that encoding for meaning is a prerequisite for automatic processing on conceptual tests of memory.

  6. Exploring business process modelling paradigms and design-time to run-time transitions

    NASA Astrophysics Data System (ADS)

    Caron, Filip; Vanthienen, Jan

    2016-09-01

    The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.

  7. System Engineering Concept Demonstration, Process Model. Volume 3

    DTIC Science & Technology

    1992-12-01

    Process or Process Model The System Engineering process must be the enactment of the aforementioned definitions. Therefore, a process is an enactment of a...Prototype Tradeoff Scenario demonstrates six levels of abstraction in the Process Model. The Process Model symbology is explained within the "Help" icon ...dnofing no- ubeq t"vidi e /hn -am-a. lmi IzyuO ..pu Row _e._n au"c.ue-w’ ’- anuiildyidwile b ie htplup ?~imsav D symbo ,,ue,.dvu ,,dienl Flw s--..,fu..I

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eun, H.C.; Cho, Y.Z.; Choi, J.H.

    A regeneration process of LiCl-KCl eutectic waste salt generated from the pyrochemical process of spent nuclear fuel has been studied. This regeneration process is composed of a chemical conversion process and a vacuum distillation process. Through the regeneration process, a high efficiency of renewable salt recovery can be obtained from the waste salt and rare earth nuclides in the waste salt can be separated as oxide or phosphate forms. Thus, the regeneration process can contribute greatly to a reduction of the waste volume and a creation of durable final waste forms. (authors)

  9. An open system approach to process reengineering in a healthcare operational environment.

    PubMed

    Czuchry, A J; Yasin, M M; Norris, J

    2000-01-01

    The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.

  10. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  11. Water-saving liquid-gas conditioning system

    DOEpatents

    Martin, Christopher; Zhuang, Ye

    2014-01-14

    A method for treating a process gas with a liquid comprises contacting a process gas with a hygroscopic working fluid in order to remove a constituent from the process gas. A system for treating a process gas with a liquid comprises a hygroscopic working fluid comprising a component adapted to absorb or react with a constituent of a process gas, and a liquid-gas contactor for contacting the working fluid and the process gas, wherein the constituent is removed from the process gas within the liquid-gas contactor.

  12. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  13. Magnitude processing of symbolic and non-symbolic proportions: an fMRI study.

    PubMed

    Mock, Julia; Huber, Stefan; Bloechle, Johannes; Dietrich, Julia F; Bahnmueller, Julia; Rennig, Johannes; Klein, Elise; Moeller, Korbinian

    2018-05-10

    Recent research indicates that processing proportion magnitude is associated with activation in the intraparietal sulcus. Thus, brain areas associated with the processing of numbers (i.e., absolute magnitude) were activated during processing symbolic fractions as well as non-symbolic proportions. Here, we investigated systematically the cognitive processing of symbolic (e.g., fractions and decimals) and non-symbolic proportions (e.g., dot patterns and pie charts) in a two-stage procedure. First, we investigated relative magnitude-related activations of proportion processing. Second, we evaluated whether symbolic and non-symbolic proportions share common neural substrates. We conducted an fMRI study using magnitude comparison tasks with symbolic and non-symbolic proportions, respectively. As an indicator for magnitude-related processing of proportions, the distance effect was evaluated. A conjunction analysis indicated joint activation of specific occipito-parietal areas including right intraparietal sulcus (IPS) during proportion magnitude processing. More specifically, results indicate that the IPS, which is commonly associated with absolute magnitude processing, is involved in processing relative magnitude information as well, irrespective of symbolic or non-symbolic presentation format. However, we also found distinct activation patterns for the magnitude processing of the different presentation formats. Our findings suggest that processing for the separate presentation formats is not only associated with magnitude manipulations in the IPS, but also increasing demands on executive functions and strategy use associated with frontal brain regions as well as visual attention and encoding in occipital regions. Thus, the magnitude processing of proportions may not exclusively reflect processing of number magnitude information but also rather domain-general processes.

  14. [Alcohol-purification technology and its particle sedimentation process in manufactory of Fufang Kushen injection].

    PubMed

    Liu, Xiaoqian; Tong, Yan; Wang, Jinyu; Wang, Ruizhen; Zhang, Yanxia; Wang, Zhimin

    2011-11-01

    Fufang Kushen injection was selected as the model drug, to optimize its alcohol-purification process and understand the characteristics of particle sedimentation process, and to investigate the feasibility of using process analytical technology (PAT) on traditional Chinese medicine (TCM) manufacturing. Total alkaloids (calculated by matrine, oxymatrine, sophoridine and oxysophoridine) and macrozamin were selected as quality evaluation markers to optimize the process of Fufang Kushen injection purification with alcohol. Process parameters of particulate formed in the alcohol-purification, such as the number, density and sedimentation velocity, were also determined to define the sedimentation time and well understand the process. The purification process was optimized as that alcohol is added to the concentrated extract solution (drug material) to certain concentration for 2 times and deposited the alcohol-solution containing drug-material to sediment for some time, i.e. 60% alcohol deposited for 36 hours, filter and then 80% -90% alcohol deposited for 6 hours in turn. The content of total alkaloids was decreased a little during the depositing process. The average settling time of particles with the diameters of 10, 25 microm were 157.7, 25.2 h in the first alcohol-purified process, and 84.2, 13.5 h in the second alcohol-purified process, respectively. The optimized alcohol-purification process remains the marker compositions better and compared with the initial process, it's time saving and much economy. The manufacturing quality of TCM-injection can be controlled by process. PAT pattern must be designed under the well understanding of process of TCM production.

  15. Application of volume-retarded osmosis and low-pressure membrane hybrid process for water reclamation.

    PubMed

    Im, Sung-Ju; Choi, Jungwon; Lee, Jung-Gil; Jeong, Sanghyun; Jang, Am

    2018-03-01

    A new concept of volume-retarded osmosis and low-pressure membrane (VRO-LPM) hybrid process was developed and evaluated for the first time in this study. Commercially available forward osmosis (FO) and ultrafiltration (UF) membranes were employed in a VRO-LPM hybrid process to overcome energy limitations of draw solution (DS) regeneration and production of permeate in the FO process. To evaluate its feasibility as a water reclamation process, and to optimize the operational conditions, cross-flow FO and dead-end mode UF processes were individually evaluated. For the FO process, a DS concentration of 0.15 g mL -1 of polysulfonate styrene (PSS) was determined to be optimal, having a high flux with a low reverse salt flux. The UF membrane with a molecular weight cut-off of 1 kDa was chosen for its high PSS rejection in the LPM process. As a single process, UF (LPM) exhibited a higher flux than FO, but this could be controlled by adjusting the effective membrane area of the FO and UF membranes in the VRO-LPM system. The VRO-LPM hybrid process only required a circulation pump for the FO process. This led to a decrease in the specific energy consumption of the VRO-LPM process for potable water production, that was similar to the single FO process. Therefore, the newly developed VRO-LPM hybrid process, with an appropriate DS selection, can be used as an energy efficient water production method, and can outperform conventional water reclamation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  17. Formulating poultry processing sanitizers from alkaline salts of fatty acids

    USDA-ARS?s Scientific Manuscript database

    Though some poultry processing operations remove microorganisms from carcasses; other processing operations cause cross-contamination that spreads microorganisms between carcasses, processing water, and processing equipment. One method used by commercial poultry processors to reduce microbial contam...

  18. Fabrication Process for Cantilever Beam Micromechanical Switches

    DTIC Science & Technology

    1993-08-01

    Beam Design ................................................................... 13 B. Chemistry and Materials Used in Cantilever Beam Process...7 3. Photomask levels and composite...pp 410-413. 5 2. Cantilever Beam Fabrication Process The beam fabrication process incorporates four different photomasking levels with 62 processing

  19. Reports of planetary geology program, 1983

    NASA Technical Reports Server (NTRS)

    Holt, H. E. (Compiler)

    1984-01-01

    Several areas of the Planetary Geology Program were addressed including outer solar system satellites, asteroids, comets, Venus, cratering processes and landform development, volcanic processes, aeolian processes, fluvial processes, periglacial and permafrost processes, geomorphology, remote sensing, tectonics and stratigraphy, and mapping.

  20. Cognitive Processes in Discourse Comprehension: Passive Processes, Reader-Initiated Processes, and Evolving Mental Representations

    ERIC Educational Resources Information Center

    van den Broek, Paul; Helder, Anne

    2017-01-01

    As readers move through a text, they engage in various types of processes that, if all goes well, result in a mental representation that captures their interpretation of the text. With each new text segment the reader engages in passive and, at times, reader-initiated processes. These processes are strongly influenced by the readers'…

  1. The Use of Knowledge Based Decision Support Systems in Reengineering Selected Processes in the U. S. Marine Corps

    DTIC Science & Technology

    2001-09-01

    measurable benefit in terms of process efficiency and effectiveness, business process reengineering (BPR) is becoming increasingly important. BPR suggests...technology by businesses in hopes of achieving a measurable benefit in terms of process efficiency and effectiveness, business process...KOPER-LITE ........................................13 E. HOW MIGHT THE MILITARY BENEFIT FROM PROCESS REENGINEERING EFFORTS

  2. 30 CFR 206.181 - How do I establish processing costs for dual accounting purposes when I do not process the gas?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting purposes when I do not process the gas? 206.181 Section 206.181 Mineral Resources MINERALS... Processing Allowances § 206.181 How do I establish processing costs for dual accounting purposes when I do not process the gas? Where accounting for comparison (dual accounting) is required for gas production...

  3. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  4. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  5. Reliability and performance of a system-on-a-chip by predictive wear-out based activation of functional components

    DOEpatents

    Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong

    2013-10-01

    A processor-implemented method for determining aging of a processing unit in a processor the method comprising: calculating an effective aging profile for the processing unit wherein the effective aging profile quantifies the effects of aging on the processing unit; combining the effective aging profile with process variation data, actual workload data and operating conditions data for the processing unit; and determining aging through an aging sensor of the processing unit using the effective aging profile, the process variation data, the actual workload data, architectural characteristics and redundancy data, and the operating conditions data for the processing unit.

  6. Fuzzy control of burnout of multilayer ceramic actuators

    NASA Astrophysics Data System (ADS)

    Ling, Alice V.; Voss, David; Christodoulou, Leo

    1996-08-01

    To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.

  7. Direct access inter-process shared memory

    DOEpatents

    Brightwell, Ronald B; Pedretti, Kevin; Hudson, Trammell B

    2013-10-22

    A technique for directly sharing physical memory between processes executing on processor cores is described. The technique includes loading a plurality of processes into the physical memory for execution on a corresponding plurality of processor cores sharing the physical memory. An address space is mapped to each of the processes by populating a first entry in a top level virtual address table for each of the processes. The address space of each of the processes is cross-mapped into each of the processes by populating one or more subsequent entries of the top level virtual address table with the first entry in the top level virtual address table from other processes.

  8. Biotechnology in Food Production and Processing

    NASA Astrophysics Data System (ADS)

    Knorr, Dietrich; Sinskey, Anthony J.

    1985-09-01

    The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.

  9. What is a good public participation process? Five perspectives from the public.

    PubMed

    Webler, T; Tuler, S; Krueger, R

    2001-03-01

    It is now widely accepted that members of the public should be involved in environmental decision-making. This has inspired many to search for principles that characterize good public participation processes. In this paper we report on a study that identifies discourses about what defines a good process. Our case study was a forest planning process in northern New England and New York. We employed Q methodology to learn how participants characterize a good process differently, by selecting, defining, and privileging different principles. Five discourses, or perspectives, about good process emerged from our study. One perspective emphasizes that a good process acquires and maintains popular legitimacy. A second sees a good process as one that facilitates an ideological discussion. A third focuses on the fairness of the process. A fourth perspective conceptualizes participatory processes as a power struggle--in this instance a power play between local land-owning interests and outsiders. A fifth perspective highlights the need for leadership and compromise. Dramatic differences among these views suggest an important challenge for those responsible for designing and carrying out public participation processes. Conflicts may emerge about process designs because people disagree about what is good in specific contexts.

  10. Alternating event processes during lifetimes: population dynamics and statistical inference.

    PubMed

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  11. Process mining in oncology using the MIMIC-III dataset

    NASA Astrophysics Data System (ADS)

    Prima Kurniati, Angelina; Hall, Geoff; Hogg, David; Johnson, Owen

    2018-03-01

    Process mining is a data analytics approach to discover and analyse process models based on the real activities captured in information systems. There is a growing body of literature on process mining in healthcare, including oncology, the study of cancer. In earlier work we found 37 peer-reviewed papers describing process mining research in oncology with a regular complaint being the limited availability and accessibility of datasets with suitable information for process mining. Publicly available datasets are one option and this paper describes the potential to use MIMIC-III, for process mining in oncology. MIMIC-III is a large open access dataset of de-identified patient records. There are 134 publications listed as using the MIMIC dataset, but none of them have used process mining. The MIMIC-III dataset has 16 event tables which are potentially useful for process mining and this paper demonstrates the opportunities to use MIMIC-III for process mining in oncology. Our research applied the L* lifecycle method to provide a worked example showing how process mining can be used to analyse cancer pathways. The results and data quality limitations are discussed along with opportunities for further work and reflection on the value of MIMIC-III for reproducible process mining research.

  12. Research on the technique of large-aperture off-axis parabolic surface processing using tri-station machine and its applicability.

    PubMed

    Zhang, Xin; Luo, Xiao; Hu, Haixiang; Zhang, Xuejun

    2015-09-01

    In order to process large-aperture aspherical mirrors, we designed and constructed a tri-station machine processing center with a three station device, which bears vectored feed motion of up to 10 axes. Based on this processing center, an aspherical mirror-processing model is proposed, in which each station implements traversal processing of large-aperture aspherical mirrors using only two axes, while the stations are switchable, thus lowering cost and enhancing processing efficiency. The applicability of the tri-station machine is also analyzed. At the same time, a simple and efficient zero-calibration method for processing is proposed. To validate the processing model, using our processing center, we processed an off-axis parabolic SiC mirror with an aperture diameter of 1450 mm. The experimental results indicate that, with a one-step iterative process, the peak to valley (PV) and root mean square (RMS) of the mirror converged from 3.441 and 0.5203 μm to 2.637 and 0.2962 μm, respectively, where the RMS reduced by 43%. The validity and high accuracy of the model are thereby demonstrated.

  13. Patterning of Indium Tin Oxide Films

    NASA Technical Reports Server (NTRS)

    Immer, Christopher

    2008-01-01

    A relatively rapid, economical process has been devised for patterning a thin film of indium tin oxide (ITO) that has been deposited on a polyester film. ITO is a transparent, electrically conductive substance made from a mixture of indium oxide and tin oxide that is commonly used in touch panels, liquid-crystal and plasma display devices, gas sensors, and solar photovoltaic panels. In a typical application, the ITO film must be patterned to form electrodes, current collectors, and the like. Heretofore it has been common practice to pattern an ITO film by means of either a laser ablation process or a photolithography/etching process. The laser ablation process includes the use of expensive equipment to precisely position and focus a laser. The photolithography/etching process is time-consuming. The present process is a variant of the direct toner process an inexpensive but often highly effective process for patterning conductors for printed circuits. Relative to a conventional photolithography/ etching process, this process is simpler, takes less time, and is less expensive. This process involves equipment that costs less than $500 (at 2005 prices) and enables patterning of an ITO film in a process time of less than about a half hour.

  14. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dafler, J.R.; Sinnott, J.; Novil, M.

    The first phase of a study to identify candidate processes and products suitable for future exploitation using high-temperature solar energy is presented. This phase has been principally analytical, consisting of techno-economic studies, thermodynamic assessments of chemical reactions and processes, and the determination of market potentials for major chemical commodities that use significant amounts of fossil resources today. The objective was to identify energy-intensive processes that would be suitable for the production of chemicals and fuels using solar energy process heat. Of particular importance was the comparison of relative costs and energy requirements for the selected solar product versus costs formore » the product derived from conventional processing. The assessment methodology used a systems analytical approach to identify processes and products having the greatest potential for solar energy-thermal processing. This approach was used to establish the basis for work to be carried out in subsequent phases of development. It has been the intent of the program to divide the analysis and process identification into the following three distinct areas: (1) process selection, (2) process evaluation, and (3) ranking of processes. Four conventional processes were selected for assessment namely, methanol synthesis, styrene monomer production, vinyl chloride monomer production, and terephthalic acid production.« less

  16. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  17. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  18. Auditory-musical processing in autism spectrum disorders: a review of behavioral and brain imaging studies.

    PubMed

    Ouimet, Tia; Foster, Nicholas E V; Tryfon, Ana; Hyde, Krista L

    2012-04-01

    Autism spectrum disorder (ASD) is a complex neurodevelopmental condition characterized by atypical social and communication skills, repetitive behaviors, and atypical visual and auditory perception. Studies in vision have reported enhanced detailed ("local") processing but diminished holistic ("global") processing of visual features in ASD. Individuals with ASD also show enhanced processing of simple visual stimuli but diminished processing of complex visual stimuli. Relative to the visual domain, auditory global-local distinctions, and the effects of stimulus complexity on auditory processing in ASD, are less clear. However, one remarkable finding is that many individuals with ASD have enhanced musical abilities, such as superior pitch processing. This review provides a critical evaluation of behavioral and brain imaging studies of auditory processing with respect to current theories in ASD. We have focused on auditory-musical processing in terms of global versus local processing and simple versus complex sound processing. This review contributes to a better understanding of auditory processing differences in ASD. A deeper comprehension of sensory perception in ASD is key to better defining ASD phenotypes and, in turn, may lead to better interventions. © 2012 New York Academy of Sciences.

  19. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  20. Effect of simulated mechanical recycling processes on the structure and properties of poly(lactic acid).

    PubMed

    Beltrán, F R; Lorenzo, V; Acosta, J; de la Orden, M U; Martínez Urreaga, J

    2018-06-15

    The aim of this work is to study the effects of different simulated mechanical recycling processes on the structure and properties of PLA. A commercial grade of PLA was melt compounded and compression molded, then subjected to two different recycling processes. The first recycling process consisted of an accelerated ageing and a second melt processing step, while the other recycling process included an accelerated ageing, a demanding washing process and a second melt processing step. The intrinsic viscosity measurements indicate that both recycling processes produce a degradation in PLA, which is more pronounced in the sample subjected to the washing process. DSC results suggest an increase in the mobility of the polymer chains in the recycled materials; however the degree of crystallinity of PLA seems unchanged. The optical, mechanical and gas barrier properties of PLA do not seem to be largely affected by the degradation suffered during the different recycling processes. These results suggest that, despite the degradation of PLA, the impact of the different simulated mechanical recycling processes on the final properties is limited. Thus, the potential use of recycled PLA in packaging applications is not jeopardized. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Consumption of ultra-processed foods predicts diet quality in Canada.

    PubMed

    Moubarac, Jean-Claude; Batal, M; Louzada, M L; Martinez Steele, E; Monteiro, C A

    2017-01-01

    This study describes food consumption patterns in Canada according to the types of food processing using the Nova classification and investigates the association between consumption of ultra-processed foods and the nutrient profile of the diet. Dietary intakes of 33,694 individuals from the 2004 Canadian Community Health Survey aged 2 years and above were analyzed. Food and drinks were classified using Nova into unprocessed or minimally processed foods, processed culinary ingredients, processed foods and ultra-processed foods. Average consumption (total daily energy intake) and relative consumption (% of total energy intake) provided by each of the food groups were calculated. Consumption of ultra-processed foods according to sex, age, education, residential location and relative family revenue was assessed. Mean nutrient content of ultra-processed foods and non-ultra-processed foods were compared, and the average nutrient content of the overall diet across quintiles of dietary share of ultra-processed foods was measured. In 2004, 48% of calories consumed by Canadians came from ultra-processed foods. Consumption of such foods was high amongst all socioeconomic groups, and particularly in children and adolescents. As a group, ultra-processed foods were grossly nutritionally inferior to non-ultra-processed foods. After adjusting for covariates, a significant and positive relationship was found between the dietary share of ultra-processed foods and the content in carbohydrates, free sugars, total and saturated fats and energy density, while an inverse relationship was observed with the dietary content in protein, fiber, vitamins A, C, D, B6 and B12, niacin, thiamine, riboflavin, as well as zinc, iron, magnesium, calcium, phosphorus and potassium. Lowering the dietary share of ultra-processed foods and raising consumption of hand-made meals from unprocessed or minimally processed foods would substantially improve the diet quality of Canadian. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor degreasing process.

  3. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  4. The prevalence of medial coronoid process disease is high in lame large breed dogs and quantitative radiographic assessments contribute to the diagnosis.

    PubMed

    Mostafa, Ayman; Nolte, Ingo; Wefstaedt, Patrick

    2018-06-05

    Medial coronoid process disease is a common leading cause of thoracic limb lameness in dogs. Computed tomography and arthroscopy are superior to radiography to diagnose medial coronoid process disease, however, radiography remains the most available diagnostic imaging modality in veterinary practice. Objectives of this retrospective observational study were to describe the prevalence of medial coronoid process disease in lame large breed dogs and apply a novel method for quantifying the radiographic changes associated with medial coronoid process and subtrochlear-ulnar region in Labrador and Golden Retrievers with confirmed medial coronoid process disease. Purebred Labrador and Golden Retrievers (n = 143, 206 elbows) without and with confirmed medial coronoid process disease were included. The prevalence of medial coronoid process disease in lame large breed dogs was calculated. Mediolateral and craniocaudal radiographs of elbows were analyzed to assess the medial coronoid process length and morphology, and subtrochlear-ulnar width. Mean grayscale value was calculated for radial and subtrochlear-ulnar zones. The prevalence of medial coronoid process disease was 20.8%. Labrador and Golden Retrievers were the most affected purebred dogs (29.6%). Elbows with confirmed medial coronoid process disease had short (P < 0.0001) and deformed (∼95%) medial coronoid process, with associated medial coronoid process osteophytosis (7.5%). Subtrochlear-ulnar sclerosis was evidenced in ∼96% of diseased elbows, with a significant increase (P < 0.0001) in subtrochlear-ulnar width and standardized grayscale value. Radial grayscale value did not differ between groups. Periarticular osteophytosis was identified in 51.4% of elbows with medial coronoid process disease. Medial coronoid process length and morphology, and subtrochlear-ulnar width and standardized grayscale value varied significantly in dogs with confirmed medial coronoid process disease compared to controls. Findings indicated that medial coronoid process disease has a high prevalence in lame large breed dogs and that quantitative radiographic assessments can contribute to the diagnosis. © 2018 American College of Veterinary Radiology.

  5. The role of rational and experiential processing in influencing the framing effect.

    PubMed

    Stark, Emily; Baldwin, Austin S; Hertel, Andrew W; Rothman, Alexander J

    2017-01-01

    Research on individual differences and the framing effect has focused primarily on how variability in rational processing influences choice. However, we propose that measuring only rational processing presents an incomplete picture of how participants are responding to framed options, as orthogonal individual differences in experiential processing might be relevant. In two studies, we utilize the Rational Experiential Inventory, which captures individual differences in rational and experiential processing, to investigate how both processing types influence decisions. Our results show that differences in experiential processing, but not rational processing, moderated the effect of frame on choice. We suggest that future research should more closely examine the influence of experiential processing on making decisions, to gain a broader understanding of the conditions that contribute to the framing effect.

  6. Study and Analysis of The Robot-Operated Material Processing Systems (ROMPS)

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.

    1996-01-01

    This is a report presenting the progress of a research grant funded by NASA for work performed during 1 Oct. 1994 - 31 Sep. 1995. The report deals with the development and investigation of potential use of software for data processing for the Robot Operated Material Processing System (ROMPS). It reports on the progress of data processing of calibration samples processed by ROMPS in space and on earth. First data were retrieved using the I/O software and manually processed using MicroSoft Excel. Then the data retrieval and processing process was automated using a program written in C which is able to read the telemetry data and produce plots of time responses of sample temperatures and other desired variables. LabView was also employed to automatically retrieve and process the telemetry data.

  7. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  8. Separate cortical networks involved in music perception: preliminary functional MRI evidence for modularity of music processing.

    PubMed

    Schmithorst, Vincent J

    2005-04-01

    Music perception is a quite complex cognitive task, involving the perception and integration of various elements including melody, harmony, pitch, rhythm, and timbre. A preliminary functional MRI investigation of music perception was performed, using a simplified passive listening task. Group independent component analysis (ICA) was used to separate out various components involved in music processing, as the hemodynamic responses are not known a priori. Various components consistent with auditory processing, expressive language, syntactic processing, and visual association were found. The results are discussed in light of various hypotheses regarding modularity of music processing and its overlap with language processing. The results suggest that, while some networks overlap with ones used for language processing, music processing may involve its own domain-specific processing subsystems.

  9. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  10. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  11. Laser displacement sensor to monitor the layup process of composite laminate production

    NASA Astrophysics Data System (ADS)

    Miesen, Nick; Groves, Roger M.; Sinke, Jos; Benedictus, Rinze

    2013-04-01

    Several types of flaw can occur during the layup process of prepreg composite laminates. Quality control after the production process checks the end product by testing the specimens for flaws which are included during the layup process or curing process, however by then these flaws are already irreversibly embedded in the laminate. This paper demonstrates the use of a laser displacement sensor technique applied during the layup process of prepreg laminates for in-situ flaw detection, for typical flaws that can occur during the composite production process. An incorrect number of layers and fibre wrinkling are dominant flaws during the process of layup. These and other dominant flaws have been modeled to determine the requirements for an in-situ monitoring during the layup process of prepreg laminates.

  12. Levels of integration in cognitive control and sequence processing in the prefrontal cortex.

    PubMed

    Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.

  13. Levels of Integration in Cognitive Control and Sequence Processing in the Prefrontal Cortex

    PubMed Central

    Bahlmann, Jörg; Korb, Franziska M.; Gratton, Caterina; Friederici, Angela D.

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex. PMID:22952762

  14. Flow chemistry using milli- and microstructured reactors-from conventional to novel process windows.

    PubMed

    Illg, Tobias; Löb, Patrick; Hessel, Volker

    2010-06-01

    The terminology Novel Process Window unites different methods to improve existing processes by applying unconventional and harsh process conditions like: process routes at much elevated pressure, much elevated temperature, or processing in a thermal runaway regime to achieve a significant impact on process performance. This paper is a review of parts of IMM's works in particular the applicability of above mentioned Novel Process Windows on selected chemical reactions. First, general characteristics of microreactors are discussed like excellent mass and heat transfer and improved mixing quality. Different types of reactions are presented in which the use of microstructured devices led to an increased process performance by applying Novel Process Windows. These examples were chosen to demonstrate how chemical reactions can benefit from the use of milli- and microstructured devices and how existing protocols can be changed toward process conditions hitherto not applicable in standard laboratory equipment. The used milli- and microstructured reactors can also offer advantages in other areas, for example, high-throughput screening of catalysts and better control of size distribution in a particle synthesis process by improved mixing, etc. The chemical industry is under continuous improvement. So, a lot of research is being done to synthesize high value chemicals, to optimize existing processes in view of process safety and energy consumption and to search for new routes to produce such chemicals. Leitmotifs of such undertakings are often sustainable development(1) and Green Chemistry(2).

  15. Fast but fleeting: adaptive motor learning processes associated with aging and cognitive decline.

    PubMed

    Trewartha, Kevin M; Garcia, Angeles; Wolpert, Daniel M; Flanagan, J Randall

    2014-10-01

    Motor learning has been shown to depend on multiple interacting learning processes. For example, learning to adapt when moving grasped objects with novel dynamics involves a fast process that adapts and decays quickly-and that has been linked to explicit memory-and a slower process that adapts and decays more gradually. Each process is characterized by a learning rate that controls how strongly motor memory is updated based on experienced errors and a retention factor determining the movement-to-movement decay in motor memory. Here we examined whether fast and slow motor learning processes involved in learning novel dynamics differ between younger and older adults. In addition, we investigated how age-related decline in explicit memory performance influences learning and retention parameters. Although the groups adapted equally well, they did so with markedly different underlying processes. Whereas the groups had similar fast processes, they had different slow processes. Specifically, the older adults exhibited decreased retention in their slow process compared with younger adults. Within the older group, who exhibited considerable variation in explicit memory performance, we found that poor explicit memory was associated with reduced retention in the fast process, as well as the slow process. These findings suggest that explicit memory resources are a determining factor in impairments in the both the fast and slow processes for motor learning but that aging effects on the slow process are independent of explicit memory declines. Copyright © 2014 the authors 0270-6474/14/3413411-11$15.00/0.

  16. Parallel Activation in Bilingual Phonological Processing

    ERIC Educational Resources Information Center

    Lee, Su-Yeon

    2011-01-01

    In bilingual language processing, the parallel activation hypothesis suggests that bilinguals activate their two languages simultaneously during language processing. Support for the parallel activation mainly comes from studies of lexical (word-form) processing, with relatively less attention to phonological (sound) processing. According to…

  17. OCLC-MARC Tape Processing: A Functional Analysis.

    ERIC Educational Resources Information Center

    Miller, Bruce Cummings

    1984-01-01

    Analyzes structure of, and data in, the OCLC-MARC record in the form delivered via OCLC's Tape Subscription Service, and outlines important processing functions involved: "unreadable tapes," duplicate records and deduping, match processing, choice processing, locations processing, "automatic" and "input" stamps,…

  18. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  19. Risk-based Strategy to Determine Testing Requirement for the Removal of Residual Process Reagents as Process-related Impurities in Bioprocesses.

    PubMed

    Qiu, Jinshu; Li, Kim; Miller, Karen; Raghani, Anil

    2015-01-01

    The purpose of this article is to recommend a risk-based strategy for determining clearance testing requirements of the process reagents used in manufacturing biopharmaceutical products. The strategy takes account of four risk factors. Firstly, the process reagents are classified into two categories according to their safety profile and history of use: generally recognized as safe (GRAS) and potential safety concern (PSC) reagents. The clearance testing of GRAS reagents can be eliminated because of their safe use historically and process capability to remove these reagents. An estimated safety margin (Se) value, a ratio of the exposure limit to the estimated maximum reagent amount, is then used to evaluate the necessity for testing the PSC reagents at an early development stage. The Se value is calculated from two risk factors, the starting PSC reagent amount per maximum product dose (Me), and the exposure limit (Le). A worst-case scenario is assumed to estimate the Me value, that is common. The PSC reagent of interest is co-purified with the product and no clearance occurs throughout the entire purification process. No clearance testing is required for this PSC reagent if its Se value is ≥1; otherwise clearance testing is needed. Finally, the point of the process reagent introduction to the process is also considered in determining the necessity of the clearance testing for process reagents. How to use the measured safety margin as a criterion for determining PSC reagent testing at process characterization, process validation, and commercial production stages are also described. A large number of process reagents are used in the biopharmaceutical manufacturing to control the process performance. Clearance testing for all of the process reagents will be an enormous analytical task. In this article, a risk-based strategy is described to eliminate unnecessary clearance testing for majority of the process reagents using four risk factors. The risk factors included in the strategy are (i) safety profile of the reagents, (ii) the starting amount of the process reagents used in the manufacturing process, (iii) the maximum dose of the product, and (iv) the point of introduction of the process reagents in the process. The implementation of the risk-based strategy can eliminate clearance testing for approximately 90% of the process reagents used in the manufacturing processes. This science-based strategy allows us to ensure patient safety and meet regulatory agency expectations throughout the product development life cycle. © PDA, Inc. 2015.

  20. Titania nanotube powders obtained by rapid breakdown anodization in perchloric acid electrolytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Saima, E-mail: saima.ali@aalto.fi; Hannula, Simo-Pekka

    Titania nanotube (TNT) powders are prepared by rapid break down anodization (RBA) in a 0.1 M perchloric acid (HClO{sub 4}) solution (Process 1), and ethylene glycol (EG) mixture with HClO{sub 4} and water (Process 2). A study of the as-prepared and calcined TNT powders obtained by both processes is implemented to evaluate and compare the morphology, crystal structure, specific surface area, and the composition of the nanotubes. Longer TNTs are formed in Process 1, while comparatively larger pore diameter and wall thickness are obtained for the nanotubes prepared by Process 2. The TNTs obtained by Process 1 are converted tomore » nanorods at 350 °C, while nanotubes obtained by Process 2 preserve tubular morphology till 350 °C. In addition, the TNTs prepared by an aqueous electrolyte have a crystalline structure, whereas the TNTs obtained by Process 2 are amorphous. Samples calcined till 450 °C have XRD peaks from the anatase phase, while the rutile phase appears at 550 °C for the TNTs prepared by both processes. The Raman spectra also show clear anatase peaks for all samples except the as-prepared sample obtained by Process 2, thus supporting the XRD findings. FTIR spectra reveal the presence of O-H groups in the structure for the TNTs obtained by both processes. However, the presence is less prominent for annealed samples. Additionally, TNTs obtained by Process 2 have a carbonaceous impurity present in the structure attributed to the electrolyte used in that process. While a negligible weight loss is typical for TNTs prepared from aqueous electrolytes, a weight loss of 38.6% in the temperature range of 25–600 °C is found for TNTs prepared in EG electrolyte (Process 2). A large specific surface area of 179.2 m{sup 2} g{sup −1} is obtained for TNTs prepared by Process 1, whereas Process 2 produces nanotubes with a lower specific surface area. The difference appears to correspond to the dimensions of the nanotubes obtained by the two processes. - Graphical abstract: Titania nanotube powders prepared by Process 1 and Process 2 have different crystal structure and specific surface area. - Highlights: • Titania nanotube (TNT) powder is prepared in low water organic electrolyte. • Characterization of TNT powders prepared from aqueous and organic electrolyte. • TNTs prepared by Process 1 are crystalline with higher specific surface area. • TNTs obtained by Process 2 have carbonaceous impurities in the structure.« less

  1. A processing approach to the working memory/long-term memory distinction: evidence from the levels-of-processing span task.

    PubMed

    Rose, Nathan S; Craik, Fergus I M

    2012-07-01

    Recent theories suggest that performance on working memory (WM) tasks involves retrieval from long-term memory (LTM). To examine whether WM and LTM tests have common principles, Craik and Tulving's (1975) levels-of-processing paradigm, which is known to affect LTM, was administered as a WM task: Participants made uppercase, rhyme, or category-membership judgments about words, and immediate recall of the words was required after every 3 or 8 processing judgments. In Experiment 1, immediate recall did not demonstrate a levels-of-processing effect, but a subsequent LTM test (delayed recognition) of the same words did show a benefit of deeper processing. Experiment 2 showed that surprise immediate recall of 8-item lists did demonstrate a levels-of-processing effect, however. A processing account of the conditions in which levels-of-processing effects are and are not found in WM tasks was advanced, suggesting that the extent to which levels-of-processing effects are similar between WM and LTM tests largely depends on the amount of disruption to active maintenance processes. 2012 APA, all rights reserved

  2. Emotional words can be embodied or disembodied: the role of superficial vs. deep types of processing

    PubMed Central

    Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I.; Ghassemzadeh, Habib; Joanette, Yves

    2015-01-01

    Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word’s perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output. PMID:26217288

  3. Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process

    PubMed Central

    Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.

    2010-01-01

    Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477

  4. Adaptive memory: determining the proximate mechanisms responsible for the memorial advantages of survival processing.

    PubMed

    Burns, Daniel J; Burns, Sarah A; Hwang, Ana J

    2011-01-01

    J. S. Nairne, S. R. Thompson, and J. N. S. Pandeirada (2007) suggested that our memory systems may have evolved to help us remember fitness-relevant information and showed that retention of words rated for their relevance to survival is superior to that of words encoded under other deep processing conditions. The authors present 4 experiments that uncover the proximate mechanisms likely responsible. The authors obtained a recall advantage for survival processing compared with conditions that promoted only item-specific processing or only relational processing. This effect was eliminated when control conditions encouraged both item-specific and relational processing. Data from separate measures of item-specific and relational processing generally were consistent with the view that the memorial advantage for survival processing results from the encoding of both types of processing. Although the present study suggests the proximate mechanisms for the effect, the authors argue that survival processing may be fundamentally different from other memory phenomena for which item-specific and relational processing differences have been implicated. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  5. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  6. Energy saving processes for nitrogen removal in organic wastewater from food processing industries in Thailand.

    PubMed

    Johansen, N H; Suksawad, N; Balslev, P

    2004-01-01

    Nitrogen removal from organic wastewater is becoming a demand in developed communities. The use of nitrite as intermediate in the treatment of wastewater has been largely ignored, but is actually a relevant energy saving process compared to conventional nitrification/denitrification using nitrate as intermediate. Full-scale results and pilot-scale results using this process are presented. The process needs some additional process considerations and process control to be utilized. Especially under tropical conditions the nitritation process will round easily, and it must be expected that many AS treatment plants in the food industry already produce NO2-N. This uncontrolled nitrogen conversion can be the main cause for sludge bulking problems. It is expected that sludge bulking problems in many cases can be solved just by changing the process control in order to run a more consequent nitritation. Theoretically this process will decrease the oxygen consumption for oxidation by 25% and the use of carbon source for the reduction will be decreased by 40% compared to the conventional process.

  7. Application of Ozone MBBR Process in Refinery Wastewater Treatment

    NASA Astrophysics Data System (ADS)

    Lin, Wang

    2018-01-01

    Moving Bed Biofilm Reactor (MBBR) is a kind of sewage treatment technology based on fluidized bed. At the same time, it can also be regarded as an efficient new reactor between active sludge method and the biological membrane method. The application of ozone MBBR process in refinery wastewater treatment is mainly studied. The key point is to design the ozone +MBBR combined process based on MBBR process. The ozone +MBBR process is used to analyze the treatment of concentrated water COD discharged from the refinery wastewater treatment plant. The experimental results show that the average removal rate of COD is 46.0%~67.3% in the treatment of reverse osmosis concentrated water by ozone MBBR process, and the effluent can meet the relevant standard requirements. Compared with the traditional process, the ozone MBBR process is more flexible. The investment of this process is mainly ozone generator, blower and so on. The prices of these items are relatively inexpensive, and these costs can be offset by the excess investment in traditional activated sludge processes. At the same time, ozone MBBR process has obvious advantages in water quality, stability and other aspects.

  8. Models of recognition: A review of arguments in favor of a dual-process account

    PubMed Central

    DIANA, RACHEL A.; REDER, LYNNE M.; ARNDT, JASON; PARK, HEEKYEONG

    2008-01-01

    The majority of computationally specified models of recognition memory have been based on a single-process interpretation, claiming that familiarity is the only influence on recognition. There is increasing evidence that recognition is, in fact, based on two processes: recollection and familiarity. This article reviews the current state of the evidence for dual-process models, including the usefulness of the remember/know paradigm, and interprets the relevant results in terms of the source of activation confusion (SAC) model of memory. We argue that the evidence from each of the areas we discuss, when combined, presents a strong case that inclusion of a recollection process is necessary. Given this conclusion, we also argue that the dual-process claim that the recollection process is always available is, in fact, more parsimonious than the single-process claim that the recollection process is used only in certain paradigms. The value of a well-specified process model such as the SAC model is discussed with regard to other types of dual-process models. PMID:16724763

  9. Emotional words can be embodied or disembodied: the role of superficial vs. deep types of processing.

    PubMed

    Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I; Ghassemzadeh, Habib; Joanette, Yves

    2015-01-01

    Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word's perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output.

  10. Techno-economic analysis of biocatalytic processes for production of alkene expoxides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borole, Abhijeet P

    2007-01-01

    A techno-economic analysis of two different bioprocesses was conducted, one for the conversion of propylene to propylene oxide (PO) and other for conversion of styrene to styrene expoxide (SO). The first process was a lipase-mediated chemo-enzymatic reaction, whereas the second one was a one-step enzymatic process using chloroperoxidase. The PO produced through the chemo-enzymatic process is a racemic product, whereas the latter process (based on chloroperoxidase) produces an enantio-pure product. The former process thus falls under the category of high-volume commodity chemical (PO); whereas the latter is a low-volume, high-value product (SO).A simulation of the process was conducted using themore » bioprocess engineering software SuperPro Designer v6.0 (Intelligen, Inc., Scotch Plains, NJ) to determine the economic feasibility of the process. The purpose of the exercise was to compare biocatalytic processes with existing chemical processes for production of alkene expoxides. The results show that further improvements are needed in improving biocatalyst stability to make these bioprocesses competitive with chemical processes.« less

  11. The representation of conceptual knowledge: visual, auditory, and olfactory imagery compared with semantic processing.

    PubMed

    Palmiero, Massimiliano; Di Matteo, Rosalia; Belardinelli, Marta Olivetti

    2014-05-01

    Two experiments comparing imaginative processing in different modalities and semantic processing were carried out to investigate the issue of whether conceptual knowledge can be represented in different format. Participants were asked to judge the similarity between visual images, auditory images, and olfactory images in the imaginative block, if two items belonged to the same category in the semantic block. Items were verbally cued in both experiments. The degree of similarity between the imaginative and semantic items was changed across experiments. Experiment 1 showed that the semantic processing was faster than the visual and the auditory imaginative processing, whereas no differentiation was possible between the semantic processing and the olfactory imaginative processing. Experiment 2 revealed that only the visual imaginative processing could be differentiated from the semantic processing in terms of accuracy. These results showed that the visual and auditory imaginative processing can be differentiated from the semantic processing, although both visual and auditory images strongly rely on semantic representations. On the contrary, no differentiation is possible within the olfactory domain. Results are discussed in the frame of the imagery debate.

  12. Working memory load eliminates the survival processing effect.

    PubMed

    Kroneisen, Meike; Rummel, Jan; Erdfelder, Edgar

    2014-01-01

    In a series of experiments, Nairne, Thompson, and Pandeirada (2007) demonstrated that words judged for their relevance to a survival scenario are remembered better than words judged for a scenario not relevant on a survival dimension. They explained this survival-processing effect by arguing that nature "tuned" our memory systems to process and remember fitness-relevant information. Kroneisen and Erdfelder (2011) proposed that it may not be survival processing per se that facilitates recall but the richness and distinctiveness with which information is encoded. To further test this account, we investigated how the survival processing effect is affected by cognitive load. If the survival processing effect is due to automatic processes or, alternatively, if survival processing is routinely prioritized in dual-task contexts, we would expect this effect to persist under cognitive load conditions. If the effect relies on cognitively demanding processes like richness and distinctiveness of encoding, however, the survival processing benefit should be hampered by increased cognitive load during encoding. Results were in line with the latter prediction, that is, the survival processing effect vanished under dual-task conditions.

  13. E-learning process maturity level: a conceptual framework

    NASA Astrophysics Data System (ADS)

    Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.

    2018-03-01

    ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.

  14. Heat input and accumulation for ultrashort pulse processing with high average power

    NASA Astrophysics Data System (ADS)

    Finger, Johannes; Bornschlegel, Benedikt; Reininghaus, Martin; Dohrn, Andreas; Nießen, Markus; Gillner, Arnold; Poprawe, Reinhart

    2018-05-01

    Materials processing using ultrashort pulsed laser radiation with pulse durations <10 ps is known to enable very precise processing with negligible thermal load. However, even for the application of picosecond and femtosecond laser radiation, not the full amount of the absorbed energy is converted into ablation products and a distinct fraction of the absorbed energy remains as residual heat in the processed workpiece. For low average power and power densities, this heat is usually not relevant for the processing results and dissipates into the workpiece. In contrast, when higher average powers and repetition rates are applied to increase the throughput and upscale ultrashort pulse processing, this heat input becomes relevant and significantly affects the achieved processing results. In this paper, we outline the relevance of heat input for ultrashort pulse processing, starting with the heat input of a single ultrashort laser pulse. Heat accumulation during ultrashort pulse processing with high repetition rate is discussed as well as heat accumulation for materials processing using pulse bursts. In addition, the relevance of heat accumulation with multiple scanning passes and processing with multiple laser spots is shown.

  15. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  16. Process qualification and testing of LENS deposited AY1E0125 D-bottle brackets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atwood, Clinton J.; Smugeresky, John E.; Jew, Michael

    2006-11-01

    The LENS Qualification team had the goal of performing a process qualification for the Laser Engineered Net Shaping{trademark}(LENS{reg_sign}) process. Process Qualification requires that a part be selected for process demonstration. The AY1E0125 D-Bottle Bracket from the W80-3 was selected for this work. The repeatability of the LENS process was baselined to determine process parameters. Six D-Bottle brackets were deposited using LENS, machined to final dimensions, and tested in comparison to conventionally processed brackets. The tests, taken from ES1E0003, included a mass analysis and structural dynamic testing including free-free and assembly-level modal tests, and Haversine shock tests. The LENS brackets performedmore » with very similar characteristics to the conventionally processed brackets. Based on the results of the testing, it was concluded that the performance of the brackets made them eligible for parallel path testing in subsystem level tests. The testing results and process rigor qualified the LENS process as detailed in EER200638525A.« less

  17. Sustainability assessment of shielded metal arc welding (SMAW) process

    NASA Astrophysics Data System (ADS)

    Alkahla, Ibrahim; Pervaiz, Salman

    2017-09-01

    Shielded metal arc welding (SMAW) process is one of the most commonly employed material joining processes utilized in the various industrial sectors such as marine, ship-building, automotive, aerospace, construction and petrochemicals etc. The increasing pressure on manufacturing sector wants the welding process to be sustainable in nature. The SMAW process incorporates several types of inputs and output streams. The sustainability concerns associated with SMAW process are linked with the various input and output streams such as electrical energy requirement, input material consumptions, slag formation, fumes emission and hazardous working conditions associated with the human health and occupational safety. To enhance the environmental performance of the SMAW welding process, there is a need to characterize the sustainability for the SMAW process under the broad framework of sustainability. Most of the available literature focuses on the technical and economic aspects of the welding process, however the environmental and social aspects are rarely addressed. The study reviews SMAW process with respect to the triple bottom line (economic, environmental and social) sustainability approach. Finally, the study concluded recommendations towards achieving economical and sustainable SMAW welding process.

  18. Decontamination and disposal of PCB wastes.

    PubMed Central

    Johnston, L E

    1985-01-01

    Decontamination and disposal processes for PCB wastes are reviewed. Processes are classed as incineration, chemical reaction or decontamination. Incineration technologies are not limited to the rigorous high temperature but include those where innovations in use of oxident, heat transfer and residue recycle are made. Chemical processes include the sodium processes, radiant energy processes and low temperature oxidations. Typical processing rates and associated costs are provided where possible. PMID:3928363

  19. Logistics Control Facility: A Normative Model for Total Asset Visibility in the Air Force Logistics System

    DTIC Science & Technology

    1994-09-01

    IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in

  20. Definition and documentation of engineering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, G.W.

    1997-11-01

    This tutorial is an extract of a two-day workshop developed under the auspices of the Quality Engineering Department at Sandia National Laboratories. The presentation starts with basic definitions and addresses why processes should be defined and documented. It covers three primary topics: (1) process considerations and rationale, (2) approach to defining and documenting engineering processes, and (3) an IDEFO model of the process for defining engineering processes.

  1. Method for enhanced atomization of liquids

    DOEpatents

    Thompson, Richard E.; White, Jerome R.

    1993-01-01

    In a process for atomizing a slurry or liquid process stream in which a slurry or liquid is passed through a nozzle to provide a primary atomized process stream, an improvement which comprises subjecting the liquid or slurry process stream to microwave energy as the liquid or slurry process stream exits the nozzle, wherein sufficient microwave heating is provided to flash vaporize the primary atomized process stream.

  2. Rethinking a Negative Event: The Affective Impact of Ruminative versus Imagery-Based Processing of Aversive Autobiographical Memories.

    PubMed

    Slofstra, Christien; Eisma, Maarten C; Holmes, Emily A; Bockting, Claudi L H; Nauta, Maaike H

    2017-01-01

    Ruminative (abstract verbal) processing during recall of aversive autobiographical memories may serve to dampen their short-term affective impact. Experimental studies indeed demonstrate that verbal processing of non-autobiographical material and positive autobiographical memories evokes weaker affective responses than imagery-based processing. In the current study, we hypothesized that abstract verbal or concrete verbal processing of an aversive autobiographical memory would result in weaker affective responses than imagery-based processing. The affective impact of abstract verbal versus concrete verbal versus imagery-based processing during recall of an aversive autobiographical memory was investigated in a non-clinical sample ( n  = 99) using both an observational and an experimental design. Observationally, it was examined whether spontaneous use of processing modes (both state and trait measures) was associated with impact of aversive autobiographical memory recall on negative and positive affect. Experimentally, the causal relation between processing modes and affective impact was investigated by manipulating the processing mode during retrieval of the same aversive autobiographical memory. Main findings were that higher levels of trait (but not state) measures of both ruminative and imagery-based processing and depressive symptomatology were positively correlated with higher levels of negative affective impact in the observational part of the study. In the experimental part, no main effect of processing modes on affective impact of autobiographical memories was found. However, a significant moderating effect of depressive symptomatology was found. Only for individuals with low levels of depressive symptomatology, concrete verbal (but not abstract verbal) processing of the aversive autobiographical memory did result in weaker affective responses, compared to imagery-based processing. These results cast doubt on the hypothesis that ruminative processing of aversive autobiographical memories serves to avoid the negative emotions evoked by such memories. Furthermore, findings suggest that depressive symptomatology is associated with the spontaneous use and the affective impact of processing modes during recall of aversive autobiographical memories. Clinical studies are needed that examine the role of processing modes during aversive autobiographical memory recall in depression, including the potential effectiveness of targeting processing modes in therapy.

  3. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. On the facilitative effects of face motion on face recognition and its development

    PubMed Central

    Xiao, Naiqi G.; Perrotta, Steve; Quinn, Paul C.; Wang, Zhe; Sun, Yu-Hao P.; Lee, Kang

    2014-01-01

    For the past century, researchers have extensively studied human face processing and its development. These studies have advanced our understanding of not only face processing, but also visual processing in general. However, most of what we know about face processing was investigated using static face images as stimuli. Therefore, an important question arises: to what extent does our understanding of static face processing generalize to face processing in real-life contexts in which faces are mostly moving? The present article addresses this question by examining recent studies on moving face processing to uncover the influence of facial movements on face processing and its development. First, we describe evidence on the facilitative effects of facial movements on face recognition and two related theoretical hypotheses: the supplementary information hypothesis and the representation enhancement hypothesis. We then highlight several recent studies suggesting that facial movements optimize face processing by activating specific face processing strategies that accommodate to task requirements. Lastly, we review the influence of facial movements on the development of face processing in the first year of life. We focus on infants' sensitivity to facial movements and explore the facilitative effects of facial movements on infants' face recognition performance. We conclude by outlining several future directions to investigate moving face processing and emphasize the importance of including dynamic aspects of facial information to further understand face processing in real-life contexts. PMID:25009517

  5. Comparison of property between two Viking Seismic tapes

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Yamada, R.

    2016-12-01

    Tthe restoration work of the seismometer data onboard Viking Lander 2 is still continuing. Originally, the data were processed and archived both in MIT and UTIG separately, and each data is accessible via the Internet today. Their file formats to store the data are different, but both of them are currently readable due to the continuous investigation. However, there is some inconsistency between their data although most of their data are highly consistent. To understand the differences, the knowledge of archiving and off-line processing of spacecraft is required because these differences are caused by the off-line processing.The data processing of spacecraft often requires merge and sort processing of raw data. The merge processing is normally performed to eliminate duplicated data, and the sort processing is performed to fix data order. UTIG did not seem to perform these merge and sort processing. Therefore, the UTIG processed data remain duplication. The MIT processed data did these merge and sort processing, but the raw data sometimes include wrong time tags, and it cannot be fixed strictly after sort processing. Also, the MIT processed data has enough documents to understand metadata, while UTIG data has a brief instruction. Therefore, both of MIT and UTIG data are treated complementary. A better data set can be established using both of them. In this presentation, we would show the method to build a better data set of Viking Lander 2 seismic data.

  6. Holistic processing, contact, and the other-race effect in face recognition.

    PubMed

    Zhao, Mintao; Hayward, William G; Bülthoff, Isabelle

    2014-12-01

    Face recognition, holistic processing, and processing of configural and featural facial information are known to be influenced by face race, with better performance for own- than other-race faces. However, whether these various other-race effects (OREs) arise from the same underlying mechanisms or from different processes remains unclear. The present study addressed this question by measuring the OREs in a set of face recognition tasks, and testing whether these OREs are correlated with each other. Participants performed different tasks probing (1) face recognition, (2) holistic processing, (3) processing of configural information, and (4) processing of featural information for both own- and other-race faces. Their contact with other-race people was also assessed with a questionnaire. The results show significant OREs in tasks testing face memory and processing of configural information, but not in tasks testing either holistic processing or processing of featural information. Importantly, there was no cross-task correlation between any of the measured OREs. Moreover, the level of other-race contact predicted only the OREs obtained in tasks testing face memory and processing of configural information. These results indicate that these various cross-race differences originate from different aspects of face processing, in contrary to the view that the ORE in face recognition is due to cross-race differences in terms of holistic processing. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  8. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1996-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  9. Materials processing in space: Early experiments

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.; Herring, H. W.

    1980-01-01

    The characteristics of the space environment were reviewed. Potential applications of space processing are discussed and include metallurgical processing, and processing of semiconductor materials. The behavior of fluid in low gravity is described. The evolution of apparatus for materials processing in space was reviewed.

  10. Abhijit Dutta | NREL

    Science.gov Websites

    Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A

  11. 7 CFR 52.806 - Color.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1... cherries that vary markedly from this color due to oxidation, improper processing, or other causes, or that... to oxidation, improper processing, or other causes, or that are undercolored, does not exceed the...

  12. 7 CFR 52.806 - Color.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1... cherries that vary markedly from this color due to oxidation, improper processing, or other causes, or that... to oxidation, improper processing, or other causes, or that are undercolored, does not exceed the...

  13. Meat Processing.

    ERIC Educational Resources Information Center

    Legacy, Jim; And Others

    This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

  14. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  15. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  16. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  17. 40 CFR 60.2558 - What if a chemical recovery unit is not listed in § 60.2555(n)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  18. Integrated decontamination process for metals

    DOEpatents

    Snyder, Thomas S.; Whitlow, Graham A.

    1991-01-01

    An integrated process for decontamination of metals, particularly metals that are used in the nuclear energy industry contaminated with radioactive material. The process combines the processes of electrorefining and melt refining to purify metals that can be decontaminated using either electrorefining or melt refining processes.

  19. Case Studies in Continuous Process Improvement

    NASA Technical Reports Server (NTRS)

    Mehta, A.

    1997-01-01

    This study focuses on improving the SMT assembly process in a low-volume, high-reliability environment with emphasis on fine pitch and BGA packages. Before a process improvement is carried out, it is important to evaluate where the process stands in terms of process capability.

  20. Mathematical Model of Nonstationary Separation Processes Proceeding in the Cascade of Gas Centrifuges in the Process of Separation of Multicomponent Isotope Mixtures

    NASA Astrophysics Data System (ADS)

    Orlov, A. A.; Ushakov, A. A.; Sovach, V. P.

    2017-03-01

    We have developed and realized on software a mathematical model of the nonstationary separation processes proceeding in the cascades of gas centrifuges in the process of separation of multicomponent isotope mixtures. With the use of this model the parameters of the separation process of germanium isotopes have been calculated. It has been shown that the model adequately describes the nonstationary processes in the cascade and is suitable for calculating their parameters in the process of separation of multicomponent isotope mixtures.

  1. International Best Practices for Pre-Processing and Co-Processing Municipal Solid Waste and Sewage Sludge in the Cement Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasanbeigi, Ali; Lu, Hongyou; Williams, Christopher

    The purpose of this report is to describe international best practices for pre-processing and coprocessing of MSW and sewage sludge in cement plants, for the benefit of countries that wish to develop co-processing capacity. The report is divided into three main sections. Section 2 describes the fundamentals of co-processing, Section 3 describes exemplary international regulatory and institutional frameworks for co-processing, and Section 4 describes international best practices related to the technological aspects of co-processing.

  2. Thermochemical water decomposition processes

    NASA Technical Reports Server (NTRS)

    Chao, R. E.

    1974-01-01

    Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

  3. Voyager image processing at the Image Processing Laboratory

    NASA Astrophysics Data System (ADS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-09-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  4. Voyager image processing at the Image Processing Laboratory

    NASA Technical Reports Server (NTRS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-01-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  5. A novel process control method for a TT-300 E-Beam/X-Ray system

    NASA Astrophysics Data System (ADS)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

  6. A minimally processed dietary pattern is associated with lower odds of metabolic syndrome among Lebanese adults.

    PubMed

    Nasreddine, Lara; Tamim, Hani; Itani, Leila; Nasrallah, Mona P; Isma'eel, Hussain; Nakhoul, Nancy F; Abou-Rizk, Joana; Naja, Farah

    2018-01-01

    To (i) estimate the consumption of minimally processed, processed and ultra-processed foods in a sample of Lebanese adults; (ii) explore patterns of intakes of these food groups; and (iii) investigate the association of the derived patterns with cardiometabolic risk. Cross-sectional survey. Data collection included dietary assessment using an FFQ and biochemical, anthropometric and blood pressure measurements. Food items were categorized into twenty-five groups based on the NOVA food classification. The contribution of each food group to total energy intake (TEI) was estimated. Patterns of intakes of these food groups were examined using exploratory factor analysis. Multivariate logistic regression analysis was used to evaluate the associations of derived patterns with cardiometabolic risk factors. Greater Beirut area, Lebanon. Adults ≥18 years (n 302) with no prior history of chronic diseases. Of TEI, 36·53 and 27·10 % were contributed by ultra-processed and minimally processed foods, respectively. Two dietary patterns were identified: the 'ultra-processed' and the 'minimally processed/processed'. The 'ultra-processed' consisted mainly of fast foods, snacks, meat, nuts, sweets and liquor, while the 'minimally processed/processed' consisted mostly of fruits, vegetables, legumes, breads, cheeses, sugar and fats. Participants in the highest quartile of the 'minimally processed/processed' pattern had significantly lower odds for metabolic syndrome (OR=0·18, 95 % CI 0·04, 0·77), hyperglycaemia (OR=0·25, 95 % CI 0·07, 0·98) and low HDL cholesterol (OR=0·17, 95 % CI 0·05, 0·60). The study findings may be used for the development of evidence-based interventions aimed at encouraging the consumption of minimally processed foods.

  7. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  8. Chemical interaction matrix between reagents in a Purex based process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brahman, R.K.; Hennessy, W.P.; Paviet-Hartmann, P.

    2008-07-01

    The United States Department of Energy (DOE) is the responsible entity for the disposal of the United States excess weapons grade plutonium. DOE selected a PUREX-based process to convert plutonium to low-enriched mixed oxide fuel for use in commercial nuclear power plants. To initiate this process in the United States, a Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) is under construction and will be operated by Shaw AREVA MOX Services at the Savannah River Site. This facility will be licensed and regulated by the U.S. Nuclear Regulatory Commission (NRC). A PUREX process, similar to the one used at La Hague,more » France, will purify plutonium feedstock through solvent extraction. MFFF employs two major process operations to manufacture MOX fuel assemblies: (1) the Aqueous Polishing (AP) process to remove gallium and other impurities from plutonium feedstock and (2) the MOX fuel fabrication process (MP), which processes the oxides into pellets and manufactures the MOX fuel assemblies. The AP process consists of three major steps, dissolution, purification, and conversion, and is the center of the primary chemical processing. A study of process hazards controls has been initiated that will provide knowledge and protection against the chemical risks associated from mixing of reagents over the life time of the process. This paper presents a comprehensive chemical interaction matrix evaluation for the reagents used in the PUREX-based process. Chemical interaction matrix supplements the process conditions by providing a checklist of any potential inadvertent chemical reactions that may take place. It also identifies the chemical compatibility/incompatibility of the reagents if mixed by failure of operations or equipment within the process itself or mixed inadvertently by a technician in the laboratories. (aut0010ho.« less

  9. Ultra-processed foods have the worst nutrient profile, yet they are the most available packaged products in a sample of New Zealand supermarkets.

    PubMed

    Luiten, Claire M; Steenhuis, Ingrid Hm; Eyles, Helen; Ni Mhurchu, Cliona; Waterlander, Wilma E

    2016-02-01

    To examine the availability of packaged food products in New Zealand supermarkets by level of industrial processing, nutrient profiling score (NPSC), price (energy, unit and serving costs) and brand variety. Secondary analysis of cross-sectional survey data on packaged supermarket food and non-alcoholic beverages. Products were classified according to level of industrial processing (minimally, culinary and ultra-processed) and their NPSC. Packaged foods available in four major supermarkets in Auckland, New Zealand. Packaged supermarket food products for the years 2011 and 2013. The majority (84% in 2011 and 83% in 2013) of packaged foods were classified as ultra-processed. A significant positive association was found between the level of industrial processing and NPSC, i.e., ultra-processed foods had a worse nutrient profile (NPSC=11.63) than culinary processed foods (NPSC=7.95), which in turn had a worse nutrient profile than minimally processed foods (NPSC=3.27), P<0.001. No clear associations were observed between the three price measures and level of processing. The study observed many variations of virtually the same product. The ten largest food manufacturers produced 35% of all packaged foods available. In New Zealand supermarkets, ultra-processed foods comprise the largest proportion of packaged foods and are less healthy than less processed foods. The lack of significant price difference between ultra- and less processed foods suggests ultra-processed foods might provide time-poor consumers with more value for money. These findings highlight the need to improve the supermarket food supply by reducing numbers of ultra-processed foods and by reformulating products to improve their nutritional profile.

  10. Trends in consumption of ultra-processed foods and obesity in Sweden between 1960 and 2010.

    PubMed

    Juul, Filippa; Hemmingsson, Erik

    2015-12-01

    To investigate how consumption of ultra-processed foods has changed in Sweden in relation to obesity. Nationwide ecological analysis of changes in processed foods along with corresponding changes in obesity. Trends in per capita food consumption during 1960-2010 were investigated using data from the Swedish Board of Agriculture. Food items were classified as group 1 (unprocessed/minimally processed), group 2 (processed culinary ingredients) or group 3 (3·1, processed food products; and 3·2, ultra-processed products). Obesity prevalence data were pooled from the peer-reviewed literature, Statistics Sweden and the WHO Global Health Observatory. Nationwide analysis in Sweden, 1960-2010. Swedish nationals aged 18 years and older. During the study period consumption of group 1 foods (minimal processing) decreased by 2 %, while consumption of group 2 foods (processed ingredients) decreased by 34 %. Consumption of group 3·1 foods (processed food products) increased by 116 % and group 3·2 foods (ultra-processed products) increased by 142 %. Among ultra-processed products, there were particularly large increases in soda (315 %; 22 v. 92 litres/capita per annum) and snack foods such as crisps and candies (367 %; 7 v. 34 kg/capita per annum). In parallel to these changes in ultra-processed products, rates of adult obesity increased from 5 % in 1980 to over 11 % in 2010. The consumption of ultra-processed products (i.e. foods with low nutritional value but high energy density) has increased dramatically in Sweden since 1960, which mirrors the increased prevalence of obesity. Future research should clarify the potential causal role of ultra-processed products in weight gain and obesity.

  11. Differential Phonological and Semantic Modulation of Neurophysiological Responses to Visual Word Recognition.

    PubMed

    Drakesmith, Mark; El-Deredy, Wael; Welbourne, Stephen

    2015-01-01

    Reading words for meaning relies on orthographic, phonological and semantic processing. The triangle model implicates a direct orthography-to-semantics pathway and a phonologically mediated orthography-to-semantics pathway, which interact with each other. The temporal evolution of processing in these routes is not well understood, although theoretical evidence predicts early phonological processing followed by interactive phonological and semantic processing. This study used electroencephalography-event-related potential (ERP) analysis and magnetoencephalography (MEG) source localisation to identify temporal markers and the corresponding neural generators of these processes in early (∼200 ms) and late (∼400 ms) neurophysiological responses to visual words, pseudowords and consonant strings. ERP showed an effect of phonology but not semantics in both time windows, although at ∼400 ms there was an effect of stimulus familiarity. Phonological processing at ~200 ms was localised to the left occipitotemporal cortex and the inferior frontal gyrus. At 400 ms, there was continued phonological processing in the inferior frontal gyrus and additional semantic processing in the anterior temporal cortex. There was also an area in the left temporoparietal junction which was implicated in both phonological and semantic processing. In ERP, the semantic response at ∼400 ms appeared to be masked by concurrent processes relating to familiarity, while MEG successfully differentiated these processes. The results support the prediction of early phonological processing followed by an interaction of phonological and semantic processing during word recognition. Neuroanatomical loci of these processes are consistent with previous neuropsychological and functional magnetic resonance imaging studies. The results also have implications for the classical interpretation of N400-like responses as markers for semantic processing.

  12. Basic abnormalities in visual processing affect face processing at an early age in autism spectrum disorder.

    PubMed

    Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal

    2010-12-15

    A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD and the relation with abnormal SF processing. We investigated whether young ASD children show abnormalities in low spatial frequency (LSF, global) and high spatial frequency (HSF, detailed) processing and explored whether these are crucially involved in the early development of face processing. Three- to 4-year-old children with ASD (n = 22) were compared with developmentally delayed children without ASD (n = 17). Spatial frequency processing was studied by recording visual evoked potentials from visual brain areas while children passively viewed gratings (HSF/LSF). In addition, children watched face stimuli with different expressions, filtered to include only HSF or LSF. Enhanced activity in visual brain areas was found in response to HSF versus LSF information in children with ASD, in contrast to control subjects. Furthermore, facial-expression processing was also primarily driven by detail in ASD. Enhanced visual processing of detailed (HSF) information is present early in ASD and occurs for neutral (gratings), as well as for socially relevant stimuli (facial expressions). These data indicate that there is a general abnormality in visual SF processing in early ASD and are in agreement with suggestions that a fast LSF subcortical face processing route might be affected in ASD. This could suggest that abnormal visual processing is causative in the development of social problems in ASD. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. [Contention on the theory of processing techniques of Chinese materia medica in the Ming-Qing period].

    PubMed

    Chen, Bin; Jia, Tianzhu

    2015-03-01

    On the basis of the golden stage of development of processing techniques of medicinals in the Song dynasty, the theory and techniques of processing in the Ming-Qing dynasties developed and accomplished further. The knowledge of some physicians on the processing of common medicinal, such as Radix rehmannia and Radixophiopogonis, was questioned, with new idea of processing methods put forward and argued against those insisting traditional ones, marking the progress of the art of processing. By reviewing the contention of technical theory of medicinal processing in the Ming-Qing period, useful references can be provided for the inheritance and development of the traditional art of processing medicinals.

  14. Process Feasibility Study in Support of Silicon Material, Task 1

    NASA Technical Reports Server (NTRS)

    Li, K. Y.; Hansen, K. C.; Yaws, C. L.

    1979-01-01

    During this reporting period, major activies were devoted to process system properties, chemical engineering and economic analyses. Analyses of process system properties was continued for materials involved in the alternate processes under consideration for solar cell grade silicon. The following property data are reported for silicon tetrafluoride: critical constants, vapor pressure, heat of varporization, heat capacity, density, surface tension, viscosity, thermal conductivity, heat of formation and Gibb's free energy of formation. Chemical engineering analysis of the BCL process was continued with primary efforts being devoted to the preliminary process design. Status and progress are reported for base case conditions; process flow diagram; reaction chemistry; material and energy balances; and major process equipment design.

  15. Technology and development requirements for advanced coal conversion systems

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A compendium of coal conversion process descriptions is presented. The SRS and MC data bases were utilized to provide information paticularly in the areas of existing process designs and process evaluations. Additional information requirements were established and arrangements were made to visit process developers, pilot plants, and process development units to obtain information that was not otherwise available. Plant designs, process descriptions and operating conditions, and performance characteristics were analyzed and requirements for further development identified and evaluated to determine the impact of these requirements on the process commercialization potential from the standpoint of economics and technical feasibility. A preliminary methodology was established for the comparative technical and economic assessment of advanced processes.

  16. The s-process in massive stars: the Shell C-burning contribution

    NASA Astrophysics Data System (ADS)

    Pignatari, Marco; Gallino, R.; Baldovin, C.; Wiescher, M.; Herwig, F.; Heger, A.; Heil, M.; Käppeler, F.

    In massive stars the s¡ process (slow neutron capture process) is activated at different tempera- tures, during He¡ burning and during convective shell C¡ burning. At solar metallicity, the neu- tron capture process in the convective C¡ shell adds a substantial contribution to the s¡ process yields made by the previous core He¡ burning, and the final results carry the signature of both processes. With decreasing metallicity, the contribution of the C¡ burning shell to the weak s¡ process rapidly decreases, because of the effect of the primary neutron poisons. On the other hand, also the s¡ process efficiency in the He core decreases with metallicity.

  17. Clean-up and disposal process of polluted sediments from urban rivers.

    PubMed

    He, P J; Shao, L M; Gu, G W; Bian, C L; Xu, C

    2001-10-01

    In this paper, the discussion is concentrated on the properties of the polluted sediments and the combination of clean-up and disposal process for the upper layer heavily polluted sediments with good flowability. Based on the systematic analyses of various clean-up processes, a suitable engineering process has been evaluated and recommended. The process has been applied to the river reclamation in Yangpu District of Shanghai City, China. An improved centrifuge is used for dewatering the dredged sludge, which plays an important role in the combination of clean-up and disposal process. The assessment of the engineering process shows its environmental and technical economy feasibility, which is much better than that of traditional dredging-disposal processes.

  18. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  19. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  20. Survey of the US materials processing and manufacturing in space program

    NASA Technical Reports Server (NTRS)

    Mckannan, E. C.

    1981-01-01

    To promote potential commercial applications of low-g technology, the materials processing and manufacturing in space program is structured to: (1) analyze the scientific principles of gravitational effects on processes used in producing materials; (2) apply the research toward the technology used to control production process (on Earth or in space, as appropriate); and (3) establish the legal and managerial framework for commercial ventures. Presently federally funded NASA research is described as well as agreements for privately funded commercial activity, and a proposed academic participation process. The future scope of the program and related capabilities using ground based facilities, aircraft, sounding rockets, and space shuttles are discussed. Areas of interest described include crystal growth; solidification of metals and alloys; containerless processing; fluids and chemical processes (including biological separation processes); and processing extraterrestrial materials.

  1. On the fractal characterization of Paretian Poisson processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  2. Mobil process converts methanol to high-quality synthetic gasoline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, A.

    1978-12-11

    If production of gasoline from coal becomes commercially attractive in the United States, a process under development at the Mobil Research and Development Corp. may compete with better known coal liquefaction processes. Mobil process converts methanol to high-octane, unleaded gasoline; methanol can be produced commercially from coal. If gasoline is the desired product, the Mobil process offers strong technical and cost advantages over H-coal, Exxon donor solvent, solvent-refined coal, and Fischer--Tropsch processes. The cost analysis, contained in a report to the Dept. of Energy, concludes that the Mobil process produces more-expensive liquid products than any other liquefaction process except Fischer--Tropsch.more » But Mobil's process produces ready-to-use gasoline, while the others produce oils which require further expensive refining to yield gasoline. Disadvantages and advantages are discussed.« less

  3. Using Waste Heat for External Processes (English/Chinese) (Fact Sheet) (in Chin3se; English)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Chinese translation of the Using Waste Heat for External Processes fact sheet. Provides suggestions on how to use waste heat in industrial applications. The temperature of exhaust gases from fuel-fired industrial processes depends mainly on the process temperature and the waste heat recovery method. Figure 1 shows the heat lost in exhaust gases at various exhaust gas temperatures and percentages of excess air. Energy from gases exhausted from higher temperature processes (primary processes) can be recovered and used for lower temperature processes (secondary processes). One example is to generate steam using waste heat boilers for the fluid heaters used inmore » petroleum crude processing. In addition, many companies install heat exchangers on the exhaust stacks of furnaces and ovens to produce hot water or to generate hot air for space heating.« less

  4. In-situ acoustic signature monitoring in additive manufacturing processes

    NASA Astrophysics Data System (ADS)

    Koester, Lucas W.; Taheri, Hossein; Bigelow, Timothy A.; Bond, Leonard J.; Faierson, Eric J.

    2018-04-01

    Additive manufacturing is a rapidly maturing process for the production of complex metallic, ceramic, polymeric, and composite components. The processes used are numerous, and with the complex geometries involved this can make quality control and standardization of the process and inspection difficult. Acoustic emission measurements have been used previously to monitor a number of processes including machining and welding. The authors have identified acoustic signature measurement as a potential means of monitoring metal additive manufacturing processes using process noise characteristics and those discrete acoustic emission events characteristic of defect growth, including cracks and delamination. Results of acoustic monitoring for a metal additive manufacturing process (directed energy deposition) are reported. The work investigated correlations between acoustic emissions and process noise with variations in machine state and deposition parameters, and provided proof of concept data that such correlations do exist.

  5. NPTool: Towards Scalability and Reliability of Business Process Management

    NASA Astrophysics Data System (ADS)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  6. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  7. Achieving continuous manufacturing for final dosage formation: challenges and how to meet them. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are: Form precompetitive partnerships, including industry (pharmaceutical companies and equipment manufacturers), government, and universities. These precompetitive partnerships would develop case studies of continuous manufacturing and ideally perform joint-technology development, including development of small-scale equipment and processes. Develop ways to invest internally in continuous manufacturing. How best to do this will depend on the specifics of a given organization, in particular the current development projects. Upper managers will need to energize their process developers to incorporate continuous manufacturing in at least part of their processes to gain experience and demonstrate directly the benefits. Training of continuous manufacturing technologies, organizational approaches, and regulatory approaches is a key area that industrial leaders should pursue together. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. 25 CFR 42.4 - What are alternative dispute resolution processes?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... What are alternative dispute resolution processes? Alternative dispute resolution (ADR) processes are... action. (a) ADR processes may: (1) Include peer adjudication, mediation, and conciliation; and (2... that these practices are readily identifiable. (b) For further information on ADR processes and how to...

  9. 25 CFR 42.4 - What are alternative dispute resolution processes?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... What are alternative dispute resolution processes? Alternative dispute resolution (ADR) processes are... action. (a) ADR processes may: (1) Include peer adjudication, mediation, and conciliation; and (2... that these practices are readily identifiable. (b) For further information on ADR processes and how to...

  10. Characterization of Nonhomogeneous Poisson Processes Via Moment Conditions.

    DTIC Science & Technology

    1986-08-01

    Poisson processes play an important role in many fields. The Poisson process is one of the simplest counting processes and is a building block for...place of independent increments. This provides a somewhat different viewpoint for examining Poisson processes . In addition, new characterizations for

  11. West Valley demonstration project: Alternative processes for solidifying the high-level wastes

    NASA Astrophysics Data System (ADS)

    Holton, L. K.; Larson, D. E.; Partain, W. L.; Treat, R. L.

    1981-10-01

    Two pretreatment approaches and several waste form processes for radioactive wastes were selected for evaluation. The two waste treatment approaches were the salt/sludge separation process and the combined waste process. Both terminal and interim waste form processes were studied.

  12. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, U.B.; Gazula, G.K.M.; Hasham, A.

    1996-06-18

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements. 6 figs.

  13. 40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...

  14. 40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...

  15. Enhancing Manufacturing Process Education via Computer Simulation and Visualization

    ERIC Educational Resources Information Center

    Manohar, Priyadarshan A.; Acharya, Sushil; Wu, Peter

    2014-01-01

    Industrially significant metal manufacturing processes such as melting, casting, rolling, forging, machining, and forming are multi-stage, complex processes that are labor, time, and capital intensive. Academic research develops mathematical modeling of these processes that provide a theoretical framework for understanding the process variables…

  16. 9 CFR 318.304 - Operations in the thermal processing area.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... factor over the specified thermal processing operation times. Temperature/time recording devices shall... minimum initial temperatures and operating procedures for thermal processing equipment, shall be posted in... available to the thermal processing system operator and the inspector. (b) Process indicators and retort...

  17. 9 CFR 318.304 - Operations in the thermal processing area.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... factor over the specified thermal processing operation times. Temperature/time recording devices shall... minimum initial temperatures and operating procedures for thermal processing equipment, shall be posted in... available to the thermal processing system operator and the inspector. (b) Process indicators and retort...

  18. 9 CFR 318.304 - Operations in the thermal processing area.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... factor over the specified thermal processing operation times. Temperature/time recording devices shall... minimum initial temperatures and operating procedures for thermal processing equipment, shall be posted in... available to the thermal processing system operator and the inspector. (b) Process indicators and retort...

  19. 9 CFR 318.304 - Operations in the thermal processing area.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... factor over the specified thermal processing operation times. Temperature/time recording devices shall... minimum initial temperatures and operating procedures for thermal processing equipment, shall be posted in... available to the thermal processing system operator and the inspector. (b) Process indicators and retort...

  20. 20 CFR 404.926 - Agreement in expedited appeals process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DISABILITY INSURANCE (1950- ) Determinations, Administrative Review Process, and Reopening of Determinations and Decisions Expedited Appeals Process § 404.926 Agreement in expedited appeals process. If you meet... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Agreement in expedited appeals process. 404...

  1. 40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...

  2. 40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...

  3. 40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...

  4. 75 FR 16388 - Approval and Promulgation of Implementation Plans; Commonwealth of Kentucky: Prevention of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-01

    ... ``chemical process plants'' that produce ethanol through a natural fermentation process (hereafter referred... for excluding ``chemical process plants'' that produce ethanol through a natural fermentation process... facilities that produce ethanol by natural fermentation processes. Kentucky's February 5, 2010, SIP...

  5. Process for selecting engineering tools : applied to selecting a SysML tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  6. Process of discharging charge-build up in slag steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1994-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag-containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  7. 40 CFR 408.170 - Applicability; description of the Alaskan mechanized salmon processing subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Alaskan mechanized salmon processing subcategory. 408.170 Section 408.170 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Alaskan Mechanized Salmon Processing Subcategory § 408.170 Applicability; description of the Alaskan mechanized salmon processing subcategory. The provisions of this subpart are...

  8. 40 CFR 408.170 - Applicability; description of the Alaskan mechanized salmon processing subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Alaskan mechanized salmon processing subcategory. 408.170 Section 408.170 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Alaskan Mechanized Salmon Processing Subcategory § 408.170 Applicability; description of the Alaskan mechanized salmon processing subcategory. The provisions of this subpart are...

  9. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  10. Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  11. Burst of virus infection and a possibly largest epidemic threshold of non-Markovian susceptible-infected-susceptible processes on networks

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Van Mieghem, Piet

    2018-02-01

    Since a real epidemic process is not necessarily Markovian, the epidemic threshold obtained under the Markovian assumption may be not realistic. To understand general non-Markovian epidemic processes on networks, we study the Weibullian susceptible-infected-susceptible (SIS) process in which the infection process is a renewal process with a Weibull time distribution. We find that, if the infection rate exceeds 1 /ln(λ1+1 ) , where λ1 is the largest eigenvalue of the network's adjacency matrix, then the infection will persist on the network under the mean-field approximation. Thus, 1 /ln(λ1+1 ) is possibly the largest epidemic threshold for a general non-Markovian SIS process with a Poisson curing process under the mean-field approximation. Furthermore, non-Markovian SIS processes may result in a multimodal prevalence. As a byproduct, we show that a limiting Weibullian SIS process has the potential to model bursts of a synchronized infection.

  12. How does processing affect storage in working memory tasks? Evidence for both domain-general and domain-specific effects.

    PubMed

    Jarrold, Christopher; Tam, Helen; Baddeley, Alan D; Harvey, Caroline E

    2011-05-01

    Two studies that examine whether the forgetting caused by the processing demands of working memory tasks is domain-general or domain-specific are presented. In each, separate groups of adult participants were asked to carry out either verbal or nonverbal operations on exactly the same processing materials while maintaining verbal storage items. The imposition of verbal processing tended to produce greater forgetting even though verbal processing operations took no longer to complete than did nonverbal processing operations. However, nonverbal processing did cause forgetting relative to baseline control conditions, and evidence from the timing of individuals' processing responses suggests that individuals in both processing groups slowed their responses in order to "refresh" the memoranda. Taken together the data suggest that processing has a domain-general effect on working memory performance by impeding refreshment of memoranda but can also cause effects that appear domain-specific and that result from either blocking of rehearsal or interference.

  13. Semantic and self-referential processing of positive and negative trait adjectives in older adults

    PubMed Central

    Glisky, Elizabeth L.; Marquine, Maria J.

    2008-01-01

    The beneficial effects of self-referential processing on memory have been demonstrated in numerous experiments with younger adults but have rarely been studied in older individuals. In the present study we tested young people, younger-older adults, and older-older adults in a self-reference paradigm, and compared self-referential processing to general semantic processing. Findings indicated that older adults over the age of 75 and those with below average episodic memory function showed a decreased benefit from both semantic and self-referential processing relative to a structural baseline condition. However, these effects appeared to be confined to the shared semantic processes for the two conditions, leaving the added advantage for self-referential processing unaffected These results suggest that reference to the self engages qualitatively different processes compared to general semantic processing. These processes seem relatively impervious to age and to declining memory and executive function, suggesting that they might provide a particularly useful way for older adults to improve their memories. PMID:18608973

  14. The time course of attentional modulation on emotional conflict processing.

    PubMed

    Zhou, Pingyan; Yang, Guochun; Nan, Weizhi; Liu, Xun

    2016-01-01

    Cognitive conflict resolution is critical to human survival in a rapidly changing environment. However, emotional conflict processing seems to be particularly important for human interactions. This study examined whether the time course of attentional modulation on emotional conflict processing was different from cognitive conflict processing during a flanker task. Results showed that emotional N200 and P300 effects, similar to colour conflict processing, appeared only during the relevant task. However, the emotional N200 effect preceded the colour N200 effect, indicating that emotional conflict can be identified earlier than cognitive conflict. Additionally, a significant emotional N100 effect revealed that emotional valence differences could be perceived during early processing based on rough aspects of input. The present data suggest that emotional conflict processing is modulated by top-down attention, similar to cognitive conflict processing (reflected by N200 and P300 effects). However, emotional conflict processing seems to have more time advantages during two different processing stages.

  15. Seeing the forest for the trees: Networked workstations as a parallel processing computer

    NASA Technical Reports Server (NTRS)

    Breen, J. O.; Meleedy, D. M.

    1992-01-01

    Unlike traditional 'serial' processing computers in which one central processing unit performs one instruction at a time, parallel processing computers contain several processing units, thereby, performing several instructions at once. Many of today's fastest supercomputers achieve their speed by employing thousands of processing elements working in parallel. Few institutions can afford these state-of-the-art parallel processors, but many already have the makings of a modest parallel processing system. Workstations on existing high-speed networks can be harnessed as nodes in a parallel processing environment, bringing the benefits of parallel processing to many. While such a system can not rival the industry's latest machines, many common tasks can be accelerated greatly by spreading the processing burden and exploiting idle network resources. We study several aspects of this approach, from algorithms to select nodes to speed gains in specific tasks. With ever-increasing volumes of astronomical data, it becomes all the more necessary to utilize our computing resources fully.

  16. Design and application of process control charting methodologies to gamma irradiation practices

    NASA Astrophysics Data System (ADS)

    Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.

    2002-12-01

    The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.

  17. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  18. Is the processing of affective prosody influenced by spatial attention? an ERP study

    PubMed Central

    2013-01-01

    Background The present study asked whether the processing of affective prosody is modulated by spatial attention. Pseudo-words with a neutral, happy, threatening, and fearful prosody were presented at two spatial positions. Participants attended to one position in order to detect infrequent targets. Emotional prosody was task irrelevant. The electro-encephalogram (EEG) was recorded to assess processing differences as a function of spatial attention and emotional valence. Results Event-related potentials (ERPs) differed as a function of emotional prosody both when attended and when unattended. While emotional prosody effects interacted with effects of spatial attention at early processing levels (< 200 ms), these effects were additive at later processing stages (> 200 ms). Conclusions Emotional prosody, therefore, seems to be partially processed outside the focus of spatial attention. Whereas at early sensory processing stages spatial attention modulates the degree of emotional voice processing as a function of emotional valence, emotional prosody is processed outside of the focus of spatial attention at later processing stages. PMID:23360491

  19. Becoming a Lunari or Taiyo expert: learned attention to parts drives holistic processing of faces.

    PubMed

    Chua, Kao-Wei; Richler, Jennifer J; Gauthier, Isabel

    2014-06-01

    Faces are processed holistically, but the locus of holistic processing remains unclear. We created two novel races of faces (Lunaris and Taiyos) to study how experience with face parts influences holistic processing. In Experiment 1, subjects individuated Lunaris wherein the top, bottom, or both face halves contained diagnostic information. Subjects who learned to attend to face parts exhibited no holistic processing. This suggests that individuation only leads to holistic processing when the whole face is attended. In Experiment 2, subjects individuated both Lunaris and Taiyos, with diagnostic information in complementary face halves of the two races. Holistic processing was measured with composites made of either diagnostic or nondiagnostic face parts. Holistic processing was only observed for composites made from diagnostic face parts, demonstrating that holistic processing can occur for diagnostic face parts that were never seen together. These results suggest that holistic processing is an expression of learned attention to diagnostic face parts. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  20. Applications of colored petri net and genetic algorithms to cluster tool scheduling

    NASA Astrophysics Data System (ADS)

    Liu, Tung-Kuan; Kuo, Chih-Jen; Hsiao, Yung-Chin; Tsai, Jinn-Tsong; Chou, Jyh-Horng

    2005-12-01

    In this paper, we propose a method, which uses Coloured Petri Net (CPN) and genetic algorithm (GA) to obtain an optimal deadlock-free schedule and to solve re-entrant problem for the flexible process of the cluster tool. The process of the cluster tool for producing a wafer usually can be classified into three types: 1) sequential process, 2) parallel process, and 3) sequential parallel process. But these processes are not economical enough to produce a variety of wafers in small volume. Therefore, this paper will propose the flexible process where the operations of fabricating wafers are randomly arranged to achieve the best utilization of the cluster tool. However, the flexible process may have deadlock and re-entrant problems which can be detected by CPN. On the other hand, GAs have been applied to find the optimal schedule for many types of manufacturing processes. Therefore, we successfully integrate CPN and GAs to obtain an optimal schedule with the deadlock and re-entrant problems for the flexible process of the cluster tool.

  1. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  2. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  3. Visual Motion Perception and Visual Attentive Processes.

    DTIC Science & Technology

    1988-04-01

    88-0551 Visual Motion Perception and Visual Attentive Processes George Spering , New YorkUnivesity A -cesson For DTIC TAB rant AFOSR 85-0364... Spering . HIPSt: A Unix-based image processing syslem. Computer Vision, Graphics, and Image Processing, 1984,25. 331-347. ’HIPS is the Human Information...Processing Laboratory’s Image Processing System. 1985 van Santen, Jan P. It, and George Spering . Elaborated Reichardt detectors. Journal of the Optical

  4. Some functional limit theorems for compound Cox processes

    NASA Astrophysics Data System (ADS)

    Korolev, Victor Yu.; Chertok, A. V.; Korchagin, A. Yu.; Kossova, E. V.; Zeifman, Alexander I.

    2016-06-01

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  5. Some functional limit theorems for compound Cox processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korolev, Victor Yu.; Institute of Informatics Problems FRC CSC RAS; Chertok, A. V.

    2016-06-08

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  6. 37 CFR 205.12 - Process served on the Register of Copyrights or an employee in his or her official capacity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROCESSES Service of Process § 205.12 Process served on the Register of Copyrights or an employee in his or... and mode of service. (d) The Office will accept service of process for an employee only when the legal... procedure. Service of process in this case is inadequate when made only on the General Counsel. An employee...

  7. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    NASA Technical Reports Server (NTRS)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  8. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3

    DTIC Science & Technology

    2012-06-01

    OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models

  9. Evidence for the contribution of a threshold retrieval process to semantic memory.

    PubMed

    Kempnich, Maria; Urquhart, Josephine A; O'Connor, Akira R; Moulin, Chris J A

    2017-10-01

    It is widely held that episodic retrieval can recruit two processes: a threshold context retrieval process (recollection) and a continuous signal strength process (familiarity). Conversely the processes recruited during semantic retrieval are less well specified. We developed a semantic task analogous to single-item episodic recognition to interrogate semantic recognition receiver-operating characteristics (ROCs) for a marker of a threshold retrieval process. We fitted observed ROC points to three signal detection models: two models typically used in episodic recognition (unequal variance and dual-process signal detection models) and a novel dual-process recollect-to-reject (DP-RR) signal detection model that allows a threshold recollection process to aid both target identification and lure rejection. Given the nature of most semantic questions, we anticipated the DP-RR model would best fit the semantic task data. Experiment 1 (506 participants) provided evidence for a threshold retrieval process in semantic memory, with overall best fits to the DP-RR model. Experiment 2 (316 participants) found within-subjects estimates of episodic and semantic threshold retrieval to be uncorrelated. Our findings add weight to the proposal that semantic and episodic memory are served by similar dual-process retrieval systems, though the relationship between the two threshold processes needs to be more fully elucidated.

  10. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study.

    PubMed

    Klingner, Carsten M; Brodoehl, Stefan; Huonker, Ralph; Witte, Otto W

    2016-01-01

    The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM) to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG) data collected during sustained (260 ms) tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII) receives information regarding a new stimulus in parallel with the primary somatosensory area (SI), whereas later processing in the SII is dominated by the preprocessed input from the SI.

  11. Selection of Sustainable Processes using Sustainability ...

    EPA Pesticide Factsheets

    Chemical products can be obtained by process pathways involving varying amounts and types of resources, utilities, and byproduct formation. When such competing process options such as six processes for making methanol as are considered in this study, it is necessary to identify the most sustainable option. Sustainability of a chemical process is generally evaluated with indicators that require process and chemical property data. These indicators individually reflect the impacts of the process on areas of sustainability, such as the environment or society. In order to choose among several alternative processes an overall comparative analysis is essential. Generally net profit will show the most economic process. A mixed integer optimization problem can also be solved to identify the most economic among competing processes. This method uses economic optimization and leaves aside the environmental and societal impacts. To make a decision on the most sustainable process, the method presented here rationally aggregates the sustainability indicators into a single index called sustainability footprint (De). Process flow and economic data were used to compute the indicator values. Results from sustainability footprint (De) are compared with those from solving a mixed integer optimization problem. In order to identify the rank order of importance of the indicators, a multivariate analysis is performed using partial least square variable importance in projection (PLS-VIP)

  12. Improvement of hospital processes through business process management in Qaem Teaching Hospital: A work in progress.

    PubMed

    Yarmohammadian, Mohammad H; Ebrahimipour, Hossein; Doosty, Farzaneh

    2014-01-01

    In a world of continuously changing business environments, organizations have no option; however, to deal with such a big level of transformation in order to adjust the consequential demands. Therefore, many companies need to continually improve and review their processes to maintain their competitive advantages in an uncertain environment. Meeting these challenges requires implementing the most efficient possible business processes, geared to the needs of the industry and market segments that the organization serves globally. In the last 10 years, total quality management, business process reengineering, and business process management (BPM) have been some of the management tools applied by organizations to increase business competiveness. This paper is an original article that presents implementation of "BPM" approach in the healthcare domain that allows an organization to improve and review its critical business processes. This project was performed in "Qaem Teaching Hospital" in Mashhad city, Iran and consists of four distinct steps; (1) identify business processes, (2) document the process, (3) analyze and measure the process, and (4) improve the process. Implementing BPM in Qaem Teaching Hospital changed the nature of management by allowing the organization to avoid the complexity of disparate, soloed systems. BPM instead enabled the organization to focus on business processes at a higher level.

  13. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study

    PubMed Central

    Klingner, Carsten M.; Brodoehl, Stefan; Huonker, Ralph; Witte, Otto W.

    2016-01-01

    The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM) to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG) data collected during sustained (260 ms) tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII) receives information regarding a new stimulus in parallel with the primary somatosensory area (SI), whereas later processing in the SII is dominated by the preprocessed input from the SI. PMID:28066197

  14. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Potato processing scenario in India: Industrial constraints, future projections, challenges ahead and remedies - A review.

    PubMed

    Marwaha, R S; Pandey, S K; Kumar, Dinesh; Singh, S V; Kumar, Parveen

    2010-03-01

    Indian potato (Solanum tuberosum L.) processing industry has emerged fast due to economic liberalization coupled with growing urbanization, expanding market options and development of indegenous processing varieties. India's first potato processing varieties 'Kufri Chipsona-1' and 'Kufri Chipsona-2' were developed in 1998, followed by an improved processing variety 'Kufri Chipsona-3' in 2005 for the Indian plains and first chipping variety 'Kufri Himsona' for the hills. These varieties have >21% tuber dry matter content, contain low reducing sugars (<0.1% on fresh wt) and are most suitable for producing chips, French fries and dehydrated products. The availability of these varieties and standardization of storage techniques for processing potatoes at 10-12°C with sprout suppressant isopropyl N-(3-chlorophenyl) carbamate have revolutionized the processing scenario within a short span of 10 years. Currently about 4% of total potato produce is being processed in organized and unorganized sector. Potato processing industry mainly comprises 4 segments: potato chips, French fries, potato flakes/powder and other processed products. However, potato chips still continue to be the most popular processed product. The major challenge facing the industries lies in arranging round the year supply of processing varieties at reasonable price for their uninterrupted operation, besides several others which have been discussed at length and addressed with concrete solutions.

  16. Six sigma: process of understanding the control and capability of ranitidine hydrochloride tablet.

    PubMed

    Chabukswar, Ar; Jagdale, Sc; Kuchekar, Bs; Joshi, Vd; Deshmukh, Gr; Kothawade, Hs; Kuckekar, Ab; Lokhande, Pd

    2011-01-01

    The process of understanding the control and capability (PUCC) is an iterative closed loop process for continuous improvement. It covers the DMAIC toolkit in its three phases. PUCC is an iterative approach that rotates between the three pillars of the process of understanding, process control, and process capability, with each iteration resulting in a more capable and robust process. It is rightly said that being at the top is a marathon and not a sprint. The objective of the six sigma study of Ranitidine hydrochloride tablets is to achieve perfection in tablet manufacturing by reviewing the present robust manufacturing process, to find out ways to improve and modify the process, which will yield tablets that are defect-free and will give more customer satisfaction. The application of six sigma led to an improved process capability, due to the improved sigma level of the process from 1.5 to 4, a higher yield, due to reduced variation and reduction of thick tablets, reduction in packing line stoppages, reduction in re-work by 50%, a more standardized process, with smooth flow and change in coating suspension reconstitution level (8%w/w), a huge cost reduction of approximately Rs.90 to 95 lakhs per annum, an improved overall efficiency by 30% approximately, and improved overall quality of the product.

  17. Friction spinning - Twist phenomena and the capability of influencing them

    NASA Astrophysics Data System (ADS)

    Lossen, Benjamin; Homberg, Werner

    2016-10-01

    The friction spinning process can be allocated to the incremental forming techniques. The process consists of process elements from both metal spinning and friction welding. The selective combination of process elements from these two processes results in the integration of friction sub-processes in a spinning process. This implies self-induced heat generation with the possibility of manufacturing functionally graded parts from tube and sheets. Compared with conventional spinning processes, this in-process heat treatment permits the extension of existing forming limits and also the production of more complex geometries. Furthermore, the defined adjustment of part properties like strength, grain size/orientation and surface conditions can be achieved through the appropriate process parameter settings and consequently by setting a specific temperature profile in combination with the degree of deformation. The results presented from tube forming start with an investigation into the resulting twist phenomena in flange processing. In this way, the influence of the main parameters, such as rotation speed, feed rate, forming paths and tool friction surface, and their effects on temperature, forces and finally the twist behavior are analyzed. Following this, the significant correlations with the parameters and a new process strategy are set out in order to visualize the possibility of achieving a defined grain texture orientation.

  18. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    PubMed

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  19. Acid and alkaline solubilization (pH shift) process: a better approach for the utilization of fish processing waste and by-products.

    PubMed

    Surasani, Vijay Kumar Reddy

    2018-05-22

    Several technologies and methods have been developed over the years to address the environmental pollution and nutritional losses associated with the dumping of fish processing waste and low-cost fish and by-products. Despite the continuous efforts put in this field, none of the developed technologies was successful in addressing the issues due to various technical problems. To solve the problems associated with the fish processing waste and low-value fish and by-products, a process called pH shift/acid and alkaline solubilization process was developed. In this process, proteins are first solubilized using acid and alkali followed by precipitating them at their isoelectric pH to recover functional and stable protein isolates from underutilized fish species and by-products. Many studies were conducted using pH shift process to recover proteins from fish and fish by-products and found to be most successful in recovering proteins with increased yields than conventional surimi (three cycle washing) process and with good functional properties. In this paper, problems associated with conventional processing, advantages and principle of pH shift processing, effect of pH shift process on the quality and storage stability of recovered isolates, applications protein isolates, etc. are discussed in detail for better understanding.

  20. Six Sigma: Process of Understanding the Control and Capability of Ranitidine Hydrochloride Tablet

    PubMed Central

    Chabukswar, AR; Jagdale, SC; Kuchekar, BS; Joshi, VD; Deshmukh, GR; Kothawade, HS; Kuckekar, AB; Lokhande, PD

    2011-01-01

    The process of understanding the control and capability (PUCC) is an iterative closed loop process for continuous improvement. It covers the DMAIC toolkit in its three phases. PUCC is an iterative approach that rotates between the three pillars of the process of understanding, process control, and process capability, with each iteration resulting in a more capable and robust process. It is rightly said that being at the top is a marathon and not a sprint. The objective of the six sigma study of Ranitidine hydrochloride tablets is to achieve perfection in tablet manufacturing by reviewing the present robust manufacturing process, to find out ways to improve and modify the process, which will yield tablets that are defect-free and will give more customer satisfaction. The application of six sigma led to an improved process capability, due to the improved sigma level of the process from 1.5 to 4, a higher yield, due to reduced variation and reduction of thick tablets, reduction in packing line stoppages, reduction in re-work by 50%, a more standardized process, with smooth flow and change in coating suspension reconstitution level (8%w/w), a huge cost reduction of approximately Rs.90 to 95 lakhs per annum, an improved overall efficiency by 30% approximately, and improved overall quality of the product. PMID:21607050

Top