Sample records for main processing steps

  1. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images.

    PubMed

    Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.

  2. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images

    PubMed Central

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing. PMID:29023597

  3. Qualitative Features Extraction from Sensor Data using Short-time Fourier Transform

    NASA Technical Reports Server (NTRS)

    Amini, Abolfazl M.; Figueroa, Fernando

    2004-01-01

    The information gathered from sensors is used to determine the health of a sensor. Once a normal mode of operation is established any deviation from the normal behavior indicates a change. This change may be due to a malfunction of the sensor(s) or the system (or process). The step-up and step-down features, as well as sensor disturbances are assumed to be exponential. An RC network is used to model the main process, which is defined by a step-up (charging), drift, and step-down (discharging). The sensor disturbances and spike are added while the system is in drift. The system runs for a period of at least three time-constants of the main process every time a process feature occurs (e.g. step change). The Short-Time Fourier Transform of the Signal is taken using the Hamming window. Three window widths are used. The DC value is removed from the windowed data prior to taking the FFT. The resulting three dimensional spectral plots provide good time frequency resolution. The results indicate distinct shapes corresponding to each process.

  4. Dynamic Modeling of the Main Blow in Basic Oxygen Steelmaking Using Measured Step Responses

    NASA Astrophysics Data System (ADS)

    Kattenbelt, Carolien; Roffel, B.

    2008-10-01

    In the control and optimization of basic oxygen steelmaking, it is important to have an understanding of the influence of control variables on the process. However, important process variables such as the composition of the steel and slag cannot be measured continuously. The decarburization rate and the accumulation rate of oxygen, which can be derived from the generally measured waste gas flow and composition, are an indication of changes in steel and slag composition. The influence of the control variables on the decarburization rate and the accumulation rate of oxygen can best be determined in the main blow period. In this article, the measured step responses of the decarburization rate and the accumulation rate of oxygen to step changes in the oxygen blowing rate, lance height, and the addition rate of iron ore during the main blow are presented. These measured step responses are subsequently used to develop a dynamic model for the main blow. The model consists of an iron oxide and a carbon balance and an additional equation describing the influence of the lance height and the oxygen blowing rate on the decarburization rate. With this simple dynamic model, the measured step responses can be explained satisfactorily.

  5. Extraction of Qualitative Features from Sensor Data Using Windowed Fourier Transform

    NASA Technical Reports Server (NTRS)

    Amini, Abolfazl M.; Figueroa, Fenando

    2003-01-01

    In this paper, we use Matlab to model the health monitoring of a system through the information gathered from sensors. This implies assessment of the condition of the system components. Once a normal mode of operation is established any deviation from the normal behavior indicates a change. This change may be due to a malfunction of an element, a qualitative change, or a change due to a problem with another element in the network. For example, if one sensor indicates that the temperature in the tank has experienced a step change then a pressure sensor associated with the process in the tank should also experience a step change. The step up and step down as well as sensor disturbances are assumed to be exponential. An RC network is used to model the main process, which is step-up (charging), drift, and step-down (discharging). The sensor disturbances and spike are added while the system is in drift. The system is allowed to run for a period equal to three time constant of the main process before changes occur. Then each point of the signal is selected with a trailing data collected previously. Two trailing lengths of data are selected, one equal to two time constants of the main process and the other equal to two time constants of the sensor disturbance. Next, the DC is removed from each set of data and then the data are passed through a window followed by calculation of spectra for each set. In order to extract features the signal power, peak, and spectrum are plotted vs time. The results indicate distinct shapes corresponding to each process. The study is also carried out for a number of Gaussian distributed noisy cases.

  6. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A comparative study of one-step and two-step approaches for MAPbI3 perovskite layer and its influence on the performance of mesoscopic perovskite solar cell

    NASA Astrophysics Data System (ADS)

    Wang, Minhuan; Feng, Yulin; Bian, Jiming; Liu, Hongzhu; Shi, Yantao

    2018-01-01

    The mesoscopic perovskite solar cells (M-PSCs) were synthesized with MAPbI3 perovskite layers as light harvesters, which were grown with one-step and two-step solution process, respectively. A comparative study was performed through the quantitative correlation of resulting device performance and the crystalline quality of perovskite layers. Comparing with the one-step counterpart, a pronounced improvement in the steady-state power conversion efficiencies (PCEs) by 56.86% was achieved with two-step process, which was mainly resulted from the significant enhancement in fill factor (FF) from 48% to 77% without sacrificing the open circuit voltage (Voc) and short circuit current (Jsc). The enhanced FF was attributed to the reduced non-radiative recombination channels due to the better crystalline quality and larger grain size with the two-step processed perovskite layer. Moreover, the superiority of two-step over one-step process was demonstrated with rather good reproducibility.

  8. Sewage treatment method

    DOEpatents

    Fassbender, Alex G.

    1995-01-01

    The invention greatly reduces the amount of ammonia in sewage plant effluent. The process of the invention has three main steps. The first step is dewatering without first digesting, thereby producing a first ammonia-containing stream having a low concentration of ammonia, and a second solids-containing stream. The second step is sending the second solids-containing stream through a means for separating the solids from the liquid and producing an aqueous stream containing a high concentration of ammonia. The third step is removal of ammonia from the aqueous stream using a hydrothermal process.

  9. Evolution of Taste Compounds of Dezhou-Braised Chicken During Cooking Evaluated by Chemical Analysis and an Electronic Tongue System.

    PubMed

    Liu, Dengyong; Li, Shengjie; Wang, Nan; Deng, Yajun; Sha, Lei; Gai, Shengmei; Liu, Huan; Xu, Xinglian

    2017-05-01

    This paper aimed to study the time course changes in taste compounds of Dezhou-braised chicken during the entire cooking process mainly consisting of deep-frying, high-temperature boiling, and low-temperature braising steps. For this purpose, meat samples at different processing stages were analyzed for 5'-nucleotides and free amino acids, and were also subjected to electronic tongue measurements. Results showed that IMP, Glu, Lys, and sodium chloride were the main compounds contributing to the taste attributes of the final product. IMP and Glu increased in the boiling step and remained unchanged in the following braising steps. Meanwhile, decrease in Lys content and increase in sodium chloride content were observed over time in both boiling and braising steps. Intensities for bitterness, saltiness, and Aftertaste-B obtained from the electronic tongue analysis were correlated with the concentrations of these above chemical compounds. Therefore, the electronic tongue system could be applied to evaluate the taste development of Dezhou-braised chicken during processing. © 2017 Institute of Food Technologists®.

  10. Glycidyl fatty acid esters in refined edible oils: A review on formation, occurrence, analysis, and elimination methods

    USDA-ARS?s Scientific Manuscript database

    Glycidyl fatty acid esters (GEs), one of the main contaminants in processed oil, are mainly formed during the deodorization step in the oil refining process of edible oils and therefore occur in almost all refined edible oils. GEs are potential carcinogens, due to the fact that they hydrolyze into t...

  11. DPPP: Default Pre-Processing Pipeline

    NASA Astrophysics Data System (ADS)

    van Diepen, Ger; Dijkema, Tammo Jan

    2018-04-01

    DPPP (Default Pre-Processing Pipeline, also referred to as NDPPP) reads and writes radio-interferometric data in the form of Measurement Sets, mainly those that are created by the LOFAR telescope. It goes through visibilities in time order and contains standard operations like averaging, phase-shifting and flagging bad stations. Between the steps in a pipeline, the data is not written to disk, making this tool suitable for operations where I/O dominates. More advanced procedures such as gain calibration are also included. Other computing steps can be provided by loading a shared library; currently supported external steps are the AOFlagger (ascl:1010.017) and a bridge that enables loading python steps.

  12. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  13. Stable Lévy motion with inverse Gaussian subordinator

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Wyłomańska, A.; Gajda, J.

    2017-09-01

    In this paper we study the stable Lévy motion subordinated by the so-called inverse Gaussian process. This process extends the well known normal inverse Gaussian (NIG) process introduced by Barndorff-Nielsen, which arises by subordinating ordinary Brownian motion (with drift) with inverse Gaussian process. The NIG process found many interesting applications, especially in financial data description. We discuss here the main features of the introduced subordinated process, such as distributional properties, existence of fractional order moments and asymptotic tail behavior. We show the connection of the process with continuous time random walk. Further, the governing fractional partial differential equations for the probability density function is also obtained. Moreover, we discuss the asymptotic distribution of sample mean square displacement, the main tool in detection of anomalous diffusion phenomena (Metzler et al., 2014). In order to apply the stable Lévy motion time-changed by inverse Gaussian subordinator we propose a step-by-step procedure of parameters estimation. At the end, we show how the examined process can be useful to model financial time series.

  14. A practical guide for the identification of major sulcogyral structures of the human cortex.

    PubMed

    Destrieux, Christophe; Terrier, Louis Marie; Andersson, Frédéric; Love, Scott A; Cottier, Jean-Philippe; Duvernoy, Henri; Velut, Stéphane; Janot, Kevin; Zemmoura, Ilyess

    2017-05-01

    The precise sulcogyral localization of cortical lesions is mandatory to improve communication between practitioners and to predict and prevent post-operative deficits. This process, which assumes a good knowledge of the cortex anatomy and a systematic analysis of images, is, nevertheless, sometimes neglected in the neurological and neurosurgical training. This didactic paper proposes a brief overview of the sulcogyral anatomy, using conventional MR-slices, and also reconstructions of the cortical surface after a more or less extended inflation process. This method simplifies the cortical anatomy by removing part of the cortical complexity induced by the folding process, and makes it more understandable. We then reviewed several methods for localizing cortical structures, and proposed a three-step identification: after localizing the lateral, medial or ventro-basal aspect of the hemisphere (step 1), the main interlobar sulci were located to limit the lobes (step 2). Finally, intralobar sulci and gyri were identified (step 3) thanks to the same set of rules. This paper does not propose any new identification method but should be regarded as a set of practical guidelines, useful in daily clinical practice, for detecting the main sulci and gyri of the human cortex.

  15. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification.

    PubMed

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang; Li, Jinghong

    2017-08-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5'-ASO could block RNA splicing by inhibiting the first step, while 3'-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs.

  16. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use of immobilized biocatalysts is considered. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Automatic diagnosis of malaria based on complete circle-ellipse fitting search algorithm.

    PubMed

    Sheikhhosseini, M; Rabbani, H; Zekri, M; Talebi, A

    2013-12-01

    Diagnosis of malaria parasitemia from blood smears is a subjective and time-consuming task for pathologists. The automatic diagnostic process will reduce the diagnostic time. Also, it can be worked as a second opinion for pathologists and may be useful in malaria screening. This study presents an automatic method for malaria diagnosis from thin blood smears. According to this fact that malaria life cycle is started by forming a ring around the parasite nucleus, the proposed approach is mainly based on curve fitting to detect parasite ring in the blood smear. The method is composed of six main phases: stain object extraction step, which extracts candidate objects that may be infected by malaria parasites. This phase includes stained pixel extraction step based on intensity and colour, and stained object segmentation by defining stained circle matching. Second step is preprocessing phase which makes use of nonlinear diffusion filtering. The process continues with detection of parasite nucleus from resulted image of previous step according to image intensity. Fourth step introduces a complete search process in which the circle search step identifies the direction and initial points for direct least-square ellipse fitting algorithm. Furthermore in the ellipse searching process, although parasite shape is completed undesired regions with high error value are removed and ellipse parameters are modified. Features are extracted from the parasite candidate region instead of whole candidate object in the fifth step. By employing this special feature extraction way, which is provided by special searching process, the necessity of employing clump splitting methods is removed. Also, defining stained circle matching process in the first step speeds up the whole procedure. Finally, a series of decision rules are applied on the extracted features to decide on the positivity or negativity of malaria parasite presence. The algorithm is applied on 26 digital images which are provided from thin blood smear films. The images are contained 1274 objects which may be infected by parasite or healthy. Applying the automatic identification of malaria on provided database showed a sensitivity of 82.28% and specificity of 98.02%. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  18. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  19. How to Develop an Engineering Design Task

    ERIC Educational Resources Information Center

    Dankenbring, Chelsey; Capobianco, Brenda M.; Eichinger, David

    2014-01-01

    In this article, the authors provide an overview of engineering and the engineering design process, and describe the steps they took to develop a fifth grade-level, standards-based engineering design task titled "Getting the Dirt on Decomposition." Their main goal was to focus more on modeling the discrete steps they took to create and…

  20. [Collaborative application of BEPS at different time steps.

    PubMed

    Lu, Wei; Fan, Wen Yi; Tian, Tian

    2016-09-01

    BEPSHourly is committed to simulate the ecological and physiological process of vegetation at hourly time steps, and is often applied to analyze the diurnal change of gross primary productivity (GPP), net primary productivity (NPP) at site scale because of its more complex model structure and time-consuming solving process. However, daily photosynthetic rate calculation in BEPSDaily model is simpler and less time-consuming, not involving many iterative processes. It is suitable for simulating the regional primary productivity and analyzing the spatial distribution of regional carbon sources and sinks. According to the characteristics and applicability of BEPSDaily and BEPSHourly models, this paper proposed a method of collaborative application of BEPS at daily and hourly time steps. Firstly, BEPSHourly was used to optimize the main photosynthetic parameters: the maximum rate of carboxylation (V c max ) and the maximum rate of photosynthetic electron transport (J max ) at site scale, and then the two optimized parameters were introduced into BEPSDaily model to estimate regional NPP at regional scale. The results showed that optimization of the main photosynthesis parameters based on the flux data could improve the simulate ability of the model. The primary productivity of different forest types in descending order was deciduous broad-leaved forest, mixed forest, coniferous forest in 2011. The collaborative application of carbon cycle models at different steps proposed in this study could effectively optimize the main photosynthesis parameters V c max and J max , simulate the monthly averaged diurnal GPP, NPP, calculate the regional NPP, and analyze the spatial distribution of regional carbon sources and sinks.

  1. Post-processing procedure for industrial quantum key distribution systems

    NASA Astrophysics Data System (ADS)

    Kiktenko, Evgeny; Trushechkin, Anton; Kurochkin, Yury; Fedorov, Aleksey

    2016-08-01

    We present algorithmic solutions aimed on post-processing procedure for industrial quantum key distribution systems with hardware sifting. The main steps of the procedure are error correction, parameter estimation, and privacy amplification. Authentication of classical public communication channel is also considered.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osmanlioglu, Ahmet Erdal

    Pre-treatment of radioactive waste is the first step in waste management program that occurs after waste generation from various applications in Turkey. Pre-treatment and characterization practices are carried out in Radioactive Waste Management Unit (RWMU) at Cekmece Nuclear Research and Training Center (CNRTC) in Istanbul. This facility has been assigned to take all low-level radioactive wastes generated by nuclear applications in Turkey. The wastes are generated from research and nuclear applications mainly in medicine, biology, agriculture, quality control in metal processing and construction industries. These wastes are classified as low- level radioactive wastes. Pre-treatment practices cover several steps. In thismore » paper, main steps of pre-treatment and characterization are presented. Basically these are; collection, segregation, chemical adjustment, size reduction and decontamination operations. (author)« less

  3. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  4. Non-cellulosic polysaccharides from cotton fibre are differently impacted by textile processing.

    PubMed

    Runavot, Jean-Luc; Guo, Xiaoyuan; Willats, William G T; Knox, J Paul; Goubet, Florence; Meulewaeter, Frank

    2014-01-01

    Cotton fibre is mainly composed of cellulose, although non-cellulosic polysaccharides play key roles during fibre development and are still present in the harvested fibre. This study aimed at determining the fate of non-cellulosic polysaccharides during cotton textile processing. We analyzed non-cellulosic cotton fibre polysaccharides during different steps of cotton textile processing using GC-MS, HPLC and comprehensive microarray polymer profiling to obtain monosaccharide and polysaccharide amounts and linkage compositions. Additionally, in situ detection was used to obtain information on polysaccharide localization and accessibility. We show that pectic and hemicellulosic polysaccharide levels decrease during cotton textile processing and that some processing steps have more impact than others. Pectins and arabinose-containing polysaccharides are strongly impacted by the chemical treatments, with most being removed during bleaching and scouring. However, some forms of pectin are more resistant than others. Xylan and xyloglucan are affected in later processing steps and to a lesser extent, whereas callose showed a strong resistance to the chemical processing steps. This study shows that non-cellulosic polysaccharides are differently impacted by the treatments used in cotton textile processing with some hemicelluloses and callose being resistant to these harsh treatments.

  5. DEVELOPMENT OF PERMANENT MECHANICAL REPAIR SLEEVE FOR PLASTIC PIPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hitesh Patadia

    2005-04-29

    The report presents a comprehensive summary of the prototype development process utilized towards the development of a permanent mechanical repair fitting intended to be installed on damaged PE mains under blowing gas conditions. Specifically, the step by step construction approach is presented and the experimental data to support the mitigation of ensuing slow crack growth (SCG) of the damage area.

  6. Use of proteomics for validation of the isolation process of clotting factor IX from human plasma.

    PubMed

    Clifton, James; Huang, Feilei; Gaso-Sokac, Dajana; Brilliant, Kate; Hixson, Douglas; Josic, Djuro

    2010-01-03

    The use of proteomic techniques in the monitoring of different production steps of plasma-derived clotting factor IX (pd F IX) was demonstrated. The first step, solid-phase extraction with a weak anion-exchange resin, fractionates the bulk of human serum albumin (HSA), immunoglobulin G, and other non-binding proteins from F IX. The proteins that strongly bind to the anion-exchange resin are eluted by higher salt concentrations. In the second step, anion-exchange chromatography, residual HSA, some proteases and other contaminating proteins are separated. In the last chromatographic step, affinity chromatography with immobilized heparin, the majority of the residual impurities are removed. However, some contaminating proteins still remain in the eluate from the affinity column. The next step in the production process, virus filtration, is also an efficient step for the removal of residual impurities, mainly high molecular weight proteins, such as vitronectin and inter-alpha inhibitor proteins. In each production step, the active component, pd F IX and contaminating proteins are monitored by biochemical and immunochemical methods and by LC-MS/MS and their removal documented. Our methodology is very helpful for further process optimization, rapid identification of target proteins with relatively low abundance, and for the design of subsequent steps for their removal or purification.

  7. Graphical modeling and query language for hospitals.

    PubMed

    Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris

    2013-01-01

    So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool involving the physicians from several hospitals in Latvia and working with real data from these hospitals. Our third step is to develop an efficient implementation of the query language.

  8. Systematic development of a self-help and motivational enhancement intervention to promote sexual health in HIV-positive men who have sex with men.

    PubMed

    Van Kesteren, Nicole M C; Kok, Gerjo; Hospers, Harm J; Schippers, Jan; De Wildt, Wencke

    2006-12-01

    The objective of this study was to describe the application of a systematic process-Intervention Mapping-to developing a theory- and evidence-based intervention to promote sexual health in HIV-positive men who have sex with men (MSM). Intervention Mapping provides a framework that gives program planners a systematic method for decision-making in each phase of intervention development. In Step 1, we focused on the improvement of two health-promoting behaviors: satisfactory sexual functioning and safer sexual behavior. These behaviors were then linked with selected personal and external determinants, such as attitudes and social support, to produce a set of proximal program objectives. In Step 2, theoretical methods were identified to influence the proximal program objectives and were translated into practical strategies. Although theoretical methods were derived from various theories, self-regulation theory and a cognitive model of behavior change provided the main framework for selecting the intervention methods. The main strategies chosen were bibliotherapy (i.e., the use of written material to help people solve problems or change behavior) and motivational interviewing. In Step 3, the theoretical methods and practical strategies were applied in a program that comprised a self-help guide, a motivational interviewing session and a motivational interviewing telephone call, both delivered by specialist nurses in HIV treatment centers. In Step 4, implementation was anticipated by developing a linkage group to ensure involvement of program users in the planning process and conducting additional research to understand how to implement our program better. In Step 5, program evaluation was anticipated based on the planning process from the previous Intervention Mapping steps.

  9. Automating the evaluation of flood damages: methodology and potential gains

    NASA Astrophysics Data System (ADS)

    Eleutério, Julian; Martinez, Edgar Daniel

    2010-05-01

    The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.

  10. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification† †Electronic supplementary information (ESI) available: Additional experimental materials, methods, DNA sequences and supplementary figures and tables. See DOI: 10.1039/c7sc01336a Click here for additional data file.

    PubMed Central

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang

    2017-01-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5′-ASO could block RNA splicing by inhibiting the first step, while 3′-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs. PMID:28989608

  11. Formation of aqueous-phase sulfate during the haze period in China: Kinetics and atmospheric implications

    NASA Astrophysics Data System (ADS)

    Zhang, Haijie; Chen, Shilu; Zhong, Jie; Zhang, Shaowen; Zhang, Yunhong; Zhang, Xiuhui; Li, Zesheng; Zeng, Xiao Cheng

    2018-03-01

    Sulfate is one of the most important components in the aerosol due to its key role in air pollution and global climate change. Recent work has suggested that reactive nitrogen chemistry in aqueous water can explain the missing source of sulfate in the aqueous water. Herein, we have mapped out the energy profile of the oxidization process of SO2 leading from NO2 and two feasible three-step mechanisms have been proposed. For the oxidation of HOSO2- and HSO3- by the dissolved NO2 in weakly acidic and neutral aerosol (pH ≤ 7), the main contribution to the missing sulfate production comes from the oxidation of HOSO2-. The whole process is a self-sustaining process. For the oxidation of SO32- in alkaline aerosol (pH > 7), the third step - decomposition step of H2O or hydrolysis of SO3 step which are two parallel processes are the rate-limiting steps. The present results are of avail to better understand the missing source of sulfate in the aerosol and hence may lead to better science-based solutions for resolving the severe haze problems in China.

  12. Learning and study strategies correlate with medical students' performance in anatomical sciences.

    PubMed

    Khalil, Mohammed K; Williams, Shanna E; Gregory Hawkins, H

    2018-05-06

    Much of the content delivered during medical students' preclinical years is assessed nationally by such testing as the United States Medical Licensing Examination ® (USMLE ® ) Step 1 and Comprehensive Osteopathic Medical Licensing Examination ® (COMPLEX-USA ® ) Step 1. Improvement of student study/learning strategies skills is associated with academic success in internal and external (USMLE Step 1) examinations. This research explores the strength of association between the Learning and Study Strategies Inventory (LASSI) scores and student performance in the anatomical sciences and USMLE Step 1 examinations. The LASSI inventory assesses learning and study strategies based on ten subscale measures. These subscales include three components of strategic learning: skill (Information processing, Selecting main ideas, and Test strategies), will (Anxiety, Attitude, and Motivation) and self-regulation (Concentration, Time management, Self-testing, and Study aid). During second year (M2) orientation, 180 students (Classes of 2016, 2017, and 2018) were administered the LASSI survey instrument. Pearson Product-Moment correlation analyses identified significant associations between five of the ten LASSI subscales (Anxiety, Information processing, Motivation, Selecting main idea, and Test strategies) and students' performance in the anatomical sciences and USMLE Step 1 examinations. Identification of students lacking these skills within the anatomical sciences curriculum allows targeted interventions, which not only maximize academic achievement in an aspect of an institution's internal examinations, but in the external measure of success represented by USMLE Step 1 scores. Anat Sci Educ 11: 236-242. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.

  13. How Things Work: Physics in the Copy Machine.

    ERIC Educational Resources Information Center

    Crane, H. Richard, Ed.

    1984-01-01

    Discusses the physics principles applied to the main steps of the photocopying process. Of particular interest (and at the heart of the process) are the ways in which electric charges, or particles carrying charges, are caused to transfer from one surface or medium to another at each stage. (JN)

  14. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  15. Study of Nickel Silicide as a Copper Diffusion Barrier in Monocrystalline Silicon Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kale, Abhijit; Beese, Emily; Saenz, Theresa

    NiSi as a conductive diffusion barrier to silicon has been studied. We demonstrate that the NiSi films formed using the single step annealing process are as good as the two step process using XRD and Raman. Quality of NiSi films formed using e-beam Ni and electroless Ni process has been compared. Incomplete surface coverage and presence of constituents other than Ni are the main challenges with electroless Ni. We also demonstrate that Cu reduces the thermal stability of NiSi films. The detection of Cu has proven to be difficult due to temperature limitations.

  16. Implementation of Competency-Based Pharmacy Education (CBPE)

    PubMed Central

    Koster, Andries; Schalekamp, Tom; Meijerman, Irma

    2017-01-01

    Implementation of competency-based pharmacy education (CBPE) is a time-consuming, complicated process, which requires agreement on the tasks of a pharmacist, commitment, institutional stability, and a goal-directed developmental perspective of all stakeholders involved. In this article the main steps in the development of a fully-developed competency-based pharmacy curriculum (bachelor, master) are described and tips are given for a successful implementation. After the choice for entering into CBPE is made and a competency framework is adopted (step 1), intended learning outcomes are defined (step 2), followed by analyzing the required developmental trajectory (step 3) and the selection of appropriate assessment methods (step 4). Designing the teaching-learning environment involves the selection of learning activities, student experiences, and instructional methods (step 5). Finally, an iterative process of evaluation and adjustment of individual courses, and the curriculum as a whole, is entered (step 6). Successful implementation of CBPE requires a system of effective quality management and continuous professional development as a teacher. In this article suggestions for the organization of CBPE and references to more detailed literature are given, hoping to facilitate the implementation of CBPE. PMID:28970422

  17. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics.

    PubMed

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-10-05

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics.

  18. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics

    PubMed Central

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-01-01

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics. PMID:27703141

  19. Conversion treatment of thin titanium layer deposited on carbon steel

    NASA Astrophysics Data System (ADS)

    Benarioua, Younes; Wendler, Bogdan; Chicot, Didier

    2018-05-01

    The present study has been conducted in order to obtain titanium carbide layer using a conversion treatment consisting of two main steps. In the first step a thin pure titanium layer was deposited on 120C4 carbon steel by PVD. In the second step, the carbon atoms from the substrate diffuse to the titanium coating due to a vacuum annealing treatment and the Ti coating transforms into titanium carbide. Depending on the annealing temperature a partial or complete conversion into TiC is obtained. The hardness of the layer can be expected to differ depending on the processing temperatures. By a systematic study of the hardness as a function of the applied load, we confirm the process of growth of the layer.

  20. One-step solution combustion synthesis of pure Ni nanopowders with enhanced coercivity: The fuel effect

    NASA Astrophysics Data System (ADS)

    Khort, Alexander; Podbolotov, Kirill; Serrano-García, Raquel; Gun'ko, Yurii K.

    2017-09-01

    In this paper, we report a new modified one-step combustion synthesis technique for production of Ni metal nanoparticles. The main unique feature of our approach is the use of microwave assisted foam preparation. Also, the effect of different types of fuels (urea, citric acid, glycine and hexamethylenetetramine) on the combustion process and characteristics of resultant solid products were investigated. It is observed that the combination of microwave assisted foam preparation and using of hexamethylenetetramine as a fuel allows producing pure ferromagnetic Ni metal nanoparticles with enhanced coercivity (78 Oe) and high value of saturation magnetization (52 emu/g) by one-step solution combustion synthesis under normal air atmosphere without any post-reduction processing.

  1. Impact of Marine Drugs on Animal Reproductive Processes

    PubMed Central

    Silvestre, Francesco; Tosti, Elisabetta

    2009-01-01

    The discovery and description of bioactive substances from natural sources has been a research topic for the last 50 years. In this respect, marine animals have been used to extract many new compounds exerting different actions. Reproduction is a complex process whose main steps are the production and maturation of gametes, their activation, the fertilisation and the beginning of development. In the literature it has been shown that many substances extracted from marine organisms may have profound influence on the reproductive behaviour, function and reproductive strategies and survival of species. However, despite the central importance of reproduction and thus the maintenance of species, there are still few studies on how reproductive mechanisms are impacted by marine bioactive drugs. At present, studies in either marine and terrestrial animals have been particularly important in identifying what specific fine reproductive mechanisms are affected by marine-derived substances. In this review we describe the main steps of the biology of reproduction and the impact of substances from marine environment and organisms on the reproductive processes. PMID:20098597

  2. A sandpile model of grain blocking and consequences for sediment dynamics in step-pool streams

    NASA Astrophysics Data System (ADS)

    Molnar, P.

    2012-04-01

    Coarse grains (cobbles to boulders) are set in motion in steep mountain streams by floods with sufficient energy to erode the particles locally and transport them downstream. During transport, grains are often blocked and form width-spannings structures called steps, separated by pools. The step-pool system is a transient, self-organizing and self-sustaining structure. The temporary storage of sediment in steps and the release of that sediment in avalanche-like pulses when steps collapse, leads to a complex nonlinear threshold-driven dynamics in sediment transport which has been observed in laboratory experiments (e.g., Zimmermann et al., 2010) and in the field (e.g., Turowski et al., 2011). The basic question in this paper is if the emergent statistical properties of sediment transport in step-pool systems may be linked to the transient state of the bed, i.e. sediment storage and morphology, and to the dynamics in sediment input. The hypothesis is that this state, in which sediment transporting events due to the collapse and rebuilding of steps of all sizes occur, is analogous to a critical state in self-organized open dissipative dynamical systems (Bak et al., 1988). To exlore the process of self-organization, a cellular automaton sandpile model is used to simulate the processes of grain blocking and hydraulically-driven step collapse in a 1-d channel. Particles are injected at the top of the channel and are allowed to travel downstream based on various local threshold rules, with the travel distance drawn from a chosen probability distribution. In sandpile modelling this is a simple 1-d limited non-local model, however it has been shown to have nontrivial dynamical behaviour (Kadanoff et al., 1989), and it captures the essence of stochastic sediment transport in step-pool systems. The numerical simulations are used to illustrate the differences between input and output sediment transport rates, mainly focussing on the magnification of intermittency and variability in the system response by the processes of grain blocking and step collapse. The temporal correlation in input and output rates and the number of grains stored in the system at any given time are quantified by spectral analysis and statistics of long-range dependence. Although the model is only conceptually conceived to represent the real processes of step formation and collapse, connections will be made between the modelling results and some field and laboratory data on step-pool systems. The main focus in the discussion will be to demonstrate how even in such a simple model the processes of grain blocking and step collapse may impact the sediment transport rates to the point that certain changes in input are not visible anymore, along the lines of "shredding the signals" proposed by Jerolmack and Paola (2010). The consequences are that the notions of stability and equilibrium, the attribution of cause and effect, and the timescales of process and form in step-pool systems, and perhaps in many other fluvial systems, may have very limited applicability.

  3. Refining each process step to accelerate the development of biorefineries

    DOE PAGES

    Chandra, Richard P.; Ragauskas, Art J.

    2016-06-21

    Research over the past decade has been mainly focused on overcoming hurdles in the pretreatment, enzymatic hydrolysis, and fermentation steps of biochemical processing. Pretreatments have improved significantly in their ability to fractionate and recover the cellulose, hemicellulose, and lignin components of biomass while producing substrates containing carbohydrates that can be easily broken down by hydrolytic enzymes. There is a rapid movement towards pretreatment processes that incorporate mechanical treatments that make use of existing infrastructure in the pulp and paper industry, which has experienced a downturn in its traditional markets. Enzyme performance has also made great strides with breakthrough developments inmore » nonhydrolytic protein components, such as lytic polysaccharide monooxygenases, as well as the improvement of enzyme cocktails.The fermentability of pretreated and hydrolyzed sugar streams has been improved through strategies such as the use of reducing agents for detoxification, strain selection, and strain improvements. Although significant progress has been made, tremendous challenges still remain to advance each step of biochemical conversion, especially when processing woody biomass. In addition to technical and scale-up issues within each step of the bioconversion process, biomass feedstock supply and logistics challenges still remain at the forefront of biorefinery research.« less

  4. Cold Test Operation of the German VEK Vitrification Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleisch, J.; Schwaab, E.; Weishaupt, M.

    2008-07-01

    In 2007 the German High-Level Liquid Waste (HLLW) Vitrification plant VEK (Verglasungseinrichtung Karlsruhe) has passed a three months integral cold test operation as final step before entering the hot phase. The overall performance of the vitrification process equipment with a liquid-fed ceramic glass melter as main component proved to be completely in line with the requirements of the regulatory body. The retention efficiency of main radioactive-bearing elements across melter and wet off-gas treatment system exceeded the design values distinctly. The strategy to produce a specified waste glass could be successfully demonstrated. The results of the cold test operation allow enteringmore » the next step of hot commissioning, i.e. processing of approximately 2 m{sup 3} of diluted HLLW. In summary: An important step of the VEK vitrification plant towards hot operation has been the performance of the cold test operation from April to July 2007. This first integral operation was carried out under boundary conditions and rules established for radioactive operation. Operation and process control were carried out following the procedure as documented in the licensed operational manuals. The function of the process technology and the safe operation could be demonstrated. No severe problems were encountered. Based on the positive results of the cold test, application of the license for hot operation has been initiated and is expected in the near future. (authors)« less

  5. Supercritical CO2 assisted process for the production of high-purity and sterile nano-hydroxyapatite/chitosan hybrid scaffolds.

    PubMed

    Ruphuy, G; Souto-Lopes, M; Paiva, D; Costa, P; Rodrigues, A E; Monteiro, F J; Salgado, C L; Fernandes, M H; Lopes, J C; Dias, M M; Barreiro, M F

    2018-04-01

    Hybrid scaffolds composed of hydroxyapatite (HAp), in particular in its nanometric form (n-HAp), and chitosan (CS) are promising materials for non-load-bearing bone graft applications. The main constraints of their production concern the successful implementation of the final purification/neutralization and sterilization steps. Often, the used purification strategies can compromise scaffold structural features, and conventional sterilization techniques can result in material's thermal degradation and/or contamination with toxic residues. In this context, this work presents a process to produce n-HAp/CS scaffolds mimicking bone composition and structure, where an innovative single step based on supercritical CO 2 extraction was used for both purification and sterilization. A removal of 80% of the residual acetic acid was obtained (T = 75°C, p = 8.0 MPa, 2 extraction cycles of 2 h) giving rise to scaffolds exhibiting adequate interconnected porous structure, fast swelling and storage modulus compatible with non-load-bearing applications. Moreover, the obtained scaffolds showed cytocompatibility and osteoconductivity without further need of disinfection/sterilization procedures. Among the main advantages, the proposed process comprises only three steps (n-HAp/CS dispersion preparation; freeze-drying; and supercritical CO 2 extraction), and the supercritical CO 2 extraction show clear advantages over currently used procedures based on neutralization steps. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 965-975, 2018. © 2017 Wiley Periodicals, Inc.

  6. Secretory immunoglobulin purification from whey by chromatographic techniques.

    PubMed

    Matlschweiger, Alexander; Engelmaier, Hannah; Himmler, Gottfried; Hahn, Rainer

    2017-08-15

    Secretory immunoglobulins (SIg) are a major fraction of the mucosal immune system and represent potential drug candidates. So far, platform technologies for their purification do not exist. SIg from animal whey was used as a model to develop a simple, efficient and potentially generic chromatographic purification process. Several chromatographic stationary phases were tested. A combination of two anion-exchange steps resulted in the highest purity. The key step was the use of a small-porous anion exchanger operated in flow-through mode. Diffusion of SIg into the resin particles was significantly hindered, while the main impurities, IgG and serum albumin, were bound. In this step, initial purity was increased from 66% to 89% with a step yield of 88%. In a second anion-exchange step using giga-porous material, SIg was captured and purified by step or linear gradient elution to obtain fractions with purities >95%. For the step gradient elution step yield of highly pure SIg was 54%. Elution of SIgA and SIgM with a linear gradient resulted in a step yield of 56% and 35%, respectively. Overall yields for both anion exchange steps were 43% for the combination of flow-through and step elution mode. Combination of flow-through and linear gradient elution mode resulted in a yield of 44% for SIgA and 39% for SIgM. The proposed process allows the purification of biologically active SIg from animal whey in preparative scale. For future applications, the process can easily be adopted for purification of recombinant secretory immunoglobulin species. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Compaction behavior of out-of-autoclave prepreg materials

    NASA Astrophysics Data System (ADS)

    Serrano, Léonard; Olivier, Philippe; Cinquin, Jacques

    2017-10-01

    The main challenges with composite parts manufacturing are related to the curing means, mainly autoclaves, the length of their cycles and their operating costs. In order to decrease this dependency, out of autoclave materials have been considered as a solution for high production rate parts such as spars, flaps, etc… However, most out-of-autoclave process do not possess the same maturity as their counterpart, especially concerning part quality1. Some pre-cure processes such as compaction and ply lay-up are usually less of a concern for autoclave manufacturing: the pressure applied during the cycle participates to reduce the potential defects (porosity caused by a poor quality lay-up, bad compaction, entrapped air or humidity…). For out-of-autoclave parts, those are crucial steps which may have many consequences on the final quality of the laminate2. In order to avoid this quality loss, those steps must be well understood.

  8. Process modeling of a HLA research lab

    NASA Astrophysics Data System (ADS)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  9. Electrochemical reduction of CerMet fuels for transmutation using surrogate CeO2-Mo pellets

    NASA Astrophysics Data System (ADS)

    Claux, B.; Souček, P.; Malmbeck, R.; Rodrigues, A.; Glatz, J.-P.

    2017-08-01

    One of the concepts chosen for the transmutation of minor actinides in Accelerator Driven Systems or fast reactors proposes the use of fuels and targets containing minor actinides oxides embedded in an inert matrix either composed of molybdenum metal (CerMet fuel) or of ceramic magnesium oxide (CerCer fuel). Since the sufficient transmutation cannot be achieved in a single step, it requires multi-recycling of the fuel including recovery of the not transmuted minor actinides. In the present work, a pyrochemical process for treatment of Mo metal inert matrix based CerMet fuels is studied, particularly the electroreduction in molten chloride salt as a head-end step required prior the main separation process. At the initial stage, different inactive pellets simulating the fuel containing CeO2 as minor actinide surrogates were examined. The main studied parameters of the process efficiency were the porosity and composition of the pellets and the process parameters as current density and passed charge. The results indicated the feasibility of the process, gave insight into its limiting parameters and defined the parameters for the future experiment on minor actinide containing material.

  10. CNC Machining Of The Complex Copper Electrodes

    NASA Astrophysics Data System (ADS)

    Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina

    2015-07-01

    This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.

  11. Processing of LEU targets for {sup 99}Mo production--testing and modification of the Cintichem process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, D.; Landsberger, S.; Buchholz, B.

    1995-09-01

    Recent experimental results on testing and modification of the Cintichem process to allow substitution of low enriched uranium (LEU) for high enriched uranium (HEU) targets are presented in this report. The main focus is on {sup 99}Mo recovery and purification by its precipitation with {alpha}-benzoin oxime. Parameters that were studied include concentrations of nitric and sulfuric acids, partial neutralization of the acids, molybdenum and uranium concentrations, and the ratio of {alpha}-benzoin oxime to molybdenum. Decontamination factors for uranium, neptunium, and various fission products were measured. Experiments with tracer levels of irradiated LEU were conducted for testing the {sup 99}Mo recoverymore » and purification during each step of the Cintichem process. Improving the process with additional processing steps was also attempted. The results indicate that the conversion of molybdenum chemical processing from HEU to LEU targets is possible.« less

  12. Isostrychnine synthesis mediated by hypervalent iodine reagent.

    PubMed

    Jacquemot, Guillaume; Maertens, Gaëtan; Canesi, Sylvain

    2015-05-18

    Althought there are several reported synthetic routes to strychnine, one of the most widely recognized alkaloids, we report an unexplored route with an oxidative dearomatizing process mediated by hypervalent iodine as the key step. The new syntheses of isostrychnine and strychnine have been achieved from an readily available phenol in nine and ten steps. In addition to the key step, these syntheses involve an aza Michael-ether-enol tandem transformation, two heck type cyclizations, a reductive isomerization, and a double reductive amination in cascade leading to the alkaloid main core. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Supporting Active Patient and Health Care Collaboration: A Prototype for Future Health Care Information Systems.

    PubMed

    Åhlfeldt, Rose-Mharie; Persson, Anne; Rexhepi, Hanife; Wåhlander, Kalle

    2016-12-01

    This article presents and illustrates the main features of a proposed process-oriented approach for patient information distribution in future health care information systems, by using a prototype of a process support system. The development of the prototype was based on the Visuera method, which includes five defined steps. The results indicate that a visualized prototype is a suitable tool for illustrating both the opportunities and constraints of future ideas and solutions in e-Health. The main challenges for developing and implementing a fully functional process support system concern both technical and organizational/management aspects. © The Author(s) 2015.

  14. 3D road marking reconstruction from street-level calibrated stereo pairs

    NASA Astrophysics Data System (ADS)

    Soheilian, Bahman; Paparoditis, Nicolas; Boldo, Didier

    This paper presents an automatic approach to road marking reconstruction using stereo pairs acquired by a mobile mapping system in a dense urban area. Two types of road markings were studied: zebra crossings (crosswalks) and dashed lines. These two types of road markings consist of strips having known shape and size. These geometric specifications are used to constrain the recognition of strips. In both cases (i.e. zebra crossings and dashed lines), the reconstruction method consists of three main steps. The first step extracts edge points from the left and right images of a stereo pair and computes 3D linked edges using a matching process. The second step comprises a filtering process that uses the known geometric specifications of road marking objects. The goal is to preserve linked edges that can plausibly belong to road markings and to filter others out. The final step uses the remaining linked edges to fit a theoretical model to the data. The method developed has been used for processing a large number of images. Road markings are successfully and precisely reconstructed in dense urban areas under real traffic conditions.

  15. Improving the Desulfurization Degree of High-Grade Nickel Matte via a Two-Step Oxidation Roasting Process

    NASA Astrophysics Data System (ADS)

    Xi, Zhao; Wang, Zhixing; Li, Xinhai; Guo, Huajun; Yan, Guochun; Wang, Jiexi

    2018-05-01

    Generally, sulfur elimination from nickel matte was incomplete in the one-step oxidation roasting process. In this work, X-ray diffraction, scanning electron microscopy/energy-dispersive X-ray spectroscopy, and chemical analysis of the roasted products were carried out to explain this phenomenon. The results indicated that the melting of heazlewoodite was the main limiting factor. Thereafter, the oxidation mechanism of high-grade nickel matte from room temperature to 1000 °C was studied. It was found that the transformation from heazlewoodite (Ni3S2) to nickel sulfide (NiS) took place from 400 °C to 520 °C. Considering that the melting temperature of NiS was much higher than that of Ni3S2, a low-temperature roasting step was suggested to suppress the melting of heazlewoodite. Under the optimum conditions (520 °C for 120 minutes followed by 800 °C for 80 minutes), the degree of desulfurization reached 99.52 pct. These results indicated that the two-step oxidation roasting method could be a promising process for producing low-sulfur calcine from high-grade nickel matte.

  16. A three step supercritical process to improve the dissolution rate of eflucimibe.

    PubMed

    Rodier, Elisabeth; Lochard, Hubert; Sauceau, Martial; Letourneau, Jean-Jacques; Freiss, Bernard; Fages, Jacques

    2005-10-01

    The aim of this study is to improve the dissolution properties of a poorly-soluble active substance, Eflucimibe by associating it with gamma-cyclodextrin. To achieve this objective, a new three-step process based on supercritical fluid technology has been proposed. First, Eflucimibe and cyclodextrin are co-crystallized using an anti-solvent process, dimethylsulfoxide being the solvent and supercritical carbon dioxide being the anti-solvent. Second, the co-crystallized powder is held in a static mode under supercritical conditions for several hours. This is the maturing step. Third, in a final stripping step, supercritical CO(2) is flowed through the matured powder to extract the residual solvent. The coupling of the first two steps brings about a significant synergistic effect to improve the dissolution rate of the drug. The nature of the entity obtained at the end of each step is discussed and some suggestions are made as to what happens in these operations. It is shown the co-crystallization ensures a good dispersion of both compounds and is rather insensitive to the operating parameters tested. The maturing step allows some dissolution-recrystallization to occur thus intensifying the intimate contact between the two compounds. Addition of water is necessary to make maturing effective as this is governed by the transfer properties of the medium. The stripping step allows extraction of the residual solvent but also removes some of the Eflucimibe which is the main drawback of this final stage.

  17. Hybrid codes with finite electron mass

    NASA Astrophysics Data System (ADS)

    Lipatov, A. S.

    This report is devoted to the current status of the hybrid multiscale simulation technique. The different aspects of modeling are discussed. In particular, we consider the different level for description of the plasma model, however, the main attention will be paid to conventional hybrid models. We discuss the main steps of time integration the Vlasov/Maxwell system of equations. The main attention will be paid to the models with finite electron mass. Such model may allow us to explore the plasma system with multiscale phenomena ranging from ion to electron scales. As an application of hybrid modeling technique we consider the simulation of the plasma processes at the collisionless shocks and very shortly ther magnetic field reconnection processes.

  18. Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.

    PubMed

    Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran

    2007-08-01

    The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.

  19. Best Practices In Overset Grid Generation

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Gomez, Reynaldo J., III; Rogers, Stuart E.; Buning, Pieter G.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Grid generation for overset grids on complex geometry can be divided into four main steps: geometry processing, surface grid generation, volume grid generation and domain connectivity. For each of these steps, the procedures currently practiced by experienced users are described. Typical problems encountered are also highlighted and discussed. Most of the guidelines are derived from experience on a variety of problems including space launch and return vehicles, subsonic transports with propulsion and high lift devices, supersonic vehicles, rotorcraft vehicles, and turbomachinery.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, L.M.

    Presently some methods of HTS-conductors processing are under study in the authors laboratory. ``Powder-in-tube`` (PIT), ``Jelly-roll``, electrophorethis are among them. PIT process has developed predominantly both in a view of the achieved J{sub c} values Bi-2223 phase was used as a core material for these tapes. Since the main purpose of the task order was to enhance the development of long length high temperature superconductor tapes, the authors have considered reasonable to lay the perfection idea of the PIT process step by step or tape by tape. To realize it they have assumed, keeping stable the basic scheme of PITmore » process, to vary some technological parameters which are as follows: (1) type of initial powder; (2) sheath material; (3) tape construction (filaments number, cross section e.a.); and (4) processing regimes. This report covers the fabrication process and characteristics of the produced conductors.« less

  1. Ni-MH spent batteries: a raw material to produce Ni-Co alloys.

    PubMed

    Lupi, Carla; Pilone, Daniela

    2002-01-01

    Ni-MH spent batteries are heterogeneous and complex materials, so any kind of metallurgical recovery process needs a mechanical pre-treatment at least to separate irony materials and recyclable plastic materials (like ABS) respectively, in order to get additional profit from this saleable scrap, as well as minimize waste arising from the braking separation process. Pyrometallurgical processing is not suitable to treat Ni-MH batteries mainly because of Rare Earths losses in the slag. On the other hand, the hydrometallurgical method, that offers better opportunities in terms of recovery yield and higher purity of Ni, Co, and RE, requires several process steps as shown in technical literature. The main problems during leach liquor purification are the removal of elements such as Mn, Zn, Cd, dissolved during the leaching step, and the separation of Ni from Co. In the present work, the latter problem is overcome by co-deposition of a Ni-35/40%w Co alloy of good quality. The experiments carried out in a laboratory scale pilot-plant show that a current efficiency higher than 91% can be reached in long duration electrowinning tests performed at 50 degrees C and 4.3 catholyte pH.

  2. Writing a Cochrane systematic review on preventive interventions to improve safety: the case of the construction industry.

    PubMed

    van der Molen, H F; Hoonakker, P L T; Lehtola, Marika M; Hsiao, H; Haslam, R A; Hale, A R; Verbeek, J H

    2009-01-01

    The objective of this paper is to describe the main steps and to conduct a systematic literature review on preventive interventions concerning work-related injuries and to illustrate the process. Based on the Cochrane handbook, a structured framework of six steps was outlined for the development of a systematic review. This framework was used to describe a Cochrane systematic review (CSR) on the effectiveness of interventions to prevent work related injuries in the construction industry. The 6 main steps to write a CSR were: formulating the problem and objectives; locating and selecting studies; assessing study quality; collecting data; analysing data and presenting results; and interpreting results. The CSR on preventing injuries in the construction industry yielded five eligible intervention studies. Re-analysis of original injury data of the studies on regulatory interventions, through correcting for pre-intervention injury trends led to different conclusions about the effectiveness of interventions than those reported in the original studies. The Cochrane handbook for systematic reviews of interventions provides a practical and feasible six-step framework for developing and reporting a systematic review for preventive interventions.

  3. Sequential lignin depolymerization by combination of biocatalytic and formic acid/formate treatment steps.

    PubMed

    Gasser, Christoph A; Čvančarová, Monika; Ammann, Erik M; Schäffer, Andreas; Shahgaldian, Patrick; Corvini, Philippe F-X

    2017-03-01

    Lignin, a complex three-dimensional amorphous polymer, is considered to be a potential natural renewable resource for the production of low-molecular-weight aromatic compounds. In the present study, a novel sequential lignin treatment method consisting of a biocatalytic oxidation step followed by a formic acid-induced lignin depolymerization step was developed and optimized using response surface methodology. The biocatalytic step employed a laccase mediator system using the redox mediator 1-hydroxybenzotriazole. Laccases were immobilized on superparamagnetic nanoparticles using a sorption-assisted surface conjugation method allowing easy separation and reuse of the biocatalysts after treatment. Under optimized conditions, as much as 45 wt% of lignin could be solubilized either in aqueous solution after the first treatment or in ethyl acetate after the second (chemical) treatment. The solubilized products were found to be mainly low-molecular-weight aromatic monomers and oligomers. The process might be used for the production of low-molecular-weight soluble aromatic products that can be purified and/or upgraded applying further downstream processes.

  4. Synthesis of Titanium Oxycarbide from Titanium Slag by Methane-Containing Gas

    NASA Astrophysics Data System (ADS)

    Dang, Jie; Fatollahi-Fard, Farzin; Pistorius, Petrus Christiaan; Chou, Kuo-Chih

    2018-02-01

    In this study, reaction steps of a process for synthesis of titanium oxycarbide from titanium slag were demonstrated. This process involves the reduction of titanium slag by a methane-hydrogen-argon mixture at 1473 K (1200 °C) and the leaching of the reduced products by hydrofluoric acid near room temperature to remove the main impurity (Fe3Si). Some iron was formed by disproportionation of the main M3O5 phase before gaseous reduction started. Upon reduction, more iron formed first, followed by reduction of titanium dioxide to suboxides and eventually oxycarbide.

  5. The line roughness improvement with plasma coating and cure treatment for 193nm lithography and beyond

    NASA Astrophysics Data System (ADS)

    Zheng, Erhu; Huang, Yi; Zhang, Haiyang

    2017-03-01

    As CMOS technology reaches 14nm node and beyond, one of the key challenges of the extension of 193nm immersion lithography is how to control the line edge and width roughness (LER/LWR). For Self-aligned Multiple Patterning (SaMP), LER becomes larger while LWR becomes smaller as the process proceeds[1]. It means plasma etch process becomes more and more dominant for LER reduction. In this work, we mainly focus on the core etch solution including an extra plasma coating process introduced before the bottom anti reflective coating (BARC) open step, and an extra plasma cure process applied right after BARC-open step. Firstly, we leveraged the optimal design experiment (ODE) to investigate the impact of plasma coating step on LER and identified the optimal condition. ODE is an appropriate method for the screening experiments of non-linear parameters in dynamic process models, especially for high-cost-intensive industry [2]. Finally, we obtained the proper plasma coating treatment condition that has been proven to achieve 32% LER improvement compared with standard process. Furthermore, the plasma cure scheme has been also optimized with ODE method to cover the LWR degradation induced by plasma coating treatment.

  6. An AIDS risk reduction program for Dutch drug users: an intervention mapping approach to planning.

    PubMed

    van Empelen, Pepijn; Kok, Gerjo; Schaalma, Herman P; Bartholomew, L Kay

    2003-10-01

    This article presents the development of a theory- and evidence-based AIDS prevention program targeting Dutch drug users and aimed at promoting condom use. The emphasis is on the development of the program using a five-step intervention development protocol called intervention mapping (IM). Preceding Step 1 of the IM process, an assessment of the HIV problem among drug users was conducted. The product of IM Step 1 was a series of program objectives specifying what drug users should learn in order to use condoms consistently. In Step 2, theoretical methods for influencing the most important determinants were chosen and translated into practical strategies that fit the program objectives. The main strategy chosen was behavioral journalism. In Step 3, leaflets with role-model stories based on authentic interviews with drug users were developed and pilot tested. Finally, the need for cooperation with program users is discussed in IM Steps 4 and 5.

  7. The effect of double steps heat treatment on the microstructure of nanostructure bainitic medium carbon steels

    NASA Astrophysics Data System (ADS)

    Foughani, Milad; Kolahi, Alireza; Palizdar, Yahya

    2018-01-01

    Nowadays, Nano structure bainitic steel have attracted attention mostly because of its special mechanical properties such as high tensile strength, hardness, appropriate toughness and low manufacturing cost. The main concern for the mass production of this type of steels is prolong austempering process which increases the production costs as well as time. In this research, in order to accelerate the bainitic transformation and decrease the production time, a medium carbon steel has been prepared and two steps austempering process was employed to prevent the bainite laths thickening. The Samples were austenetized at 1000°C for 15 min and were kept in the salt bath between 1 - 12 hours at 290°C in one step and between 1 - 12 hours at the temperature range of 250°C - 300°C in two steps bainite transformation. The obtained micro structures were studied by the optical and scanning electron microscopy (FESEM) and the mechanical properties were investigated by using tensile and hardness tests. The results show that the two steps austempering process and lower carbon concentration lead to lower austempering time as well as the formation of more stable retained austenite and nanostructured bainite lath which results in higher mechanical properties.

  8. Managing frame diversity in environmental participatory processes - Example from the Fogera woreda in Ethiopia.

    PubMed

    Hassenforder, Emeline; Brugnach, Marcela; Cullen, Beth; Ferrand, Nils; Barreteau, Olivier; Daniell, Katherine Anne; Pittock, Jamie

    2016-07-15

    Many participatory processes fail to generate social change and collaborative outcomes. This failure can partly be explained by how divergent stakeholders' frames are handled. This paper builds on the framing and participation literature to explain how facilitators can manage frame diversity and foster collaborative outcomes. It suggests two pragmatic steps: identifying frames and managing frames. The two steps are applied to a participatory process for natural resource management in Fogera, Ethiopia. Effectiveness of facilitators' strategies to manage frame diversity in the Fogera case is discussed. Two main elements challenging effectiveness are identified: counter-strategies used by facilitators and most-powerful stakeholders, and the constraining factors knowledge, champions and frame sponsorship. We argue that these elements need to be taken into account by participatory process facilitators when managing frame diversity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    PubMed

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Integrating intelligent transportation systems within the transportation planning process : an interim handbook

    DOT National Transportation Integrated Search

    1999-06-01

    The main purpose of Phase I of this project was to develop a methodology for predicting consequences of hazardous material (HM) crashes, such as injuries and property damage. An initial step in developing a risk assessment is to reliably estimate the...

  11. Recent developments in biohythane production from household food wastes: A review.

    PubMed

    Bolzonella, David; Battista, Federico; Cavinato, Cristina; Gottardo, Marco; Micolucci, Federico; Lyberatos, Gerasimos; Pavan, Paolo

    2018-06-01

    Biohythane is a hydrogen-methane blend with hydrogen concentration between 10 and 30% v/v. It can be produced from different organic substrates by two sequential anaerobic stages: a dark fermentation step followed by a second an anaerobic digestion step, for hydrogen and methane production, respectively. The advantages of this blend compared to either hydrogen or methane, as separate biofuels, are first presented in this work. The two-stage anaerobic process and the main operative parameters are then discussed. Attention is focused on the production of biohythane from household food wastes, one of the most abundant organic substrate available for anaerobic digestion: the main milestones and the future trends are exposed. In particular, the possibility to co-digest food wastes and sewage sludge to improve the process yield is discussed. Finally, the paper illustrates the developments of biohythane application in the automotive sector as well as its reduced environmental burden. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Functional-to-form mapping for assembly design automation

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.

    2017-11-01

    Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.

  13. Simulation study of the initial crystallization processes of poly(3-hexylthiophene) in solution: ordering dynamics of main chains and side chains.

    PubMed

    Takizawa, Yuumi; Shimomura, Takeshi; Miura, Toshiaki

    2013-05-23

    We study the initial nucleation dynamics of poly(3-hexylthiophene) (P3HT) in solution, focusing on the relationship between the ordering process of main chains and that of side chains. We carried out Langevin dynamics simulation and found that the initial nucleation processes consist of three steps: the ordering of ring orientation, the ordering of main-chain vectors, and the ordering of side chains. At the start, the normal vectors of thiophene rings aligned in a very short time, followed by alignment of main-chain end-to-end vectors. The flexible side-chain ordering took almost 5 times longer than the rigid-main-chain ordering. The simulation results indicated that the ordering of side chains was induced after the formation of the regular stack structure of main chains. This slow ordering dynamics of flexible side chains is one of the factors that cause anisotropic nuclei growth, which would be closely related to the formation of nanofiber structures without external flow field. Our simulation results revealed how the combined structure of the planar and rigid-main-chain backbones and the sparse flexible side chains lead to specific ordering behaviors that are not observed in ordinary linear polymer crystallization processes.

  14. Development of expert systems for modeling of technological process of pressure casting on the basis of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.

    2017-09-01

    In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed

  15. SED16 autonomous star tracker night sky testing

    NASA Astrophysics Data System (ADS)

    Foisneau, Thierry; Piriou, Véronique; Perrimon, Nicolas; Jacob, Philippe; Blarre, Ludovic; Vilaire, Didier

    2017-11-01

    The SED16 is an autonomous multi-missions star tracker which delivers three axis satellite attitude in an inertial reference frame and the satellite angular velocity with no prior information. The qualification process of this star sensor includes five validation steps using optical star simulator, digitized image simulator and a night sky tests setup. The night sky testing was the final step of the qualification process during which all the functions of the star tracker were used in almost nominal conditions : Autonomous Acquisition of the attitude, Autonomous Tracking of ten stars. These tests were performed in Calern in the premises of the OCA (Observatoire de la Cote d'Azur). The test set-up and the test results are described after a brief review of the sensor main characteristics and qualification process.

  16. A Unified Model of Cloud-to-Ground Lightning Stroke

    NASA Astrophysics Data System (ADS)

    Nag, A.; Rakov, V. A.

    2014-12-01

    The first stroke in a cloud-to-ground lightning discharge is thought to follow (or be initiated by) the preliminary breakdown process which often produces a train of relatively large microsecond-scale electric field pulses. This process is poorly understood and rarely modeled. Each lightning stroke is composed of a downward leader process and an upward return-stroke process, which are usually modeled separately. We present a unified engineering model for computing the electric field produced by a sequence of preliminary breakdown, stepped leader, and return stroke processes, serving to transport negative charge to ground. We assume that a negatively-charged channel extends downward in a stepped fashion through the relatively-high-field region between the main negative and lower positive charge centers and then through the relatively-low-field region below the lower positive charge center. A relatively-high-field region is also assumed to exist near ground. The preliminary breakdown pulse train is assumed to be generated when the negatively-charged channel interacts with the lower positive charge region. At each step, an equivalent current source is activated at the lower extremity of the channel, resulting in a step current wave that propagates upward along the channel. The leader deposits net negative charge onto the channel. Once the stepped leader attaches to ground (upward connecting leader is presently neglected), an upward-propagating return stroke is initiated, which neutralizes the charge deposited by the leader along the channel. We examine the effect of various model parameters, such as step length and current propagation speed, on model-predicted electric fields. We also compare the computed fields with pertinent measurements available in the literature.

  17. A Featured-Based Strategy for Stereovision Matching in Sensors with Fish-Eye Lenses for Forest Environments

    PubMed Central

    Herrera, Pedro Javier; Pajares, Gonzalo; Guijarro, Maria; Ruz, José J.; Cruz, Jesús M.; Montes, Fernando

    2009-01-01

    This paper describes a novel feature-based stereovision matching process based on a pair of omnidirectional images in forest stands acquired with a stereovision sensor equipped with fish-eye lenses. The stereo analysis problem consists of the following steps: image acquisition, camera modelling, feature extraction, image matching and depth determination. Once the depths of significant points on the trees are obtained, the growing stock volume can be estimated by considering the geometrical camera modelling, which is the final goal. The key steps are feature extraction and image matching. This paper is devoted solely to these two steps. At a first stage a segmentation process extracts the trunks, which are the regions used as features, where each feature is identified through a set of attributes of properties useful for matching. In the second step the features are matched based on the application of the following four well known matching constraints, epipolar, similarity, ordering and uniqueness. The combination of the segmentation and matching processes for this specific kind of sensors make the main contribution of the paper. The method is tested with satisfactory results and compared against the human expert criterion. PMID:22303134

  18. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  19. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE PAGES

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi; ...

    2017-03-13

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  20. The Search for Accountability and Transparency in Plan Colombia: Reforming Judicial Institutions - Again

    DTIC Science & Technology

    2001-05-01

    acquired money coming mainly from drug sources heavily influenced Colombia’s political process and institutions. 7 The first step to ending corruption is to...and took his case for Plan Colombia directly to the international stage in the formative process for the plan. In fact, a Spanish language version of...by which repressive colonial regimes enforced first monarchical and later executive authority over the populace. In the process , the subservience of

  1. Iranian Clinical Nurses’ Activities for Self-Directed Learning: A Qualitative Study

    PubMed Central

    Ghiyasvandian, Shahrzad; Malekian, Morteza; Cheraghi, Mohammad Ali

    2016-01-01

    Background: Clinical nurses need lifelong learning skills for responding to the rapid changes of clinical settings. One of the best strategies for lifelong learning is self-directed learning. The aim of this study was to explore Iranian clinical nurses’ activities for self-directed learning. Methods: In this qualitative study, 23 semi-structured personal interviews were conducted with nineteen clinical nurses working in all four hospitals affiliated to Isfahan Social Security Organization, Isfahan, Iran. Study data were analyzed by using the content analysis approach. The study was conducted from June 2013 to October 2014. Findings: Study participants’ activities for self-directed learning fell into two main categories of striving for knowledge acquisition and striving for skill development. The main theme of the study was ‘Revising personal performance based on intellectual-experiential activities’. Conclusions: Study findings suggest that Iranian clinical nurses continually revise their personal performance by performing self-directed intellectual and experiential activities to acquire expertise. The process of acquiring expertise is a linear process which includes two key steps of knowledge acquisition and knowledge development. In order to acquire and advance their knowledge, nurses perform mental learning activities such as sensory perception, self-evaluation, and suspended judgment step-by-step. Moreover, they develop their skills through doing activities like apprenticeship, masterly performance, and self-regulation. The absolute prerequisite to expertise acquisition is that a nurse needs to follow these two steps in a sequential manner. PMID:26652072

  2. Environmental Benign Process for Production of Molybdenum Metal from Sulphide Based Minerals

    NASA Astrophysics Data System (ADS)

    Rajput, Priyanka; Janakiram, Vangada; Jayasankar, Kalidoss; Angadi, Shivakumar; Bhoi, Bhagyadhar; Mukherjee, Partha Sarathi

    2017-10-01

    Molybdenum is a strategic and high temperature refractory metal which is not found in nature in free state, it is predominantly found in earth's crust in the form of MoO3/MoS2. The main disadvantage of the industrial treatment of Mo concentrate is that the process contains many stages and requires very high temperature. Almost in every step many gaseous, liquid, solid chemical substances are formed which require further treatment. To overcome the above drawback, a new alternative one step novel process is developed for the treatment of sulphide and trioxide molybdenum concentrates. This paper presents the results of the investigations on molybdenite dissociation (MoS2) using microwave assisted plasma unit as well as transferred arc thermal plasma torch. It is a single step process for the preparation of pure molybdenum metal from MoS2 by hydrogen reduction in thermal plasma. Process variable such as H2 gas, Ar gas, input current, voltage and time have been examined to prepare molybdenum metal. Molybdenum recovery of the order of 95% was achieved. The XRD results confirm the phases of molybdenum metal and the chemical analysis of the end product indicate the formation of metallic molybdenum (Mo 98%).

  3. Integration of the Anammox process to the rejection water and main stream lines of WWTPs.

    PubMed

    Morales, Nicolás; Val Del Río, Ángeles; Vázquez-Padín, José Ramón; Méndez, Ramón; Mosquera-Corral, Anuska; Campos, José Luis

    2015-12-01

    Nowadays the application of Anammox based processes in the wastewater treatment plants has given a step forward. The new goal consists of removing the nitrogen present in the main stream of the WWTPs to improve their energetic efficiencies. This new approach aims to remove not only the nitrogen but also to provide a better use of the energy contained in the organic matter. The organic matter will be removed either by an anaerobic psychrophilic membrane reactor or an aerobic stage operated at low solids retention time followed by an anaerobic digestion of the generated sludge. Then ammonia coming from these units will be removed in an Anammox based process in a single unit system. The second strategy provides the best results in terms of operational costs and would allow reductions of about 28%. Recent research works performed on Anammox based processes and operated at relatively low temperatures and/or low ammonia concentrations were carried out in single-stage systems using biofilms, granules or a mixture of flocculent nitrifying and granular Anammox biomasses. These systems allowed the appropriated retention of Anammox and ammonia oxidizing bacteria but also the proliferation of nitrite oxidizing bacteria which seems to be the main drawback to achieve the required effluent quality for disposal. Therefore, prior to the implementation of the Anammox based processes at full scale to the water line, a reliable strategy to avoid nitrite oxidation should be defined in order to maintain the process stability and to obtain the desired effluent quality. If not, the application of a post-denitrification step should be necessary. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.

    PubMed

    Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M

    2017-01-01

    Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.

  5. Effects of industrial processing on folate content in green vegetables.

    PubMed

    Delchier, Nicolas; Ringling, Christiane; Le Grandois, Julie; Aoudé-Werner, Dalal; Galland, Rachel; Georgé, Stéphane; Rychlik, Michael; Renard, Catherine M G C

    2013-08-15

    Folates are described to be sensitive to different physical parameters such as heat, light, pH and leaching. Most studies on folates degradation during processing or cooking treatments were carried out on model solutions or vegetables only with thermal treatments. Our aim was to identify which steps were involved in folates loss in industrial processing chains, and which mechanisms were underlying these losses. For this, the folates contents were monitored along an industrial canning chain of green beans and along an industrial freezing chain of spinach. Folates contents decreased significantly by 25% during the washing step for spinach in the freezing process, and by 30% in the green beans canning process after sterilisation, with 20% of the initial amount being transferred into the covering liquid. The main mechanism involved in folate loss during both canning green beans and freezing spinach was leaching. Limiting the contact between vegetables and water or using steaming seems to be an adequate measure to limit folates losses during processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Dynamic modeling and analyses of simultaneous saccharification and fermentation process to produce bio-ethanol from rice straw.

    PubMed

    Ko, Jordon; Su, Wen-Jun; Chien, I-Lung; Chang, Der-Ming; Chou, Sheng-Hsin; Zhan, Rui-Yu

    2010-02-01

    The rice straw, an agricultural waste from Asians' main provision, was collected as feedstock to convert cellulose into ethanol through the enzymatic hydrolysis and followed by the fermentation process. When the two process steps are performed sequentially, it is referred to as separate hydrolysis and fermentation (SHF). The steps can also be performed simultaneously, i.e., simultaneous saccharification and fermentation (SSF). In this research, the kinetic model parameters of the cellulose saccharification process step using the rice straw as feedstock is obtained from real experimental data of cellulase hydrolysis. Furthermore, this model can be combined with a fermentation model at high glucose and ethanol concentrations to form a SSF model. The fermentation model is based on cybernetic approach from a paper in the literature with an extension of including both the glucose and ethanol inhibition terms to approach more to the actual plants. Dynamic effects of the operating variables in the enzymatic hydrolysis and the fermentation models will be analyzed. The operation of the SSF process will be compared to the SHF process. It is shown that the SSF process is better in reducing the processing time when the product (ethanol) concentration is high. The means to improve the productivity of the overall SSF process, by properly using aeration during the batch operation will also be discussed.

  7. Study of the highly ordered TiO2 nanotubes physical properties prepared with two-step anodization

    NASA Astrophysics Data System (ADS)

    Pishkar, Negin; Ghoranneviss, Mahmood; Ghorannevis, Zohreh; Akbari, Hossein

    2018-06-01

    Highly ordered hexagonal closely packed titanium dioxide nanotubes (TiO2 NTs) were successfully grown by a two-step anodization process. The TiO2 NTs were synthesized by electrochemical anodization of titanium foils in an ethylene glycol based electrolyte solution containing 0.3 wt% NH4F and 2 vol% deionized (DI) water at constant potential (50 V) for 1 h at room temperature. Physical properties of the TiO2 NTs, which were prepared via one and two-step anodization, were investigated. Atomic Force Microscopy (AFM) analysis revealed that anodization and subsequently peeled off the TiO2 NTs caused to the periodic pattern on the Ti surface. In order To study the nanotubes morphology, Field Emission Scanning Electron Microscopy (FESEM) was used, which was revealed that the two-step anodization resulted highly ordered hexagonal TiO2 NTs. Crystal structures of the TiO2 NTs were mainly anatase, determined by X-ray diffraction analysis. Optical studies were performed by Diffuse Reflection Spectra (DRS) and Photoluminescence (PL) analysis showed that the band gap of TiO2 NTs prepared via two-step anodization was lower than the band gap of samples prepared by one-step anodization process.

  8. Electrohydraulic linear actuator with two stepping motors controlled by overshoot-free algorithm

    NASA Astrophysics Data System (ADS)

    Milecki, Andrzej; Ortmann, Jarosław

    2017-11-01

    The paper describes electrohydraulic spool valves with stepping motors used as electromechanical transducers. A new concept of a proportional valve in which two stepping motors are working differentially is introduced. Such valve changes the fluid flow proportionally to the sum or difference of the motors' steps numbers. The valve design and principle of its operation is described. Theoretical equations and simulation models are proposed for all elements of the drive, i.e., the stepping motor units, hydraulic valve and cylinder. The main features of the valve and drive operation are described; some specific problem areas covering the nature of stepping motors and their differential work in the valve are also considered. The whole servo drive non-linear model is proposed and used further for simulation investigations. The initial simulation investigations of the drive with a new valve have shown that there is a significant overshoot in the drive step response, which is not allowed in positioning process. Therefore additional effort is spent to reduce the overshoot and in consequence reduce the settling time. A special predictive algorithm is proposed to this end. Then the proposed control method is tested and further improved in simulations. Further on, the model is implemented in reality and the whole servo drive system is tested. The investigation results presented in this paper, are showing an overshoot-free positioning process which enables high positioning accuracy.

  9. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  10. Tequila production.

    PubMed

    Cedeño, M

    1995-01-01

    Tequila is obtained from the distillation of fermented juice of agave plant, Agave tequilana, to which up to 49% (w/v) of an adjunct sugar, mainly from cane or corn, could be added. Agave plants require from 8 to 12 years to mature and during all this time cleaning, pest control, and slacken of land are required to produce an initial raw material with the appropriate chemical composition for tequila production. Production process comprises four steps: cooking to hydrolyze inulin into fructose, milling to extract the sugars, fermentation with a strain of Saccharomyces cerevisiae to convert the sugars into ethanol and organoleptic compounds, and, finally, a two-step distillation process. Maturation, if needed, is carried out in white oak barrels to obtain rested or aged tequila in 2 or 12 months, respectively.

  11. Process development and tooling design for intrinsic hybrid composites

    NASA Astrophysics Data System (ADS)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  12. A kinetic study of struvite precipitation recycling technology with NaOH/Mg(OH)2 addition.

    PubMed

    Yu, Rongtai; Ren, Hongqiang; Wang, Yanru; Ding, Lili; Geng, Jingji; Xu, Ke; Zhang, Yan

    2013-09-01

    Struvite precipitation recycling technology is received wide attention in removal ammonium and phosphate out of wastewater. While past study focused on process efficiency, and less on kinetics. The kinetic study is essential for the design and optimization in the application of struvite precipitation recycling technology. The kinetics of struvite with NaOH/Mg(OH)2 addition were studied by thermogravimetry analysis with three rates (5, 10, 20 °C/min), using Friedman method and Ozawa-Flynn-Wall method, respectively. Degradation process of struvite with NaOH/Mg(OH)2 addition was three steps. The stripping of ammonia from struvite was mainly occurred at the first step. In the first step, the activation energy was about 70 kJ/mol, which has gradually declined as the reaction progress. By model fitting studies, the proper mechanism function for struvite decomposition process with NaOH/Mg(OH)2 addition was revealed. The mechanism function was f(α)=α(α)-(1-α)(n), a Prout-Tompkins nth order (Bna) model. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Cellulase production using different streams of wheat grain- and wheat straw-based ethanol processes.

    PubMed

    Gyalai-Korpos, Miklós; Mangel, Réka; Alvira, Pablo; Dienes, Dóra; Ballesteros, Mercedes; Réczey, Kati

    2011-07-01

    Pretreatment is a necessary step in the biomass-to-ethanol conversion process. The side stream of the pretreatment step is the liquid fraction, also referred to as the hydrolyzate, which arises after the separation of the pretreated solid and is composed of valuable carbohydrates along with compounds that are potentially toxic to microbes (mainly furfural, acetic acid, and formic acid). The aim of our study was to utilize the liquid fraction from steam-exploded wheat straw as a carbon source for cellulase production by Trichoderma reesei RUT C30. Results showed that without detoxification, the fungus failed to utilize any dilution of the hydrolyzate; however, after a two-step detoxification process, it was able to grow on a fourfold dilution of the treated liquid fraction. Supplementation of the fourfold-diluted, treated liquid fraction with washed pretreated wheat straw or ground wheat grain led to enhanced cellulase (filter paper) activity. Produced enzymes were tested in hydrolysis of washed pretreated wheat straw. Supplementation with ground wheat grain provided a more efficient enzyme mixture for the hydrolysis by means of the near-doubled β-glucosidase activity obtained.

  14. Cross-current leaching of indium from end-of-life LCD panels.

    PubMed

    Rocchetti, Laura; Amato, Alessia; Fonti, Viviana; Ubaldini, Stefano; De Michelis, Ida; Kopacek, Bernd; Vegliò, Francesco; Beolchini, Francesca

    2015-08-01

    Indium is a critical element mainly produced as a by-product of zinc mining, and it is largely used in the production process of liquid crystal display (LCD) panels. End-of-life LCDs represent a possible source of indium in the field of urban mining. In the present paper, we apply, for the first time, cross-current leaching to mobilize indium from end-of-life LCD panels. We carried out a series of treatments to leach indium. The best leaching conditions for indium were 2M sulfuric acid at 80°C for 10min, which allowed us to completely mobilize indium. Taking into account the low content of indium in end-of-life LCDs, of about 100ppm, a single step of leaching is not cost-effective. We tested 6 steps of cross-current leaching: in the first step indium leaching was complete, whereas in the second step it was in the range of 85-90%, and with 6 steps it was about 50-55%. Indium concentration in the leachate was about 35mg/L after the first step of leaching, almost 2-fold at the second step and about 3-fold at the fifth step. Then, we hypothesized to scale up the process of cross-current leaching up to 10 steps, followed by cementation with zinc to recover indium. In this simulation, the process of indium recovery was advantageous from an economic and environmental point of view. Indeed, cross-current leaching allowed to concentrate indium, save reagents, and reduce the emission of CO2 (with 10 steps we assessed that the emission of about 90kg CO2-Eq. could be avoided) thanks to the recovery of indium. This new strategy represents a useful approach for secondary production of indium from waste LCD panels. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders.

    PubMed

    van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-08-13

    It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.

  16. Steps to Ensure a Successful Implementation of Occupational Health and Safety Interventions at an Organizational Level

    PubMed Central

    Herrera-Sánchez, Isabel M.; León-Pérez, José M.; León-Rubio, José M.

    2017-01-01

    There is increasing meta-analytic evidence that addresses the positive impact of evidence-based occupational health and safety interventions on employee health and well-being. However, such evidence is less clear when interventions are approached at an organizational level and are aimed at changing organizational policies and processes. Given that occupational health and safety interventions are usually tailored to specific organizational contexts, generalizing and transferring such interventions to other organizations is a complex endeavor. In response, several authors have argued that an evaluation of the implementation process is crucial for assessing the intervention’s effectiveness and for understanding how and why the intervention has been (un)successful. Thus, this paper focuses on the implementation process and attempts to move this field forward by identifying the main factors that contribute toward ensuring a greater success of occupational health and safety interventions conducted at the organizational level. In doing so, we propose some steps that can guide a successful implementation. These implementation steps are illustrated using examples of evidence-based best practices reported in the literature that have described and systematically evaluated the implementation process behind their interventions during the last decade. PMID:29375413

  17. Process and apparatus for igniting a burner in an inert atmosphere

    DOEpatents

    Coolidge, Dennis W.; Rinker, Franklin G.

    1994-01-01

    According to this invention there is provided a process and apparatus for the ignition of a pilot burner in an inert atmosphere without substantially contaminating the inert atmosphere. The process includes the steps of providing a controlled amount of combustion air for a predetermined interval of time to the combustor then substantially simultaneously providing a controlled mixture of fuel and air to the pilot burner and to a flame generator. The controlled mixture of fuel and air to the flame generator is then periodically energized to produce a secondary flame. With the secondary flame the controlled mixture of fuel and air to the pilot burner and the combustion air is ignited to produce a pilot burner flame. The pilot burner flame is then used to ignited a mixture of main fuel and combustion air to produce a main burner flame. The main burner flame then is used to ignite a mixture of process derived fuel and combustion air to produce products of combustion for use as an inert gas in a heat treatment process.

  18. A Step for Evaluating Constructivist Approach Integrated Online Courses

    ERIC Educational Resources Information Center

    Gazi, Zehra A.

    2011-01-01

    This research aims to reveal the validation of 86-items in order to develop a scale for evaluating constructivist approach integrated online courses in higher education practices. The main aim of this research process is to reveal a scale to further evaluate whether the online education practices in higher education have the notions of…

  19. Added value of involving patients in the first step of multidisciplinary guideline development: a qualitative interview study among infertile patients.

    PubMed

    den Breejen, Elvira M E; Hermens, Rosella P M G; Galama, Wienke H; Willemsen, Wim N P; Kremer, Jan A M; Nelen, Willianne L D M

    2016-06-01

    Patient involvement in scoping the guideline is emphasized, but published initiatives actively involving patients are generally limited to the writing and reviewing phase. To assess patients' added value to the scoping phase of a multidisciplinary guideline on infertility. Qualitative interview study. We conducted interviews among 12 infertile couples and 17 professionals. We listed and compared the couples' and professionals' key clinical issues (=care aspects that need improvement) to be addressed in the guideline according to four domains: current guidelines, professionals, patients and organization of care. Main key clinical issues suggested by more than three quarters of the infertile couples and/or at least two professionals were identified and compared. Overall, we identified 32 key clinical issues among infertile couples and 23 among professionals. Of the defined main key clinical issues, infertile couples mentioned eight issues that were not mentioned by the professionals. These main key clinical issues mainly concerned patient-centred (e.g. poor information provision and poor alignment of care) aspects of care on the professional and organizational domain. Both groups mentioned two main key clinical issues collectively that were interpreted differently: the lack of emotional support and respect for patients' values. Including patients from the first phase of the guideline development process leads to valuable additional main key clinical issues for the next step of a multidisciplinary guideline development process and broadens the scope of the guideline, particularly regarding patient-centredness and organizational issues from a patients' perspective. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  20. Study of ND3-enhanced MAR processes in D2-N2 plasmas to induce plasma detachment

    NASA Astrophysics Data System (ADS)

    Abe, Shota; Chakraborty Thakur, Saikat; Doerner, Russ; Tynan, George

    2017-10-01

    The Molecular Assisted Recombination (MAR) process is thought to be a main channel of volumetric recombination to induce the plasma detachment operation. Authors have focused on a new plasma recombination process supported by ammonia molecules, which will be formed by impurity seeding of N2 for controlling divertor plasma temperature and heat loads in ITER. This ammonia-enhanced MAR process would occur throughout two steps. In this study, the first step of the new MAR process is investigated in low density plasmas (Ne 1016 m-3, Te 4 eV) fueled by D2 and N2. Ion and neutral densities are measured by a calibrated Electrostatic Quadrupole Plasma (EQP) analyzer, combination of an ion energy analyzer and mass spectrometer. The EQP shows formation of ND3 during discharges. Ion densities calculated by a rate equation model are compared with experimental results. We find that the model can reproduce the observed ion densities in the plasma. The model calculation shows that the dominant neutralization channel of Dx+(x =1-3) ions in the volume is the formation of NDy+(y =3 or 4) throughout charge/D+ exchange reactions with ND3. Furthermore, high density plasmas (Ne 1016 m-3) have been achieved to investigate electron-impact dissociative recombination processes of formed NDy+,which is the second step of this MAR process.

  1. User’s Manual for the Ride Motion Simulator

    DTIC Science & Technology

    1989-08-01

    1800 psi). Step 16. Pressurize the system by moving the main pressure switch to "ON." Wait for the roll, pitch, and yaw error signals to go to "Zero...Carefully, help the test subject dismount. Step 41. Flip the main pressure switch on the hydraulic control panel to "OFF." This will block hydraulic...1.13, thus lowering the seat. Release the "Low Limit Override" switch. Step 5. Dismount the test subject. Step 6. Move the main pressure switch to the

  2. Safety Assessment of TACOM’s Ride Motion Simulator

    DTIC Science & Technology

    1990-01-24

    level (1300 to 1800 psi). 24 Step 16. Pressurize the system by moving the main pressure switch to "ON." Wait for the roll, pitch, and yaw error signals...the appropriate seat/shoulder/safety belts and harnesses. Carefully, help the test subject dismount. Step 41. Flip the main pressure switch on the...Dismount the test subject. Step 6. Move the main pressure switch to the "OFF" position. This will block any hydraulic flow to the system. Step 7. Move the

  3. Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.

    PubMed

    Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe

    2016-08-01

    Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Perfusion mammalian cell culture for recombinant protein manufacturing - A critical review.

    PubMed

    Bielser, Jean-Marc; Wolf, Moritz; Souquet, Jonathan; Broly, Hervé; Morbidelli, Massimo

    The manufacturing of recombinant protein is traditionally divided in two main steps: upstream (cell culture and synthesis of the target protein) and downstream (purification and formulation of the protein into a drug substance or drug product). Today, cost pressure, market uncertainty and market growth, challenge the existing manufacturing technologies. Leaders in the field are active in designing the process of the future and continuous manufacturing is recurrently mentioned as a potential solution to address some of the current limitations. This review focuses on the application of continuous processing to the first step of the manufacturing process. Enabling technologies and operation modes are described in the first part. In the second part, recent advances in the field that have the potential to support its successful future development are critically discussed. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Plasma Heating and Alfvénic Turbulence Enhancement During Two Steps of Energy Conversion in Magnetic Reconnection Exhaust Region of Solar Wind

    NASA Astrophysics Data System (ADS)

    Jiansen, He; Xingyu, Zhu; Yajie, Chen; Chadi, Salem; Michael, Stevens; Hui, Li; Wenzhi, Ruan; Lei, Zhang; Chuanyi, Tu

    2018-04-01

    The magnetic reconnection exhaust is a pivotal region with enormous magnetic energy being continuously released and converted. The physical processes of energy conversion involved are so complicated that an all-round understanding based on in situ measurements is still lacking. We present the evidence of plasma heating by illustrating the broadening of proton and electron velocity distributions, which are extended mainly along the magnetic field, in an exhaust of interchange reconnection between two interplanetary magnetic flux tubes of the same polarity on the Sun. The exhaust is asymmetric across an interface, with both sides being bounded by a pair of compound discontinuities consisting of rotational discontinuity and slow shock. The energized plasmas are found to be firehose unstable, and responsible for the emanation of Alfvén waves during the second step of energy conversion. It is realized that the energy conversion in the exhaust can be a two-step process involving both plasma energization and wave emission.

  6. Preparation of cellulose based microspheres by combining spray coagulating with spray drying.

    PubMed

    Wang, Qiao; Fu, Aiping; Li, Hongliang; Liu, Jingquan; Guo, Peizhi; Zhao, Xiu Song; Xia, Lin Hua

    2014-10-13

    Porous microspheres of regenerated cellulose with size in range of 1-2 μm and composite microspheres of chitosan coated cellulose with size of 1-3 μm were obtained through a two-step spray-assisted approach. The spray coagulating process must combine with a spray drying step to guarantee the formation of stable microspheres of cellulose. This approach exhibits the following two main virtues. First, the preparation was performed using aqueous solution of cellulose as precursor in the absence of organic solvent and surfactant; Second, neither crosslinking agent nor separated crosslinking process was required for formation of stable microspheres. Moreover, the spray drying step also provided us with the chance to encapsulate guests into the resultant cellulose microspheres. The potential application of the cellulose microspheres acting as drug delivery vector has been studied in two PBS (phosphate-buffered saline) solution with pH values at 4.0 and 7.4 to mimic the environments of stomach and intestine, respectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A Model for Effective Performance in the Indonesian Navy.

    DTIC Science & Technology

    1987-06-01

    Navy LMET I AWBRAC’ (COAGAU0 OAl reVerie of necoiuary and .dfmtk by block num"ber) 7,- This thesis describes a process of designing a management ...effective from ineffective manager . In the process of building the model two main steps are taken. First, a literature study of the empirical analysis of... management competencies was conducted to identify management competencies in the United States in general and the U.S. Navy in particular. Second, a

  8. Robust perception algorithms for road and track autonomous following

    NASA Astrophysics Data System (ADS)

    Marion, Vincent; Lecointe, Olivier; Lewandowski, Cecile; Morillon, Joel G.; Aufrere, Romuald; Marcotegui, Beatrix; Chapuis, Roland; Beucher, Serge

    2004-09-01

    The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales Airborne Systems as the prime contractor, focuses on about 15 robotic themes, which can provide an immediate "operational add-on value." The paper details the "road and track following" theme (named AUT2), which main purpose was to develop a vision based sub-system to automatically detect roadsides of an extended range of roads and tracks suitable to military missions. To achieve the goal, efforts focused on three main areas: (1) Improvement of images quality at algorithms inputs, thanks to the selection of adapted video cameras, and the development of a THALES patented algorithm: it removes in real time most of the disturbing shadows in images taken in natural environments, enhances contrast and lowers reflection effect due to films of water. (2) Selection and improvement of two complementary algorithms (one is segment oriented, the other region based) (3) Development of a fusion process between both algorithms, which feeds in real time a road model with the best available data. Each previous step has been developed so that the global perception process is reliable and safe: as an example, the process continuously evaluates itself and outputs confidence criteria qualifying roadside detection. The paper presents the processes in details, and the results got from passed military acceptance tests, which trigger the next step: autonomous track following (named AUT3).

  9. Failure modes and effects analysis for ocular brachytherapy.

    PubMed

    Lee, Yongsook C; Kim, Yongbok; Huynh, Jason Wei-Yeong; Hamilton, Russell J

    The aim of the study was to identify potential failure modes (FMs) having a high risk and to improve our current quality management (QM) program in Collaborative Ocular Melanoma Study (COMS) ocular brachytherapy by undertaking a failure modes and effects analysis (FMEA) and a fault tree analysis (FTA). Process mapping and FMEA were performed for COMS ocular brachytherapy. For all FMs identified in FMEA, risk priority numbers (RPNs) were determined by assigning and multiplying occurrence, severity, and lack of detectability values, each ranging from 1 to 10. FTA was performed for the major process that had the highest ranked FM. Twelve major processes, 121 sub-process steps, 188 potential FMs, and 209 possible causes were identified. For 188 FMs, RPN scores ranged from 1.0 to 236.1. The plaque assembly process had the highest ranked FM. The majority of FMs were attributable to human failure (85.6%), and medical physicist-related failures were the most numerous (58.9% of all causes). After FMEA, additional QM methods were included for the top 10 FMs and 6 FMs with severity values > 9.0. As a result, for these 16 FMs and the 5 major processes involved, quality control steps were increased from 8 (50%) to 15 (93.8%), and major processes having quality assurance steps were increased from 2 to 4. To reduce high risk in current clinical practice, we proposed QM methods. They mainly include a check or verification of procedures/steps and the use of checklists for both ophthalmology and radiation oncology staff, and intraoperative ultrasound-guided plaque positioning for ophthalmology staff. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  10. Engineering design and prototype development of a full scale ultrasound system for virgin olive oil by means of numerical and experimental analysis.

    PubMed

    Clodoveo, Maria Lisa; Moramarco, Vito; Paduano, Antonello; Sacchi, Raffaele; Di Palmo, Tiziana; Crupi, Pasquale; Corbo, Filomena; Pesce, Vito; Distaso, Elia; Tamburrano, Paolo; Amirante, Riccardo

    2017-07-01

    The aim of the virgin olive oil extraction process is mainly to obtain the best quality oil from fruits, by only applying mechanical actions while guaranteeing the highest overall efficiency. Currently, the mechanical methods used to extract virgin oils from olives are basically of two types: the discontinuous system (obsolete) and the continuous one. Anyway the system defined as "continuous" is composed of several steps which are not all completely continuous, due to the presence of the malaxer, a device that works in batch. The aim of the paper was to design, realize and test the first full scale sono-exchanger for the virgin olive oil industry, to be placed immediately after the crusher and before the malaxer. The innovative device is mainly composed of a triple concentric pipe heat exchanger combined with three ultrasound probes. This mechanical solution allows both the cell walls (which release the oil droplets) along with the minor compounds to be destroyed more effectively and the heat exchange between the olive paste and the process water to be accelerated. This strategy represents the first step towards the transformation of the malaxing step from a batch operation into a real continuous process, thus improving the working capacity of the industrial plants. Considering the heterogeneity of the olive paste, which is composed of different tissues, the design of the sono-exchanger required a thorough fluid dynamic analysis. The thermal effects of the sono-exchanger were monitored by measuring the temperature of the product at the inlet and the outlet of the device; in addition, the measurement of the pigments concentration in the product allowed monitoring the mechanical effects of the sono-exchanger. The effects of the innovative process were also evaluated in terms of extra virgin olive oil yields and quality, evaluating the main legal parameters, the polyphenol and tocopherol content. Moreover, the activity of the polyphenol oxidase enzyme in the olive paste was measured. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Chemical kinetic simulation of kerosene combustion in an individual flame tube.

    PubMed

    Zeng, Wen; Liang, Shuang; Li, Hai-Xia; Ma, Hong-An

    2014-05-01

    The use of detailed chemical reaction mechanisms of kerosene is still very limited in analyzing the combustion process in the combustion chamber of the aircraft engine. In this work, a new reduced chemical kinetic mechanism for fuel n-decane, which selected as a surrogate fuel for kerosene, containing 210 elemental reactions (including 92 reversible reactions and 26 irreversible reactions) and 50 species was developed, and the ignition and combustion characteristics of this fuel in both shock tube and flat-flame burner were kinetic simulated using this reduced reaction mechanism. Moreover, the computed results were validated by experimental data. The calculated values of ignition delay times at pressures of 12, 50 bar and equivalence ratio is 1.0, 2.0, respectively, and the main reactants and main products mole fractions using this reduced reaction mechanism agree well with experimental data. The combustion processes in the individual flame tube of a heavy duty gas turbine combustor were simulated by coupling this reduced reaction mechanism of surrogate fuel n-decane and one step reaction mechanism of surrogate fuel C12H23 into the computational fluid dynamics software. It was found that this reduced reaction mechanism is shown clear advantages in simulating the ignition and combustion processes in the individual flame tube over the one step reaction mechanism.

  12. Chemical kinetic simulation of kerosene combustion in an individual flame tube

    PubMed Central

    Zeng, Wen; Liang, Shuang; Li, Hai-xia; Ma, Hong-an

    2013-01-01

    The use of detailed chemical reaction mechanisms of kerosene is still very limited in analyzing the combustion process in the combustion chamber of the aircraft engine. In this work, a new reduced chemical kinetic mechanism for fuel n-decane, which selected as a surrogate fuel for kerosene, containing 210 elemental reactions (including 92 reversible reactions and 26 irreversible reactions) and 50 species was developed, and the ignition and combustion characteristics of this fuel in both shock tube and flat-flame burner were kinetic simulated using this reduced reaction mechanism. Moreover, the computed results were validated by experimental data. The calculated values of ignition delay times at pressures of 12, 50 bar and equivalence ratio is 1.0, 2.0, respectively, and the main reactants and main products mole fractions using this reduced reaction mechanism agree well with experimental data. The combustion processes in the individual flame tube of a heavy duty gas turbine combustor were simulated by coupling this reduced reaction mechanism of surrogate fuel n-decane and one step reaction mechanism of surrogate fuel C12H23 into the computational fluid dynamics software. It was found that this reduced reaction mechanism is shown clear advantages in simulating the ignition and combustion processes in the individual flame tube over the one step reaction mechanism. PMID:25685503

  13. Step-by-step seeding procedure for preparing HKUST-1 membrane on porous α-alumina support.

    PubMed

    Nan, Jiangpu; Dong, Xueliang; Wang, Wenjin; Jin, Wanqin; Xu, Nanping

    2011-04-19

    Metal-organic framework (MOF) membranes have attracted considerable attention because of their striking advantages in small-molecule separation. The preparation of an integrated MOF membrane is still a major challenge. Depositing a uniform seed layer on a support for secondary growth is a main route to obtaining an integrated MOF membrane. A novel seeding method to prepare HKUST-1 (known as Cu(3)(btc)(2)) membranes on porous α-alumina supports is reported. The in situ production of the seed layer was realized in step-by-step fashion via the coordination of H(3)btc and Cu(2+) on an α-alumina support. The formation process of the seed layer was observed by ultraviolet-visible absorption spectroscopy and atomic force microscopy. An integrated HKUST-1 membrane could be synthesized by the secondary hydrothermal growth on the seeded support. The gas permeation performance of the membrane was evaluated. © 2011 American Chemical Society

  14. Towards an Approach for an Accessible and Inclusive Virtual Education Using ESVI-AL Project Results

    ERIC Educational Resources Information Center

    Amado-Salvatierra, Hector R.; Hilera, Jose R.

    2015-01-01

    Purpose: This paper aims to present an approach to achieve accessible and inclusive Virtual Education for all, but especially intended for students with disabilities. This work proposes main steps to take into consideration for stakeholders involved in the educational process related to an inclusive e-Learning. Design/methodology/approach: The…

  15. Qvo Vadis Magister Artium? Policy Implications of Executive Master's Programmes in an Israeli Research University

    ERIC Educational Resources Information Center

    Yogev, Abraham

    2010-01-01

    During recent decades master's studies have mainly become professional, but in some countries, like Israel, they still are a stepping stone toward doctorate studies. Changes in that respect may however occur due to recent university marketization processes. Using Tel Aviv University as a case study, we focus on the executive master's programmes…

  16. Main Quality Attributes of Monoclonal Antibodies and Effect of Cell Culture Components

    PubMed

    Torkashvand, Fatemeh; Vaziri, Behrouz

    2017-05-01

    The culture media optimization is an inevitable part of upstream process development in therapeutic monoclonal antibodies (mAbs) production. The quality by design (QbD) approach defines the assured quality of the final product through the development stage. An important step in QbD is determination of the main quality attributes. During the media optimization, some of the main quality attributes such as glycosylation pattern, charge variants, aggregates, and low-molecular-weight species, could be significantly altered. Here, we provide an overview of how cell culture medium components affects the main quality attributes of the mAbs. Knowing the relationship between the culture media components and the main quality attributes could be successfully utilized for a rational optimization of mammalian cell culture media for industrial mAbs production.

  17. Influence of bilayer resist processing on p-i-n OLEDs: towards multicolor photolithographic structuring of organic displays

    NASA Astrophysics Data System (ADS)

    Krotkus, Simonas; Nehm, Frederik; Janneck, Robby; Kalkura, Shrujan; Zakhidov, Alex A.; Schober, Matthias; Hild, Olaf R.; Kasemann, Daniel; Hofmann, Simone; Leo, Karl; Reineke, Sebastian

    2015-03-01

    Recently, bilayer resist processing combined with development in hydrofluoroether (HFE) solvents has been shown to enable single color structuring of vacuum-deposited state-of-the-art organic light-emitting diodes (OLED). In this work, we focus on further steps required to achieve multicolor structuring of p-i-n OLEDs using a bilayer resist approach. We show that the green phosphorescent OLED stack is undamaged after lift-off in HFEs, which is a necessary step in order to achieve RGB pixel array structured by means of photolithography. Furthermore, we investigate the influence of both, double resist processing on red OLEDs and exposure of the devices to ambient conditions, on the basis of the electrical, optical and lifetime parameters of the devices. Additionally, water vapor transmission rates of single and bilayer system are evaluated with thin Ca film conductance test. We conclude that diffusion of propylene glycol methyl ether acetate (PGMEA) through the fluoropolymer film is the main mechanism behind OLED degradation observed after bilayer processing.

  18. Technical difficulties and solutions of direct transesterification process of microbial oil for biodiesel synthesis.

    PubMed

    Yousuf, Abu; Khan, Maksudur Rahman; Islam, M Amirul; Wahid, Zularisam Ab; Pirozzi, Domenico

    2017-01-01

    Microbial oils are considered as alternative to vegetable oils or animal fats as biodiesel feedstock. Microalgae and oleaginous yeast are the main candidates of microbial oil producers' community. However, biodiesel synthesis from these sources is associated with high cost and process complexity. The traditional transesterification method includes several steps such as biomass drying, cell disruption, oil extraction and solvent recovery. Therefore, direct transesterification or in situ transesterification, which combines all the steps in a single reactor, has been suggested to make the process cost effective. Nevertheless, the process is not applicable for large-scale biodiesel production having some difficulties such as high water content of biomass that makes the reaction rate slower and hurdles of cell disruption makes the efficiency of oil extraction lower. Additionally, it requires high heating energy in the solvent extraction and recovery stage. To resolve these difficulties, this review suggests the application of antimicrobial peptides and high electric fields to foster the microbial cell wall disruption.

  19. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  20. Impact of different post-harvest processing methods on the chemical compositions of peony root.

    PubMed

    Zhu, Shu; Shirakawa, Aimi; Shi, Yanhong; Yu, Xiaoli; Tamura, Takayuki; Shibahara, Naotoshi; Yoshimatsu, Kayo; Komatsu, Katsuko

    2018-06-01

    The impact of key processing steps such as boiling, peeling, drying and storing on chemical compositions and morphologic features of the produced peony root was investigated in detail by applying 15 processing methods to fresh roots of Paeonia lactiflora and then monitoring contents of eight main components, as well as internal root color. The results showed that low temperature (4 °C) storage of fresh roots for approximately 1 month after harvest resulted in slightly increased and stable content of paeoniflorin, which might be due to suppression of enzymatic degradation. This storage also prevented roots from discoloring, facilitating production of favorable bright color roots. Boiling process triggered decomposition of polygalloylglucoses, thereby leading to a significant increase in contents of pentagalloylglucose and gallic acid. Peeling process resulted in a decrease of albiflorin and catechin contents. As a result, an optimized and practicable processing method ensuring high contents of the main active components in the produced root was developed.

  1. Chain of evidence generation for contrast enhancement in digital image forensics

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Messina, Giuseppe; Strano, Daniela

    2010-01-01

    The quality of the images obtained by digital cameras has improved a lot since digital cameras early days. Unfortunately, it is not unusual in image forensics to find wrongly exposed pictures. This is mainly due to obsolete techniques or old technologies, but also due to backlight conditions. To extrapolate some invisible details a stretching of the image contrast is obviously required. The forensics rules to produce evidences require a complete documentation of the processing steps, enabling the replication of the entire process. The automation of enhancement techniques is thus quite difficult and needs to be carefully documented. This work presents an automatic procedure to find contrast enhancement settings, allowing both image correction and automatic scripting generation. The technique is based on a preprocessing step which extracts the features of the image and selects correction parameters. The parameters are thus saved through a JavaScript code that is used in the second step of the approach to correct the image. The generated script is Adobe Photoshop compliant (which is largely used in image forensics analysis) thus permitting the replication of the enhancement steps. Experiments on a dataset of images are also reported showing the effectiveness of the proposed methodology.

  2. Isolation and Purification of Biotechnological Products

    NASA Astrophysics Data System (ADS)

    Hubbuch, Jürgen; Kula, Maria-Regina

    2007-05-01

    The production of modern pharma proteins is one of the most rapid growing fields in biotechnology. The overall development and production is a complex task ranging from strain development and cultivation to the purification and formulation of the drug. Downstream processing, however, still accounts for the major part of production costs. This is mainly due to the high demands on purity and thus safety of the final product and results in processes with a sequence of typically more than 10 unit operations. Consequently, even if each process step would operate at near optimal yield, a very significant amount of product would be lost. The majority of unit operations applied in downstream processing have a long history in the field of chemical and process engineering; nevertheless, mathematical descriptions of the respective processes and the economical large-scale production of modern pharmaceutical products are hampered by the complexity of the biological feedstock, especially the high molecular weight and limited stability of proteins. In order to develop new operational steps as well as a successful overall process, it is thus a necessary prerequisite to develop a deeper understanding of the thermodynamics and physics behind the applied processes as well as the implications for the product.

  3. Development and test of combustion chamber for Stirling engine heated by natural gas

    NASA Astrophysics Data System (ADS)

    Li, Tie; Song, Xiange; Gui, Xiaohong; Tang, Dawei; Li, Zhigang; Cao, Wenyu

    2014-04-01

    The combustion chamber is an important component for the Stirling engine heated by natural gas. In the paper, we develop a combustion chamber for the Stirling engine which aims to generate 3˜5 kWe electric power. The combustion chamber includes three main components: combustion module, heat exchange cavity and thermal head. Its feature is that the structure can divide "combustion" process and "heat transfer" process into two apparent individual steps and make them happen one by one. Since natural gas can mix with air fully before burning, the combustion process can be easily completed without the second wind. The flame can avoid contacting the thermal head of Stirling engine, and the temperature fields can be easily controlled. The designed combustion chamber is manufactured and its performance is tested by an experiment which includes two steps. The experimental result of the first step proves that the mixture of air and natural gas can be easily ignited and the flame burns stably. In the second step of experiment, the combustion heat flux can reach 20 kW, and the energy utilization efficiency of thermal head has exceeded 0.5. These test results show that the thermal performance of combustion chamber has reached the design goal. The designed combustion chamber can be applied to a real Stirling engine heated by natural gas which is to generate 3˜5 kWe electric power.

  4. A methodology to event reconstruction from trace images.

    PubMed

    Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre

    2015-03-01

    The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders

    PubMed Central

    Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-01-01

    Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510

  6. Identifying Key Features of Student Performance in Educational Video Games and Simulations through Cluster Analysis

    ERIC Educational Resources Information Center

    Kerr, Deirdre; Chung, Gregory K. W. K.

    2012-01-01

    The assessment cycle of "evidence-centered design" (ECD) provides a framework for treating an educational video game or simulation as an assessment. One of the main steps in the assessment cycle of ECD is the identification of the key features of student performance. While this process is relatively simple for multiple choice tests, when…

  7. Understanding of Protein Synthesis in a Living Cell

    ERIC Educational Resources Information Center

    Mustapha, Y.; Muhammad, S.

    2006-01-01

    The assembly of proteins takes place in the cytoplasm of a cell. There are three main steps. In initiation, far left, all the necessary parts of the process are brought together by a small molecule called a ribosome. During elongation, amino acids, the building blocks of proteins, are joined to one another in a long chain. The sequence in which…

  8. Synchronized observations of cloud-to-ground lightning using VHF broadband interferometer and acoustic arrays

    NASA Astrophysics Data System (ADS)

    Qiu, Shi; Zhou, Bi-Hua; Shi, Li-Hua

    2012-10-01

    A single-station-based lightning discharge channel reconstruction system by combining a two-dimensional (2D) VHF broadband interferometer and a three-dimensional (3D) acoustic lighting mapping system has been developed and used for lightning observations. Two cloud-to-ground (CG) flashes with highly branched leaders recorded by the system are analyzed and presented in this paper. VHF radiation could well delineate the development of simultaneous leader branches, while acoustic emissions mainly located on the main channel which was traversed by return stroke (RS) process. Localizations by VHF and acoustic emissions agree well with each other. The mapping results confirm that audible acoustic emission of lightning discharge is mainly associated with high current process like RS. Leaders could generate detectable acoustic signals, with amplitude at least an order weaker than ensuing RS, but they are hard to identify except in closer ranges than the main channel. As a significant phenomenon, this paper provides the first 3D locations associated with sources of tearing sounds, which are inferred to be generated by downward negative leaders when they approach ground. The synchronized observation enable VHF interferometer locate lightning development in spatially quasi 3D, and three stepped leaders, five dart leaders and two dart-stepped leaders are identified, with the 3D velocity (1.3-3.9) × 105 m/s, (1.0-2.9) × 107 m/s and from (1.0-1.3) × 107 m/s to (2.4-2.6) × 106 m/s, respectively. In addition, the application of this approach in improving the accuracy of thunder ranging is discussed.

  9. The artificial object detection and current velocity measurement using SAR ocean surface images

    NASA Astrophysics Data System (ADS)

    Alpatov, Boris; Strotov, Valery; Ershov, Maksim; Muraviev, Vadim; Feldman, Alexander; Smirnov, Sergey

    2017-10-01

    Due to the fact that water surface covers wide areas, remote sensing is the most appropriate way of getting information about ocean environment for vessel tracking, security purposes, ecological studies and others. Processing of synthetic aperture radar (SAR) images is extensively used for control and monitoring of the ocean surface. Image data can be acquired from Earth observation satellites, such as TerraSAR-X, ERS, and COSMO-SkyMed. Thus, SAR image processing can be used to solve many problems arising in this field of research. This paper discusses some of them including ship detection, oil pollution control and ocean currents mapping. Due to complexity of the problem several specialized algorithm are necessary to develop. The oil spill detection algorithm consists of the following main steps: image preprocessing, detection of dark areas, parameter extraction and classification. The ship detection algorithm consists of the following main steps: prescreening, land masking, image segmentation combined with parameter measurement, ship orientation estimation and object discrimination. The proposed approach to ocean currents mapping is based on Doppler's law. The results of computer modeling on real SAR images are presented. Based on these results it is concluded that the proposed approaches can be used in maritime applications.

  10. Extraction of medium chain fatty acids from organic municipal waste and subsequent production of bio-based fuels.

    PubMed

    Kannengiesser, Jan; Sakaguchi-Söder, Kaori; Mrukwia, Timo; Jager, Johannes; Schebek, Liselotte

    2016-01-01

    This paper provides an overview on investigations for a new technology to generate bio-based fuel additives from bio-waste. The investigations are taking place at the composting plant in Darmstadt-Kranichstein (Germany). The aim is to explore the potential of bio-waste as feedstock in producing different bio-based products (or bio-based fuels). For this investigation, a facultative anaerobic process is to be integrated into the normal aerobic waste treatment process for composting. The bio-waste is to be treated in four steps to produce biofuels. The first step is the facultative anaerobic treatment of the waste in a rotting box namely percolate to generate a fatty-acid rich liquid fraction. The Hydrolysis takes place in the rotting box during the waste treatment. The organic compounds are then dissolved and transferred into the waste liquid phase. Browne et al. (2013) describes the hydrolysis as an enzymatically degradation of high solid substrates to soluble products which are further degraded to volatile fatty acids (VFA). This is confirmed by analytical tests done on the liquid fraction. After the percolation, volatile and medium chain fatty acids are found in the liquid phase. Concentrations of fatty acids between 8.0 and 31.5 were detected depending on the nature of the input material. In the second step, a fermentation process will be initiated to produce additional fatty acids. Existing microorganism mass is activated to degrade the organic components that are still remaining in the percolate. After fermentation the quantity of fatty acids in four investigated reactors increased 3-5 times. While fermentation mainly non-polar fatty acids (pentanoic to octanoic acid) are build. Next to the fermentation process, a chain-elongation step is arranged by adding ethanol to the fatty acid rich percolate. While these investigations a chain-elongation of mainly fatty acids with pair numbers of carbon atoms (acetate, butanoic and hexanoic acid) are demonstrated. After these three pre-treatments, the percolate is brought to a refinery to extract the non-polar fatty acids using bio-diesel, which was generated from used kitchen oil at the refinery. The extraction tests in the lab have proved that the efficiency of the liquid-liquid-extraction is directly linked with the chain length and polarity of the fatty acids. By using a non-polar bio-diesel mainly the non-polar fatty acids, like pentanoic to octanoic acid, are extracted. After extraction, the bio-diesel enriched with the fatty acids is esterified. As a result bio-diesel with a lower viscosity than usual is produced. The fatty acids remaining in the percolate after the extraction can be used in another fermentation process to generate biogas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. One-step global parameter estimation of kinetic inactivation parameters for Bacillus sporothermodurans spores under static and dynamic thermal processes.

    PubMed

    Cattani, F; Dolan, K D; Oliveira, S D; Mishra, D K; Ferreira, C A S; Periago, P M; Aznar, A; Fernandez, P S; Valdramidis, V P

    2016-11-01

    Bacillus sporothermodurans produces highly heat-resistant endospores, that can survive under ultra-high temperature. High heat-resistant sporeforming bacteria are one of the main causes for spoilage and safety of low-acid foods. They can be used as indicators or surrogates to establish the minimum requirements for heat processes, but it is necessary to understand their thermal inactivation kinetics. The aim of the present work was to study the inactivation kinetics under both static and dynamic conditions in a vegetable soup. Ordinary least squares one-step regression and sequential procedures were applied for estimating these parameters. Results showed that multiple dynamic heating profiles, when analyzed simultaneously, can be used to accurately estimate the kinetic parameters while significantly reducing estimation errors and data collection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Public Participation Procedure in Integrated Transport and Green Infrastructure Planning

    NASA Astrophysics Data System (ADS)

    Finka, Maroš; Ondrejička, Vladimír; Jamečný, Ľubomír; Husár, Milan

    2017-10-01

    The dialogue among the decision makers and stakeholders is a crucial part of any decision-making processes, particularly in case of integrated transportation planning and planning of green infrastructure where a multitude of actors is present. Although the theory of public participation is well-developed after several decades of research, there is still a lack of practical guidelines due to the specificity of public participation challenges. The paper presents a model of public participation for integrated transport and green infrastructure planning for international project TRANSGREEN covering the area of five European countries - Slovakia, Czech Republic, Austria, Hungary and Romania. The challenge of the project is to coordinate the efforts of public actors and NGOs in international environment in oftentimes precarious projects of transport infrastructure building and developing of green infrastructure. The project aims at developing and environmentally-friendly and safe international transport network. The proposed public participation procedure consists of five main steps - spread of information (passive), collection of information (consultation), intermediate discussion, engagement and partnership (empowerment). The initial spread of information is a process of communicating with the stakeholders, informing and educating them and it is based on their willingness to be informed. The methods used in this stage are public displays, newsletters or press releases. The second step of consultation is based on transacting the opinions of stakeholders to the decision makers. Pools, surveys, public hearings or written responses are examples of the multitude of ways to achieve this objective and the main principle of openness of stakeholders. The third step is intermediate discussion where all sides of are invited to a dialogue using the tools such as public meetings, workshops or urban walks. The fourth step is an engagement based on humble negotiation, arbitration and mediation. The collaborative skill needed here is dealing with conflicts. The final step in the procedure is partnership and empowerment employing methods as multi-actor decision making, voting or referenda. The leading principle is cooperation. In this ultimate step, the stakeholders are becoming decision makers themselves and the success factor here is continuous evaluation.

  13. A characterization of the two-step reaction mechanism of phenol decomposition by a Fenton reaction

    NASA Astrophysics Data System (ADS)

    Valdés, Cristian; Alzate-Morales, Jans; Osorio, Edison; Villaseñor, Jorge; Navarro-Retamal, Carlos

    2015-11-01

    Phenol is one of the worst contaminants at date, and its degradation has been a crucial task over years. Here, the decomposition process of phenol, in a Fenton reaction, is described. Using scavengers, it was observed that decomposition of phenol was mainly influenced by production of hydroxyl radicals. Experimental and theoretical activation energies (Ea) for phenol oxidation intermediates were calculated. According to these Ea, phenol decomposition is a two-step reaction mechanism mediated predominantly by hydroxyl radicals, producing a decomposition yield order given as hydroquinone > catechol > resorcinol. Furthermore, traces of reaction derived acids were detected by HPLC and GS-MS.

  14. A new hyperspectral image compression paradigm based on fusion

    NASA Astrophysics Data System (ADS)

    Guerra, Raúl; Melián, José; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    The on-board compression of remote sensed hyperspectral images is an important task nowadays. One of the main difficulties is that the compression of these images must be performed in the satellite which carries the hyperspectral sensor. Hence, this process must be performed by space qualified hardware, having area, power and speed limitations. Moreover, it is important to achieve high compression ratios without compromising the quality of the decompress image. In this manuscript we proposed a new methodology for compressing hyperspectral images based on hyperspectral image fusion concepts. The proposed compression process has two independent steps. The first one is to spatially degrade the remote sensed hyperspectral image to obtain a low resolution hyperspectral image. The second step is to spectrally degrade the remote sensed hyperspectral image to obtain a high resolution multispectral image. These two degraded images are then send to the earth surface, where they must be fused using a fusion algorithm for hyperspectral and multispectral image, in order to recover the remote sensed hyperspectral image. The main advantage of the proposed methodology for compressing remote sensed hyperspectral images is that the compression process, which must be performed on-board, becomes very simple, being the fusion process used to reconstruct image the more complex one. An extra advantage is that the compression ratio can be fixed in advanced. Many simulations have been performed using different fusion algorithms and different methodologies for degrading the hyperspectral image. The results obtained in the simulations performed corroborate the benefits of the proposed methodology.

  15. 27. VIEW FROM AFT OF MAIN HOISTING ENGINE WITH HOISTING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    27. VIEW FROM AFT OF MAIN HOISTING ENGINE WITH HOISTING DRUM IN FOREGROUND. NOTE MAIN HOISTING DRUM IS A STEP DRUM, WITH TWO DIAMETERS ON DRUM. WHEN BUCKET IS IN WATER THE CABLE IS ON THE SMALLER STEP, AS PICTURED, GIVING MORE POWER TO THE LINE. THE CABLE STEPS TO LARGER DIAMETER WHEN BUCKET IS OUT OF WATER, WHERE SPEED IS MORE IMPORTANT THAN POWER. SMALLER BACKING DRUM IN BACKGROUND. - Dredge CINCINNATI, Docked on Ohio River at foot of Lighthill Street, Pittsburgh, Allegheny County, PA

  16. Computational mate choice: theory and empirical evidence.

    PubMed

    Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo

    2012-06-01

    The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for behavioural ecologist interested in integrating proximate and ultimate causes of mate choice. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Brunn: an open source laboratory information system for microplates with a graphical plate layout design process.

    PubMed

    Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S

    2011-05-20

    Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.

  18. Optimization of an incubation step to maximize sulforaphane content in pre-processed broccoli.

    PubMed

    Mahn, Andrea; Pérez, Carmen

    2016-11-01

    Sulforaphane is a powerful anticancer compound, found naturally in food, which comes from the hydrolysis of glucoraphanin, the main glucosinolate of broccoli. The aim of this work was to maximize sulforaphane content in broccoli by designing an incubation step after subjecting broccoli pieces to an optimized blanching step. Incubation was optimized through a Box-Behnken design using ascorbic acid concentration, incubation temperature and incubation time as factors. The optimal incubation conditions were 38 °C for 3 h and 0.22 mg ascorbic acid per g fresh broccoli. The maximum sulforaphane concentration predicted by the model was 8.0 µmol g -1 , which was confirmed experimentally yielding a value of 8.1 ± 0.3 µmol g -1 . This represents a 585% increase with respect to fresh broccoli and a 119% increase in relation to blanched broccoli, equivalent to a conversion of 94% of glucoraphanin. The process proposed here allows maximizing sulforaphane content, thus avoiding artificial chemical synthesis. The compound could probably be isolated from broccoli, and may find application as nutraceutical or functional ingredient.

  19. Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies

    PubMed Central

    El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush

    2013-01-01

    This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282

  20. Zeolite based microconcentrators for volatile organic compounds sensing at trace-level: fabrication and performance

    NASA Astrophysics Data System (ADS)

    Almazán, Fernando; Pellejero, Ismael; Morales, Alberto; Urbiztondo, Miguel A.; Sesé, Javier; Pina, M. Pilar; Santamaría, Jesús

    2016-08-01

    A novel 6-step microfabrication process is proposed in this work to prepare microfluidic devices with integrated zeolite layers. In particular, microfabricated preconcentrators designed for volatile organic compounds (VOC) sensing applications are fully described. The main novelty of this work is the integration of the pure siliceous MFI type zeolite (silicalite-1) polycrystalline layer, i.e. 4.0  ±  0.5 μm thick, as active phase, within the microfabrication process just before the anodic bonding step. Following this new procedure, Si microdevices with an excellent distribution of the adsorbent material, integrated resistive heaters and Pyrex caps have been obtained. Firstly, the microconcentrator performance has been assessed by means of the normal hexane breakthrough curves as a function of sampling and desorption flowrates, temperature and micropreconcentrator design. In a step further, the best preconcentrator device has been tested in combination with downstream Si based microcantilevers deployed as VOC detectors. Thus, a preliminar evaluation of the improvement on detection sensitivity by silicalite-1 based microconcentrators is presented.

  1. Breast cancer mitosis detection in histopathological images with spatial feature extraction

    NASA Astrophysics Data System (ADS)

    Albayrak, Abdülkadir; Bilgin, Gökhan

    2013-12-01

    In this work, cellular mitosis detection in histopathological images has been investigated. Mitosis detection is very expensive and time consuming process. Development of digital imaging in pathology has enabled reasonable and effective solution to this problem. Segmentation of digital images provides easier analysis of cell structures in histopathological data. To differentiate normal and mitotic cells in histopathological images, feature extraction step is very crucial step for the system accuracy. A mitotic cell has more distinctive textural dissimilarities than the other normal cells. Hence, it is important to incorporate spatial information in feature extraction or in post-processing steps. As a main part of this study, Haralick texture descriptor has been proposed with different spatial window sizes in RGB and La*b* color spaces. So, spatial dependencies of normal and mitotic cellular pixels can be evaluated within different pixel neighborhoods. Extracted features are compared with various sample sizes by Support Vector Machines using k-fold cross validation method. According to the represented results, it has been shown that separation accuracy on mitotic and non-mitotic cellular pixels gets better with the increasing size of spatial window.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobo, R.; Revah, S.; Viveros-Garcia, T.

    An analysis of the local processes occurring in a trickle-bed bioreactor (TBB) with a first-order bioreaction shows that the identification of the TBB operating regime requires knowledge of the substrate concentration in the liquid phase. If the substrate liquid concentration is close to 0, the rate-controlling step is mass transfer at the gas-liquid interface; when it is close to the value in equilibrium with the gas phase, the controlling step is the phenomena occurring in the biofilm, CS{sub 2} removal rate data obtained in a TBB with a Thiobacilii consortia biofilm are analyzed to obtain the mass transfer and kineticmore » parameters, and to show that the bioreactor operates in a regime mainly controlled by mass transfer. A TBB model with two experimentally determined parameters is developed and used to show how the bioreactor size depends on the rate-limiting step, the absorption factor, the substrate fractional conversion, and on the gas and liquid contact pattern. Under certain conditions, the TBB size is independent of the flowing phases` contact pattern. The model effectively describes substrate gas and liquid concentration data for mass transfer and biodegradation rate controlled processes.« less

  3. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  4. From vision to action: roadmapping as a strategic method and tool to implement climate change adaptation - the example of the roadmap 'water sensitive urban design 2020'.

    PubMed

    Hasse, J U; Weingaertner, D E

    2016-01-01

    As the central product of the BMBF-KLIMZUG-funded Joint Network and Research Project (JNRP) 'dynaklim - Dynamic adaptation of regional planning and development processes to the effects of climate change in the Emscher-Lippe region (North Rhine Westphalia, Germany)', the Roadmap 2020 'Regional Climate Adaptation' has been developed by the various regional stakeholders and institutions containing specific regional scenarios, strategies and adaptation measures applicable throughout the region. This paper presents the method, elements and main results of this regional roadmap process by using the example of the thematic sub-roadmap 'Water Sensitive Urban Design 2020'. With a focus on the process support tool 'KlimaFLEX', one of the main adaptation measures of the WSUD 2020 roadmap, typical challenges for integrated climate change adaptation like scattered knowledge, knowledge gaps and divided responsibilities but also potential solutions and promising chances for urban development and urban water management are discussed. With the roadmap and the related tool, the relevant stakeholders of the Emscher-Lippe region have jointly developed important prerequisites to integrate their knowledge, to clarify vulnerabilities, adaptation goals, responsibilities and interests, and to foresightedly coordinate measures, resources, priorities and schedules for an efficient joint urban planning, well-grounded decision-making in times of continued uncertainties and step-by-step implementation of adaptation measures from now on.

  5. Toward the reconstitution of synthetic cell motility

    PubMed Central

    Siton-Mendelson, Orit; Bernheim-Groswasser, Anne

    2016-01-01

    ABSTRACT Cellular motility is a fundamental process essential for embryonic development, wound healing, immune responses, and tissues development. Cells are mostly moving by crawling on external, or inside, substrates which can differ in their surface composition, geometry, and dimensionality. Cells can adopt different migration phenotypes, e.g., bleb-based and protrusion-based, depending on myosin contractility, surface adhesion, and cell confinement. In the few past decades, research on cell motility has focused on uncovering the major molecular players and their order of events. Despite major progresses, our ability to infer on the collective behavior from the molecular properties remains a major challenge, especially because cell migration integrates numerous chemical and mechanical processes that are coupled via feedbacks that span over large range of time and length scales. For this reason, reconstituted model systems were developed. These systems allow for full control of the molecular constituents and various system parameters, thereby providing insight into their individual roles and functions. In this review we describe the various reconstituted model systems that were developed in the past decades. Because of the multiple steps involved in cell motility and the complexity of the overall process, most of the model systems focus on very specific aspects of the individual steps of cell motility. Here we describe the main advancement in cell motility reconstitution and discuss the main challenges toward the realization of a synthetic motile cell. PMID:27019160

  6. Study on the syhthesis process of tetracaine hydrochloride

    NASA Astrophysics Data System (ADS)

    Li, Wenli; Zhao, Jie; Cui, Yujie

    2017-05-01

    Tetrachloride hydrochloride is a local anesthetic with long-acting ester, and it is usually present in the form of a hydrochloride salt. Firsleb first synthesized the tetracaine by experiment in 1928, which is one of the recognized clinical potent anesthetics. This medicine has the advantages of stable physical and chemical properties, the rapid role and long maintenance. Tetracaine is also used for ophthalmic surface anesthesia as one of the main local anesthetic just like conduction block anesthesia, mucosal surface anesthesia and epidural anesthesia. So far, the research mainly engaged in its clinical application research, and the research strength is relatively small in the field of synthetic technology. The general cost of the existing production process is high, and the yield is low. In addition, the reaction time is long and the reaction conditions are harsh. In this paper, a new synthetic method was proposed for the synthesis of tetracaine hydrochloride. The reaction route has the advantages of few steps, high yield, short reaction time and mild reaction conditions. The cheap p-nitrobenzoic acid was selected as raw material. By esterification with ethanol and reaction with n-butyraldehyde (the reaction process includes nitro reduction, aldol condensation and hydrogenation reduction), the intermediate was transesterified with dimethylaminoethanol under basic conditions. Finally, the PH value was adjusted in the ethanol solvent. After experiencing 4 steps reaction, the crude tetracaine hydrochloride was obtained.

  7. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  8. SARTools: A DESeq2- and EdgeR-Based R Pipeline for Comprehensive Differential Analysis of RNA-Seq Data.

    PubMed

    Varet, Hugo; Brillet-Guéguen, Loraine; Coppée, Jean-Yves; Dillies, Marie-Agnès

    2016-01-01

    Several R packages exist for the detection of differentially expressed genes from RNA-Seq data. The analysis process includes three main steps, namely normalization, dispersion estimation and test for differential expression. Quality control steps along this process are recommended but not mandatory, and failing to check the characteristics of the dataset may lead to spurious results. In addition, normalization methods and statistical models are not exchangeable across the packages without adequate transformations the users are often not aware of. Thus, dedicated analysis pipelines are needed to include systematic quality control steps and prevent errors from misusing the proposed methods. SARTools is an R pipeline for differential analysis of RNA-Seq count data. It can handle designs involving two or more conditions of a single biological factor with or without a blocking factor (such as a batch effect or a sample pairing). It is based on DESeq2 and edgeR and is composed of an R package and two R script templates (for DESeq2 and edgeR respectively). Tuning a small number of parameters and executing one of the R scripts, users have access to the full results of the analysis, including lists of differentially expressed genes and a HTML report that (i) displays diagnostic plots for quality control and model hypotheses checking and (ii) keeps track of the whole analysis process, parameter values and versions of the R packages used. SARTools provides systematic quality controls of the dataset as well as diagnostic plots that help to tune the model parameters. It gives access to the main parameters of DESeq2 and edgeR and prevents untrained users from misusing some functionalities of both packages. By keeping track of all the parameters of the analysis process it fits the requirements of reproducible research.

  9. Full-waveform data for building roof step edge localization

    NASA Astrophysics Data System (ADS)

    Słota, Małgorzata

    2015-08-01

    Airborne laser scanning data perfectly represent flat or gently sloped areas; to date, however, accurate breakline detection is the main drawback of this technique. This issue becomes particularly important in the case of modeling buildings, where accuracy higher than the footprint size is often required. This article covers several issues related to full-waveform data registered on building step edges. First, the full-waveform data simulator was developed and presented in this paper. Second, this article provides a full description of the changes in echo amplitude, echo width and returned power caused by the presence of edges within the laser footprint. Additionally, two important properties of step edge echoes, peak shift and echo asymmetry, were noted and described. It was shown that these properties lead to incorrect echo positioning along the laser center line and can significantly reduce the edge points' accuracy. For these reasons and because all points are aligned with the center of the beam, regardless of the actual target position within the beam footprint, we can state that step edge points require geometric corrections. This article presents a novel algorithm for the refinement of step edge points. The main distinguishing advantage of the developed algorithm is the fact that none of the additional data, such as emitted signal parameters, beam divergence, approximate edge geometry or scanning settings, are required. The proposed algorithm works only on georeferenced profiles of reflected laser energy. Another major advantage is the simplicity of the calculation, allowing for very efficient data processing. Additionally, the developed method of point correction allows for the accurate determination of points lying on edges and edge point densification. For this reason, fully automatic localization of building roof step edges based on LiDAR full-waveform data with higher accuracy than the size of the lidar footprint is feasible.

  10. Donor-π-Acceptor Polymer with Alternating Triarylborane and Triphenylamine Moieties.

    PubMed

    Li, Haiyan; Jäkle, Frieder

    2010-05-12

    A luminescent main chain donor-π-acceptor-type polymer (4) was prepared via organometallic polycondensation reaction followed by post modification. With both electron-rich amine and electron-deficient borane moieties embedded in the main chain, 4 exhibits an interesting ambipolar character: it can be reduced and oxidized electrochemically at moderate potentials and shows a strong solvatochromic effect in the emission spectra. Complexation studies show that 4 selectively binds to fluoride and cyanide; quantitative titration with cyanide reveals a two-step binding process. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Production Process for Stem Cell Based Therapeutic Implants: Expansion of the Production Cell Line and Cultivation of Encapsulated Cells

    NASA Astrophysics Data System (ADS)

    Weber, C.; Pohl, S.; Poertner, R.; Pino-Grace, Pablo; Freimark, D.; Wallrapp, C.; Geigle, P.; Czermak, P.

    Cell based therapy promises the treatment of many diseases like diabetes mellitus, Parkinson disease or stroke. Microencapsulation of the cells protects them against host-vs-graft reactions and thus enables the usage of allogenic cell lines for the manufacturing of cell therapeutic implants. The production process of such implants consists mainly of the three steps expansion of the cells, encapsulation of the cells, and cultivation of the encapsulated cells in order to increase their vitality and thus quality. This chapter deals with the development of fixed-bed bioreactor-based cultivation procedures used in the first and third step of production. The bioreactor system for the expansion of the stem cell line (hMSC-TERT) is based on non-porous glass spheres, which support cell growth and harvesting with high yield and vitality. The cultivation process for the spherical cell based implants leads to an increase of vitality and additionally enables the application of a medium-based differentiation protocol.

  12. High-resolution onshore-offshore morpho-bathymetric records of modern chalk and granitic shore platforms in NW France

    NASA Astrophysics Data System (ADS)

    Duperret, Anne; Raimbault, Céline; Le Gall, Bernard; Authemayou, Christine; van Vliet-Lanoë, Brigitte; Regard, Vincent; Dromelet, Elsa; Vandycke, Sara

    2016-07-01

    Modern shore platforms developed on rocky coasts are key areas for understanding coastal erosion processes during the Holocene. This contribution offers a detailed picture of two contrasted shore-platform systems, based on new high-resolution shallow-water bathymetry, further coupled with aerial LiDAR topography. Merged land-sea digital elevation models were achieved on two distinct types of rocky coasts along the eastern English Channel in France (Picardy and Upper-Normandy: PUN) and in a NE Atlantic area (SW Brittany: SWB) in NW France. About the PUN case, submarine steps, identified as paleo-shorelines, parallel the actual coastline. Coastal erosive processes appear to be continuous and regular through time, since mid-Holocene at least. In SWB, there is a discrepancy between contemporary coastline orientation and a continuous step extending from inland to offshore, identified as a paleo-shoreline. This illustrates a polyphased and inherited shore platform edification, mainly controlled by tectonic processes.

  13. Platinum and rhenium extraction from a spent refinery catalyst using Bacillus megaterium as a cyanogenic bacterium: statistical modeling and process optimization.

    PubMed

    Motaghed, M; Mousavi, S M; Rastegar, S O; Shojaosadati, S A

    2014-11-01

    The present study evaluated the potential of Bacillus megaterium as a cyanogenic bacterium to produce cyanide for solubilization of platinum and rhenium from a spent refinery catalyst. Response surface methodology was applied to study the effects and interaction between two main effective parameters including initial glycine concentration and pulp density. Maximum Pt and Re recovery was obtained 15.7% and 98%, respectively, under optimum conditions of 12.8 g/l initial glycine concentration and 4% (w/v) pulp density after 7 days. Increasing the free cyanide concentration to 3.6 mg/l, varying the pH from 6.7 to 9, and increasing the dissolved oxygen from 2 to 5mg/l demonstrated the growth characteristics of B. megaterium during bioleaching process. The modified shrinking core model was used to determine the rate limiting step of the process. It was found that diffusion through the product layer is the rate controlling step. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Reinforced two-step-ahead weight adjustment technique for online training of recurrent neural networks.

    PubMed

    Chang, Li-Chiu; Chen, Pin-An; Chang, Fi-John

    2012-08-01

    A reliable forecast of future events possesses great value. The main purpose of this paper is to propose an innovative learning technique for reinforcing the accuracy of two-step-ahead (2SA) forecasts. The real-time recurrent learning (RTRL) algorithm for recurrent neural networks (RNNs) can effectively model the dynamics of complex processes and has been used successfully in one-step-ahead forecasts for various time series. A reinforced RTRL algorithm for 2SA forecasts using RNNs is proposed in this paper, and its performance is investigated by two famous benchmark time series and a streamflow during flood events in Taiwan. Results demonstrate that the proposed reinforced 2SA RTRL algorithm for RNNs can adequately forecast the benchmark (theoretical) time series, significantly improve the accuracy of flood forecasts, and effectively reduce time-lag effects.

  15. Developing a workbook to support the contextualisation of global health systems guidance: a case study identifying steps and critical factors for success in this process at WHO.

    PubMed

    Alvarez, Elizabeth; Lavis, John N; Brouwers, Melissa; Schwartz, Lisa

    2018-03-02

    Global guidance can help countries strengthen their health systems to deliver effective interventions to their populations. However, to have an impact, guidance needs to be contextualised or adapted to local settings; this process includes consideration of health system arrangements and political system factors. To date, methods to support contextualisation do not exist. In response, a workbook was designed to provide specific methods and strategies to enable the contextualisation of WHO's 'Optimizing health worker roles to improve maternal and newborn health' (OptimizeMNH) guidance at the national or subnational level. The objective of this study was to describe the process of developing the workbook and identify key steps of the development process, barriers that arose and facilitators that helped overcome some of these barriers. A qualitative single case study design was carried out. Interviews, documents and a reflexive journal were used. Constant comparison and an edit-style of organisation were used during data analysis to develop concepts, themes, subthemes and relationships among them. Thirteen interviews were conducted and 52 documents were reviewed. Three main steps were identified in the process of developing the workbook for health systems guidance contextualisation, namely (1) determining the need for and gaining approval to develop the workbook, (2) developing the workbook (taking on the task, creating the structure of the workbook, operationalising its components, undergoing approval processes and editing it), and (3) implementing the workbook both at the WHO level and at the national/subnational level. Five barriers and/or facilitators emerged relevant to each step, namely (1) having well-placed and credible champions, (2) creating and capitalising on opportunities, (3) finding the right language to engage various actors and obtain buy-in, (4) obtaining and maintaining meaningful buy-in, and (5) ensuring access to resources. Understanding the key steps and the critical factors involved in the process of developing the workbook could help in the planning of similar and other tools aimed to support the implementation of WHO guidance. A plan for dissemination and implementation needs to be addressed during the preparation of these tools.

  16. The function of advanced treatment process in a drinking water treatment plant with organic matter-polluted source water.

    PubMed

    Lin, Huirong; Zhang, Shuting; Zhang, Shenghua; Lin, Wenfang; Yu, Xin

    2017-04-01

    To understand the relationship between chemical and microbial treatment at each treatment step, as well as the relationship between microbial community structure in biofilms in biofilters and their ecological functions, a drinking water plant with severe organic matter-polluted source water was investigated. The bacterial community dynamics of two drinking water supply systems (traditional and advanced treatment processes) in this plant were studied from the source to the product water. Analysis by 454 pyrosequencing was conducted to characterize the bacterial diversity in each step of the treatment processes. The bacterial communities in these two treatment processes were highly diverse. Proteobacteria, which mainly consisted of beta-proteobacteria, was the dominant phylum. The two treatment processes used in the plant could effectively remove organic pollutants and microbial polution, especially the advanced treatment process. Significant differences in the detection of the major groups were observed in the product water samples in the treatment processes. The treatment processes, particularly the biological pretreatment and O 3 -biological activated carbon in the advanced treatment process, highly influenced the microbial community composition and the water quality. Some opportunistic pathogens were found in the water. Nitrogen-relative microorganisms found in the biofilm of filters may perform an important function on the microbial community composition and water quality improvement.

  17. Rainfall Stochastic models

    NASA Astrophysics Data System (ADS)

    Campo, M. A.; Lopez, J. J.; Rebole, J. P.

    2012-04-01

    This work was carried out in north of Spain. San Sebastian A meteorological station, where there are available precipitation records every ten minutes was selected. Precipitation data covers from October of 1927 to September of 1997. Pulse models describe the temporal process of rainfall as a succession of rainy cells, main storm, whose origins are distributed in time according to a Poisson process and a secondary process that generates a random number of cells of rain within each storm. Among different pulse models, the Bartlett-Lewis was used. On the other hand, alternative renewal processes and Markov chains describe the way in which the process will evolve in the future depending only on the current state. Therefore they are nor dependant on past events. Two basic processes are considered when describing the occurrence of rain: the alternation of wet and dry periods and temporal distribution of rainfall in each rain event, which determines the rainwater collected in each of the intervals that make up the rain. This allows the introduction of alternative renewal processes and Markov chains of three states, where interstorm time is given by either of the two dry states, short or long. Thus, the stochastic model of Markov chains tries to reproduce the basis of pulse models: the succession of storms, each one composed for a series of rain, separated by a short interval of time without theoretical complexity of these. In a first step, we analyzed all variables involved in the sequential process of the rain: rain event duration, event duration of non-rain, average rainfall intensity in rain events, and finally, temporal distribution of rainfall within the rain event. Additionally, for pulse Bartlett-Lewis model calibration, main descriptive statistics were calculated for each month, considering the process of seasonal rainfall in each month. In a second step, both models were calibrated. Finally, synthetic series were simulated with calibration parameters; series were recorded every ten minutes and hourly, aggregated. Preliminary results show adequate simulation of the main features of rain. Main variables are well simulated for time series of ten minutes, also over one hour precipitation time series, which are those that generate higher rainfall hydrologic design. For coarse scales, less than one hour, rainfall durations are not appropriate under the simulation. A hypothesis may be an excessive number of simulated events, which causes further fragmentation of storms, resulting in an excess of rain "short" (less than 1 hour), and therefore also among rain events, compared with the ones that occur in the actual series.

  18. Occupational exposure in the fluorescent lamp recycling sector in France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimmermann, François, E-mail: francois.zimmermann@inrs.fr; Lecler, Marie-Thérèse; Clerc, Frédéric

    Highlights: • Chemical risks were assessed in the five fluorescent lamp recycling facilities. • The main hazardous agents are mercury vapors and dust containing lead and yttrium. • Exposure and pollutant levels were correlated with steps and processes. • All the stages and processes are concerned by worrying levels of pollutants. • We suggest recommendations to reduce chemical risk. - Abstract: The fluorescent lamp recycling sector is growing considerably in Europe due to increasingly strict regulations aimed at inciting the consumption of low energy light bulbs and their end-of-life management. Chemical risks were assessed in fluorescent lamp recycling facilities bymore » field measurement surveys in France, highlighting that occupational exposure and pollutant levels in the working environment were correlated with the main recycling steps and processes. The mean levels of worker exposure are 4.4 mg/m{sup 3}, 15.4 μg/m{sup 3}, 14.0 μg/m{sup 3}, 247.6 μg/m{sup 3}, respectively, for total inhalable dust, mercury, lead and yttrium. The mean levels of airborne pollutants are 3.1 mg/m{sup 3}, 9.0 μg/m{sup 3}, 9.0 μg/m{sup 3}, 219.2 μg/m{sup 3}, respectively, for total inhalable dust, mercury, lead and yttrium. The ranges are very wide. Surface samples from employees’ skin and granulometric analysis were also carried out. The overview shows that all the stages and processes involved in lamp recycling are concerned by the risk of hazardous substances penetrating into the bodies of employees, although exposure of the latter varies depending on the processes and tasks they perform. The conclusion of this study strongly recommends the development of a new generation of processes in parallel with more information sharing and regulatory measures.« less

  19. [ASSESSMENT OF EXTREME FACTORS OF SHIFT WORK IN ARCTIC CONDITIONS BY WORKERS WITH DIFFERENT REGULATORY PROCESSES].

    PubMed

    Korneeva, Ya A; Simonova, N N

    2016-01-01

    A man working on a shift basis in the Arctic, every day is under the influence of various extreme factors which are inevitable for oil and gas indudtry. To adapt to shift work employees use various resources of the individual. The purpose of research is the determination of personal resources of shift workers to overcome the adverse factors of the environment in the Arctic. The study involved 191 builder of main gas pipelines, working in shifts in the Tyumen region (the length of the shift 52 days of arrival) at the age of 23 to 59 (mean age 34.9 ± 8.1) years. Methods: psychological testing, questioning, observation, descriptive statistics, discriminant step by step analysis. There was revealed the correlation between the subjective assessment of the majority of adverse climatic factors in the regulatory process "assessment of results"; production factors--regulatory processes such as flexibility, autonomy, simulation, and the general level of self-regulation; social factors are more associated with the severity of such regulatory processes, flexibility and evaluation of results.

  20. A History of the Chemical Innovations in Silver-Halide Materials for Color PhotographyIII. Dye Tranfer Process — Instant Color Photography

    NASA Astrophysics Data System (ADS)

    Oishi, Yasushi

    A historical review of the technological developments of instant color photographic process, is presented with emphasis on the innovation processes at the following main turning points: 1) the creation of instant photography by E. H. Land in 1948 (one step processing by transfer of image-forming materials), 2) the advent of instant color photography based on dye developer, by Polaroid Corp., in 1963 (departing from dye-forming development, forming a direct positive preformed-dye image with a negative emulsion, but constraining the sensitive-material designs), 3) the introduction of a color instant product containing redox dye releaser with improved auto-positive emulsion, by Eastman Kodak Co., in 1976 (producing much improved color image quality, freed from the design constraints), and 4) the realization of absolute one-step photography by the integral film- unit system, by Polaroid in 1972. And the patent litigation (1976-86) raised by Polaroid against Kodak allegedly infringing on the integral film-unit patents caused the vast impacts on the industry.

  1. Biomass-to-electricity: analysis and optimization of the complete pathway steam explosion--enzymatic hydrolysis--anaerobic digestion with ICE vs SOFC as biogas users.

    PubMed

    Santarelli, M; Barra, S; Sagnelli, F; Zitella, P

    2012-11-01

    The paper deals with the energy analysis and optimization of a complete biomass-to-electricity energy pathway, starting from raw biomass towards the production of renewable electricity. The first step (biomass-to-biogas) is based on a real pilot plant located in Environment Park S.p.A. (Torino, Italy) with three main steps ((1) impregnation; (2) steam explosion; (3) enzymatic hydrolysis), completed by a two-step anaerobic fermentation. In the second step (biogas-to-electricity), the paper considers two technologies: internal combustion engines and a stack of solid oxide fuel cells. First, the complete pathway has been modeled and validated through experimental data. After, the model has been used for an analysis and optimization of the complete thermo-chemical and biological process, with the objective function of maximization of the energy balance at minimum consumption. The comparison between ICE and SOFC shows the better performance of the integrated plants based on SOFC. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Real-time color image processing for forensic fiber investigations

    NASA Astrophysics Data System (ADS)

    Paulsson, Nils

    1995-09-01

    This paper describes a system for automatic fiber debris detection based on color identification. The properties of the system are fast analysis and high selectivity, a necessity when analyzing forensic fiber samples. An ordinary investigation separates the material into well above 100,000 video images to analyze. The system is based on standard techniques such as CCD-camera, motorized sample table, and IBM-compatible PC/AT with add-on-boards for video frame digitalization and stepping motor control as the main parts. It is possible to operate the instrument at full video rate (25 image/s) with aid of the HSI-color system (hue- saturation-intensity) and software optimization. High selectivity is achieved by separating the analysis into several steps. The first step is fast direct color identification of objects in the analyzed video images and the second step analyzes detected objects with a more complex and time consuming stage of the investigation to identify single fiber fragments for subsequent analysis with more selective techniques.

  3. Thermodynamic analyses of hydrogen production from sub-quality natural gas. Part I: Pyrolysis and autothermal pyrolysis

    NASA Astrophysics Data System (ADS)

    Huang, Cunping; T-Raissi, Ali

    Sub-quality natural gas (SQNG) is defined as natural gas whose composition exceeds pipeline specifications of nitrogen, carbon dioxide (CO 2) and/or hydrogen sulfide (H 2S). Approximately one-third of the U.S. natural gas resource is sub-quality gas [1]. Due to the high cost of removing H 2S from hydrocarbons using current processing technologies, SQNG wells are often capped and the gas remains in the ground. We propose and analyze a two-step hydrogen production scheme using SQNG as feedstock. The first step of the process involves hydrocarbon processing (via steam-methane reformation, autothermal steam-methane reformation, pyrolysis and autothermal pyrolysis) in the presence of H 2S. Our analyses reveal that H 2S existing in SQNG is stable and can be considered as an inert gas. No sulfur dioxide (SO 2) and/or sulfur trioxide (SO 3) is formed from the introduction of oxygen to SQNG. In the second step, after the separation of hydrogen from the main stream, un-reacted H 2S is used to reform the remaining methane, generating more hydrogen and carbon disulfide (CS 2). Thermodynamic analyses on SQNG feedstock containing up to 10% (v/v) H 2S have shown that no H 2S separation is required in this process. The Part I of this paper includes only thermodynamic analyses for SQNG pyrolysis and autothermal pyrolysis.

  4. Is the size of the useful field of view affected by postural demands associated with standing and stepping?

    PubMed

    Reed-Jones, James G; Reed-Jones, Rebecca J; Hollands, Mark A

    2014-04-30

    The useful field of view (UFOV) is the visual area from which information is obtained at a brief glance. While studies have examined the effects of increased cognitive load on the visual field, no one has specifically looked at the effects of postural control or locomotor activity on the UFOV. The current study aimed to examine the effects of postural demand and locomotor activity on UFOV performance in healthy young adults. Eleven participants were tested on three modified UFOV tasks (central processing, peripheral processing, and divided-attention) while seated, standing, and stepping in place. Across all postural conditions, participants showed no difference in their central or peripheral processing. However, in the divided-attention task (reporting the letter in central vision and target location in peripheral vision amongst distracter items) a main effect of posture condition on peripheral target accuracy was found for targets at 57° of eccentricity (p=.037). The mean accuracy reduced from 80.5% (standing) to 74% (seated) to 56.3% (stepping). These findings show that postural demands do affect UFOV divided-attention performance. In particular, the size of the useful field of view significantly decreases when stepping. This finding has important implications for how the results of a UFOV test are used to evaluate the general size of the UFOV during varying activities, as the traditional seated test procedure may overestimate the size of the UFOV during locomotor activities. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Pretreatment methods for bioethanol production.

    PubMed

    Xu, Zhaoyang; Huang, Fang

    2014-09-01

    Lignocellulosic biomass, such as wood, grass, agricultural, and forest residues, are potential resources for the production of bioethanol. The current biochemical process of converting biomass to bioethanol typically consists of three main steps: pretreatment, enzymatic hydrolysis, and fermentation. For this process, pretreatment is probably the most crucial step since it has a large impact on the efficiency of the overall bioconversion. The aim of pretreatment is to disrupt recalcitrant structures of cellulosic biomass to make cellulose more accessible to the enzymes that convert carbohydrate polymers into fermentable sugars. This paper reviews several leading acidic, neutral, and alkaline pretreatments technologies. Different pretreatment methods, including dilute acid pretreatment (DAP), steam explosion pretreatment (SEP), organosolv, liquid hot water (LHW), ammonia fiber expansion (AFEX), soaking in aqueous ammonia (SAA), sodium hydroxide/lime pretreatments, and ozonolysis are intensively introduced and discussed. In this minireview, the key points are focused on the structural changes primarily in cellulose, hemicellulose, and lignin during the above leading pretreatment technologies.

  6. Superthermostability of nanoscale TIC-reinforced copper alloys manufactured by a two-step ball-milling process

    NASA Astrophysics Data System (ADS)

    Wang, Fenglin; Li, Yunping; Xu, Xiandong; Koizumi, Yuichiro; Yamanaka, Kenta; Bian, Huakang; Chiba, Akihiko

    2015-12-01

    A Cu-TiC alloy, with nanoscale TiC particles highly dispersed in the submicron-grained Cu matrix, was manufactured by a self-developed two-step ball-milling process on Cu, Ti and C powders. The thermostability of the composite was evaluated by high-temperature isothermal annealing treatments, with temperatures ranging from 727 to 1273 K. The semicoherent nanoscale TiC particles with Cu matrix, mainly located along the grain boundaries, were found to exhibit the promising trait of blocking grain boundary migrations, which leads to a super-stabilized microstructures up to approximately the melting point of copper (1223 K). Furthermore, the Cu-TiC alloys after annealing at 1323 K showed a slight decrease in Vickers hardness as well as the duplex microstructure due to selective grain growth, which were discussed in terms of hardness contributions from various mechanisms.

  7. Study of the influence of selected anisotropic parameter in the Barlat's model on the drawpiece shape

    NASA Astrophysics Data System (ADS)

    Kaldunski, Pawel; Kukielka, Leon; Patyk, Radoslaw; Kulakowska, Agnieszka; Bohdal, Lukasz; Chodor, Jaroslaw; Kukielka, Krzysztof

    2018-05-01

    In this paper, the numerical analysis and computer simulation of deep drawing process has been presented. The incremental model of the process in updated Lagrangian formulation with the regard of the geometrical and physical nonlinearity has been evaluated by variational and the finite element methods. The Frederic Barlat's model taking into consideration the anisotropy of materials in three main and six tangents directions has been used. The work out application in Ansys/Ls-Dyna program allows complex step by step analysis and prognoses: the shape, dimensions and state stress and strains of drawpiece. The paper presents the influence of selected anisotropic parameter in the Barlat's model on the drawpiece shape, which includes: height, sheet thickness and maximum drawing force. The important factors determining the proper formation of drawpiece and the ways of their determination have been described.

  8. Process Parameters Optimization in Single Point Incremental Forming

    NASA Astrophysics Data System (ADS)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  9. JSC Pharmacy Services for Remote Operations

    NASA Technical Reports Server (NTRS)

    Stoner, Paul S.; Bayuse, Tina

    2005-01-01

    The Johnson Space Center Pharmacy began operating in March of 2003. The pharmacy serves in two main capacities: to directly provide medications and services in support of the medical clinics at the Johnson Space Center, physician travel kits for NASA flight surgeon staff, and remote operations, such as the clinics in Devon Island, Star City and Moscow; and indirectly provide medications and services for the International Space Station and Space Shuttle medical kits. Process changes that occurred and continued to evolve in the advent of the installation of the new JSC Pharmacy, and the process of stocking medications for each of these aforementioned areas will be discussed. Methods: The incorporation of pharmacy involvement to provide services for remote operations and supplying medical kits was evaluated. The first step was to review the current processes and work the JSC Pharmacy into the existing system. The second step was to provide medications to these areas. Considerations for the timeline of expiring medications for shipment are reviewed with each request. The third step was the development of a process to provide accountability for the medications. Results: The JSC Pharmacy utilizes a pharmacy management system to document all medications leaving the pharmacy. Challenges inherent to providing medications to remote areas were encountered. A process has been designed to incorporate usage into the electronic medical record upon return of the information from these remote areas. This is an evolving program and several areas have been identified for further improvement.

  10. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  11. Effects of processing steps on the phenolic content and antioxidant activity of beer.

    PubMed

    Leitao, Céline; Marchioni, Eric; Bergaentzlé, Martine; Zhao, Minjie; Didierjean, Luc; Taidi, Behnam; Ennahar, Saïd

    2011-02-23

    A new analytical method (liquid chromatography-antioxidant, LC-AOx) was used that is intended to separate beer polyphenols and to determine the potential antioxidant activity of these constituents after they were allowed to react online with a buffered solution of the radical cation 2,2'-azinobis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS(•+)). Using the LC-AOx method, it was possible to demonstrate that the extent of the antioxidant activity was very much dependent on the phenolic compound considered. The method was also applied to the analysis of beer extracts and allowed the evaluation of their antioxidant activity at different steps of beer processing: brewing, boiling, and fermentation. This study showed that the total antioxidant activity remained unchanged throughout beer processing, as opposed to the polyphenolic content, which showed a 3-fold increase. Hopping and fermentation steps were the main causes of this increase. However, the increase measured after fermentation was attributed to a better extraction of polyphenols due to the presence of ethanol, rather than to a real increase in their content. Moreover, this method allowed the detection of three unknown antioxidant compounds, which accounted for 64 ± 4% of the total antioxidant activity of beer and were individually more efficient than caffeic acid and epicatechin.

  12. [FMEA applied to the radiotherapy patient care process].

    PubMed

    Meyrieux, C; Garcia, R; Pourel, N; Mège, A; Bodez, V

    2012-10-01

    Failure modes and effects analysis (FMEA), is a risk analysis method used at the Radiotherapy Department of Institute Sainte-Catherine as part of a strategy seeking to continuously improve the quality and security of treatments. The method comprises several steps: definition of main processes; for each of them, description for every step of prescription, treatment preparation, treatment application; identification of the possible risks, their consequences, their origins; research of existing safety elements which may avoid these risks; grading of risks to assign a criticality score resulting in a numerical organisation of the risks. Finally, the impact of proposed corrective actions was then estimated by a new grading round. For each process studied, a detailed map of the risks was obtained, facilitating the identification of priority actions to be undertaken. For example, we obtain five steps in patient treatment planning with an unacceptable level of risk, 62 a level of moderate risk and 31 an acceptable level of risk. The FMEA method, used in the industrial domain and applied here to health care, is an effective tool for the management of risks in patient care. However, the time and training requirements necessary to implement this method should not be underestimated. Copyright © 2012 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  13. Atomistics of vapour–liquid–solid nanowire growth

    PubMed Central

    Wang, Hailong; Zepeda-Ruiz, Luis A.; Gilmer, George H.; Upmanyu, Moneesh

    2013-01-01

    Vapour–liquid–solid route and its variants are routinely used for scalable synthesis of semiconducting nanowires, yet the fundamental growth processes remain unknown. Here we employ atomic-scale computations based on model potentials to study the stability and growth of gold-catalysed silicon nanowires. Equilibrium studies uncover segregation at the solid-like surface of the catalyst particle, a liquid AuSi droplet, and a silicon-rich droplet–nanowire interface enveloped by heterogeneous truncating facets. Supersaturation of the droplets leads to rapid one-dimensional growth on the truncating facets and much slower nucleation-controlled two-dimensional growth on the main facet. Surface diffusion is suppressed and the excess Si flux occurs through the droplet bulk which, together with the Si-rich interface and contact line, lowers the nucleation barrier on the main facet. The ensuing step flow is modified by Au diffusion away from the step edges. Our study highlights key interfacial characteristics for morphological and compositional control of semiconducting nanowire arrays. PMID:23752586

  14. The semantic distance task: Quantifying semantic distance with semantic network path length.

    PubMed

    Kenett, Yoed N; Levi, Effi; Anaki, David; Faust, Miriam

    2017-09-01

    Semantic distance is a determining factor in cognitive processes, such as semantic priming, operating upon semantic memory. The main computational approach to compute semantic distance is through latent semantic analysis (LSA). However, objections have been raised against this approach, mainly in its failure at predicting semantic priming. We propose a novel approach to computing semantic distance, based on network science methodology. Path length in a semantic network represents the amount of steps needed to traverse from 1 word in the network to the other. We examine whether path length can be used as a measure of semantic distance, by investigating how path length affect performance in a semantic relatedness judgment task and recall from memory. Our results show a differential effect on performance: Up to 4 steps separating between word-pairs, participants exhibit an increase in reaction time (RT) and decrease in the percentage of word-pairs judged as related. From 4 steps onward, participants exhibit a significant decrease in RT and the word-pairs are dominantly judged as unrelated. Furthermore, we show that as path length between word-pairs increases, success in free- and cued-recall decreases. Finally, we demonstrate how our measure outperforms computational methods measuring semantic distance (LSA and positive pointwise mutual information) in predicting participants RT and subjective judgments of semantic strength. Thus, we provide a computational alternative to computing semantic distance. Furthermore, this approach addresses key issues in cognitive theory, namely the breadth of the spreading activation process and the effect of semantic distance on memory retrieval. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. An optimized one-step wet etching process of Pb(Zr0.52Ti0.48)O3 thin films for microelectromechanical system applications

    NASA Astrophysics Data System (ADS)

    Che, L.; Halvorsen, E.; Chen, X.

    2011-10-01

    The existence of insoluble residues as intermediate products produced during the wet etching process is the main quality-reducing and structure-patterning issue for lead zirconate titanate (PZT) thin films. A one-step wet etching process using the solutions of buffered HF (BHF) and HNO3 acid was developed for patterning PZT thin films for microelectomechanical system (MEMS) applications. PZT thin films with 1 µm thickness were prepared on the Pt/Ti/SiO2/Si substrate by the sol-gel process for compatibility with Si micromachining. Various compositions of the etchant were investigated and the patterns were examined to optimize the etching process. The optimal result is demonstrated by a high etch rate (3.3 µm min-1) and low undercutting (1.1: 1). The patterned PZT thin film exhibits a remnant polarization of 24 µC cm-2, a coercive field of 53 kV cm-1, a leakage current density of 4.7 × 10-8 A cm-2 at 320 kV cm-1 and a dielectric constant of 1100 at 1 KHz.

  16. Optimization of process parameters during carbonization for improved carbon fibre strength

    NASA Astrophysics Data System (ADS)

    Köhler, T.; Pursche, F.; Burscheidt, P.; Seide, G.; Gries, T.

    2017-10-01

    Based on their extraordinary properties, carbon fibres nowadays play a significant role in modern industries. In the last years carbon fibres are increasingly used for lightweight constructions in the energy or the transportation industry. However, a bigger market penetration of carbon fibres is still hindered by high prices (~ 22 /kg) [3]. One crucial step in carbon fibre production is the process of carbonization of stabilized fibres. However, the cause effect relationships of carbonization are nowadays not fully understood. Therefore, the main goal of this research work is the quantification of the cause-effect relationships of process parameters like temperature and residence time on carbon fibre strength.

  17. An eye movement pre-training fosters the comprehension of processes and functions in technical systems.

    PubMed

    Skuballa, Irene T; Fortunski, Caroline; Renkl, Alexander

    2015-01-01

    The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning.

  18. CDC Kerala 1: Organization of clinical child development services (1987-2013).

    PubMed

    Nair, M K C; George, Babu; Nair, G S Harikumaran; Bhaskaran, Deepa; Leena, M L; Russell, Paul Swamidhas Sudhakar

    2014-12-01

    The main objective of establishing the Child Development Centre (CDC), Kerala for piloting comprehensive child adolescent development program in India, has been to understand the conceptualization, design and scaling up of a pro-active positive child development initiative, easily replicable all over India. The process of establishing the Child Development Centre (CDC) Kerala for research, clinical services, training and community extension services over the last 25 y, has been as follows; Step 1: Conceptualization--The life cycle approach to child development; Step 2: Research basis--CDC model early stimulation is effective; Step 3: Development and validation of seven simple developmental screening tools; Step 4: CDC Diagnostic services--Ultrasonology and genetic, and metabolic laboratory; Step 5: Developing seven intervention packages; Step 6: Training--Post graduate diploma in clinical child development; Step 7: CDC Clinic Services--seven major ones; Step 8: CDC Community Services--Child development referral units; Step 9: Community service delivery models--Childhood disability and for adolescent care counselling projects; Step 10: National capacity building--Four child development related courses. CDC Kerala follow-up and clinic services are offered till 18 y of age and premarital counselling till 24 y of age as shown in "CDC Kerala Clinic Services Flow Chart" and 74,291 children have availed CDC clinic services in the last 10 y. CDC Kerala is the first model for comprehensive child adolescent development services using a lifecycle approach in the Government sector and hence declared as the collaborative centre for Rashtriya Bal Swasthya Karyakram (RBSK), in Kerala.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocchetti, Laura; Amato, Alessia; Fonti, Viviana

    Graphical abstract: Display Omitted - Highlights: • End-of-life LCD panels represent a source of indium. • Several experimental conditions for indium leaching have been assessed. • Indium is completely extracted with 2 M sulfuric acid at 80 °C for 10 min. • Cross-current leaching improves indium extraction and operating costs are lowered. • Benefits to the environment come from reduction of CO{sub 2} emissions and reagents use. - Abstract: Indium is a critical element mainly produced as a by-product of zinc mining, and it is largely used in the production process of liquid crystal display (LCD) panels. End-of-life LCDs representmore » a possible source of indium in the field of urban mining. In the present paper, we apply, for the first time, cross-current leaching to mobilize indium from end-of-life LCD panels. We carried out a series of treatments to leach indium. The best leaching conditions for indium were 2 M sulfuric acid at 80 °C for 10 min, which allowed us to completely mobilize indium. Taking into account the low content of indium in end-of-life LCDs, of about 100 ppm, a single step of leaching is not cost-effective. We tested 6 steps of cross-current leaching: in the first step indium leaching was complete, whereas in the second step it was in the range of 85–90%, and with 6 steps it was about 50–55%. Indium concentration in the leachate was about 35 mg/L after the first step of leaching, almost 2-fold at the second step and about 3-fold at the fifth step. Then, we hypothesized to scale up the process of cross-current leaching up to 10 steps, followed by cementation with zinc to recover indium. In this simulation, the process of indium recovery was advantageous from an economic and environmental point of view. Indeed, cross-current leaching allowed to concentrate indium, save reagents, and reduce the emission of CO{sub 2} (with 10 steps we assessed that the emission of about 90 kg CO{sub 2}-Eq. could be avoided) thanks to the recovery of indium. This new strategy represents a useful approach for secondary production of indium from waste LCD panels.« less

  20. Detail of concrete pillars and steps leading to main entry ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of concrete pillars and steps leading to main entry at southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  2. Microbial population dynamics during startup of a full-scale anaerobic digester treating industrial food waste in Kyoto eco-energy project.

    PubMed

    Ike, Michihiko; Inoue, Daisuke; Miyano, Tomoki; Liu, Tong Tong; Sei, Kazunari; Soda, Satoshi; Kadoshin, Shiro

    2010-06-01

    The microbial community in a full-scale anaerobic digester (2300m3) treating industrial food waste in the Kyoto Eco-Energy Project was analyzed using terminal restriction fragment length polymorphism for eubacterial and archaeal 16S rRNA genes. Both thermophilic and mesophilic sludge of treated swine waste were seeded to the digestion tank. During the 150-day startup period, coffee grounds as a main food waste, along with potato, kelp and boiled beans, tofu, bean curd lees, and deep-fried bean curd were fed to the digestion process step-by-step (max. 40t/d). Finally, the methane yield reached 360m3/t-feed with 40days' retention time, although temporary accumulation of propionate was observed. Eubacterial communities that formed in the thermophilic digestion tank differed greatly from both thermophilic and mesophilic types of seed sludge. Results suggest that the Actinomyces/Thermomonospora and Ralstonia/Shewanella were contributors for hydrolyzation and degradation of food waste into volatile fatty acids. Acetate-utilizing methanogens, Methanosaeta, were dominant in seed sludges of both types, but they decreased drastically during processing in the digestion tank. Methanosarcina and Methanobrevibacter/Methanobacterium were, respectively, possible main contributors for methane production from acetate and H2 plus CO2. Copyright 2010 Elsevier Ltd. All rights reserved.

  3. Volatile changes in cv. Verdeal Transmontana olive oil: From the drupe to the table, including storage.

    PubMed

    Malheiro, Ricardo; Casal, Susana; Rodrigues, Nuno; Renard, Catherine M G C; Pereira, José Alberto

    2018-04-01

    This study focused on the volatile changes in cv. Verdeal Transmontana throughout the entire olive oil processing chain, from the drupe to olive oil storage up to 12 months, while correlating it with quality parameters and sensory quality. During crushing and malaxation, the volatiles formed were mainly "green-leaf volatiles" (GLVs), namely (E)-2-hexenal, hexanal, and 1-hexanol. Centrifugation and clarification steps increased the total volatile amounts to 130 mg kg -1 . However, clarification also increased nonanal and (E)-2-decenal contents, two markers of oxidation, with a noticeable loss of phenolic compounds and oxidative stability. During storage, the total volatile amounts reduced drastically (94% at 12 months after extraction), together with the positive sensory attributes fruity, green, bitter, and pungent. Despite being classified as extra-virgin after one year of storage, peroxides and conjugated dienes were significantly higher while there was a reduction in antioxidant capacity as well as in phenolic compounds (less 50%) and oxidative stability (57%). The present work allowed concluding that the extraction process modulates the volatile composition of olive oil, with a concentration of volatiles at the clarification step. During storage, volatiles are lost, mainly eight months after extraction, leading to the loss of important sensory attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Dynamic Analysis of Large In-Space Deployable Membrane Antennas

    NASA Technical Reports Server (NTRS)

    Fang, Houfei; Yang, Bingen; Ding, Hongli; Hah, John; Quijano, Ubaldo; Huang, John

    2006-01-01

    This paper presents a vibration analysis of an eight-meter diameter membrane reflectarray antenna, which is composed of a thin membrane and a deployable frame. This analysis process has two main steps. In the first step, a two-variable-parameter (2-VP) membrane model is developed to determine the in-plane stress distribution of the membrane due to pre-tensioning, which eventually yields the differential stiffness of the membrane. In the second step, the obtained differential stiffness is incorporated in a dynamic equation governing the transverse vibration of the membrane-frame assembly. This dynamic equation is then solved by a semi-analytical method, called the Distributed Transfer Function Method (DTFM), which produces the natural frequencies and mode shapes of the antenna. The combination of the 2-VP model and the DTFM provides an accurate prediction of the in-plane stress distribution and modes of vibration for the antenna.

  5. Automatic alignment method for calibration of hydrometers

    NASA Astrophysics Data System (ADS)

    Lee, Y. J.; Chang, K. H.; Chon, J. C.; Oh, C. Y.

    2004-04-01

    This paper presents a new method to automatically align specific scale-marks for the calibration of hydrometers. A hydrometer calibration system adopting the new method consists of a vision system, a stepping motor, and software to control the system. The vision system is composed of a CCD camera and a frame grabber, and is used to acquire images. The stepping motor moves the camera, which is attached to the vessel containing a reference liquid, along the hydrometer. The operating program has two main functions: to process images from the camera to find the position of the horizontal plane and to control the stepping motor for the alignment of the horizontal plane with a particular scale-mark. Any system adopting this automatic alignment method is a convenient and precise means of calibrating a hydrometer. The performance of the proposed method is illustrated by comparing the calibration results using the automatic alignment method with those obtained using the manual method.

  6. Peer Interventions to Promote Health: Conceptual Considerations

    PubMed Central

    Simoni, Jane M.; Franks, Julie C.; Lehavot, Keren; Yard, Samantha S.

    2013-01-01

    Peers have intervened to promote health since ancient times, yet few attempts have been made to describe theoretically their role and their interventions. After a brief overview of the history and variety of peer-based health interventions, a 4-part definition of peer interveners is presented here with a consideration of the dimensions of their involvement in health promotion. Then, a 2-step process is proposed as a means of conceptualizing peer interventions to promote health. Step 1 involves establishing a theoretical framework for the intervention’s main focus (i.e., education, social support, social norms, self-efficacy, and patient advocacy), and Step 2 involves identifying a theory that justifies the use of peers and might explain their impact. As examples, the following might be referred to: theoretical perspectives from the mutual support group and self-help literature, social cognitive and social learning theories, the social support literature, social comparison theory, social network approaches, and empowerment models. PMID:21729015

  7. False colors removal on the YCr-Cb color space

    NASA Astrophysics Data System (ADS)

    Tomaselli, Valeria; Guarnera, Mirko; Messina, Giuseppe

    2009-01-01

    Post-processing algorithms are usually placed in the pipeline of imaging devices to remove residual color artifacts introduced by the demosaicing step. Although demosaicing solutions aim to eliminate, limit or correct false colors and other impairments caused by a non ideal sampling, post-processing techniques are usually more powerful in achieving this purpose. This is mainly because the input of post-processing algorithms is a fully restored RGB color image. Moreover, post-processing can be applied more than once, in order to meet some quality criteria. In this paper we propose an effective technique for reducing the color artifacts generated by conventional color interpolation algorithms, in YCrCb color space. This solution efficiently removes false colors and can be executed while performing the edge emphasis process.

  8. A Contribution For The Understanding of The Deformation Pattern Across The Terceira Axis

    NASA Astrophysics Data System (ADS)

    Navarro, A.; Catalão, J.; Miranda, J. M.

    In spite of several geodynamics studies performed in the Azores region, little is known about the deformation pattern of the tectonically more active sector around the Ter- ceira Axis. GPS campaigns performed in the area, in the last few years, were mainly concerned to the study of the relative motions between the Eurasian, African and North-American plates. This study, developed in the scope of the STAMINA project, has as main purpose the establishment of a dense GPS network to study the crustal deformation pattern in the area between the North Hirondelle basin and the East Gra- ciosa basin. The GPS network consists of 20 stations uniformly distributed throughout the island. The first GPS survey was carried out during days 90 to 98 of 2001. TERC and TCAT stations were used as reference stations, recording continuously throughout the survey. All the other stations were occupied for at least three sessions, except for cases of receiver malfunction, each session has a duration of 12 to 24 hours. The GPS data processing approach consisted of three main steps: (1) first, all sessions were processed separately using GAMIT in order to obtain a daily solution for two local sites (TERC and TCAT) and six global tracking stations (CCV3, RABT, SAV1, SFER, STJO and WSRT) using precise orbits from the IGS; (2) then, all stations of the local network are processed together and (3) finally, all station, including the global tracking ones, are reprocessed again. Precise orbits from the IGS were used in the processing. In each step a compensation program was used to compute a least squares network adjusted solution for the campaign, where all sessions are combined to yield estimates of improved station coordinates. The final solution achieved with the described methodology is documented in this paper. Further geodetic observations are needed in order to estimate the stations ve- locities and displacements and consequently to determine the rate of deformation of the island.

  9. [Procedural analysis of acid-base balance disorder: case serials in 4 patents].

    PubMed

    Ma, Chunyuan; Wang, Guijie

    2017-05-01

    To establish the standardization process of acid-base balance analysis, analyze cases of acid-base balance disorder with the aid of acid-base balance coordinate graph. The acid-base balance theory were reviewed systematically on recent research progress, and the important concepts, definitions, formulas, parameters, regularity and inference in the analysis of acid-base balance were studied. The analysis of acid-base balance disordered processes and steps were figured. The application of acid-base balance coordinate graph in the cases was introduced. The method of "four parameters-four steps" analysis was put forward to analyze the acid-base balance disorders completely. "Four parameters" included pH, arterial partial pressure of carbon dioxide (PaCO 2 ), HCO 3 - and anion gap (AG). "Four steps" were outlined by following aspects: (1) according to the pH, PaCO 2 and HCO 3 - , the primary or main types of acid-base balance disorder was determined; (2) primary or main types of acid-base disorder were used to choose the appropriate compensation formula and to determine the presence of double mixed acid-base balance disorder; (3) the primary acid-base balance disorders were divided into two parts: respiratory acidosis or respiratory alkalosis, at the same time, the potential HCO 3 - should be calculated, the measured HCO 3 - should be replaced with potential HCO 3 - , to determine whether there were three mixed acid-base disorders; (4) based on the above analysis the data judged as the simple AG increased-metabolic acidosis was needed to be further analyzed. The ratio of ΔAG↑/ΔHCO 3 - ↓ was also needed to be calculated, to determine whether there was normal AG metabolic acidosis or metabolic alkalosis. In the clinical practice, PaCO 2 (as the abscissa) and HCO 3 - (as the ordinate) were used to establish a rectangular coordinate system, through origin (0, 0) and coordinate point (40, 24) could be a straight line, and all points on the straight line pH were equal to 7.40. The acid-base balance coordinate graph could be divided into seven areas by three straight lines [namely pH = 7.40 isoline, PaCO 2 = 40 mmHg (1 mmHg = 0.133 kPa) line and HCO 3 - = 24 mmol/L line]: main respiratory alkalosis area, main metabolic alkalosis area, respiratory + metabolic alkalosis area, main respiratory acidosis area, main metabolic acidosis area, respiratory + metabolic acidosis area and normal area. It was easier to determine the type of acid-base balance disorders by identifying the location of the (PaCO 2 , HCO 3 - ) or (PaCO 2 , potential HCO 3 - ) point on the acid-base balance coordinate graph. "Four parameters-four steps" method is systematic and comprehensive. At the same time, by using the acid-base balance coordinate graph, it is simpler to estimate the types of acid-base balance disorders. It is worthy of popularizing and generalizing.

  10. A Two-Step Bioconversion Process for Canolol Production from Rapeseed Meal Combining an Aspergillus niger Feruloyl Esterase and the Fungus Neolentinus lepideus

    PubMed Central

    Odinot, Elise; Fine, Frédéric; Sigoillot, Jean-Claude; Navarro, David; Laguna, Oscar; Bisotto, Alexandra; Peyronnet, Corinne; Lecomte, Jérôme; Faulds, Craig B.

    2017-01-01

    Rapeseed meal is a cheap and abundant raw material, particularly rich in phenolic compounds of biotechnological interest. In this study, we developed a two-step bioconversion process of naturally occurring sinapic acid (4-hydroxy-3,5-dimethoxycinnamic acid) from rapeseed meal into canolol by combining the complementary potentialities of two filamentous fungi, the micromycete Aspergillus niger and the basidiomycete Neolentinus lepideus. Canolol could display numerous industrial applications because of its high antioxidant, antimutagenic and anticarcinogenic properties. In the first step of the process, the use of the enzyme feruloyl esterase type-A (named AnFaeA) produced with the recombinant strain A. niger BRFM451 made it possible to release free sinapic acid from the raw meal by hydrolysing the conjugated forms of sinapic acid in the meal (mainly sinapine and glucopyranosyl sinapate). An amount of 39 nkat AnFaeA per gram of raw meal, at 55 °C and pH 5, led to the recovery of 6.6 to 7.4 mg of free sinapic acid per gram raw meal, which corresponded to a global hydrolysis yield of 68 to 76% and a 100% hydrolysis of sinapine. Then, the XAD2 adsorbent (a styrene and divinylbenzene copolymer resin), used at pH 4, enabled the efficient recovery of the released sinapic acid, and its concentration after elution with ethanol. In the second step, 3-day-old submerged cultures of the strain N. lepideus BRFM15 were supplied with the recovered sinapic acid as the substrate of bioconversion into canolol by a non-oxidative decarboxylation pathway. Canolol production reached 1.3 g/L with a molar yield of bioconversion of 80% and a productivity of 100 mg/L day. The same XAD2 resin, when used at pH 7, allowed the recovery and purification of canolol from the culture broth of N. lepideus. The two-step process used mild conditions compatible with green chemistry. PMID:29036919

  11. Kerosene: Contributing agent to xylene as a clearing agent in tissue processing.

    PubMed

    Shah, Amisha Ashokkumar; Kulkarni, Dinraj; Ingale, Yashwant; Koshy, Ajit V; Bhagalia, Sanjay; Bomble, Nikhil

    2017-01-01

    Research methodology in oral and maxillofacial pathology has illimitable potential. The tissue processing involves many steps of which one of the most important step is "Clearing," which is a process of replacing dehydrant with a substance which is miscible with embedding medium or paraffin wax. Xylene is one of the common clearing agents used in laboratory, but it is also hazardous. The main aim of this study is to substitute conventionally used xylene by a mixture of kerosene and xylene in clearing steps without altering the morphology and staining characteristics of tissue sections. This will also minimize the toxic effects and tend to be more economical. One hundred and twenty bits of tissue samples were collected, each randomly separated into 4 groups (A, B, C and D) and kept for routine tissue processing till the step of clearing; during the step of clearing instead of conventional xylene, we used mixture of xylene and kerosene in 4 ratios ([A-K:X - 50:50]; [B-K:X - 70:30]; [C - Ab. Kerosene]; [D - Ab. Xylene - as control]) and observed for the light microscopic study adopting H and E staining, IHC (D2-40), Special stains (periodic acid-Schiff and congo red) procedure. The result was subjected to statistical analysis by using Fisher's exact test. The results obtained from the present study were compared with control group, i.e., D and it was observed that Groups A and B were absolutely cleared without altering the morphology of tissue and cellular details; optimum embedding characteristics and better staining characteristics were also noted, whereas Group C presents poor staining characteristics with reduced cellular details. Embedded tissues in Group C presented with rough, irregular surface and also appeared shrunken. Combined mixture of xylene and kerosene as a clearing agent in different ratio, i.e., Group A (K:X - 50:50) and B (K:X - 70:30) can be used without posing any health risk or compromising the cellular integrity.

  12. Comparison of step-by-step kinematics in repeated 30m sprints in female soccer players.

    PubMed

    van den Tillaar, Roland

    2018-01-04

    The aim of this study was to compare kinematics in repeated 30m sprints in female soccer players. Seventeen subjects performed seven 30m sprints every 30s in one session. Kinematics were measured with an infrared contact mat and laser gun, and running times with an electronic timing device. The main findings were that sprint times increased in the repeated sprint ability test. The main changes in kinematics during the repeated sprint ability test were increased contact time and decreased step frequency, while no change in step length was observed. The step velocity increased in almost each step until the 14, which occurred around 22m. After this, the velocity was stable until the last step, when it decreased. This increase in step velocity was mainly caused by the increased step length and decreased contact times. It was concluded that the fatigue induced in repeated 30m sprints in female soccer players resulted in decreased step frequency and increased contact time. Employing this approach in combination with a laser gun and infrared mat for 30m makes it very easy to analyse running kinematics in repeated sprints in training. This extra information gives the athlete, coach and sports scientist the opportunity to give more detailed feedback and help to target these changes in kinematics better to enhance repeated sprint performance.

  13. The continuous assembly and transfer of nanoelements

    NASA Astrophysics Data System (ADS)

    Kumar, Arun

    Patterned nanoelements on flexible polymeric substrates at micro/nano scale at high rate, low cost, and commercially viable route offer an opportunity for manufacturing devices with micro/nano scale features. These micro/nano scale now made with various nanoelement can enhance the device functionality in sensing and switching due to their improved conductivity and better mechanical properties. In this research the fundamental understanding of high rate assembly and transfer of nanoelements has been developed. To achieve this objective, three sub topics were made. In the first step, the use of electrophoresis for the controlled assembly of CNT's on interdigitated templates has been shown. The time scale of assembly reported is shorter than the previously reported assembly time (60 seconds). The mass deposited was also predicted using the Hamaker's law. It is also shown that pre-patterned CNT's could be transferred from the rigid templates onto flexible polymeric substrates using a thermoforming process. The time scale of transfer is less than one minute (50 seconds) and was found to be dependent on polymer chemistry. It was found that CNT's preferentially transfer from Au electrode to non-polar polymeric substrates (polyurethane and polyethylene terephalathate glycol) in the thermoforming process. In the second step, a novel process (Pulsed Electrophoresis) has been shown for the first time to assist the assembly of conducting polyaniline on gold nanowire interdigitated templates. This technique offers dynamic control over heat build-up, which has been a main drawback in the DC electrophoresis and AC dielectrophoresis as well as the main cause of nanowire template damage. The use of this technique allowed higher voltages to be applied, resulting in shorter assembly times (e.g., 17.4 seconds, assembly resolution of 100 nm). The pre-patterned templates with PANi deposition were subsequently used to transfer the nanoscale assembled PANi from the rigid templates to thermoplastic polyurethane using the thermoforming process. In the third step, a novel integration of high rate pulsed electrophoretic assembly with thermally assisted transfer in a roll-to-roll process has been shown. This technique allowed the whole assembly and transfer process to take place in only 30 seconds. Further, a processing window is developed to control the percent area coverage of PANi with the aid of the belt speed. Also shown is the effect of different types of polymer on the quality of transfer, and it concluded that the transfer is affected by the polymer chemistry.

  14. Development of in situ product removal strategies in biocatalysis applying scaled-down unit operations.

    PubMed

    Heintz, Søren; Börner, Tim; Ringborg, Rolf H; Rehn, Gustav; Grey, Carl; Nordblad, Mathias; Krühne, Ulrich; Gernaey, Krist V; Adlercreutz, Patrick; Woodley, John M

    2017-03-01

    An experimental platform based on scaled-down unit operations combined in a plug-and-play manner enables easy and highly flexible testing of advanced biocatalytic process options such as in situ product removal (ISPR) process strategies. In such a platform, it is possible to compartmentalize different process steps while operating it as a combined system, giving the possibility to test and characterize the performance of novel process concepts and biocatalysts with minimal influence of inhibitory products. Here the capabilities of performing process development by applying scaled-down unit operations are highlighted through a case study investigating the asymmetric synthesis of 1-methyl-3-phenylpropylamine (MPPA) using ω-transaminase, an enzyme in the sub-family of amino transferases (ATAs). An on-line HPLC system was applied to avoid manual sample handling and to semi-automatically characterize ω-transaminases in a scaled-down packed-bed reactor (PBR) module, showing MPPA as a strong inhibitor. To overcome the inhibition, a two-step liquid-liquid extraction (LLE) ISPR concept was tested using scaled-down unit operations combined in a plug-and-play manner. Through the tested ISPR concept, it was possible to continuously feed the main substrate benzylacetone (BA) and extract the main product MPPA throughout the reaction, thereby overcoming the challenges of low substrate solubility and product inhibition. The tested ISPR concept achieved a product concentration of 26.5 g MPPA  · L -1 , a purity up to 70% g MPPA  · g tot -1 and a recovery in the range of 80% mol · mol -1 of MPPA in 20 h, with the possibility to increase the concentration, purity, and recovery further. Biotechnol. Bioeng. 2017;114: 600-609. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Challenges associated with the implementation of the nursing process: A systematic review.

    PubMed

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses.

  16. Challenges associated with the implementation of the nursing process: A systematic review

    PubMed Central

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Background: Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. Materials and Methods: To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Results: Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. Conclusions: On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses. PMID:26257793

  17. NPTool: Towards Scalability and Reliability of Business Process Management

    NASA Astrophysics Data System (ADS)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  18. Analysis of pure tar substances (polycyclic aromatic hydrocarbons) in the gas stream using ultraviolet visible (UV-Vis) spectroscopy and multivariate curve resolution (MCR).

    PubMed

    Weide, Tobias; Guschin, Viktor; Becker, Wolfgang; Koelle, Sabine; Maier, Simon; Seidelt, Stephan

    2015-01-01

    The analysis of tar, mostly characterized as polycyclic aromatic hydrocarbons (PAHs), describes a topic that has been researched for years. An online analysis of tar in the gas stream in particular is needed to characterize the tar conversion or formation in the biomass gasification process. The online analysis in the gas is carried out with ultraviolet-visible (UV-Vis) spectroscopy (190-720 nm). This online analysis is performed with a measuring cell developed by the Fraunhofer Institute for Chemical Technology (ICT). To this day, online tar measurements using UV-Vis spectroscopy have not been carried out in detail. Therefore, PAHs are analyzed as follows. The measurements are split into different steps. The first step to prove the online method is to vaporize single tar substances. These experiments show that a qualitative analysis of PAHs in the gas stream with the used measurement setup is possible. Furthermore, it is shown that the method provides very exact results, so that a differentiation of various PAHs is possible. The next step is to vaporize a PAH mixture. This step consists of vaporizing five pure substances almost simultaneously. The interpretation of the resulting data is made using a chemometric interpretation method, the multivariate curve resolution (MCR). The verification of the calculated results is the main aim of this experiment. It has been shown that the tar mixture can be analyzed qualitatively and quantitatively (in arbitrary units) in detail using the MCR. Finally it is the main goal of this paper to show the first steps in the applicability of the UV-Vis spectroscopy and the measurement setup on online tar analysis in view of characterizing the biomass gasification process. Due to that, the gasification plant (at the laboratory scale), developed and constructed by the Fraunhofer ICT, has been used to vaporize these substances. Using this gasification plant for the experiments enables the usage of the measurement setup also for the spectroscopic analysis of the tar formation during the biomass gasification.

  19. Automatic Road Gap Detection Using Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  20. Calibration process of highly parameterized semi-distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Vidmar, Andrej; Brilly, Mitja

    2017-04-01

    Hydrological phenomena take place in the hydrological system, which is governed by nature, and are essentially stochastic. These phenomena are unique, non-recurring, and changeable across space and time. Since any river basin with its own natural characteristics and any hydrological event therein, are unique, this is a complex process that is not researched enough. Calibration is a procedure of determining the parameters of a model that are not known well enough. Input and output variables and mathematical model expressions are known, while only some parameters are unknown, which are determined by calibrating the model. The software used for hydrological modelling nowadays is equipped with sophisticated algorithms for calibration purposes without possibility to manage process by modeler. The results are not the best. We develop procedure for expert driven process of calibration. We use HBV-light-CLI hydrological model which has command line interface and coupling it with PEST. PEST is parameter estimation tool which is used widely in ground water modeling and can be used also on surface waters. Process of calibration managed by expert directly, and proportionally to the expert knowledge, affects the outcome of the inversion procedure and achieves better results than if the procedure had been left to the selected optimization algorithm. First step is to properly define spatial characteristic and structural design of semi-distributed model including all morphological and hydrological phenomena, like karstic area, alluvial area and forest area. This step includes and requires geological, meteorological, hydraulic and hydrological knowledge of modeler. Second step is to set initial parameter values at their preferred values based on expert knowledge. In this step we also define all parameter and observation groups. Peak data are essential in process of calibration if we are mainly interested in flood events. Each Sub Catchment in the model has own observations group. Third step is to set appropriate bounds to parameters in their range of realistic values. Fourth step is to use of singular value decomposition (SVD) ensures that PEST maintains numerical stability, regardless of how ill-posed is the inverse problem Fifth step is to run PWTADJ1. This creates a new PEST control file in which weights are adjusted such that the contribution made to the total objective function by each observation group is the same. This prevents the information content of any group from being invisible to the inversion process. Sixth step is to add Tikhonov regularization to the PEST control file by running the ADDREG1 utility (Doherty, J, 2013). In adding regularization to the PEST control file ADDREG1 automatically provides a prior information equation for each parameter in which the preferred value of that parameter is equated to its initial value. Last step is to run PEST. We run BeoPEST which a parallel version of PEST and can be run on multiple computers in parallel in same time on TCP communications and this speedup process of calibrations. The case study with results of calibration and validation of the model will be presented.

  1. Automatic cloud coverage assessment of Formosat-2 image

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  2. Potential radiological impact of tornadoes on the safety of Nuclear Fuel Services' West Valley Fuel Reprocessing Plant. Volume I. Tornado effects on head-end cell airflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holloway, L.J.; Andrae, R.W.

    1981-09-01

    This report describes results of a parametric study of the impacts of a tornado-generated depressurization on airflow in the contaminated process cells within the presently inoperative Nuclear Fuel Services fuel reprocessing facility near West Valley, NY. The study involved the following tasks: (1) mathematical modeling of installed ventilation and abnormal exhaust pathways from the cells and prediction of tornado-induced airflows in these pathways; (2) mathematical modeling of individual cell flow characteristics and prediction of in-cell velocities induced by flows from step 1; and (3) evaluation of the results of steps 1 and 2 to determine whether any of the pathwaysmore » investigated have the potential for releasing quantities of radioactively contaminated air from the main process cells. The study has concluded that in the event of a tornado strike, certain pathways from the cells have the potential to release radioactive materials of the atmosphere. Determination of the quantities of radioactive material released from the cells through pathways identified in step 3 is presented in Part II of this report.« less

  3. One-step production of multilayered microparticles by tri-axial electro-flow focusing

    NASA Astrophysics Data System (ADS)

    Si, Ting; Feng, Hanxin; Li, Yang; Luo, Xisheng; Xu, Ronald

    2014-03-01

    Microencapsulation of drugs and imaging agents in the same carrier is of great significance for simultaneous detection and treatment of diseases. In this work, we have developed a tri-axial electro-flow focusing (TEFF) device using three needles with a novel concentric arrangement to one-step form multilayered microparticles. The TEFF process can be characterized as a multi-fluidic compound cone-jet configuration in the core of a high-speed coflowing gas stream under an axial electric field. The tri-axial liquid jet eventually breaks up into multilayered droplets. To validate the method, the effect of main process parameters on characteristics of the cone and the jet has been studied experimentally. The applied electric field can dramatically promote the stability of the compound cone and enhance the atomization of compound liquid jets. Microparticles with both three-layer, double-layer and single-layer structures have been obtained. The results show that the TEFF technique has great benefits in fabricating multilayered microparticles at smaller scales. This method will be able to one-step encapsulate multiple therapeutic and imaging agents for biomedical applications such as multi-modal imaging, drug delivery and biomedicine.

  4. Biomass-based magnetic fluorescent nanoparticles: One-step scalable synthesis, application as drug carriers and mechanism study.

    PubMed

    Li, Lei; Wang, Feijun; Shao, Ziqiang

    2018-03-15

    A biomass-based magnetic fluorescent nanoparticle (MFNPs) was successively in situ synthesized via a one-step high-gravity approach, which constructed by a magnetic core of Fe 3 O 4 nanoparticles, the fluorescent marker of carbon dots (CDs), and shells of chitosan (CS). The obtained MFNPs had a 10 nm average diameter and narrow particle size distribution, low cytotoxicity, superior fluorescent emission and superparamagnetic properties. The encapsulating and release 5-fluorouracil experiments confirmed that the introduction of CS/CDs effectively improved the drug loading capacity. Mechanism and kinetic studies proved that: (i) the monolayer adsorption was the main sorption mode under the studied conditions; (ii) the whole adsorption process was controlled by intra-liquid diffusion mass transfer and governed by chemisorption; and (iii) the release process was controlled by Fickian diffusion. These results demonstrated this method to one-step continuously produce MFNPs and the construction of non-toxic nanostructure possessed great superiority in currently Nano-delivery systems, which would show high application value in targeted drug delivery, magnetic fluid hyperthermia treatment, magnetic resonance imaging (MRI), in vitro testing and relative research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  6. Porous single-phase NiTi processed under Ca reducing vapor for use as a bone graft substitute.

    PubMed

    Bertheville, Bernard

    2006-03-01

    Porous nickel-titanium alloys (NiTi, nitinol) have recently attracted attention in clinical surgery because they are a very interesting alternative to the more brittle and less machinable conventional porous Ca-based ceramics. The main remaining limitations come from the chemical homogeneity of the as-processed porous nickel-titanium alloys, which always contain undesired secondary Ti- and Ni-rich phases. These are known to weaken the NiTi products, to favor their cavitation corrosion and to decrease their biocompatibility. Elemental nickel must also be avoided because it could give rise to several adverse tissue reactions. Therefore, the synthesis of porous single-phase NiTi alloys by using a basic single-step sintering procedure is an important step towards the processing of safe implant materials. The sintering process used in this work is based on a vapor phase calciothermic reduction operating during the NiTi compound formation. The as-processed porous nickel-titanium microstructure is single-phase and shows a uniformly open pore distribution with porosity of about 53% and pore diameters in the range 20-100 microm. Furthermore, due to the process, fine CaO layers grow on the NiTi outer and inner surfaces, acting as possible promoting agents for the ingrowth of bone cells at the implantation site.

  7. RFID in the blood supply chain--increasing productivity, quality and patient safety.

    PubMed

    Briggs, Lynne; Davis, Rodeina; Gutierrez, Alfonso; Kopetsky, Matthew; Young, Kassandra; Veeramani, Raj

    2009-01-01

    As part of an overall design of a new, standardized RFID-enabled blood transfusion medicine supply chain, an assessment was conducted for two hospitals: the University of Iowa Hospital and Clinics (UIHC) and Mississippi Baptist Health System (MBHS). The main objectives of the study were to assess RFID technological and economic feasibility, along with possible impacts to productivity, quality and patient safety. A step-by-step process analysis focused on the factors contributing to process "pain points" (errors, inefficiency, product losses). A process re-engineering exercise produced blueprints of RFID-enabled processes to alleviate or eliminate those pain-points. In addition, an innovative model quantifying the potential reduction in adverse patient effects as a result of RFID implementation was created, allowing improvement initiatives to focus on process areas with the greatest potential impact to patient safety. The study concluded that it is feasible to implement RFID-enabled processes, with tangible improvements to productivity and safety expected. Based on a comprehensive cost/benefit model, it is estimated for a large hospital (UIHC) to recover investment from implementation within two to three years, while smaller hospitals may need longer to realize ROI. More importantly, the study estimated that RFID technology could reduce morbidity and mortality effects substantially among patients receiving transfusions.

  8. Fundamental mechanisms and reactions in non-catalytic subcritical hydrothermal processes: A review.

    PubMed

    Yousefifar, Azadeh; Baroutian, Saeid; Farid, Mohammed M; Gapes, Daniel J; Young, Brent R

    2017-10-15

    The management and disposal of solid waste is of increasing concern across the globe. Hydrothermal processing of sludge has been suggested as a promising solution to deal with the considerable amounts of sludge produced worldwide. Such a process not only degrades organic compounds and reduces waste volume, but also provides an opportunity to recover valuable substances. Hydrothermal processing comprises two main sub-processes: wet oxidation (WO) and thermal hydrolysis (TH), in which the formation of various free radicals results in the production of different intermediates. Volatile fatty acids (VFAs), especially acetic acid, are usually the main intermediates which remain as a by-product of the process. This paper aims to review the fundamental mechanism for hydrothermal processing of sludge, and the formation of different free radicals and intermediates therein. In addition, the proposed kinetic models for the two processes (WO and TH) from the literature are reviewed and the advantages and disadvantages of each model are outlined. The effect of mass transfer as a critical component of the design and development of the processes, which has been neglected in most of these proposed models, is also reviewed, and the effect of influencing parameters on the processes' controlling step (reaction or mass transfer) is discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. New numerical approach for the modelling of machining applied to aeronautical structural parts

    NASA Astrophysics Data System (ADS)

    Rambaud, Pierrick; Mocellin, Katia

    2018-05-01

    The manufacturing of aluminium alloy structural aerospace parts involves several steps: forming (rolling, forging …etc), heat treatments and machining. Before machining, the manufacturing processes have embedded residual stresses into the workpiece. The final geometry is obtained during this last step, when up to 90% of the raw material volume is removed by machining. During this operation, the mechanical equilibrium of the part is in constant evolution due to the redistribution of the initial stresses. This redistribution is the main cause for workpiece deflections during machining and for distortions - after unclamping. Both may lead to non-conformity of the part regarding the geometrical and dimensional specifications and therefore to rejection of the part or additional conforming steps. In order to improve the machining accuracy and the robustness of the process, the effect of the residual stresses has to be considered for the definition of the machining process plan and even in the geometrical definition of the part. In this paper, the authors present two new numerical approaches concerning the modelling of machining of aeronautical structural parts. The first deals with the use of an immersed volume framework to model the cutting step, improving the robustness and the quality of the resulting mesh compared to the previous version. The second is about the mechanical modelling of the machining problem. The authors thus show that in the framework of rolled aluminium parts the use of a linear elasticity model is functional in the finite element formulation and promising regarding the reduction of computation times.

  10. Value of Collaboration With Standardized Patients and Patient Facilitators in Enhancing Reflection During the Process of Building a Simulation.

    PubMed

    Stanley, Claire; Lindsay, Sally; Parker, Kathryn; Kawamura, Anne; Samad Zubairi, Mohammad

    2018-05-09

    We previously reported that experienced clinicians find the process of collectively building and participating in simulations provide (1) a unique reflective opportunity; (2) a venue to identify different perspectives through discussion and action in a group; and (3) a safe environment for learning. No studies have assessed the value of collaborating with standardized patients (SPs) and patient facilitators (PFs) in the process. In this work, we describe this collaboration in building a simulation and the key elements that facilitate reflection. Three simulation scenarios surrounding communication were built by teams of clinicians, a PF, and SPs. Six build sessions were audio recorded, transcribed, and thematically analyzed through an iterative process to (1) describe the steps of building a simulation scenario and (2) identify the key elements involved in the collaboration. The five main steps to build a simulation scenario were (1) storytelling and reflection; (2) defining objectives and brainstorming ideas; (3) building a stem and creating a template; (4) refining the scenario with feedback from SPs; and (5) mock run-throughs with follow-up discussion. During these steps, the PF shared personal insights, challenging participants to reflect deeper to better understand and consider the patient's perspective. The SPs provided unique outside perspective to the group. In addition, the interaction between the SPs and the PF helped refine character roles. A collaborative approach incorporating feedback from PFs and SPs to create a simulation scenario is a valuable method to enhance reflective practice for clinicians.

  11. Hybrid method to estimate two-layered superficial tissue optical properties from simulated data of diffuse reflectance spectroscopy.

    PubMed

    Hsieh, Hong-Po; Ko, Fan-Hua; Sung, Kung-Bin

    2018-04-20

    An iterative curve fitting method has been applied in both simulation [J. Biomed. Opt.17, 107003 (2012)JBOPFO1083-366810.1117/1.JBO.17.10.107003] and phantom [J. Biomed. Opt.19, 077002 (2014)JBOPFO1083-366810.1117/1.JBO.19.7.077002] studies to accurately extract optical properties and the top layer thickness of a two-layered superficial tissue model from diffuse reflectance spectroscopy (DRS) data. This paper describes a hybrid two-step parameter estimation procedure to address two main issues of the previous method, including (1) high computational intensity and (2) converging to local minima. The parameter estimation procedure contained a novel initial estimation step to obtain an initial guess, which was used by a subsequent iterative fitting step to optimize the parameter estimation. A lookup table was used in both steps to quickly obtain reflectance spectra and reduce computational intensity. On simulated DRS data, the proposed parameter estimation procedure achieved high estimation accuracy and a 95% reduction of computational time compared to previous studies. Furthermore, the proposed initial estimation step led to better convergence of the following fitting step. Strategies used in the proposed procedure could benefit both the modeling and experimental data processing of not only DRS but also related approaches such as near-infrared spectroscopy.

  12. The influence of patient portals on users' decision making is insufficiently investigated: A systematic methodological review.

    PubMed

    Fraccaro, Paolo; Vigo, Markel; Balatsoukas, Panagiotis; Buchan, Iain E; Peek, Niels; van der Veer, Sabine N

    2018-03-01

    Patient portals are considered valuable conduits for supporting patients' self-management. However, it is unknown why they often fail to impact on health care processes and outcomes. This may be due to a scarcity of robust studies focusing on the steps that are required to induce improvement: users need to effectively interact with the portal (step 1) in order to receive information (step 2), which might influence their decision-making (step 3). We aimed to explore this potential knowledge gap by investigating to what extent each step has been investigated for patient portals, and explore the methodological approaches used. We performed a systematic literature review using Coiera's information value chain as a guiding theoretical framework. We searched MEDLINE and Scopus by combining terms related to patient portals and evaluation methodologies. Two reviewers selected relevant papers through duplicate screening, and one extracted data from the included papers. We included 115 articles. The large majority (n = 104) evaluated aspects related to interaction with patient portals (step 1). Usage was most often assessed (n = 61), mainly by analysing system interaction data (n = 50), with most authors considering participants as active users if they logged in at least once. Overall usability (n = 57) was commonly assessed through non-validated questionnaires (n = 44). Step 2 (information received) was investigated in 58 studies, primarily by analysing interaction data to evaluate usage of specific system functionalities (n = 34). Eleven studies explicitly assessed the influence of patient portals on patients' and clinicians' decisions (step 3). Whereas interaction with patient portals has been extensively studied, their influence on users' decision-making remains under-investigated. Methodological approaches to evaluating usage and usability of portals showed room for improvement. To unlock the potential of patient portals, more (robust) research should focus on better understanding the complex process of how portals lead to improved health and care. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Laser Surface Treatment of Sintered Alumina

    NASA Astrophysics Data System (ADS)

    Hagemann, R.; Noelke, C.; Kaierle, S.; Wesling, V.

    Sintered alumina ceramics are used as refractory materials for industrial aluminum furnaces. In this environment the ceramic surface is in permanent contact with molten aluminum resulting in deposition of oxidic material on its surface. Consequently, a lower volume capacity as well as thermal efficiency of the furnaces follows. To reduce oxidic adherence of the ceramic material, two laser-based surface treatment processes were investigated: a powder- based single-step laser cladding and a laser surface remelting. Main objective is to achieve an improved surface quality of the ceramic material considering the industrial requirements as a high process speed.

  14. Surgery and Research: A Practical Approach to Managing the Research Process

    PubMed Central

    Swiatek, Peter R.; Chung, Kevin C.; Mahmoudi, Elham

    2016-01-01

    Following a practical project management method is essential in completing a research project on time and within budget. Although this concept is well developed in the business world, it has yet to be explored in academic surgical research. Defining and adhering to a suitable workflow would increase portability, reusability, and therefore, efficiency of the research process. In this article, we briefly review project management techniques. We specifically underline four main steps of project management: (1) definition and organization, (2) planning, (3) execution, and (4) evaluation, using practical examples from our own multidisciplinary plastic surgery research team. PMID:26710037

  15. How We Used NASA Lunar Set in Planetary Material Science Analog Studies on Lunar Basalts and Breccias with Industrial Materials of Steels and Ceramics

    NASA Technical Reports Server (NTRS)

    Berczi, S.; Cech, V.; Jozsa, S.; Szakmany, G.; Fabriczy, A.; Foldi, T.; Varga, T.

    2005-01-01

    Analog studies play important role in space materials education. Various aspects of analogies are used in our courses. In this year two main rock types of NASA Lunar Set were used in analog studies in respect of processes and textures with selected industrial material samples. For breccias and basalts on the lunar side, ceramics and steels were found as analogs on the industrial side. Their processing steps were identified on the basis of their textures both in lunar and in industrial groups of materials.

  16. Regional and County-Level Disparities in the Post-Socialist Urban System of Romania

    NASA Astrophysics Data System (ADS)

    Török, Ibolya; Veress, Nóra-Csilla

    2016-10-01

    The evolution of the urban system in Romania in the last decades has been strongly influenced by its historical background, as well as the changing political, social and economic context. The main step in this process was marked by the year 2004 when 38 settlements received the urban status, influencing thus not only the country's urbanization level but the increased inter-regional disparities as well. The paper aims to analyze the post-urbanization process in Romania, highlighting those factors which have contributed to the deepening development differences between the country's urban areas.

  17. An automatic agricultural zone classification procedure for crop inventory satellite images

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Kux, H. J.; Velasco, F. R. D.; Deoliveira, M. O. B.

    1982-01-01

    A classification procedure for assessing crop areal proportion in multispectral scanner image is discussed. The procedure is into four parts: labeling; classification; proportion estimation; and evaluation. The procedure also has the following characteristics: multitemporal classification; the need for a minimum field information; and verification capability between automatic classification and analyst labeling. The processing steps and the main algorithms involved are discussed. An outlook on the future of this technology is also presented.

  18. Design review - A tool for all seasons.

    NASA Technical Reports Server (NTRS)

    Liberman, D. S.

    1972-01-01

    The origins of design review are considered together with questions of definitions. The main characteristics which distinguish the concept of design review discussed from the basic master-apprentice relationship include competence, objectivity, formality, and a systematic approach. Preliminary, major, and final reviews are the steps used in the management of the design and development process in each company. It is shown that the design review is generically a systems engineering milestone review with certain unique characteristics.

  19. Complete nitrogen removal from municipal wastewater via partial nitrification by appropriately alternating anoxic/aerobic conditions in a continuous plug-flow step feed process.

    PubMed

    Ge, Shijian; Peng, Yongzhen; Qiu, Shuang; Zhu, Ao; Ren, Nanqi

    2014-05-15

    This study assessed the technical feasibility of removing nitrogen from municipal wastewater by partial nitrification (nitritation) in a continuous plug-flow step feed process. Nitrite in the effluent accumulated to over 81.5  ± 9.2% but disappeared with the transition of process operation from anoxic/oxic mode to the anaerobic/anoxic/oxic mode. Batch tests showed obvious ammonia oxidizing bacteria (AOB) stimulation (advanced ammonia oxidation rate) and nitrite (NOB) oxidizing bacteria inhibition (reduced nitrite oxidation rate) under transient anoxic conditions. Two main factors contributed to nitritation in this continuous plug-flow process: One was the alternating anoxic and oxic operational condition; the step feed strategy guaranteed timely denitrification in anoxic zones, allowing a reduction in energy supply (nitrite) to NOB. Fluorescence in Situ Hybridization and quantitative real-time polymerase chain reaction analysis indicated that NOB population gradually decreased to 1.0  ± 0.1% of the total bacterial population (dominant Nitrospira spp., 1.55 × 10(9) copies/L) while AOB increased approximately two-fold (7.4  ± 0.9%, 1.25 × 10(10) copies/L) during the above anoxic to anaerobic transition. Most importantly, without addition of external carbon sources, the above wastewater treatment process reached 86.0  ± 4.2% of total nitrogen (TN) removal with only 7.23 ± 2.31 mg/L of TN in the effluent, which met the discharge requirements. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. One-Step Preservation and Decalcification of Bony Tissue for Molecular Profiling.

    PubMed

    Mueller, Claudius; Harpole, Michael G; Espina, Virginia

    2017-01-01

    Bone metastasis from primary cancer sites creates diagnostic and therapeutic challenges. Calcified bone is difficult to biopsy due to tissue hardness and patient discomfort, thus limiting the frequency and availability of bone/bone marrow biopsy material for molecular profiling. In addition, bony tissue must be demineralized (decalcified) prior to histomorphologic analysis. Decalcification processes rely on three main principles: (a) solubility of calcium salts in an acid, such as formic or nitric acid; (b) calcium chelation with ethylenediaminetetraacetic acid (EDTA); or (c) ion-exchange resins in a weak acid. A major roadblock in molecular profiling of bony tissue has been the lack of a suitable demineralization process that preserves histomorphology of calcified and soft tissue elements while also preserving phosphoproteins and nucleic acids. In this chapter, we describe general issues relevant to specimen collection and preservation of osseous tissue for molecular profiling. We provide two protocols: (a) one-step preservation of tissue histomorphology and proteins and posttranslational modifications, with simultaneous decalcification of bony tissue, and (b) ethanol-based tissue processing for TheraLin-fixed bony tissue.

  1. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  2. Collaborative partnership in age-friendly cities: two case studies from Quebec, Canada.

    PubMed

    Garon, Suzanne; Paris, Mario; Beaulieu, Marie; Veil, Anne; Laliberté, Andréanne

    2014-01-01

    This article aims to explain the collaborative partnership conditions and factors that foster implementation effectiveness within the age-friendly cities (AFC) in Quebec (AFC-QC), Canada. Based on a community-building approach that emphasizes collaborative partnership, the AFC-QC implementation process is divided into three steps: (1) social diagnostic of older adults' needs; (2) an action plan based on a logic model; and (3) implementation through collaborations. AFC-QC promotes direct involvement of older adults and seniors' associations at each of the three steps of the implementation process, as well as other stakeholders in the community. Based on two contrasting case studies, this article illustrates the importance of collaborative partnership for the success of AFC implementation. Results show that stakeholders, agencies, and organizations are exposed to a new form of governance where coordination and collaborative partnership among members of the steering committee are essential. Furthermore, despite the importance of the senior associations' participation in the process, they encountered significant limits in the capacity of implementing age-friendly environments solely by themselves. In conclusion, we identify the main collaborative partnership conditions and factors in AFC-QC.

  3. Steam-blanched highbush blueberry (Vaccinium corymbosum L.) juice: phenolic profile and antioxidant capacity in relation to cultivar selection.

    PubMed

    Brambilla, Ada; Lo Scalzo, Roberto; Bertolo, Gianni; Torreggiani, Danila

    2008-04-23

    High-quality standards in blueberry juice can be obtained only taking into account fruit compositional variability and its preservation along the processing chain. In this work, five highbush blueberry cultivars from the same environmental growing conditions were individually processed into juice after an initial blanching step and the influence was studied of the cultivar on juice phenolic content, distribution and relative antioxidant activity, measured as scavenging capacity on the artificial free-radical 2,2-diphenyl-1-picrylhydrazyl (DPPH*). A chromatographic protocol was developed to separate all main phenolic compounds in berries. A total of 15 glycosylated anthocyanins, catechin, galactoside, glucoside, and rhamnoside quercetin 3-derivatives, and main benzoic and cinnamic acids were identified. The total content and relative distribution in anthocyanins, chlorogenic acid, and quercetin of each juice were dependent upon cultivar, and the total content was highly correlated (rxy=0.97) to the antioxidant capacity. A selective protective effect of berry blanching in juice processing can be observed on more labile anthocyanin compounds.

  4. View synthesis using parallax invariance

    NASA Astrophysics Data System (ADS)

    Dornaika, Fadi

    2001-06-01

    View synthesis becomes a focus of attention of both the computer vision and computer graphics communities. It consists of creating novel images of a scene as it would appear from novel viewpoints. View synthesis can be used in a wide variety of applications such as video compression, graphics generation, virtual reality and entertainment. This paper addresses the following problem. Given a dense disparity map between two reference images, we would like to synthesize a novel view of the same scene associated with a novel viewpoint. Most of the existing work is relying on building a set of 3D meshes which are then projected onto the new image (the rendering process is performed using texture mapping). The advantages of our view synthesis approach are as follows. First, the novel view is specified by a rotation and a translation which are the most natural way to express the virtual location of the camera. Second, the approach is able to synthesize highly realistic images whose viewing position is significantly far away from the reference viewpoints. Third, the approach is able to handle the visibility problem during the synthesis process. Our developed framework has two main steps. The first step (analysis step) consists of computing the homography at infinity, the epipoles, and thus the parallax field associated with the reference images. The second step (synthesis step) consists of warping the reference image into a new one, which is based on the invariance of the computed parallax field. The analysis step is working directly on the reference views, and only need to be performed once. Examples of synthesizing novel views using either feature correspondences or dense disparity map have demonstrated the feasibility of the proposed approach.

  5. Framework for Service Composition in G-Lite

    NASA Astrophysics Data System (ADS)

    Goranova, R.

    2011-11-01

    G-Lite is a Grid middleware, currently the main middleware installed on all clusters in Bulgaria. The middleware is used by scientists for solving problems, which require a large amount of storage and computational resources. On the other hand, the scientists work with complex processes, where job execution in Grid is just a step of the process. That is why, it is strategically important g-Lite to provide a mechanism for service compositions and business process management. Such mechanism is not specified yet. In this article we propose a framework for service composition in g-Lite. We discuss business process modeling, deployment and execution in this Grid environment. The examples used to demonstrate the concept are based on some IBM products.

  6. Further theoretical insight into the reaction mechanism of the hepatitis C NS3/NS4A serine protease

    NASA Astrophysics Data System (ADS)

    Martínez-González, José Ángel; Rodríguez, Alex; Puyuelo, María Pilar; González, Miguel; Martínez, Rodrigo

    2015-01-01

    The main reactions of the hepatitis C virus NS3/NS4A serine protease are studied using the second-order Møller-Plesset ab initio method and rather large basis sets to correct the previously reported AM1/CHARMM22 potential energy surfaces. The reaction efficiencies measured for the different substrates are explained in terms of the tetrahedral intermediate formation step (the rate-limiting process). The energies of the barrier and the corresponding intermediate are so close that the possibility of a concerted mechanism is open (especially for the NS5A/5B substrate). This is in contrast to the suggested general reaction mechanism of serine proteases, where a two-step mechanism is postulated.

  7. Using Cellular Proteins to Reveal Mechanisms of HIV Infection | Center for Cancer Research

    Cancer.gov

    A vital step in HIV infection is the insertion of viral DNA into the genome of the host cell. In order for the insertion to occur, viral nucleic acid must be transported through the membrane that separates the main cellular compartment (the cytoplasm) from the nucleus, where the host DNA is located. Scientists are actively studying the mechanism used to transport viral DNA into the nucleus in the hopes of targeting this step with future anti-HIV treatments. Up to this point, researchers have identified some of the viral components that play a role in nuclear transport, but they have not determined how viral interactions with other molecules in the cell contribute to the process.

  8. Solution of the 2-D steady-state radiative transfer equation in participating media with specular reflections using SUPG and DG finite elements

    NASA Astrophysics Data System (ADS)

    Le Hardy, D.; Favennec, Y.; Rousseau, B.

    2016-08-01

    The 2D radiative transfer equation coupled with specular reflection boundary conditions is solved using finite element schemes. Both Discontinuous Galerkin and Streamline-Upwind Petrov-Galerkin variational formulations are fully developed. These two schemes are validated step-by-step for all involved operators (transport, scattering, reflection) using analytical formulations. Numerical comparisons of the two schemes, in terms of convergence rate, reveal that the quadratic SUPG scheme proves efficient for solving such problems. This comparison constitutes the main issue of the paper. Moreover, the solution process is accelerated using block SOR-type iterative methods, for which the determination of the optimal parameter is found in a very cheap way.

  9. Eutrophication of lakes and reservoirs: A framework for making management decisions

    USGS Publications Warehouse

    Rast, W.; Holland, M.

    1988-01-01

    The development of management strategies for the protection of environmental quality usually involves consideration both of technical and nontechnical issues. A logical, step-by-step framework for development of such strategies is provided. Its application to the control of cultured eutrophication of lakes and reservoirs illustrates its potential usefulness. From the perspective of the policymaker, the main consideration is that the eutrophication-related water quality of a lake or reservoir can be managed for given water uses. The approach presented here allows the rational assessment of relevant water-quality parameters and establishment of water-quality goals, consideration of social and other nontechnical issues, the possibilities of public involvement in the decision-making process, and a reasonable economic analysis within a management framework.

  10. A multilevel-skin neighbor list algorithm for molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhao, Mingcan; Hou, Chaofeng; Ge, Wei

    2018-01-01

    Searching of the interaction pairs and organization of the interaction processes are important steps in molecular dynamics (MD) algorithms and are critical to the overall efficiency of the simulation. Neighbor lists are widely used for these steps, where thicker skin can reduce the frequency of list updating but is discounted by more computation in distance check for the particle pairs. In this paper, we propose a new neighbor-list-based algorithm with a precisely designed multilevel skin which can reduce unnecessary computation on inter-particle distances. The performance advantages over traditional methods are then analyzed against the main simulation parameters on Intel CPUs and MICs (many integrated cores), and are clearly demonstrated. The algorithm can be generalized for various discrete simulations using neighbor lists.

  11. Automatic Registration of GF4 Pms: a High Resolution Multi-Spectral Sensor on Board a Satellite on Geostationary Orbit

    NASA Astrophysics Data System (ADS)

    Gao, M.; Li, J.

    2018-04-01

    Geometric correction is an important preprocessing process in the application of GF4 PMS image. The method of geometric correction that is based on the manual selection of geometric control points is time-consuming and laborious. The more common method, based on a reference image, is automatic image registration. This method involves several steps and parameters. For the multi-spectral sensor GF4 PMS, it is necessary for us to identify the best combination of parameters and steps. This study mainly focuses on the following issues: necessity of Rational Polynomial Coefficients (RPC) correction before automatic registration, base band in the automatic registration and configuration of GF4 PMS spatial resolution.

  12. How to assess extreme weather impacts - case European transport network

    NASA Astrophysics Data System (ADS)

    Leviäkangas, P.

    2010-09-01

    To assess the impacts of climate change and preparing for impacts is a process. This process we must understand and learn to apply. EWENT (Extreme Weather impacts on European Networks of Transport) will be a test bench for one prospective approach. It has the following main components: 1) identifying what is "extreme", 2) assessing the change in the probabilities, 3) constructing the causal impact models, 4) finding appropriate methods of pricing and costing, 5) finding alternative strategy option, 6) assessing the efficiency of strategy option. This process follows actually the steps of standardized risk management process. Each step is challenging, but if EWENT project succeeds to assess the extreme weather impacts on European transport networks, it is one possible benchmark how to carry out similar analyses in other regions and on country level. EWENT approach could particularly useful for weather and climate information service providers, offering tools for transport authorities and financiers to assess weather risks, and then rationally managing the risks. EWENT project is financed by the European Commission and participated by met-service organisations and transport research institutes from different parts of Europe. The presentation will explain EWENT approach in detail and bring forth the findings of the first work packages.

  13. [Internal audit in medical laboratory: what means of control for an effective audit process?].

    PubMed

    Garcia-Hejl, Carine; Chianéa, Denis; Dedome, Emmanuel; Sanmartin, Nancy; Bugier, Sarah; Linard, Cyril; Foissaud, Vincent; Vest, Philippe

    2013-01-01

    To prepare the French Accreditation Committee (COFRAC) visit for initial certification of our medical laboratory, our direction evaluated its quality management system (QMS) and all its technical activities. This evaluation was performed owing an internal audit. This audit was outsourced. Auditors had an expertise in audit, a whole knowledge of biological standards and were independent. Several nonconformities were identified at that time, including a lack of control of several steps of the internal audit process. Hence, necessary corrective actions were taken in order to meet the requirements of standards, in particular, the formalization of all stages, from the audit program, to the implementation, review and follow-up of the corrective actions taken, and also the implementation of the resources needed to carry out audits in a pre-established timing. To ensure an optimum control of each step, the main concepts of risk management were applied: process approach, root cause analysis, effects and criticality analysis (FMECA). After a critical analysis of our practices, this methodology allowed us to define our "internal audit" process, then to formalize it and to follow it up, with a whole documentary system.

  14. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  15. Identification of column edges of DNA fragments by using K-means clustering and mean algorithm on lane histograms of DNA agarose gel electrophoresis images

    NASA Astrophysics Data System (ADS)

    Turan, Muhammed K.; Sehirli, Eftal; Elen, Abdullah; Karas, Ismail R.

    2015-07-01

    Gel electrophoresis (GE) is one of the most used method to separate DNA, RNA, protein molecules according to size, weight and quantity parameters in many areas such as genetics, molecular biology, biochemistry, microbiology. The main way to separate each molecule is to find borders of each molecule fragment. This paper presents a software application that show columns edges of DNA fragments in 3 steps. In the first step the application obtains lane histograms of agarose gel electrophoresis images by doing projection based on x-axis. In the second step, it utilizes k-means clustering algorithm to classify point values of lane histogram such as left side values, right side values and undesired values. In the third step, column edges of DNA fragments is shown by using mean algorithm and mathematical processes to separate DNA fragments from the background in a fully automated way. In addition to this, the application presents locations of DNA fragments and how many DNA fragments exist on images captured by a scientific camera.

  16. Frequency analysis of a step dynamic pressure calibrator.

    PubMed

    Choi, In-Mook; Yang, Inseok; Yang, Tae-Heon

    2012-09-01

    A dynamic high pressure standard is becoming more essential in the fields of mobile engines, space science, and especially the area of defense such as long-range missile development. However, a complication arises when a dynamic high pressure sensor is compared with a reference dynamic pressure gauge calibrated in static mode. Also, it is difficult to determine a reference dynamic pressure signal from the calibrator because a dynamic high pressure calibrator generates unnecessary oscillations in a positive-going pressure step method. A dynamic high pressure calibrator, using a quick-opening ball valve, generates a fast step pressure change within 1 ms; however, the calibrator also generates a big impulse force that can lead to a short life-time of the system and to oscillating characteristics in response to the dynamic sensor to be calibrated. In this paper, unnecessary additional resonant frequencies besides those of the step function are characterized using frequency analysis. Accordingly, the main sources of resonance are described. In order to remove unnecessary frequencies, the post processing results, obtained by a filter, are given; also, a method for the modification of the dynamic calibration system is proposed.

  17. Frequency analysis of a step dynamic pressure calibrator

    NASA Astrophysics Data System (ADS)

    Choi, In-Mook; Yang, Inseok; Yang, Tae-Heon

    2012-09-01

    A dynamic high pressure standard is becoming more essential in the fields of mobile engines, space science, and especially the area of defense such as long-range missile development. However, a complication arises when a dynamic high pressure sensor is compared with a reference dynamic pressure gauge calibrated in static mode. Also, it is difficult to determine a reference dynamic pressure signal from the calibrator because a dynamic high pressure calibrator generates unnecessary oscillations in a positive-going pressure step method. A dynamic high pressure calibrator, using a quick-opening ball valve, generates a fast step pressure change within 1 ms; however, the calibrator also generates a big impulse force that can lead to a short life-time of the system and to oscillating characteristics in response to the dynamic sensor to be calibrated. In this paper, unnecessary additional resonant frequencies besides those of the step function are characterized using frequency analysis. Accordingly, the main sources of resonance are described. In order to remove unnecessary frequencies, the post processing results, obtained by a filter, are given; also, a method for the modification of the dynamic calibration system is proposed.

  18. Study on characteristics of printed circuit board liberation and its crushed products.

    PubMed

    Quan, Cui; Li, Aimin; Gao, Ningbo

    2012-11-01

    Recycling printed circuit board waste (PCBW) waste is a hot issue of environmental protection and resource recycling. Mechanical and thermo-chemical methods are two traditional recycling processes for PCBW. In the present research, a two-step crushing process combined with a coarse-crushing step and a fine-pulverizing step was adopted, and then the crushed products were classified into seven different fractions with a standard sieve. The liberation situation and particle shape in different size fractions were observed. Properties of different size fractions, such as heating value, thermogravimetric, proximate, ultimate and chemical analysis were determined. The Rosin-Rammler model was applied to analyze the particle size distribution of crushed material. The results indicated that complete liberation of metals from the PCBW was achieved at a size less than 0.59 mm, but the nonmetal particle in the smaller-than-0.15 mm fraction is liable to aggregate. Copper was the most prominent metal in PCBW and mainly enriched in the 0.42-0.25 mm particle size. The Rosin-Rammler equation adequately fit particle size distribution data of crushed PCBW with a correlation coefficient of 0.9810. The results of heating value and proximate analysis revealed that the PCBW had a low heating value and high ash content. The combustion and pyrolysis process of PCBW was different and there was an obvious oxidation peak of Cu in combustion runs.

  19. Extraction and purification methods in downstream processing of plant-based recombinant proteins.

    PubMed

    Łojewska, Ewelina; Kowalczyk, Tomasz; Olejniczak, Szymon; Sakowicz, Tomasz

    2016-04-01

    During the last two decades, the production of recombinant proteins in plant systems has been receiving increased attention. Currently, proteins are considered as the most important biopharmaceuticals. However, high costs and problems with scaling up the purification and isolation processes make the production of plant-based recombinant proteins a challenging task. This paper presents a summary of the information regarding the downstream processing in plant systems and provides a comprehensible overview of its key steps, such as extraction and purification. To highlight the recent progress, mainly new developments in the downstream technology have been chosen. Furthermore, besides most popular techniques, alternative methods have been described. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Oligodendrogenesis in the normal and pathological central nervous system

    PubMed Central

    El Waly, Bilal; Macchi, Magali; Cayre, Myriam; Durbec, Pascale

    2014-01-01

    Oligodendrocytes (OLGs) are generated late in development and myelination is thus a tardive event in the brain developmental process. It is however maintained whole life long at lower rate, and myelin sheath is crucial for proper signal transmission and neuronal survival. Unfortunately, OLGs present a high susceptibility to oxidative stress, thus demyelination often takes place secondary to diverse brain lesions or pathologies. OLGs can also be the target of immune attacks, leading to primary demyelination lesions. Following oligodendrocytic death, spontaneous remyelination may occur to a certain extent. In this review, we will mainly focus on the adult brain and on the two main sources of progenitor cells that contribute to oligodendrogenesis: parenchymal oligodendrocyte precursor cells (OPCs) and subventricular zone (SVZ)-derived progenitors. We will shortly come back on the main steps of oligodendrogenesis in the postnatal and adult brain, and summarize the key factors involved in the determination of oligodendrocytic fate. We will then shed light on the main causes of demyelination in the adult brain and present the animal models that have been developed to get insight on the demyelination/remyelination process. Finally, we will synthetize the results of studies searching for factors able to modulate spontaneous myelin repair. PMID:24971048

  1. Breastfeeding Practices and Barriers to Implementing the Ten Steps to Successful Breastfeeding in Mississippi Hospitals.

    PubMed

    Alakaam, Amir; Lemacks, Jennifer; Yadrick, Kathleen; Connell, Carol; Choi, Hwanseok Winston; Newman, Ray G

    2018-05-01

    Mississippi has the lowest rates of breastfeeding in the United States at 6 and 12 months. There is growing evidence that the rates and duration of infant breastfeeding improve after hospitals implement the Ten Steps to Successful Breastfeeding; moreover, the Ten Steps approach is considered the standard model for evaluation of breastfeeding practices in birthplaces. Research aim: This study aimed to examine the implementation level of the Ten Steps and identify barriers to implementing the Ten Steps in Mississippi hospitals. A cross-sectional self-report survey was used to answer the research aim. Nurse managers of the birthing and maternity units of all 43 Mississippi hospitals that provided birthing and maternity care were recruited. A response rate of 72% ( N = 31) was obtained. Implementation of the Ten Steps in these hospitals was categorized as low, partial, moderate, or high. The researcher classified implementation in 29% of hospitals as moderate and in 71% as partial. The hospital level of implementation was significantly positively associated with the hospital delivery rate along with the hospital cesarean section rate per year. The main barriers for the implementation process of the Ten Steps reported were resistance to new policies, limited financial and human resources, and lack of support from national and state governments. Breastfeeding practices in Mississippi hospitals need to be improved. New policies need to be established in Mississippi to encourage hospitals to adopt the Ten Steps policies and practice in the maternity and birthing units.

  2. Strategies for Stabilizing Nitrogenous Compounds in ECLSS Wastewater: Top-Down System Design and Unit Operation Selection with Focus on Bio-Regenerative Processes for Short and Long Term Scenarios

    NASA Technical Reports Server (NTRS)

    Lunn, Griffin M.

    2011-01-01

    Water recycling and eventual nutrient recovery is crucial for surviving in or past low earth orbit. New approaches and syste.m architecture considerations need to be addressed to meet current and future system requirements. This paper proposes a flexible system architecture that breaks down pretreatment , steps into discrete areas where multiple unit operations can be considered. An overview focusing on the urea and ammonia conversion steps allows an analysis on each process's strengths and weaknesses and synergy with upstream and downstream processing. Process technologies to be covered include chemical pretreatment, biological urea hydrolysis, chemical urea hydrolysis, combined nitrification-denitrification, nitrate nitrification, anammox denitrification, and regenerative ammonia absorption through struvite formation. Biological processes are considered mainly for their ability to both maximize water recovery and to produce nutrients for future plant systems. Unit operations can be considered for traditional equivalent system mass requirements in the near term or what they can provide downstream in the form of usable chemicals or nutrients for the long term closed-loop ecological control and life support system. Optimally this would allow a system to meet the former but to support the latter without major modification.

  3. The complexities of hydrolytic enzymes from the termite digestive system.

    PubMed

    Saadeddin, Anas

    2014-06-01

    The main challenge in second generation bioethanol production is the efficient breakdown of cellulose to sugar monomers (hydrolysis). Due to the recalcitrant character of cellulose, feedstock pretreatment and adapted hydrolysis steps are needed to obtain fermentable sugar monomers. The conventional industrial production process of second-generation bioethanol from biomass comprises several steps: thermochemical pretreatment, enzymatic hydrolysis and sugar fermentation. This process is undergoing continuous optimization in order to increase the bioethanol yield and reduce the economic cost. Therefore, the discovery of new enzymes with high lignocellulytic activity or new strategies is extremely important. In nature, wood-feeding termites have developed a sophisticated and efficient cellulose degrading system in terms of the rate and extent of cellulose hydrolysis and exploitation. This system, which represents a model for digestive symbiosis has attracted the attention of biofuel researchers. This review describes the termite digestive system, gut symbionts, termite enzyme resources, in vitro studies of isolated enzymes and lignin degradation in termites.

  4. One-pot aldol condensation and hydrodeoxygenation of biomass-derived carbonyl compounds for biodiesel synthesis.

    PubMed

    Faba, Laura; Díaz, Eva; Ordóñez, Salvador

    2014-10-01

    Integrating reaction steps is of key interest in the development of processes for transforming lignocellulosic materials into drop-in fuels. We propose a procedure for performing the aldol condensation (reaction between furfural and acetone is taken as model reaction) and the total hydrodeoxygenation of the resulting condensation adducts in one step, yielding n-alkanes. Different combinations of catalysts (bifunctional catalysts or mechanical mixtures), reaction conditions, and solvents (aqueous and organic) have been tested for performing these reactions in an isothermal batch reactor. The results suggest that the use of bifunctional catalysts and aqueous phase lead to an effective integration of both reactions. Therefore, selectivities to n-alkanes higher than 50% were obtained using this catalyst at typical hydrogenation conditions (T=493 K, P=4.5 MPa, 24 h reaction time). The use of organic solvent, carbonaceous supports, or mechanical mixtures of monofunctional catalysts leads to poorer results owing to side effects; mainly, hydrogenation of reactants and adsorption processes. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. [Steps to transform a necessity into a validated and useful screening tool for early detection of developmental problems in Mexican children].

    PubMed

    Rizzoli-Córdoba, Antonio; Delgado-Ginebra, Ismael

    A screening test is an instrument whose primary function is to identify individuals with a probable disease among an apparently healthy population, establishing risk or suspicion of a disease. Caution must be taken when using a screening tool in order to avoid unrealistic measurements, delaying an intervention for those who may benefit from it. Before introducing a screening test into clinical practice, it is necessary to certify the presence of some characteristics making its worth useful. This "certification" process is called validation. The main objective of this paper is to describe the different steps that must be taken, from the identification of a need for early detection through the generation of a validated and reliable screening tool using, as an example, the process for the modified version of the Child Development Evaluation Test (CDE or Prueba EDI) in Mexico. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  6. Vibration study of a vehicle suspension assembly with the finite element method

    NASA Astrophysics Data System (ADS)

    Cătălin Marinescu, Gabriel; Castravete, Ştefan-Cristian; Dumitru, Nicolae

    2017-10-01

    The main steps of the present work represent a methodology of analysing various vibration effects over suspension mechanical parts of a vehicle. A McPherson type suspension from an existing vehicle was created using CAD software. Using the CAD model as input, a finite element model of the suspension assembly was developed. Abaqus finite element analysis software was used to pre-process, solve, and post-process the results. Geometric nonlinearities are included in the model. Severe sources of nonlinearities such us friction and contact are also included in the model. The McPherson spring is modelled as linear spring. The analysis include several steps: preload, modal analysis, the reduction of the model to 200 generalized coordinates, a deterministic external excitation, a random excitation that comes from different types of roads. The vibration data used as an input for the simulation were previously obtained by experimental means. Mathematical expressions used for the simulation were also presented in the paper.

  7. From the CMS Computing Experience in the WLCG STEP'09 Challenge to the First Data Taking of the LHC Era

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Gutsche, O.

    The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.

  8. Application of Barcoding to Reduce Error of Patient Identification and to Increase Patient's Information Confidentiality of Test Tube Labelling in a Psychiatric Teaching Hospital.

    PubMed

    Liu, Hsiu-Chu; Li, Hsing; Chang, Hsin-Fei; Lu, Mei-Rou; Chen, Feng-Chuan

    2015-01-01

    Learning from the experience of another medical center in Taiwan, Kaohsiung Municipal Kai-Suan Psychiatric Hospital has changed the nursing informatics system step by step in the past year and a half . We considered ethics in the original idea of implementing barcodes on the test tube labels to process the identification of the psychiatric patients. The main aims of this project are to maintain the confidential information and to transport the sample effectively. The primary nurses had been using different work sheets for this project to ensure the acceptance of the new barcode system. In the past two years the errors in the blood testing process were as high as 11,000 in 14,000 events per year, resulting in wastage of resources. The actions taken by the nurses and the new barcode system implementation can improve the clinical nursing care quality, safety of the patients, and efficiency, while decreasing the cost due to the human error.

  9. The NASA Continuous Risk Management Process

    NASA Technical Reports Server (NTRS)

    Pokorny, Frank M.

    2004-01-01

    As an intern this summer in the GRC Risk Management Office, I have become familiar with the NASA Continuous Risk Management Process. In this process, risk is considered in terms of the probability that an undesired event will occur and the impact of the event, should it occur (ref., NASA-NPG: 7120.5). Risk management belongs in every part of every project and should be ongoing from start to finish. Another key point is that a risk is not a problem until it has happened. With that in mind, there is a six step cycle for continuous risk management that prevents risks from becoming problems. The steps are: identify, analyze, plan, track, control, and communicate & document. Incorporated in the first step are several methods to identify risks such as brainstorming and using lessons learned. Once a risk is identified, a risk statement is made on a risk information sheet consisting of a single condition and one or more consequences. There can also be a context section where the risk is explained in more detail. Additionally there are three main goals of analyzing a risk, which are evaluate, classify, and prioritize. Here is where a value is given to the attributes of a risk &e., probability, impact, and timeframe) based on a multi-level classification system (e.g., low, medium, high). It is important to keep in mind that the definitions of these levels are probably different for each project. Furthermore the risks can be combined into groups. Then, the risks are prioritized to see what risk is necessary to mitigate first. After the risks are analyzed, a plan is made to mitigate as many risks as feasible. Each risk should be assigned to someone in the project with knowledge in the area of the risk. Then the possible approaches to choose from are: research, accept, watch, or mitigate. Next, all risks, mitigated or not, are tracked either individually or in groups. As the plan is executed, risks are re-evaluated, and the attribute values are adjusted as necessary. Metrics are established and monitored as tools for risk tracking. Also a trigger or threshold should be set on the metric data that indicates when an action is needed. Results of this tracking are usually evaluated and reported in a relevant format at weekly or monthly meetings. Choosing controls is the subsequent step, which involves the effects of the tracking. The three basic controls are: close, continue tracking, and re- plan. Finally communicate & document is the last step, but occurs throughout the process. It is vital that main risks, plans, changes, and progress are known by everyone in the project. A good way to keep everyone updated and inform other projects of common issues is by thoroughly documenting project risks. NASA sees value in risk management and believes that projects have greater probability or success by using the NASA Continuous Risk Management Process.

  10. Protein mass spectra data analysis for clinical biomarker discovery: a global review.

    PubMed

    Roy, Pascal; Truntzer, Caroline; Maucort-Boulch, Delphine; Jouve, Thomas; Molinari, Nicolas

    2011-03-01

    The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years there has been a growing interest in using high throughput technologies for the detection of such biomarkers. In particular, mass spectrometry appears as an exciting tool with great potential. However, to extract any benefit from the massive potential of clinical proteomic studies, appropriate methods, improvement and validation are required. To better understand the key statistical points involved with such studies, this review presents the main data analysis steps of protein mass spectra data analysis, from the pre-processing of the data to the identification and validation of biomarkers.

  11. Development of interactive hypermedia software for high school biology: A research and development study

    NASA Astrophysics Data System (ADS)

    Alturki, Uthman T.

    The goal of this research was to research, design, and develop a hypertext program for students who study biology. The Ecology Hypertext Program was developed using Research and Development (R&D) methodology. The purpose of this study was to place the final "product", a CD-ROM for learning biology concepts, in the hands of teachers and students to help them in learning and teaching process. The product was created through a cycle of literature review, needs assessment, development, and a cycle of field tests and revisions. I applied the ten steps of R&D process suggested by Borg and Gall (1989) which, consisted of: (1) Literature review, (2) Needs assessment, (3) Planning, (4) Develop preliminary product, (5) Preliminary field-testing, (6) Preliminary revision, (7) Main field-testing, (8) Main revision, (9) Final field-testing, and (10) Final product revision. The literature review and needs assessment provided a support and foundation for designing the preliminary product---the Ecology Hypertext Program. Participants in the needs assessment joined a focus group discussion. They were a group of graduate students in education who suggested the importance for designing this product. For the preliminary field test, the participants were a group of high school students studying biology. They were the potential user of the product. They reviewed the preliminary product and then filled out a questionnaire. Their feedback and suggestions were used to develop and improve the product in a step called preliminary revision. The second round of field tasting was the main field test in which the participants joined a focus group discussion. They were the same group who participated in needs assessment task. They reviewed the revised product and then provided ideas and suggestions to improve the product. Their feedback were categorized and implemented to develop the product as in the main revision task. Finally, a group of science teachers participated in this study by reviewing the product and then filling out the questionnaire. Their suggestions were used to conduct the final step in R&D methodology, the final product revision. The primary result of this study was the Ecology Hypertext Program. It considered a small attempt to give students an opportunity to learn through an interactive hypertext program. In addition, using the R&D methodology was an ideal procedure for designing and developing new educational products and material.

  12. High-energy capacitance electrostatic micromotors

    NASA Astrophysics Data System (ADS)

    Baginsky, I. L.; Kostsov, E. G.

    2003-03-01

    The design and parameters of a new electrostatic micromotor with high energy output are described. The motor is created by means of microelectronic technology. Its operation is based on the electromechanic energy conversion during the electrostatic rolling of the metallic films (petals) on the ferroelectric film surface. The mathematical simulation of the main characteristics of the rolling process is carried out. The experimentally measured parameters of the petal step micromotors are shown. The motor operation and its efficiency are investigated.

  13. The strategic relevance of manufacturing technology: An overall quality concept to promote innovation preventing drug shortage.

    PubMed

    Panzitta, Michele; Ponti, Mauro; Bruno, Giorgio; Cois, Giancarlo; D'Arpino, Alessandro; Minghetti, Paola; Mendicino, Francesca Romana; Perioli, Luana; Ricci, Maurizio

    2017-01-10

    Manufacturing is the bridge between research and patient: without product, there is no clinical outcome. Shortage has a variety of causes, in this paper we analyse only causes related to manufacturing technology and we use shortage as a paradigm highliting the relevance of Pharmaceutical Technology. Product and process complexity and capacity issues are the main challenge for the Pharmaceutical Industry Supply chain. Manufacturing Technology should be acknowledged as a R&D step and as a very important matter during University degree in Pharmacy and related disciplines, promoting collaboration between Academia and Industry, measured during HTA step and rewarded in terms of price and reimbursement. The above elements are not yet properly recognised, and manufacturing technology is taken in to consideration only when a shortage is in place. In a previous work, Panzitta et al. proposed to perform a full technology assessment at the Health Technological Assessment stage, evaluating three main technical aspects of a medicine: manufacturing process, physicochemical properties, and formulation characteristics. In this paper, we develop the concept of manufacturing appraisal, providing a technical overview of upcoming challenges, a risk based approach and an economic picture of shortage costs. We develop also an overall quality concept, not limited to GMP factors but broaden to all elements leading to a robust supply and promoting technical innovation. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Preparation of Water-Repellent Glass by Sol-Gel Process Using Perfluoroalkylsilane and Tetraethoxysilane.

    PubMed

    Jeong, Hye-Jeong; Kim, Dong-Kwon; Lee, Soo-Bok; Kwon, Soo-Han; Kadono, Kohei

    2001-03-01

    Coating films on glass substrate were prepared by sol-gel process using alkoxide solutions containing perfluoroalkylsilane (PFAS) and tetraethoxysilane (TEOS). The physical properties of the coating films were characterized by SEM, FT-IR, and XRD. And their surface properties were investigated by measuring contact angles and atomic compositions. Transparent coating films with smooth surface and uniform thickness could be obtained. The contact angles of the coating films for water and methylene iodide are extremely high, at 118 degrees and 97 degrees, respectively, and their surface free energies are about 9.7 dyn/cm. It was found that the water-repellent glass prepared is very hydrophobic and exhibits excellent water-repellency. Hydrophobic perfluoroalkyl groups are preferentially enriched to the outermost layer at the coating film-air interface, and two layers probably exist in the coating film. The upper layer oriented toward the air is composed of mainly perfluoroalkyl groups originating from PFAS, and the lower layer is composed of mainly -OSiO- groups originating from TEOS. The heat treatment after drying step cannot influence the surface enrichment of the perfluoroalkyl group. The hydrolysis reaction should be more completely done before the dip coating step to obtain lower surface free energy. The burning temperature should be less than 300 degrees C because the perfluoroalkyl group begins to decompose from this temperature. Copyright 2001 Academic Press.

  15. Characterizing local biological hotspots in the Gulf of Maine using remote sensing data

    NASA Astrophysics Data System (ADS)

    Ribera, Marta M.

    Researchers increasingly advocate the use of ecosystem-based management (EBM) for managing complex marine ecosystems. This approach requires managers to focus on processes and cross-scale interactions, rather than individual components. However, they often lack appropriate tools and data sources to pursue this change in management approach. One method that has been proposed to understand the ecological complexity inherent in marine ecosystems is the study of biological hotspots. Biological hotspots are locations where organisms from different trophic levels aggregate to feed on abundant supplies, and they are considered a first step toward understanding the processes driving spatial and temporal heterogeneity in marine systems. Biological hotspots are supported by phytoplankton aggregations, which are characterized by high spatial and temporal variability. As a result, methods developed to locate biological hotspots in relatively stable terrestrial systems are not well suited for more dynamic marine ecosystems. The main objective of this thesis is thus to identify and characterize local-scale biological hotspots in the western side of the Gulf of Maine. The first chapter describes a new methodological framework with the steps needed to locate these types of hotspots in marine ecosystems using remote sensing datasets. Then, in the second chapter these hotspots are characterized using a novel metric that uses time series information and spatial statistics to account for both the temporal variability and spatial structure of these marine aggregations. This metric redefines biological hotspots as areas with a high probability of exhibiting positive anomalies of productivity compared to the expected regional seasonal pattern. Finally, the third chapter compares the resulting biological hotspots to fishery-dependent abundance indices of surface and benthic predators to determine the effect of the location and magnitude of phytoplankton aggregations on the rest of the ecosystem. Analyses indicate that the spatial scale and magnitude of biological hotspots in the Gulf of Maine depend on the location and time of the year. Results also show that these hotspots change over time in response to both short-term oceanographic processes and long-term climatic cycles. Finally, the new metric presented here facilitates the spatial comparison between different trophic levels, thus allowing interdisciplinary ecosystem-wide studies.

  16. Contribution of seismic processing to put up the scaffolding for the 3-dimensional study of deep sedimentary basins: the fundaments of trans-national 3D modelling in the project GeoMol

    NASA Astrophysics Data System (ADS)

    Capar, Laure

    2013-04-01

    Within the framework of the transnational project GeoMol geophysical and geological information on the entire Molasse Basin and on the Po Basin are gathered to build consistent cross-border 3D geological models based on borehole evidence and seismic data. Benefiting from important progress in seismic processing, these new models will provide some answers to various questions regarding the usage of subsurface resources, as there are geothermal energy, CO2 and gas storage, oil and gas production, and support decisions-making to national and local administrations as well as to industries. More than 28 000 km of 2D seismic lines are compiled reprocessed and harmonized. This work faces various problems like the vertical drop of more than 700 meters between West and East of the Molasse Basin and to al lesser extent in the Po Plain, the heterogeneities of the substratum, the large disparities between the period and parameters of seismic acquisition, and depending of their availability, the use of two types of seismic data, raw and processed seismic data. The main challenge is to harmonize all lines at the same reference level, amplitude and step of signal processing from France to Austria, spanning more than 1000 km, to avoid misfits at crossing points between seismic lines and artifacts at the country borders, facilitating the interpretation of the various geological layers in the Molasse Basin and Po Basin. A generalized stratigraphic column for the two basins is set up, representing all geological layers relevant to subsurface usage. This stratigraphy constitutes the harmonized framework for seismic reprocessing. In general, processed seismic data is available on paper at stack stage and the mandatory information to take these seismic lines to the final stage of processing, the migration step, are datum plane and replacement velocity. However several datum planes and replacement velocities were used during previous processing projects. Our processing sequence is to first digitize the data, to have them in SEG-Y format. The second step is to apply some post-stack processing to obtain a good data quality before the final migration step. The third step is the final migration, using optimized migration velocities and the fourth step is the post-migration processing. In case of raw seismic data, the mandatory information for processing is made accessible, like from observer logs, coordinates and field seismic data. The processing sequence in order to obtain the final usable version of the seismic line is based on a pre-stack time migration. A complex processing sequence is applied. One main issue is to deal with the significant changes in the topography along the seismic lines and in the first twenty meter layer, this low velocity zone (LVZ) or weathered zone, where some lateral velocity variations occur and disturb the wave propagation, therefore the seismic signal. In seismic processing, this matter is solved by using the static corrections which allow removing these effects of lateral velocity variations and the effects of topography. Another main item is the good determination of root mean square velocities for migration, to improve the final result of seismic processing. Within GeoMol, generalized 3D velocity models of stack velocities are calculated in order to perform a rapid time-depth conversion. In final, all seismic lines of the project GeoMol will be at the same level of processing, the migration level. But to tie all these lines, a single appropriate datum plane and replacement velocity for the entire Molasse Basin and Po Plain, respectively, have to be carefully set up, to avoid misties at crossing points. The reprocessing and use of these 28 000 km of seismic lines in the project GeoMol provide the pivotal database to build a 3D framework model for regional subsurface information on the Alpine foreland basins (cf. Rupf et al. 2013, EGU2013-8924). The project GeoMol is co-funded by the Alpine Space Program as part of the European Territorial Cooperation 2007-2013. The project integrates partners from Austria, France, Germany, Italy, Slovenia and Switzerland and runs from September 2012 to June 2015. Further information on www.geomol.eu The GeoMol seismic interpretation team: Roland Baumberger (swisstopo), Agnès BRENOT (BRGM), Alessandro CAGNONI (RLB), Renaud COUËFFE (BRGM), Gabriel COURRIOUX (BRGM), Chiara D'Ambrogi (ISPRA), Chrystel Dezayes (BRGM), Charlotte Fehn (LGRB), Sunseare GABALDA (BRGM), Gregor Götzl (GBA), Andrej Lapanje (GeoZS), Stéphane MARC (BRGM), Alberto MARTINI (RER-SGSS), Fabio Carlo Molinari (RER-SGSS), Edgar Nitsch (LGRB), Robert Pamer (LfU BY), Marco PANTALONI (ISPRA), Sebastian Pfleiderer (GBA), Andrea PICCIN (RLB), (Nils Oesterling (swisstopo), Isabel Rupf (LGRB), Uta Schulz (LfU BY), Yves SIMEON (BRGM), Günter SÖKOL (LGRB), Heiko Zumsprekel (LGRB)

  17. Effect of one-step recrystallization on the grain boundary evolution of CoCrFeMnNi high entropy alloy and its subsystems.

    PubMed

    Chen, Bo-Ru; Yeh, An-Chou; Yeh, Jien-Wei

    2016-02-29

    In this study, the grain boundary evolution of equiatomic CoCrFeMnNi, CoCrFeNi, and FeCoNi alloys after one-step recrystallization were investigated. The special boundary fraction and twin density of these alloys were evaluated by electron backscatter diffraction analysis. Among the three alloys tested, FeCoNi exhibited the highest special boundary fraction and twin density after one-step recrystallization. The special boundary increment after one-step recrystallization was mainly affected by grain boundary velocity, while twin density was mainly affected by average grain boundary energy and twin boundary energy.

  18. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  19. Change classification in SAR time series: a functional approach

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  20. Recovery of metals from a mixture of various spent batteries by a hydrometallurgical process.

    PubMed

    Tanong, Kulchaya; Coudert, Lucie; Mercier, Guy; Blais, Jean-Francois

    2016-10-01

    Spent batteries contain hazardous materials, including numerous metals (cadmium, lead, nickel, zinc, etc.) that are present at high concentrations. Therefore, proper treatment of these wastes is necessary to prevent their harmful effects on human health and the environment. Current recycling processes are mainly applied to treat each type of spent battery separately. In this laboratory study, a hydrometallurgical process has been developed to simultaneously and efficiently solubilize metals from spent batteries. Among the various chemical leaching agents tested, sulfuric acid was found to be the most efficient and cheapest reagent. A Box-Behnken design was used to identify the influence of several parameters (acid concentration, solid/liquid ratio, retention time and number of leaching steps) on the removal of metals from spent batteries. According to the results, the solid/liquid ratio and acid concentration seemed to be the main parameters influencing the solubilization of zinc, manganese, nickel, cadmium and cobalt from spent batteries. According to the results, the highest metal leaching removals were obtained under the optimal leaching conditions (pulp density = 180 g/L (w/v), [H2SO4] = 1 M, number of leaching step = 3 and leaching time = 30 min). Under such optimum conditions, the removal yields obtained were estimated to be 65% for Mn, 99.9% for Cd, 100% for Zn, 74% for Co and 68% for Ni. Further studies will be performed to improve the solubilization of Mn and to selectively recover the metals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Autonomic nervous system correlates in movement observation and motor imagery

    PubMed Central

    Collet, C.; Di Rienzo, F.; El Hoyek, N.; Guillot, A.

    2013-01-01

    The purpose of the current article is to provide a comprehensive overview of the literature offering a better understanding of the autonomic nervous system (ANS) correlates in motor imagery (MI) and movement observation. These are two high brain functions involving sensori-motor coupling, mediated by memory systems. How observing or mentally rehearsing a movement affect ANS activity has not been extensively investigated. The links between cognitive functions and ANS responses are not so obvious. We will first describe the organization of the ANS whose main purposes are controlling vital functions by maintaining the homeostasis of the organism and providing adaptive responses when changes occur either in the external or internal milieu. We will then review how scientific knowledge evolved, thus integrating recent findings related to ANS functioning, and show how these are linked to mental functions. In turn, we will describe how movement observation or MI may elicit physiological responses at the peripheral level of the autonomic effectors, thus eliciting autonomic correlates to cognitive activity. Key features of this paper are to draw a step-by step progression from the understanding of ANS physiology to its relationships with high mental processes such as movement observation or MI. We will further provide evidence that mental processes are co-programmed both at the somatic and autonomic levels of the central nervous system (CNS). We will thus detail how peripheral physiological responses may be analyzed to provide objective evidence that MI is actually performed. The main perspective is thus to consider that, during movement observation and MI, ANS activity is an objective witness of mental processes. PMID:23908623

  2. Kinematic Measurement of Knee Prosthesis from Single-Plane Projection Images

    NASA Astrophysics Data System (ADS)

    Hirokawa, Shunji; Ariyoshi, Shogo; Takahashi, Kenji; Maruyama, Koichi

    In this paper, the measurement of 3D motion from 2D perspective projections of knee prosthesis is described. The technique reported by Banks and Hodge was further developed in this study. The estimation was performed in two steps. The first-step estimation was performed on the assumption of orthogonal projection. Then, the second-step estimation was subsequently carried out based upon the perspective projection to accomplish more accurate estimation. The simulation results have demonstrated that the technique archived sufficient accuracies of position/orientation estimation for prosthetic kinematics. Then we applied our algorithm to the CCD images, thereby examining the influences of various artifacts, possibly incorporated through an imaging process, on the estimation accuracies. We found that accuracies in the experiment were influenced mainly by the geometric discrepancies between the prosthesis component and computer generated model and by the spacial inconsistencies between the coordinate axes of the positioner and that of the computer model. However, we verified that our algorithm could achieve proper and consistent estimation even for the CCD images.

  3. MollDE: a homology modeling framework you can click with.

    PubMed

    Canutescu, Adrian A; Dunbrack, Roland L

    2005-06-15

    Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.

  4. Certify for success: A methodology for human-centered certification of advanced aviation systems

    NASA Technical Reports Server (NTRS)

    Small, Ronald L.; Rouse, William B.

    1994-01-01

    This position paper uses the methodology in Design for Success as a basis for a human factors certification program. The Design for Success (DFS) methodology espouses a multi-step process to designing and developing systems in a human-centered fashion. These steps are as follows: (1) naturalizing - understand stakeholders and their concerns; (2) marketing - understand market-oriented alternatives to meeting stakeholder concerns; (3) engineering - detailed design and development of the system considering tradeoffs between technology, cost, schedule, certification requirements, etc.; (4) system evaluation - determining if the system meets its goal(s); and (5) sales and service - delivering and maintaining the system. Because the main topic of this paper is certification, we will focus our attention on step 4, System Evaluation, since it is the natural precursor to certification. Evaluation involves testing the system and its parts for their correct behaviors. Certification focuses not only on ensuring that the system exhibits the correct behaviors, but ONLY the correct behaviors.

  5. Video Completion in Digital Stabilization Task Using Pseudo-Panoramic Technique

    NASA Astrophysics Data System (ADS)

    Favorskaya, M. N.; Buryachenko, V. V.; Zotin, A. G.; Pakhirka, A. I.

    2017-05-01

    Video completion is a necessary stage after stabilization of a non-stationary video sequence, if it is desirable to make the resolution of the stabilized frames equalled the resolution of the original frames. Usually the cropped stabilized frames lose 10-20% of area that means the worse visibility of the reconstructed scenes. The extension of a view of field may appear due to the pan-tilt-zoom unwanted camera movement. Our approach deals with a preparing of pseudo-panoramic key frame during a stabilization stage as a pre-processing step for the following inpainting. It is based on a multi-layered representation of each frame including the background and objects, moving differently. The proposed algorithm involves four steps, such as the background completion, local motion inpainting, local warping, and seamless blending. Our experiments show that a necessity of a seamless stitching occurs often than a local warping step. Therefore, a seamless blending was investigated in details including four main categories, such as feathering-based, pyramid-based, gradient-based, and optimal seam-based blending.

  6. Collection and conversion of algal lipid

    NASA Astrophysics Data System (ADS)

    Lin, Ching-Chieh

    Sustainable economic activities mandate a significant replacement of fossil energy by renewable forms. Algae-derived biofuels are increasingly seen as an alternative source of energy with potential to supplement the world's ever increasing demand. Our primary objective is, once the algae were cultivated, to eliminate or make more efficient energy-intensive processing steps of collection, drying, grinding, and solvent extraction prior to conversion. To overcome the processing barrier, we propose to streamline from cultivated algae to biodiesel via algal biomass collection by sand filtration, cell rupturing with ozone, and immediate transesterification. To collect the algal biomass, the specific Chlorococcum aquaticum suspension was acidified to pH 3.3 to promote agglomeration prior to sand filtration. The algae-loaded filter bed was drained of free water and added with methanol and ozonated for 2 min to rupture cell membrane to accelerate release of the cellular contents. The methanol solution now containing the dissolved lipid product was collected by draining, while the filter bed was regenerated by further ozonation when needed. The results showed 95% collection of the algal biomass from the suspension and a 16% yield of lipid from the algae, as well as restoration of filtration velocity of the sand bed via ozonation. The results further showed increased lipid yield upon cell rupturing and transesterified products composed entirely of fatty acid methyl ester (FAME) compounds, demonstrating that the rupture and transesterification processes could proceed consecutively in the same medium, requiring no separate steps of drying, extraction, and conversion. The FAME products from algae without exposure to ozone were mainly of 16 to 18 carbons containing up to 3 double bonds, while those from algae having been ozonated were smaller, highly saturated hydrocarbons. The new technique streamlines individual steps from cultivated algal lipid to transesterified products and represents an improvement over existing energy-intensive steps.

  7. The use of feasibility studies for stepped-wedge cluster randomised trials: protocol for a review of impact and scope

    PubMed Central

    Kristunas, Caroline A; Hemming, Karla; Eborall, Helen C; Gray, Laura J

    2017-01-01

    Introduction The stepped-wedge cluster randomised trial (SW-CRT) is a complex design, for which many decisions about key design parameters must be made during the planning. These include the number of steps and the duration of time needed to embed the intervention. Feasibility studies are likely to be useful for informing these decisions and increasing the likelihood of the main trial's success. However, the number of feasibility studies being conducted for SW-CRTs is currently unknown. This review aims to establish the number of feasibility studies being conducted for SW-CRTs and determine which feasibility issues are commonly investigated. Methods and analysis Fully published feasibility studies for SW-CRTs will be identified, according to predefined inclusion criteria, from searches conducted in Ovid MEDLINE, Scopus, Embase and PsycINFO. To also identify and gain information on unpublished feasibility studies the following will be contacted: authors of published SW-CRTs (identified from the most recent systematic reviews); contacts for registered SW-CRTs (identified from clinical trials registries); lead statisticians of UK registered clinical trials units and researchers known to work in the area of SW-CRTs. Data extraction will be conducted independently by two reviewers. For the fully published feasibility studies, data will be extracted on the study characteristics, the rationale for the study, the process for determining progression to a main trial, how the study informed the main trial and whether the main trial went ahead. The researchers involved in the unpublished feasibility studies will be contacted to elicit the same information. A narrative synthesis will be conducted and provided alongside a descriptive analysis of the study characteristics. Ethics and dissemination This review does not require ethical approval, as no individual patient data will be used. The results of this review will be published in an open-access peer-reviewed journal. PMID:28765139

  8. The use of feasibility studies for stepped-wedge cluster randomised trials: protocol for a review of impact and scope.

    PubMed

    Kristunas, Caroline A; Hemming, Karla; Eborall, Helen C; Gray, Laura J

    2017-08-01

    The stepped-wedge cluster randomised trial (SW-CRT) is a complex design, for which many decisions about key design parameters must be made during the planning. These include the number of steps and the duration of time needed to embed the intervention. Feasibility studies are likely to be useful for informing these decisions and increasing the likelihood of the main trial's success. However, the number of feasibility studies being conducted for SW-CRTs is currently unknown. This review aims to establish the number of feasibility studies being conducted for SW-CRTs and determine which feasibility issues are commonly investigated. Fully published feasibility studies for SW-CRTs will be identified, according to predefined inclusion criteria, from searches conducted in Ovid MEDLINE, Scopus, Embase and PsycINFO. To also identify and gain information on unpublished feasibility studies the following will be contacted: authors of published SW-CRTs (identified from the most recent systematic reviews); contacts for registered SW-CRTs (identified from clinical trials registries); lead statisticians of UK registered clinical trials units and researchers known to work in the area of SW-CRTs.Data extraction will be conducted independently by two reviewers. For the fully published feasibility studies, data will be extracted on the study characteristics, the rationale for the study, the process for determining progression to a main trial, how the study informed the main trial and whether the main trial went ahead. The researchers involved in the unpublished feasibility studies will be contacted to elicit the same information.A narrative synthesis will be conducted and provided alongside a descriptive analysis of the study characteristics. This review does not require ethical approval, as no individual patient data will be used. The results of this review will be published in an open-access peer-reviewed journal. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. The aluminum smelting process and innovative alternative technologies.

    PubMed

    Kvande, Halvor; Drabløs, Per Arne

    2014-05-01

    The industrial aluminum production process is addressed. The purpose is to give a short but comprehensive description of the electrolysis cell technology, the raw materials used, and the health and safety relevance of the process. This article is based on a study of the extensive chemical and medical literature on primary aluminum production. At present, there are two main technological challenges for the process--to reduce energy consumption and to mitigate greenhouse gas emissions. A future step may be carbon dioxide gas capture and sequestration related to the electric power generation from fossil sources. Workers' health and safety have now become an integrated part of the aluminum business. Work-related injuries and illnesses are preventable, and the ultimate goal to eliminate accidents with lost-time injuries may hopefully be approached in the future.

  10. IONIO Project: Computer-mediated Decision Support System and Communication in Ocean Science

    NASA Astrophysics Data System (ADS)

    Oddo, Paolo; Acierno, Arianna; Cuna, Daniela; Federico, Ivan; Galati, Maria Barbara; Awad, Esam; Korres, Gerasimos; Lecci, Rita; Manzella, Giuseppe M. R.; Merico, Walter; Perivoliotis, Leonidas; Pinardi, Nadia; Shchekinova, Elena; Mannarini, Gianandrea; Vamvakaki, Chrysa; Pecci, Leda; Reseghetti, Franco

    2013-04-01

    A decision Support System is composed by four main steps. The first one is the definition of the problem, the issue to be covered, decisions to be taken. Different causes can provoke different problems, for each of the causes or its effects it is necessary to define a list of information and/or data that are required in order to take the better decision. The second step is the determination of sources from where information/data needed for decision-making can be obtained and who has that information. Furthermore it must be possible to evaluate the quality of the sources to see which of them can provide the best information, and identify the mode and format in which the information is presented. The third step is relying on the processing of knowledge, i.e. if the information/data are fitting for purposes. It has to be decided which parts of the information/data need to be used, what additional data or information is necessary to access, how can information be best presented to be able to understand the situation and take decisions. Finally, the decision making process is an interactive and inclusive process involving all concerned parties, whose different views must be taken into consideration. A knowledge based discussion forum is necessary to reach a consensus. A decision making process need to be examined closely and refined, and modified to meet differing needs over time. The report is presenting legal framework and knowledge base for a scientific based decision support system and a brief exploration of some of the skills that enhances the quality of decisions taken.

  11. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  12. Intrafirm planning and mathematical modeling of owner's equity in industrial enterprises

    NASA Astrophysics Data System (ADS)

    Ponomareva, S. V.; Zheleznova, I. V.

    2018-05-01

    The article aims to review the different approaches to intrafirm planning of owner's equity in industrial enterprises. Since charter capital, additional capital and reserve capital do not change in the process of enterprise activity, the main interest lies on the field of share repurchases from shareholders and retained earnings within the owner's equity of the enterprise. In order to study the effect of share repurchases on the activities of the enterprise, let us use such mathematical methods as event study and econometric modeling. This article describes the step-by-step algorithm of carrying out event study and justifies the choice of Logit model in econometric analysis. The article represents basic results of conducted regression analysis on the effect of share repurchases on the key financial indicators in industrial enterprises.

  13. An eye movement pre-training fosters the comprehension of processes and functions in technical systems

    PubMed Central

    Skuballa, Irene T.; Fortunski, Caroline; Renkl, Alexander

    2015-01-01

    The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning. PMID:26029138

  14. A novel precursor system and its application to produce tin doped indium oxide.

    PubMed

    Veith, M; Bubel, C; Zimmer, M

    2011-06-14

    A new type of precursor has been developed by molecular design and synthesised to produce tin doped indium oxide (ITO). The precursor consists of a newly developed bimetallic indium tin alkoxide, Me(2)In(O(t)Bu)(3)Sn (Me = CH(3), O(t)Bu = OC(CH(3))(3)), which is in equilibrium with an excess of Me(2)In(O(t)Bu). This quasi single-source precursor is applied in a sol-gel process to produce powders and coatings of ITO using a one-step heat treatment process under an inert atmosphere. The main advantage of this system is the simple heat treatment that leads to the disproportionation of the bivalent Sn(II) precursor into Sn(IV) and metallic tin, resulting in an overall reduced state of the metal in the final tin doped indium oxide (ITO) material, hence avoiding the usually necessary reduction step. Solid state (119)Sn-NMR measurements of powder samples confirm the appearance of Sn(II) in an amorphous gel state and of metallic tin after annealing under nitrogen. The corresponding preparation of ITO coatings by spin coating on glass leads to transparent conductive layers with a high transmittance of visible light and a low electrical resistivity without the necessity of a reduction step.

  15. Opposing dorsal/ventral stream dynamics during figure-ground segregation.

    PubMed

    Wokke, Martijn E; Scholte, H Steven; Lamme, Victor A F

    2014-02-01

    The visual system has been commonly subdivided into two segregated visual processing streams: The dorsal pathway processes mainly spatial information, and the ventral pathway specializes in object perception. Recent findings, however, indicate that different forms of interaction (cross-talk) exist between the dorsal and the ventral stream. Here, we used TMS and concurrent EEG recordings to explore these interactions between the dorsal and ventral stream during figure-ground segregation. In two separate experiments, we used repetitive TMS and single-pulse TMS to disrupt processing in the dorsal (V5/HMT⁺) and the ventral (lateral occipital area) stream during a motion-defined figure discrimination task. We presented stimuli that made it possible to differentiate between relatively low-level (figure boundary detection) from higher-level (surface segregation) processing steps during figure-ground segregation. Results show that disruption of V5/HMT⁺ impaired performance related to surface segregation; this effect was mainly found when V5/HMT⁺ was perturbed in an early time window (100 msec) after stimulus presentation. Surprisingly, disruption of the lateral occipital area resulted in increased performance scores and enhanced neural correlates of surface segregation. This facilitatory effect was also mainly found in an early time window (100 msec) after stimulus presentation. These results suggest a "push-pull" interaction in which dorsal and ventral extrastriate areas are being recruited or inhibited depending on stimulus category and task demands.

  16. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    PubMed

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  17. Assessment in health care education - modelling and implementation of a computer supported scoring process.

    PubMed

    Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil

    2012-01-01

    Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.

  18. Ground Reaction Forces of the Lead and Trail Limbs when Stepping Over an Obstacle

    PubMed Central

    Bovonsunthonchai, Sunee; Khobkhun, Fuengfa; Vachalathiti, Roongtiwa

    2015-01-01

    Background Precise force generation and absorption during stepping over different obstacles need to be quantified for task accomplishment. This study aimed to quantify how the lead limb (LL) and trail limb (TL) generate and absorb forces while stepping over obstacle of various heights. Material/Methods Thirteen healthy young women participated in the study. Force data were collected from 2 force plates when participants stepped over obstacles. Two limbs (right LL and left TL) and 4 conditions of stepping (no obstacle, stepping over 5 cm, 20 cm, and 30 cm obstacle heights) were tested for main effect and interaction effect by 2-way ANOVA. Paired t-test and 1-way repeated-measure ANOVA were used to compare differences of variables between limbs and among stepping conditions, respectively. The main effects on the limb were found in first peak vertical force, minimum vertical force, propulsive peak force, and propulsive impulse. Results Significant main effects of condition were found in time to minimum force, time to the second peak force, time to propulsive peak force, first peak vertical force, braking peak force, propulsive peak force, vertical impulse, braking impulse, and propulsive impulse. Interaction effects of limb and condition were found in first peak vertical force, propulsive peak force, braking impulse, and propulsive impulse. Conclusions Adaptations of force generation in the LL and TL were found to involve adaptability to altered external environment during stepping in healthy young adults. PMID:26169293

  19. Process, including PSA and membrane separation, for separating hydrogen from hydrocarbons

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2001-01-01

    An improved process for separating hydrogen from hydrocarbons. The process includes a pressure swing adsorption step, a compression/cooling step and a membrane separation step. The membrane step relies on achieving a methane/hydrogen selectivity of at least about 2.5 under the conditions of the process.

  20. Broadband Rupture Process of the 2001 Kunlun Fault (Mw 7.8) Earthquake

    NASA Astrophysics Data System (ADS)

    Antolik, M.; Abercrombie, R.; Ekstrom, G.

    2003-04-01

    We model the source process of the 14 November, 2001 Kunlun fault earthquake using broadband body waves from the Global Digital Seismographic Network (P, SH) and both point-source and distributed slip techniques. The point-source mechanism technique is a non-linear iterative inversion that solves for focal mechanism, moment rate function, depth, and rupture directivity. The P waves reveal a complex rupture process for the first 30 s, with smooth unilateral rupture toward the east along the Kunlun fault accounting for the remainder of the 120 s long rupture. The obtained focal mechanism for the main portion of the rupture is (strike=96o, dip=83o, rake=-8o) which is consistent with both the Harvard CMT solution and observations of the surface rupture. The seismic moment is 5.29×1020 Nm and the average rupture velocity is ˜3.5 km/s. However, the initial portion of the P waves cannot be fit at all with this mechanism. A strong pulse visible in the first 20 s can only be matched with an oblique-slip subevent (MW ˜ 6.8-7.0) involving a substantial normal faulting component, but the nodal planes of this mechanism are not well constrained. The first-motion polarities of the P waves clearly require a strike mechanism with a similar orientation as the Kunlun fault. Field observations of the surface rupture (Xu et al., SRL, 73, No. 6) reveal a small 26 km-long strike-slip rupture at the far western end (90.5o E) with a 45-km long gap and extensional step-over between this rupture and the main Kunlun fault rupture. We hypothesize that the initial fault break occurred on this segment, with release of the normal faulting energy as a continuous rupture through the extensional step, enabling transfer of the slip to the main Kunlun fault. This process is similar to that which occurred during the 2002 Denali fault (MW 7.9) earthquake sequence except that 11 days elapsed between the October 23 (M_W 6.7) foreshock and the initial break of the Denali earthquake along a thrust fault.

  1. Hybrid finite element and Brownian dynamics method for diffusion-controlled reactions.

    PubMed

    Bauler, Patricia; Huber, Gary A; McCammon, J Andrew

    2012-04-28

    Diffusion is often the rate determining step in many biological processes. Currently, the two main computational methods for studying diffusion are stochastic methods, such as Brownian dynamics, and continuum methods, such as the finite element method. This paper proposes a new hybrid diffusion method that couples the strengths of each of these two methods. The method is derived for a general multidimensional system, and is presented using a basic test case for 1D linear and radially symmetric diffusion systems.

  2. A Serious Game for Anterior Cruciate Ligament Rehabilitation: Software Development Aspects and Game Engagement Assessment.

    PubMed

    Cordeiro d'Ornellas, Marcos; Santos Machado, Alison; de Moraes, Jefferson Potiguara; Cervi Prado, Ana Lúcia

    2017-01-01

    This work presents the steps for developing a serious game that allows the interaction through natural gestures, whose main purpose is to contribute to the treatment of individuals who have suffered an injury to the anterior cruciate ligament (ACL). In addition to the serious game development process, the users' gaming experience were performed. Through the evaluation assessment, positive results were obtained in relation to various aspects of the game engagement, proving the playful factor of this activity.

  3. Minto, Alaska Lakeview Lodge START Program Weatherization and Rehab Project Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titus, Bessie; Messier, Dave

    This report details the process that Minto Village Council undertook during the DOE sponsored START program and the work that was completed on the main energy consumer in the community, the Minto Lakeview Lodge. The report takes a look at the steps leading up to the large weatherization and renovation project, the work the was completed as a result of the funding and the results in terms of effect on the community and real energy savings.

  4. Effect of one-step recrystallization on the grain boundary evolution of CoCrFeMnNi high entropy alloy and its subsystems

    PubMed Central

    Chen, Bo-Ru; Yeh, An-Chou; Yeh, Jien-Wei

    2016-01-01

    In this study, the grain boundary evolution of equiatomic CoCrFeMnNi, CoCrFeNi, and FeCoNi alloys after one-step recrystallization were investigated. The special boundary fraction and twin density of these alloys were evaluated by electron backscatter diffraction analysis. Among the three alloys tested, FeCoNi exhibited the highest special boundary fraction and twin density after one-step recrystallization. The special boundary increment after one-step recrystallization was mainly affected by grain boundary velocity, while twin density was mainly affected by average grain boundary energy and twin boundary energy. PMID:26923713

  5. Waste Minimization Study on Pyrochemical Reprocessing Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boussier, H.; Conocar, O.; Lacquement, J.

    2006-07-01

    Ideally a new pyro-process should not generate more waste, and should be at least as safe and cost effective as the hydrometallurgical processes currently implemented at industrial scale. This paper describes the thought process, the methodology and some results obtained by process integration studies to devise potential pyro-processes and to assess their capability of achieving this challenging objective. As example the assessment of a process based on salt/metal reductive extraction, designed for the reprocessing of Generation IV carbide spent fuels, is developed. Salt/metal reductive extraction uses the capability of some metals, aluminum in this case, to selectively reduce actinide fluoridesmore » previously dissolved in a fluoride salt bath. The reduced actinides enter the metal phase from which they are subsequently recovered; the fission products remain in the salt phase. In fact, the process is not so simple, as it requires upstream and downstream subsidiary steps. All these process steps generate secondary waste flows representing sources of actinide leakage and/or FP discharge. In aqueous processes the main solvent (nitric acid solution) has a low boiling point and evaporate easily or can be removed by distillation, thereby leaving limited flow containing the dissolved substance behind to be incorporated in a confinement matrix. From the point of view of waste generation, one main handicap of molten salt processes, is that the saline phase (fluoride in our case) used as solvent is of same nature than the solutes (radionuclides fluorides) and has a quite high boiling point. So it is not so easy, than it is with aqueous solutions, to separate solvent and solutes in order to confine only radioactive material and limit the final waste flows. Starting from the initial block diagram devised two years ago, the paper shows how process integration studies were able to propose process fittings which lead to a reduction of the waste variety and flows leading at an 'ideal' new block diagram allowing internal solvent recycling, and self eliminating reactants. This new flowsheet minimizes the quantity of inactive inlet flows that would have inevitably to be incorporated in a final waste form. The study identifies all knowledge gaps to be filled and suggest some possible R and D issues to confirm or infirm the feasibility of the proposed process fittings. (authors)« less

  6. Modeling and Analysis of CNC Milling Process Parameters on Al3030 based Composite

    NASA Astrophysics Data System (ADS)

    Gupta, Anand; Soni, P. K.; Krishna, C. M.

    2018-04-01

    The machining of Al3030 based composites on Computer Numerical Control (CNC) high speed milling machine have assumed importance because of their wide application in aerospace industries, marine industries and automotive industries etc. Industries mainly focus on surface irregularities; material removal rate (MRR) and tool wear rate (TWR) which usually depends on input process parameters namely cutting speed, feed in mm/min, depth of cut and step over ratio. Many researchers have carried out researches in this area but very few have taken step over ratio or radial depth of cut also as one of the input variables. In this research work, the study of characteristics of Al3030 is carried out at high speed CNC milling machine over the speed range of 3000 to 5000 r.p.m. Step over ratio, depth of cut and feed rate are other input variables taken into consideration in this research work. A total nine experiments are conducted according to Taguchi L9 orthogonal array. The machining is carried out on high speed CNC milling machine using flat end mill of diameter 10mm. Flatness, MRR and TWR are taken as output parameters. Flatness has been measured using portable Coordinate Measuring Machine (CMM). Linear regression models have been developed using Minitab 18 software and result are validated by conducting selected additional set of experiments. Selection of input process parameters in order to get best machining outputs is the key contributions of this research work.

  7. Knowledge Management Orientation: An Innovative Perspective to Hospital Management.

    PubMed

    Ghasemi, Matina; Ghadiri Nejad, Mazyar; Bagzibagli, Kemal

    2017-12-01

    By considering innovation as a new project in hospitals, all the project management's standard steps should be followed in execution. This study investigated the validation of a new set of measures in terms of providing a procedure for knowledge management-oriented innovation that enriches the hospital management system. The relation between innovation and all the knowledge management areas, as the main constructs of project management, was illustrated by referring to project management standard steps and previous studies. Through consultations and meetings with a committee of professional project managers, a questionnaire was developed to measure ten knowledge management areas in hospital's innovation process. Additionally, a group of experts from hospital managers were invited to comment on the applicability of the questionnaires by considering if the items are measurable in hospitals practically. A close-ended, Likert-type scale items, consisted of ten sections, were developed based on project management body of knowledge thorough Delphi technique. It enables the managers to evaluate hospitals' situation to be aware whether the organization follows the knowledge management standards in innovation process or not. By pilot study, confirmatory factor analysis and exploratory factor analysis were conducted to ensure the validity and reliability of the measurement items. The developed items seem to have a potential to help hospital managers and subsequently delivering new products/services successfully based on the standard procedures in their organization. In all innovation processes, the knowledge management areas and their standard steps help hospital managers by a new tool as questionnaire format.

  8. A Neuro-Mechanical Model Explaining the Physiological Role of Fast and Slow Muscle Fibres at Stop and Start of Stepping of an Insect Leg

    PubMed Central

    Toth, Tibor Istvan; Grabowska, Martyna; Schmidt, Joachim; Büschges, Ansgar; Daun-Gruhn, Silvia

    2013-01-01

    Stop and start of stepping are two basic actions of the musculo-skeletal system of a leg. Although they are basic phenomena, they require the coordinated activities of the leg muscles. However, little is known of the details of how these activities are generated by the interactions between the local neuronal networks controlling the fast and slow muscle fibres at the individual leg joints. In the present work, we aim at uncovering some of those details using a suitable neuro-mechanical model. It is an extension of the model in the accompanying paper and now includes all three antagonistic muscle pairs of the main joints of an insect leg, together with their dedicated neuronal control, as well as common inhibitory motoneurons and the residual stiffness of the slow muscles. This model enabled us to study putative processes of intra-leg coordination during stop and start of stepping. We also made use of the effects of sensory signals encoding the position and velocity of the leg joints. Where experimental observations are available, the corresponding simulation results are in good agreement with them. Our model makes detailed predictions as to the coordination processes of the individual muscle systems both at stop and start of stepping. In particular, it reveals a possible role of the slow muscle fibres at stop in accelerating the convergence of the leg to its steady-state position. These findings lend our model physiological relevance and can therefore be used to elucidate details of the stop and start of stepping in insects, and perhaps in other animals, too. PMID:24278108

  9. Investigating calcite growth rates using a quartz crystal microbalance with dissipation (QCM-D)

    NASA Astrophysics Data System (ADS)

    Cao, Bo; Stack, Andrew G.; Steefel, Carl I.; DePaolo, Donald J.; Lammers, Laura N.; Hu, Yandi

    2018-02-01

    Calcite precipitation plays a significant role in processes such as geological carbon sequestration and toxic metal sequestration and, yet, the rates and mechanisms of calcite growth under close to equilibrium conditions are far from well understood. In this study, a quartz crystal microbalance with dissipation (QCM-D) was used for the first time to measure macroscopic calcite growth rates. Calcite seed crystals were first nucleated and grown on sensors, then growth rates of calcite seed crystals were measured in real-time under close to equilibrium conditions (saturation index, SI = log ({Ca2+}/{CO32-}/Ksp) = 0.01-0.7, where {i} represent ion activities and Ksp = 10-8.48 is the calcite thermodynamic solubility constant). At the end of the experiments, total masses of calcite crystals on sensors measured by QCM-D and inductively coupled plasma mass spectrometry (ICP-MS) were consistent, validating the QCM-D measurements. Calcite growth rates measured by QCM-D were compared with reported macroscopic growth rates measured with auto-titration, ICP-MS, and microbalance. Calcite growth rates measured by QCM-D were also compared with microscopic growth rates measured by atomic force microscopy (AFM) and with rates predicted by two process-based crystal growth models. The discrepancies in growth rates among AFM measurements and model predictions appear to mainly arise from differences in step densities, and the step velocities were consistent among the AFM measurements as well as with both model predictions. Using the predicted steady-state step velocity and the measured step densities, both models predict well the growth rates measured using QCM-D and AFM. This study provides valuable insights into the effects of reactive site densities on calcite growth rate, which may help design future growth models to predict transient-state step densities.

  10. Automation in the Teaching of Descriptive Geometry and CAD. High-Level CAD Templates Using Script Languages

    NASA Astrophysics Data System (ADS)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.

  11. Realisation of the guidelines for faculty-internal exams at the Department of General Medicine at the University of Munich: Pushing medical exams one step ahead with IMSm.

    PubMed

    Boeder, Niklas; Holzer, Matthias; Schelling, Jörg

    2012-01-01

    Graded exams are prerequisites for the admission to the medical state examination. Accordingly the exams must be of good quality in order to allow benchmarking with the faculty and between different universities. Criteria for good quality need to be considered - namely objectivity, validity and reliability. The guidelines for the processing of exams published by the GMA are supposed to help maintaining those criteria. In 2008 the Department of General Medicine at the University of Munich fulfils only 14 of 18 items. A review process, appropriate training of the staff and the introduction of the IMSm software were the main changes that helped to improve the 'GMA-score' to 30 fulfilled items. We see the introduction of the IMSm system as our biggest challenge ahead. IMSm helps to streamline the necessary workflow and improves their quality (e.g. by the detection of cueing, item analysis). Overall, we evaluate the steps to improve the exam process as very positive. We plan to engage co-workers outside the department to assist in the various review processes in the future. Furthermore we think it might be of value to get into contact with other departments and faculties to benefit from each other's question pools.

  12. Failure mode and effects analysis of witnessing protocols for ensuring traceability during IVF.

    PubMed

    Rienzi, Laura; Bariani, Fiorenza; Dalla Zorza, Michela; Romano, Stefania; Scarica, Catello; Maggiulli, Roberta; Nanni Costa, Alessandro; Ubaldi, Filippo Maria

    2015-10-01

    Traceability of cells during IVF is a fundamental aspect of treatment, and involves witnessing protocols. Failure mode and effects analysis (FMEA) is a method of identifying real or potential breakdowns in processes, and allows strategies to mitigate risks to be developed. To examine the risks associated with witnessing protocols, an FMEA was carried out in a busy IVF centre, before and after implementation of an electronic witnessing system (EWS). A multidisciplinary team was formed and moderated by human factors specialists. Possible causes of failures, and their potential effects, were identified and risk priority number (RPN) for each failure calculated. A second FMEA analysis was carried out after implementation of an EWS. The IVF team identified seven main process phases, 19 associated process steps and 32 possible failure modes. The highest RPN was 30, confirming the relatively low risk that mismatches may occur in IVF when a manual witnessing system is used. The introduction of the EWS allowed a reduction in the moderate-risk failure mode by two-thirds (highest RPN = 10). In our experience, FMEA is effective in supporting multidisciplinary IVF groups to understand the witnessing process, identifying critical steps and planning changes in practice to enable safety to be enhanced. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  13. Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China

    NASA Astrophysics Data System (ADS)

    Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema

    2018-04-01

    Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.

  14. Fining of Red Wine Monitored by Multiple Light Scattering.

    PubMed

    Ferrentino, Giovanna; Ramezani, Mohsen; Morozova, Ksenia; Hafner, Daniela; Pedri, Ulrich; Pixner, Konrad; Scampicchio, Matteo

    2017-07-12

    This work describes a new approach based on multiple light scattering to study red wine clarification processes. The whole spectral signal (1933 backscattering points along the length of each sample vial) were fitted by a multivariate kinetic model that was built with a three-step mechanism, implying (1) adsorption of wine colloids to fining agents, (2) aggregation into larger particles, and (3) sedimentation. Each step is characterized by a reaction rate constant. According to the first reaction, the results showed that gelatin was the most efficient fining agent, concerning the main objective, which was the clarification of the wine, and consequently the increase in its limpidity. Such a trend was also discussed in relation to the results achieved by nephelometry, total phenols, ζ-potential, color, sensory, and electronic nose analyses. Also, higher concentrations of the fining agent (from 5 to 30 g/100 L) or higher temperatures (from 10 to 20 °C) sped up the process. Finally, the advantage of using the whole spectral signal vs classical univariate approaches was demonstrated by comparing the uncertainty associated with the rate constants of the proposed kinetic model. Overall, multiple light scattering technique showed a great potential for studying fining processes compared to classical univariate approaches.

  15. Monte Carlo modeling of single-molecule cytoplasmic dynein.

    PubMed

    Singh, Manoranjan P; Mallik, Roop; Gross, Steven P; Yu, Clare C

    2005-08-23

    Molecular motors are responsible for active transport and organization in the cell, underlying an enormous number of crucial biological processes. Dynein is more complicated in its structure and function than other motors. Recent experiments have found that, unlike other motors, dynein can take different size steps along microtubules depending on load and ATP concentration. We use Monte Carlo simulations to model the molecular motor function of cytoplasmic dynein at the single-molecule level. The theory relates dynein's enzymatic properties to its mechanical force production. Our simulations reproduce the main features of recent single-molecule experiments that found a discrete distribution of dynein step sizes, depending on load and ATP concentration. The model reproduces the large steps found experimentally under high ATP and no load by assuming that the ATP binding affinities at the secondary sites decrease as the number of ATP bound to these sites increases. Additionally, to capture the essential features of the step-size distribution at very low ATP concentration and no load, the ATP hydrolysis of the primary site must be dramatically reduced when none of the secondary sites have ATP bound to them. We make testable predictions that should guide future experiments related to dynein function.

  16. Recovery curves of the surface electric field after lightning discharges occurring between the positive charge pocket and negative charge centre in a thundercloud

    NASA Astrophysics Data System (ADS)

    Pawar, S. D.; Kamra, A. K.

    2002-12-01

    Surface observations of the electric field recovery curves of the lightning discharges occurring between the positive charge pocket and negative main charge centre in an overhead thundercloud are reported. Such recovery curves are observed to have an additional step of very slow field-change observed at an after-discharge value of electric field equal to 5-6 kV m-1. The behavior of recovery curves is explained in terms of the coronae charge and the relative efficiencies of the charge generating processes responsible for growth of positive charge pocket and main negative charge centre in the thundercloud. The charging currents responsible for the growth of charge in positive charge pockets is computed to be 2-4 times larger than that for the growth of the main negative charge. However, the charge destroyed in such a discharge is found to be comparable to that in a discharge between the main charge centres of the thundercloud.

  17. Modeling the archetype cysteine protease reaction using dispersion corrected density functional methods in ONIOM-type hybrid QM/MM calculations; the proteolytic reaction of papain.

    PubMed

    Fekete, Attila; Komáromi, István

    2016-12-07

    A proteolytic reaction of papain with a simple peptide model substrate N-methylacetamide has been studied. Our aim was twofold: (i) we proposed a plausible reaction mechanism with the aid of potential energy surface scans and second geometrical derivatives calculated at the stationary points, and (ii) we investigated the applicability of the dispersion corrected density functional methods in comparison with the popular hybrid generalized gradient approximations (GGA) method (B3LYP) without such a correction in the QM/MM calculations for this particular problem. In the resting state of papain the ion pair and neutral forms of the Cys-His catalytic dyad have approximately the same energy and they are separated by only a small barrier. Zero point vibrational energy correction shifted this equilibrium slightly to the neutral form. On the other hand, the electrostatic solvation free energy corrections, calculated using the Poisson-Boltzmann method for the structures sampled from molecular dynamics simulation trajectories, resulted in a more stable ion-pair form. All methods we applied predicted at least a two elementary step acylation process via a zwitterionic tetrahedral intermediate. Using dispersion corrected DFT methods the thioester S-C bond formation and the proton transfer from histidine occur in the same elementary step, although not synchronously. The proton transfer lags behind (or at least does not precede) the S-C bond formation. The predicted transition state corresponds mainly to the S-C bond formation while the proton is still on the histidine Nδ atom. In contrast, the B3LYP method using larger basis sets predicts a transition state in which the S-C bond is almost fully formed and the transition state can be mainly featured by the Nδ(histidine) to N(amid) proton transfer. Considerably lower activation energy was predicted (especially by the B3LYP method) for the next amide bond breaking elementary step of acyl-enzyme formation. Deacylation appeared to be a single elementary step process in all the methods we applied.

  18. Walking velocity and step length adjustments affect knee joint contact forces in healthy weight and obese adults.

    PubMed

    Milner, Clare E; Meardon, Stacey A; Hawkins, Jillian L; Willson, John D

    2018-04-28

    Knee osteoarthritis is a major public health problem and adults with obesity are particularly at risk. One approach to alleviating this problem is to reduce the mechanical load at the joint during daily activity. Adjusting temporospatial parameters of walking could mitigate cumulative knee joint mechanical loads. The purpose of this study was to determine how adjustments to velocity and step length affects knee joint loading in healthy weight adults and adults with obesity. We collected three-dimensional gait analysis data on 10 adults with a normal body mass index and 10 adults with obesity during over ground walking in nine different conditions. In addition to preferred velocity and step length, we also conducted combinations of 15% increased and decreased velocity and step length. Peak tibiofemoral joint impulse and knee adduction angular impulse were reduced in the decreased step length conditions in both healthy weight adults (main effect) and those with obesity (interaction effect). Peak knee joint adduction moment was also reduced with decreased step length, and with decreased velocity in both groups. We conclude from these results that adopting shorter step lengths during daily activity and when walking for exercise can reduce mechanical stimuli associated with articular cartilage degenerative processes in adults with and without obesity. Thus, walking with reduced step length may benefit adults at risk for disability due to knee osteoarthritis. Adopting a shorter step length during daily walking activity may reduce knee joint loading and thus benefit those at risk for knee cartilage degeneration. © 2018 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 9999:XX-XX, 2018. © 2018 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  19. Efficient anaerobic transformation of raw wheat straw by a robust cow rumen-derived microbial consortium.

    PubMed

    Lazuka, Adèle; Auer, Lucas; Bozonnet, Sophie; Morgavi, Diego P; O'Donohue, Michael; Hernandez-Raquet, Guillermina

    2015-11-01

    A rumen-derived microbial consortium was enriched on raw wheat straw as sole carbon source in a sequential batch-reactor (SBR) process under strict mesophilic anaerobic conditions. After five cycles of enrichment the procedure enabled to select a stable and efficient lignocellulolytic microbial consortium, mainly constituted by members of Firmicutes and Bacteroidetes phyla. The enriched community, designed rumen-wheat straw-derived consortium (RWS) efficiently hydrolyzed lignocellulosic biomass, degrading 55.5% w/w of raw wheat straw over 15days at 35°C and accumulating carboxylates as main products. Cellulolytic and hemicellulolytic activities, mainly detected on the cell bound fraction, were produced in the earlier steps of degradation, their production being correlated with the maximal lignocellulose degradation rates. Overall, these results demonstrate the potential of RWS to convert unpretreated lignocellulosic substrates into useful chemicals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Enhancing field GP engagement in hospital-based studies. Rationale, design, main results and participation in the Diagest 3-GP motivation study.

    PubMed

    Berkhout, Christophe; Vandaele-Bétancourt, Marie; Robert, Stéphane; Lespinasse, Solène; Mitha, Gamil; Bradier, Quentin; Vambergue, Anne; Fontaine, Pierre

    2012-06-21

    Diagest 3 was a study aimed at lowering the risk of developing type 2 diabetes within 3 years after childbirth. Women with gestational diabetes were enrolled in the study. After childbirth, the subjects showed little interest in the structured education programme and did not attend workshops. Their general practitioners (GPs) were approached to help motivate the subjects to participate in Diagest 3, but the GPs were reluctant. The present study aimed to understand field GPs' attitudes towards hospital-based studies, and to develop strategies to enhance their involvement and reduce subject drop-out rates. We used a three-step process: step one used a phenomenological approach exploring the beliefs, attitudes, motivations and environmental factors contributing to the GPs' level of interest in the study. Data were collected in face-to-face interviews and coded by hand and with hermeneutic software to develop distinct GP profiles. Step two was a cross-sectional survey by questionnaire to determine the distribution of the profiles in the GP study population and whether completion of an attached case report form (CRF) was associated with a particular GP profile. In step three, we assessed the impact of the motivation study on participation rates in the main study. Fifteen interviews were conducted to achieve data saturation. Theorisation led to the definition of 4 distinct GP profiles. The response rate to the questionnaire was 73%, but dropped to 52% when a CRF was attached. The link between GP profiles and the rate of CRF completion remains to be verified. The GPs provided data on the CRF that was of comparable quality to those collected in the main trial. Our analysis showed that the motivation study increased overall participation in the main study by 23%, accounting for 16% (24/152) of all final visits for 536 patients who were initially enrolled in the Diagest 3 study. When a hospital-led study explores issues in primary care, its design must anticipate GP participation early in the trial. Based on our questionnaire response rates, we found that one in two GPs were willing to participate in our hospital-led study, regardless of their initial attitudes.

  1. Hot Forging of a Cladded Component by Automated GMAW Process

    NASA Astrophysics Data System (ADS)

    Rafiq, Muhammad; Langlois, Laurent; Bigot, Régis

    2011-01-01

    Weld cladding is employed to improve the service life of engineering components by increasing corrosion and wear resistance and reducing the cost. The acceptable multi-bead cladding layer depends on single bead geometry. Hence, in first step, the relationship between input process parameters and the single bead geometry is studied and in second step a comprehensive study on multi bead clad layer deposition is carried out. This paper highlights an experimental study carried out to get single layer cladding deposited by automated Gas Metal Arc Welding (GMAW) process and to find the possibility of hot forming of the cladded work piece to get the final hot formed improved structure. GMAW is an arc welding process that uses an arc between a consumable electrode and the welding pool with an external shielding gas and the cladding is done by alongside deposition of weld beads. The experiments for single bead were conducted by varying the three main process parameters wire feed rate, arc voltage and welding speed while keeping other parameters like nozzle to work distance, shielding gas and its flow rate and torch angle constant. The effect of bead spacing and torch orientation on the cladding quality of single layer from the results of single bead deposition was studied. Effect of the dilution rate and nominal energy on the cladded layer hot bending quality was also performed at different temperatures.

  2. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  3. 46 CFR 171.067 - Treatment of stepped and recessed bulkheads in Type I subdivision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Treatment of stepped and recessed bulkheads in Type I... Treatment of stepped and recessed bulkheads in Type I subdivision. (a) For the purpose of this section— (1) The main transverse watertight bulkhead immediately forward of a stepped bulkhead is referred to as...

  4. 46 CFR 171.073 - Treatment of stepped and recessed bulkheads in Type II subdivision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Treatment of stepped and recessed bulkheads in Type II... Treatment of stepped and recessed bulkheads in Type II subdivision. (a) A main transverse watertight bulkhead may not be stepped unless additional watertight bulkheads are located as shown in Figure 171.067(a...

  5. Rapid Thermal Processing (RTP) of semiconductors in space

    NASA Technical Reports Server (NTRS)

    Anderson, T. J.; Jones, K. S.

    1993-01-01

    The progress achieved on the project entitled 'Rapid Thermal Processing of Semiconductors in Space' for a 12 month period of activity ending March 31, 1993 is summarized. The activity of this group is being performed under the direct auspices of the ROMPS program. The main objective of this program is to develop and demonstrate the use of advanced robotics in space with rapid thermal process (RTP) of semiconductors providing the test technology. Rapid thermal processing is an ideal processing step for demonstration purposes since it encompasses many of the characteristics of other processes used in solid state device manufacturing. Furthermore, a low thermal budget is becoming more important in existing manufacturing practice, while a low thermal budget is critical to successful processing in space. A secondary objective of this project is to determine the influence of microgravity on the rapid thermal process for a variety of operating modes. In many instances, this involves one or more fluid phases. The advancement of microgravity processing science is an important ancillary objective.

  6. Conjugated Polymers Atypically Prepared in Water

    PubMed Central

    Invernale, Michael A.; Pendergraph, Samuel A.; Yavuz, Mustafa S.; Ombaba, Matthew; Sotzing, Gregory A.

    2010-01-01

    Processability remains a fundamental issue for the implementation of conducting polymer technology. A simple synthetic route towards processable precursors to conducting polymers (main chain and side chain) was developed using commercially available materials. These soluble precursor systems were converted to conjugated polymers electrochemically in aqueous media, offering a cheaper and greener method of processing. Oxidative conversion in aqueous and organic media each produced equivalent electrochromics. The precursor method enhances the yield of the electrochromic polymer obtained over that of electrodeposition, and it relies on a less corruptible electrolyte bath. However, electrochemical conversion of the precursor polymers often relies on organic salts and solvents. The ability to achieve oxidative conversion in brine offers a less costly and a more environmentally friendly processing step. It is also beneficial for biological applications. The electrochromics obtained herein were evaluated for electronic, spectral, and morphological properties. PMID:20959869

  7. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  8. Automation in clinical bacteriology: what system to choose?

    PubMed

    Greub, G; Prod'hom, G

    2011-05-01

    With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.

  9. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process.

    PubMed

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S; Jazar, Reza N; Khayyam, Hamid

    2018-03-05

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large.

  10. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process

    PubMed Central

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S.; Jazar, Reza N.; Khayyam, Hamid

    2018-01-01

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large. PMID:29510592

  11. 25 CFR 15.11 - What are the basic steps of the probate process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...

  12. Particulate photocatalysts for overall water splitting

    NASA Astrophysics Data System (ADS)

    Chen, Shanshan; Takata, Tsuyoshi; Domen, Kazunari

    2017-10-01

    The conversion of solar energy to chemical energy is a promising way of generating renewable energy. Hydrogen production by means of water splitting over semiconductor photocatalysts is a simple, cost-effective approach to large-scale solar hydrogen synthesis. Since the discovery of the Honda-Fujishima effect, considerable progress has been made in this field, and numerous photocatalytic materials and water-splitting systems have been developed. In this Review, we summarize existing water-splitting systems based on particulate photocatalysts, focusing on the main components: light-harvesting semiconductors and co-catalysts. The essential design principles of the materials employed for overall water-splitting systems based on one-step and two-step photoexcitation are also discussed, concentrating on three elementary processes: photoabsorption, charge transfer and surface catalytic reactions. Finally, we outline challenges and potential advances associated with solar water splitting by particulate photocatalysts for future commercial applications.

  13. Investigation for the differentiation process of mouse ES cells by Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Yoshinori; El-Hagrasy, Maha A.; Shimizu, Eiichi; Saito, Masato; Tamiya, Eiichi

    2012-03-01

    The arrangement of differentiated pluripotent embryonic stem cells into three-dimensional aggregates, which are known as embryonic bodies, is a main step for progressing the embryonic stem cells differentiation. In this work, embryonic stem cells that were directly produced from the hanging drop step as a three-dimensional structure with no further twodimensional differentiation were diagnosed with Raman spectroscopy as a non-invasive and label-free technique. Raman spectroscopy was employed to discriminate between mouse embryonic bodies of different degrees of maturation. EBs were prepared applying the hanging drop method. The Raman scattering measurements were obtained in vitro with a Nanophoton RAMAN-11 micro-spectrometer (Japan: URL: www.nanophoton.jp equipped with an Olympus XLUM Plan FLN 20X/NA= 1.0 objective lens. Spectral data were smoothed, baseline corrected and normalized to the a welldefined intense 1003 cm-1 band (phenylalanine) which is insensitive to changes in conformation or environment. The differentiation process of embryonic stem cells is initiated by the removal of LIF from culture medium. 1, 7 and 17-dayold embryonic stem cells were collected and investigated by Raman spectroscopy. The main differences involve bands which decreased with maturation such as: 784 cm-1 (U, T, C ring br DNA/RNA, O-P-O str); 1177 cm-1 (cytosine, guanine) and 1578 cm-1 (G, A). It was found that with the progress of differentiation the protein content was amplified. The increase of protein to nucleic acid ratio was also previously observed with the progress of the differentiation process. Raman spectroscopy has the potential to distinguish between the Raman signatures of live embryonic stem cells with different degrees of maturation.

  14. Comparison of adsorption of Remazol Black B and Acidol Red on microporous activated carbon felt.

    PubMed

    Donnaperna, L; Duclaux, L; Gadiou, R; Hirn, M-P; Merli, C; Pietrelli, L

    2009-11-15

    The adsorption of two anionic dyes, Remazol Black B (RB5) and Acidol Red 2BE-NW (AR42), onto a microporous activated carbon felt was investigated. The characterization of carbon surface chemistry by X-ray microanalysis, Boehm titrations, and pH-PZC measurements indicates that the surface oxygenated groups are mainly acidic. The rate of adsorption depends on the pH and the experimental data fit the intraparticle diffusion model. The pore size distribution obtained by DFT analysis shows that the mean pore size is close to 1nm, which indicates that a slow intraparticle diffusion process control the adsorption. The adsorption isotherms were measured for different pH values. The Khan and the Langmuir-Freundlich models lead to the best agreement with experimental data for RB5 and AR42, respectively. These isotherm simulations and the pH dependence of adsorption show that the adsorption capacity is mainly controlled by nondispersive electrostatic interactions for pH values below 4. The adsorption kinetics, the irreversibility of the process, and the influence of the pH indicate that the rate of adsorption in this microporous felt proceeds through two steps. The first one is fast and results from direct interaction of dye molecules with the external surface of the carbon material (which account for 10% of the whole surface area); in the second, slow step, the adsorption rate is controlled by the slow diffusion of dye molecules into the narrow micropores. The influence of temperature on the adsorption isotherms was studied and the thermodynamic parameters were obtained. They show that the process is spontaneous and exothermic.

  15. Research on the treatment of liquid waste containing cesium by an adsorption-microfiltration process with potassium zinc hexacyanoferrate.

    PubMed

    Zhang, Chang-Ping; Gu, Ping; Zhao, Jun; Zhang, Dong; Deng, Yue

    2009-08-15

    The removal of cesium from an aqueous solution by an adsorption-microfiltration (AMF) process was investigated in jar tests and lab-scale tests. The adsorbent was K(2)Zn(3)[Fe(CN)(6)](2). The obtained cesium data in the jar test fit a Freundlich-type isotherm well. In the lab-scale test, the mean cesium concentration of the raw water and the effluent were 106.87 microg/L and 0.59 microg/L, respectively, the mean removal of cesium was 99.44%, and the mean decontamination factors (DF) and concentration factors (CF) were 208 and 539, respectively. The removal of cesium in the lab-scale test was better than that in the jar test because the old adsorbents remaining in the reactor still had adsorption capacity with the premise of no significant desorption being observed, and the continuous renewal of the adsorbent surface improved the adsorption capacity of the adsorbent. Some of the suspended solids were deposited on the bottom of the reactor, which would affect the mixing of adsorbents with the raw water and the renewing of the adsorbent surface. Membrane fouling was the main physical fouling mechanism, and the cake layer was the main filtration resistance. Specific flux (SF) decreased step by step during the whole period of operation due to membrane fouling and concentration polarization. The quality of the effluent was good and the turbidity remained lower than 0.1NTU, and the toxic anion, CN(-), could not be detected because of its low concentration, this indicated that the effluent was safe. The AMF process was feasible for practical application in the treatment of liquid waste containing cesium.

  16. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms

    PubMed Central

    Hassanein, Mohamed; El-Sheimy, Naser

    2018-01-01

    Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%. PMID:29670055

  17. Experimental study of the flow over a backward-facing rounded ramp

    NASA Astrophysics Data System (ADS)

    Duriez, Thomas; Aider, Jean-Luc; Wesfreid, Jose Eduardo

    2010-11-01

    The backward-facing rounded ramp (BFR) is a very simple geometry leading to boundary layer separation, close to the backward facing step (BFS) flow. The main difference with the BFS flow is that the separation location depends on the incoming flow while it is fixed to the step edge for the BFS flow. Despite the simplicity of the geometry, the flow is complex and the transition process still has to be investigated. In this study we investigate the BFR flow using time-resolved PIV. For Reynolds number ranging between 300 and 12 000 we first study the time averaged properties such as the positions of the separation and reattachment, the recirculation length and the shear layer thickness. The time resolution also gives access to the characteristic frequencies of the time-dependant flow. An appropriate Fourier filtering of the flow field, around each frequency peak in the global spectrum, allows an investigation of each mode in order to extract its wavelength, phase velocity, and spatial distribution. We then sort the spectral content and relate the main frequencies to the most amplified Kelvin-Helmholtz instability mode and its harmonics, the vortex pairing, the low frequency recirculation bubble oscillation and the interactions between all these phenomena.

  18. Construction and evaluation of a modular biofilm-forming chamber for microbial recovery of neodymium and semi-continuous biofilm preparation. Tolerance of Serratia sp.N14 on acidic conditions and neutralized aqua regia.

    PubMed

    Vavlekas, Dimitrios A

    2017-02-01

    Recovery of neodymium from liquid metallic wastes and scrap leachates is a crucial step for its recycling, which can take place through the immobilized biofilms of Serratia sp. N14. These biofilms are produced in a fermentor vessel with a turnaround time of 10-14 days, which is unacceptable from an economic point of view for an industrial process. This study proposes the construction and evaluation of a modular system, whereby a biofilm-forming chamber is inserted into the continuous biomass outflow of the main chemostat vessel, for an alternative semi-continuous and economic production of biofilm. The activity of the biofilm from the outflow chamber was found to be the same as the one from the main chamber, which was stored in a cold room (4°C), for 9-12 months, depending on a 24 h nucleation step.Moreover, the ability of the biofilm to function in the presence of a leaching agent (aqua regia) or in acidic conditions was also evaluated. The biofilm of the main chamber can remain active even at 50% neutralized aqua regia (pH 3.0), while at acidic conditions, phosphate release of the cells is reduced to 50%. This strain proves to be very tolerant in low pH or high salt concentration solutions. The biofilm produced from the outflow of the main fermentor vessel is of acceptable activity, rather than being disposed.

  19. Gastropod diversification and community structuring processes in ancient Lake Ohrid: a metacommunity speciation perspective

    NASA Astrophysics Data System (ADS)

    Hauffe, T.; Albrecht, C.; Wilke, T.

    2015-09-01

    The Balkan Lake Ohrid is the oldest and most speciose freshwater lacustrine system in Europe. However, it remains unclear whether the diversification of its endemic taxa is mainly driven by neutral processes, environmental factors, or species interactions. This calls for a holistic perspective involving both evolutionary processes and ecological dynamics. Such a unifying framework - the metacommunity speciation model - considers how community assembly affects diversification and vice versa by assessing the relative contribution of the three main community assembly processes, dispersal limitation, environmental filtering, and species interaction. The current study therefore used the species-rich model taxon Gastropoda to assess how extant communities in Lake Ohrid are structured by performing process based metacommunity analyses. Specifically, the study aimed at (i) identifying the relative importance of the three community assembly processes and (ii) to test whether the importance of these individual processes changes gradually with lake depth or whether they are distinctively related to eco-zones. Based on specific simulation steps for each of the three processes, it could be demonstrated that dispersal limitation had the strongest influence on gastropod community structures in Lake Ohrid. However, it was not the exclusive assembly process but acted together with the other two processes - environmental filtering, and species interaction. In fact, the relative importance of the three community assembly processes varied both with lake depth and eco-zones, though the processes were better predicted by the latter. The study thus corroborated the high importance of dispersal limitation for both maintaining species richness in Lake Ohrid (through its impact on community structure) and generating endemic biodiversity (via its influence on diversification processes). However, according to the metacommunity speciation model, the inferred importance of environmental filtering and biotic interaction also suggests a small but significant influence of ecological speciation. These findings contribute to the main goal of the SCOPSCO initiative - inferring the drivers of biotic evolution - and might provide an integrative perspective on biological and limnological dynamics in ancient Lake Ohrid.

  20. Precise turnaround time measurement of laboratory processes using radiofrequency identification technology.

    PubMed

    Mayer, Horst; Brümmer, Jens; Brinkmann, Thomas

    2011-01-01

    To implement Lean Six Sigma in our central laboratory we conducted a project to measure single pre-analytical steps influencing turnaround time (TAT) of emergency department (ED) serum samples. The traditional approach of extracting data from the Laboratory Information System (LIS) for a retrospective calculation of a mean TAT is not suitable. Therefore, we used radiofrequency identification (RFID) chips for real time tracking of individual samples at any pre-analytical step. 1,200 serum tubes were labelled with RFID chips and were provided to the emergency department. 3 RFID receivers were installed in the laboratory: at the outlet of the pneumatic tube system, at the centrifuge, and in the analyser area. In addition, time stamps of sample entry at the automated sample distributor and communication of results from the analyser were collected from LIS. 1,023 labelled serum tubes arrived at our laboratory. 899 RFID tags were used for TAT calculation. The following transfer times were determined (median 95th percentile in min:sec): pneumatic tube system --> centrifuge (01:25/04:48), centrifuge --> sample distributor (14:06/5:33), sample distributor --> analysis system zone (02:39/15:07), analysis system zone --> result communication (12:42/22:21). Total TAT was calculated at 33:19/57:40 min:sec. Manual processes around centrifugation were identified as a major part of TAT with 44%/60% (median/95th percentile). RFID is a robust, easy to use, and error-free technology and not susceptible to interferences in the laboratory environment. With this study design we were able to measure significant variations in a single manual sample transfer process. We showed that TAT is mainly influenced by manual steps around the centrifugation process and we concluded that centrifugation should be integrated in solutions for total laboratory automation.

  1. Canada's Deep Geological Repository For Used Nuclear Fuel -The Geoscientific Site Evaluation Process

    NASA Astrophysics Data System (ADS)

    Hirschorn, S.; Ben Belfadhel, M.; Blyth, A.; DesRoches, A. J.; McKelvie, J. R. M.; Parmenter, A.; Sanchez-Rico Castejon, M.; Urrutia-Bustos, A.; Vorauer, A.

    2014-12-01

    The Nuclear Waste Management Organization (NWMO) is responsible for implementing Adaptive Phased Management, the approach selected by the Government of Canada for long-term management of used nuclear fuel generated by Canadian nuclear reactors. In May 2010, the NWMO published and initiated a nine-step site selection process to find an informed and willing community to host a deep geological repository for Canada's used nuclear fuel. The site selection process is designed to address a broad range of technical and social, economic and cultural factors. The suitability of candidate areas will be assessed in a stepwise manner over a period of many years and include three main steps: Initial Screenings; Preliminary Assessments; and Detailed Site Characterizations. The Preliminary Assessment is conducted in two phases. NWMO has completed Phase 1 preliminary assessments for the first eight communities that entered into this step. While the Phase 1 desktop geoscientific assessments showed that each of the eight communities contains general areas that have the potential to satisfy the geoscientific safety requirements for hosting a deep geological repository, the assessment identified varying degrees of geoscientific complexity and uncertainty between communities, reflecting their different geological settings and structural histories. Phase 2 activities will include a sequence of high-resolution airborne geophysical surveys and focused geological field mapping to ground-truth lithology and structural features, followed by limited deep borehole drilling and testing. These activities will further evaluate the site's ability to meet the safety functions that a site would need to ultimately satisfy in order to be considered suitable. This paper provides an update on the site evaluation process and describes the approach, methods and criteria that are being used to conduct the geoscientific Preliminary Assessments.

  2. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  3. Friction Stir Welding of GR-Cop 84 for Combustion Chamber Liners

    NASA Technical Reports Server (NTRS)

    Russell, Carolyn K.; Carter, Robert; Ellis, David L.; Goudy, Richard

    2004-01-01

    GRCop-84 is a copper-chromium-niobium alloy developed by the Glenn Research Center for liquid rocket engine combustion chamber liners. GRCop-84 exhibits superior properties over conventional copper-base alloys in a liquid hydrogen-oxygen operating environment. The Next Generation Launch Technology program has funded a program to demonstrate scale-up production capabilities of GR-Cop 84 to levels suitable for main combustion chamber production for the prototype rocket engine. This paper describes a novel method of manufacturing the main combustion chamber liner. The process consists of several steps: extrude the GR-Cop 84 powder into billets, roll the billets into plates, bump form the plates into cylinder halves and friction stir weld the halves into a cylinder. The cylinder is then metal spun formed to near net liner dimensions followed by finish machining to the final configuration. This paper describes the friction stir weld process development including tooling and non-destructive inspection techniques, culminating in the successful production of a liner preform completed through spin forming.

  4. A Sequential Multiplicative Extended Kalman Filter for Attitude Estimation Using Vector Observations.

    PubMed

    Qin, Fangjun; Chang, Lubin; Jiang, Sai; Zha, Feng

    2018-05-03

    In this paper, a sequential multiplicative extended Kalman filter (SMEKF) is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms.

  5. A method to identify the main mode of machine tool under operating conditions

    NASA Astrophysics Data System (ADS)

    Wang, Daming; Pan, Yabing

    2017-04-01

    The identification of the modal parameters under experimental conditions is the most common procedure when solving the problem of machine tool structure vibration. However, the influence of each mode on the machine tool vibration in real working conditions remains unknown. In fact, the contributions each mode makes to the machine tool vibration during machining process are different. In this article, an active excitation modal analysis is applied to identify the modal parameters in operational condition, and the Operating Deflection Shapes (ODS) in frequencies of high level vibration that affect the quality of machining in real working conditions are obtained. Then, the ODS is decomposed by the mode shapes which are identified in operational conditions. So, the contributions each mode makes to machine tool vibration during machining process are got by decomposition coefficients. From the previous steps, we can find out the main modes which effect the machine tool more significantly in working conditions. This method was also verified to be effective by experiments.

  6. A Sequential Multiplicative Extended Kalman Filter for Attitude Estimation Using Vector Observations

    PubMed Central

    Qin, Fangjun; Jiang, Sai; Zha, Feng

    2018-01-01

    In this paper, a sequential multiplicative extended Kalman filter (SMEKF) is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms. PMID:29751538

  7. Small Angle Neutron Scattering Observation of Chain Retraction after a Large Step Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.; Heinrich, M.; Pyckhout-Hintzen, W.

    The process of retraction in entangled linear chains after a fast nonlinear stretch was detected from time-resolved but quenched small angle neutron scattering (SANS) experiments on long, well-entangled polyisoprene chains. The statically obtained SANS data cover the relevant time regime for retraction, and they provide a direct, microscopic verification of this nonlinear process as predicted by the tube model. Clear, quantitative agreement is found with recent theories of contour length fluctuations and convective constraint release, using parameters obtained mainly from linear rheology. The theory captures the full range of scattering vectors once the crossover to fluctuations on length scales belowmore » the tube diameter is accounted for.« less

  8. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    PubMed

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Sugar (sucrose) holograms

    NASA Astrophysics Data System (ADS)

    Ponce-Lee, E. L.; Olivares-Pérez, A.; Fuentes-Tapia, I.

    2004-06-01

    Computer holograms made with sugar crystals are reported. This material is well known as a good sweetener; the sugar from sugar cane or sugar beet (sucrose). These sweetener can be applied as honey "water and diluted sugar" easily on any substrate such as plastics or glasses without critical conditions for developed process. This step corresponds only to the cured sucrose as a photopolymer process. The maximum absorption spectra is localized at UV region λ=240 nm. We record with lithographic techniques some gratings, showing a good diffraction efficiency around 45%. This material has good resolution to make diffraction gratings. These properties are attractive because they open the possibility to make phase holograms on candies. Mainly the phase modulation is by refraction index.

  10. A macrosonic system for industrial processing

    PubMed

    Gallego-Juarez; Rodriguez-Corral; Riera-Franco de Sarabia E; Campos-Pozuelo; Vazquez-Martinez; Acosta-Aparicio

    2000-03-01

    The development of high-power applications of sonic and ultrasonic energy in industrial processing requires a great variety of practical systems with characteristics which are dependent on the effect to be exploited. Nevertheless, the majority of systems are basically constituted of a treatment chamber and one or several transducers coupled to it. Therefore, the feasibility of the application mainly depends on the efficiency of the transducer-chamber system. This paper deals with a macrosonic system which is essentially constituted of a high-power transducer with a double stepped-plate radiator coupled to a chamber of square section. The radiator, which has a rectangular shape, is placed on one face of the chamber in order to drive the inside fluid volume. The stepped profile of the radiator allows a piston-like radiation to be obtained. The radiation from the back face of the radiator is also applied to the chamber by using adequate reflectors. Transducer-chamber systems for sonic and ultrasonic frequencies have been developed with power capacities up to about 5 kW for the treatment of fluid volumes of several cubic meters. The characteristics of these systems are presented in this paper.

  11. Separate and Concentrate Lactic Acid Using Combination of Nanofiltration and Reverse Osmosis Membranes

    NASA Astrophysics Data System (ADS)

    Li, Yebo; Shahbazi, Abolghasem; Williams, Karen; Wan, Caixia

    The processes of lactic acid production include two key stages, which are (a) fermentation and (b) product recovery. In this study, free cell of Bifidobacterium longum was used to produce lactic acid from cheese whey. The produced lactic acid was then separated and purified from the fermentation broth using combination of nanofiltration and reverse osmosis membranes. Nanofiltration membrane with a molecular weight cutoff of 100-400 Da was used to separate lactic acid from lactose and cells in the cheese whey fermentation broth in the first step. The obtained permeate from the above nanofiltration is mainly composed of lactic acid and water, which was then concentrated with a reverse osmosis membrane in the second step. Among the tested nanofiltration membranes, HL membrane from GE Osmonics has the highest lactose retention (97±1%). In the reverse osmosis process, the ADF membrane could retain 100% of lactic acid to obtain permeate with water only. The effect of membrane and pressure on permeate flux and retention of lactose/lactic acid was also reported in this paper.

  12. A Rapid One-Step Process for Fabrication of Biomimetic Superhydrophobic Surfaces by Pulse Electrodeposition.

    PubMed

    Jiang, Shuzhen; Guo, Zhongning; Liu, Guixian; Gyimah, Glenn Kwabena; Li, Xiaoying; Dong, Hanshan

    2017-10-25

    Inspired by some typical plants such as lotus leaves, superhydrophobic surfaces are commonly prepared by a combination of low surface energy materials and hierarchical micro/nano structures. In this work, superhydrophobic surfaces on copper substrates were prepared by a rapid, facile one-step pulse electrodepositing process, with different duty ratios in an electrolyte containing lanthanum chloride (LaCl₃·6H₂O), myristic acid (CH₃(CH₂) 12 COOH), and ethanol. The equivalent electrolytic time was only 10 min. The surface morphology, chemical composition and superhydrophobic property of the pulse electrodeposited surfaces were fully investigated with SEM, EDX, XRD, contact angle meter and time-lapse photographs of water droplets bouncing method. The results show that the as-prepared surfaces have micro/nano dual scale structures mainly consisting of La[CH₃(CH₂) 12 COO]₃ crystals. The maximum water contact angle (WCA) is about 160.9°, and the corresponding sliding angle is about 5°. This method is time-saving and can be easily extended to other conductive materials, having a great potential for future applications.

  13. Interactions of double patterning technology with wafer processing, OPC and design flows

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Cork, Chris; Miloslavsky, Alex; Luk-Pat, Gerry; Barnes, Levi; Hapli, John; Lewellen, John; Rollins, Greg; Wiaux, Vincent; Verhaegen, Staf

    2008-03-01

    Double patterning technology (DPT) is one of the main options for printing logic devices with half-pitch less than 45nm; and flash and DRAM memory devices with half-pitch less than 40nm. DPT methods decompose the original design intent into two individual masking layers which are each patterned using single exposures and existing 193nm lithography tools. The results of the individual patterning layers combine to re-create the design intent pattern on the wafer. In this paper we study interactions of DPT with lithography, masks synthesis and physical design flows. Double exposure and etch patterning steps create complexity for both process and design flows. DPT decomposition is a critical software step which will be performed in physical design and also in mask synthesis. Decomposition includes cutting (splitting) of original design intent polygons into multiple polygons where required; and coloring of the resulting polygons. We evaluate the ability to meet key physical design goals such as: reduce circuit area; minimize rework; ensure DPT compliance; guarantee patterning robustness on individual layer targets; ensure symmetric wafer results; and create uniform wafer density for the individual patterning layers.

  14. A Rapid One-Step Process for Fabrication of Biomimetic Superhydrophobic Surfaces by Pulse Electrodeposition

    PubMed Central

    Jiang, Shuzhen; Guo, Zhongning; Liu, Guixian; Gyimah, Glenn Kwabena; Li, Xiaoying; Dong, Hanshan

    2017-01-01

    Inspired by some typical plants such as lotus leaves, superhydrophobic surfaces are commonly prepared by a combination of low surface energy materials and hierarchical micro/nano structures. In this work, superhydrophobic surfaces on copper substrates were prepared by a rapid, facile one-step pulse electrodepositing process, with different duty ratios in an electrolyte containing lanthanum chloride (LaCl3·6H2O), myristic acid (CH3(CH2)12COOH), and ethanol. The equivalent electrolytic time was only 10 min. The surface morphology, chemical composition and superhydrophobic property of the pulse electrodeposited surfaces were fully investigated with SEM, EDX, XRD, contact angle meter and time-lapse photographs of water droplets bouncing method. The results show that the as-prepared surfaces have micro/nano dual scale structures mainly consisting of La[CH3(CH2)12COO]3 crystals. The maximum water contact angle (WCA) is about 160.9°, and the corresponding sliding angle is about 5°. This method is time-saving and can be easily extended to other conductive materials, having a great potential for future applications. PMID:29068427

  15. Fine- and hyperfine-structure effects in molecular photoionization. II. Resonance-enhanced multiphoton ionization and hyperfine-selective generation of molecular cations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Germann, Matthias; Willitsch, Stefan, E-mail: stefan.willitsch@unibas.ch

    2016-07-28

    Resonance-enhanced multiphoton ionization (REMPI) is a widely used technique for studying molecular photoionization and producing molecular cations for spectroscopy and dynamics studies. Here, we present a model for describing hyperfine-structure effects in the REMPI process and for predicting hyperfine populations in molecular ions produced by this method. This model is a generalization of our model for fine- and hyperfine-structure effects in one-photon ionization of molecules presented in Paper I [M. Germann and S. Willitsch, J. Chem. Phys. 145, 044314 (2016)]. This generalization is achieved by covering two main aspects: (1) treatment of the neutral bound-bound transition including the hyperfine structuremore » that makes up the first step of the REMPI process and (2) modification of our ionization model to account for anisotropic populations resulting from this first excitation step. Our findings may be used for analyzing results from experiments with molecular ions produced by REMPI and may serve as a theoretical background for hyperfine-selective ionization experiments.« less

  16. Key process parameters involved in the treatment of olive mill wastewater by membrane bioreactor.

    PubMed

    Jaouad, Y; Villain-Gambier, M; Mandi, L; Marrot, B; Ouazzani, N

    2018-04-18

    The Olive Mill Wastewater (OMWW) biodegradation in an external ceramic membrane bioreactor (MBR) was investigated with a starting acclimation step with a Ultrafiltration (UF) membrane (150 kDa) and no sludge discharge in order to develop a specific biomass adapted to OMWW biodegradation. After acclimation step, UF was replaced by an Microfiltration (MF) membrane (0.1 µm). Sludge Retention Time (SRT) was set around 25 days and Food to Microorganisms ratio (F/M) was fixed at 0.2 kg COD  kg MLVSS -1  d -1 . At stable state, removal of the main phenolic compounds (hydroxytyrosol and tyrosol) and Chemical Oxygen Demand (COD) were successfully reached (95% both). Considered as a predominant fouling factor, but never quantified in MBR treated OMWW, Soluble Microbial Products (SMP) proteins, polysaccharides and humic substances concentrations were determined (80, 110 and 360 mg L -1 respectively). At the same time, fouling was easily managed due to favourable hydraulic conditions of external ceramic MBR. Therefore, OMWW could be efficiently and durably treated by an MF MBR process under adapted operating parameters.

  17. Current status and challenges for automotive battery production technologies

    NASA Astrophysics Data System (ADS)

    Kwade, Arno; Haselrieder, Wolfgang; Leithoff, Ruben; Modlinger, Armin; Dietrich, Franz; Droeder, Klaus

    2018-04-01

    Production technology for automotive lithium-ion battery (LIB) cells and packs has improved considerably in the past five years. However, the transfer of developments in materials, cell design and processes from lab scale to production scale remains a challenge due to the large number of consecutive process steps and the significant impact of material properties, electrode compositions and cell designs on processes. This requires an in-depth understanding of the individual production processes and their interactions, and pilot-scale investigations into process parameter selection and prototype cell production. Furthermore, emerging process concepts must be developed at lab and pilot scale that reduce production costs and improve cell performance. Here, we present an introductory summary of the state-of-the-art production technologies for automotive LIBs. We then discuss the key relationships between process, quality and performance, as well as explore the impact of materials and processes on scale and cost. Finally, future developments and innovations that aim to overcome the main challenges are presented.

  18. Comparison of machinability of manganese alloyed austempered ductile iron produced using conventional and two step austempering processes

    NASA Astrophysics Data System (ADS)

    Hegde, Ananda; Sharma, Sathyashankara

    2018-05-01

    Austempered Ductile Iron (ADI) is a revolutionary material with high strength and hardness combined with optimum ductility and toughness. The discovery of two step austempering process has lead to the superior combination of all the mechanical properties. However, because of the high strength and hardness of ADI, there is a concern regarding its machinability. In the present study, machinability of ADI produced using conventional and two step heat treatment processes is assessed using tool life and the surface roughness. Speed, feed and depth of cut are considered as the machining parameters in the dry turning operation. The machinability results along with the mechanical properties are compared for ADI produced using both conventional and two step austempering processes. The results have shown that two step austempering process has produced better toughness with good hardness and strength without sacrificing ductility. Addition of 0.64 wt% manganese did not cause any detrimental effect on the machinability of ADI, both in conventional and two step processes. Marginal improvement in tool life and surface roughness were observed in two step process compared to that with conventional process.

  19. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    NASA Astrophysics Data System (ADS)

    Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.

    2016-01-01

    The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e., the observation that large-scale rainfall structures are more persistent and predictable than small-scale convective cells. This paper presents the development, adaptation and verification of the STEPS system for Belgium (STEPS-BE). STEPS-BE provides in real-time 20-member ensemble precipitation nowcasts at 1 km and 5 min resolutions up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 75-90 % of the forecast errors.

  20. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    NASA Astrophysics Data System (ADS)

    Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.

    2015-07-01

    The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE). STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80-90 % of the forecast errors.

  1. Step-by-step strategy in the management of residual hepatolithiasis using post-operative cholangioscopy

    PubMed Central

    Wen, Xu-dong; Wang, Tao; Huang, Zhu; Zhang, Hong-jian; Zhang, Bing-yin; Tang, Li-jun; Liu, Wei-hui

    2017-01-01

    Hepatolithiasis is the presence of calculi within the intrahepatic bile duct specifically located proximal to the confluence of the left and right hepatic ducts. The ultimate goal of hepatolithiasis treatment is the complete removal of the stone, the correction of the associated strictures and the prevention of recurrent cholangitis. Although hepatectomy could effectively achieve the above goals, it can be restricted by the risk of insufficient residual liver volume, and has a 15.6% rate of residual hepatolithiasis. With improvements in minimally invasive surgery, post-operative cholangioscopy (POC), provides an additional option for hepatolithiasis treatment with higher clearance rate and fewer severe complications. POC is very safe, and can be performed repeatedly until full patient benefit is achieved. During POC three main steps are accomplished: first, the analysis of the residual hepatolithiasis distribution indirectly by imaging methods or directly endoscopic observation; second, the establishment of the surgical pathway to relieve the strictures; and third, the removal of the stone by a combination of different techniques such as simple basket extraction, mechanical fragmentation, electrohydraulic lithotripsy or laser lithotripsy, among others. In summary, a step-by-step strategy of POC should be put forward to standardize the procedures, especially when dealing with complicated residual hepatolithiasis. This review briefly summarizes the classification, management and complications of hepatolithiasis during the POC process. PMID:29147136

  2. Preliminary Investigation of Time Remaining Display on the Computer-based Emergency Operating Procedure

    NASA Astrophysics Data System (ADS)

    Suryono, T. J.; Gofuku, A.

    2018-02-01

    One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.

  3. Harmonised framework for ecological risk assessment of sediments from ports and estuarine zones of North and South Atlantic.

    PubMed

    Choueri, R B; Cesar, A; Abessa, D M S; Torres, R J; Riba, I; Pereira, C D S; Nascimento, M R L; Morais, R D; Mozeto, A A; DelValls, T A

    2010-04-01

    This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of "step 1" (examination of available data); "step 2" (chemical characterisation and toxicity assessment); "decision 1" (any chemical level higher than reference values? are sediments toxic?); "step 3" (assessment of benthic community structure); "step 4" (integration of the results); "decision 2" (are sediments toxic or benthic community impaired?); "step 5" (construction of the decision matrix) and "decision 3" (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cádiz) and South Atlantic (Santos and Paranaguá Estuarine Systems).

  4. One-Step Catalytic Synthesis of CuO/Cu2O in a Graphitized Porous C Matrix Derived from the Cu-Based Metal-Organic Framework for Li- and Na-Ion Batteries.

    PubMed

    Kim, A-Young; Kim, Min Kyu; Cho, Keumnam; Woo, Jae-Young; Lee, Yongho; Han, Sung-Hwan; Byun, Dongjin; Choi, Wonchang; Lee, Joong Kee

    2016-08-03

    The hybrid composite electrode comprising CuO and Cu2O micronanoparticles in a highly graphitized porous C matrix (CuO/Cu2O-GPC) has a rational design and is a favorable approach to increasing the rate capability and reversible capacity of metal oxide negative materials for Li- and Na-ion batteries. CuO/Cu2O-GPC is synthesized through a Cu-based metal-organic framework via a one-step thermal transformation process. The electrochemical performances of the CuO/Cu2O-GPC negative electrode in Li- and Na-ion batteries are systematically studied and exhibit excellent capacities of 887.3 mAh g(-1) at 60 mA g(-1) after 200 cycles in a Li-ion battery and 302.9 mAh g(-1) at 50 mA g(-1) after 200 cycles in a Na-ion battery. The high electrochemical stability was obtained via the rational strategy, mainly owing to the synergy effect of the CuO and Cu2O micronanoparticles and highly graphitized porous C formed by catalytic graphitization of Cu nanoparticles. Owing to the simple one-step thermal transformation process and resulting high electrochemical performance, CuO/Cu2O-GPC is one of the prospective negative active materials for rechargeable Li- and Na-ion batteries.

  5. A distributed fault-detection and diagnosis system using on-line parameter estimation

    NASA Technical Reports Server (NTRS)

    Guo, T.-H.; Merrill, W.; Duyar, A.

    1991-01-01

    The development of a model-based fault-detection and diagnosis system (FDD) is reviewed. The system can be used as an integral part of an intelligent control system. It determines the faults of a system from comparison of the measurements of the system with a priori information represented by the model of the system. The method of modeling a complex system is described and a description of diagnosis models which include process faults is presented. There are three distinct classes of fault modes covered by the system performance model equation: actuator faults, sensor faults, and performance degradation. A system equation for a complete model that describes all three classes of faults is given. The strategy for detecting the fault and estimating the fault parameters using a distributed on-line parameter identification scheme is presented. A two-step approach is proposed. The first step is composed of a group of hypothesis testing modules, (HTM) in parallel processing to test each class of faults. The second step is the fault diagnosis module which checks all the information obtained from the HTM level, isolates the fault, and determines its magnitude. The proposed FDD system was demonstrated by applying it to detect actuator and sensor faults added to a simulation of the Space Shuttle Main Engine. The simulation results show that the proposed FDD system can adequately detect the faults and estimate their magnitudes.

  6. Implementation of density-based solver for all speeds in the framework of OpenFOAM

    NASA Astrophysics Data System (ADS)

    Shen, Chun; Sun, Fengxian; Xia, Xinlin

    2014-10-01

    In the framework of open source CFD code OpenFOAM, a density-based solver for all speeds flow field is developed. In this solver the preconditioned all speeds AUSM+(P) scheme is adopted and the dual time scheme is implemented to complete the unsteady process. Parallel computation could be implemented to accelerate the solving process. Different interface reconstruction algorithms are implemented, and their accuracy with respect to convection is compared. Three benchmark tests of lid-driven cavity flow, flow crossing over a bump, and flow over a forward-facing step are presented to show the accuracy of the AUSM+(P) solver for low-speed incompressible flow, transonic flow, and supersonic/hypersonic flow. Firstly, for the lid driven cavity flow, the computational results obtained by different interface reconstruction algorithms are compared. It is indicated that the one dimensional reconstruction scheme adopted in this solver possesses high accuracy and the solver developed in this paper can effectively catch the features of low incompressible flow. Then via the test cases regarding the flow crossing over bump and over forward step, the ability to capture characteristics of the transonic and supersonic/hypersonic flows are confirmed. The forward-facing step proves to be the most challenging for the preconditioned solvers with and without the dual time scheme. Nonetheless, the solvers described in this paper reproduce the main features of this flow, including the evolution of the initial transient.

  7. A semi-automatic method for quantification and classification of erythrocytes infected with malaria parasites in microscopic images.

    PubMed

    Díaz, Gloria; González, Fabio A; Romero, Eduardo

    2009-04-01

    Visual quantification of parasitemia in thin blood films is a very tedious, subjective and time-consuming task. This study presents an original method for quantification and classification of erythrocytes in stained thin blood films infected with Plasmodium falciparum. The proposed approach is composed of three main phases: a preprocessing step, which corrects luminance differences. A segmentation step that uses the normalized RGB color space for classifying pixels either as erythrocyte or background followed by an Inclusion-Tree representation that structures the pixel information into objects, from which erythrocytes are found. Finally, a two step classification process identifies infected erythrocytes and differentiates the infection stage, using a trained bank of classifiers. Additionally, user intervention is allowed when the approach cannot make a proper decision. Four hundred fifty malaria images were used for training and evaluating the method. Automatic identification of infected erythrocytes showed a specificity of 99.7% and a sensitivity of 94%. The infection stage was determined with an average sensitivity of 78.8% and average specificity of 91.2%.

  8. Combining Open-domain and Biomedical Knowledge for Topic Recognition in Consumer Health Questions.

    PubMed

    Mrabet, Yassine; Kilicoglu, Halil; Roberts, Kirk; Demner-Fushman, Dina

    2016-01-01

    Determining the main topics in consumer health questions is a crucial step in their processing as it allows narrowing the search space to a specific semantic context. In this paper we propose a topic recognition approach based on biomedical and open-domain knowledge bases. In the first step of our method, we recognize named entities in consumer health questions using an unsupervised method that relies on a biomedical knowledge base, UMLS, and an open-domain knowledge base, DBpedia. In the next step, we cast topic recognition as a binary classification problem of deciding whether a named entity is the question topic or not. We evaluated our approach on a dataset from the National Library of Medicine (NLM), introduced in this paper, and another from the Genetic and Rare Disease Information Center (GARD). The combination of knowledge bases outperformed the results obtained by individual knowledge bases by up to 16.5% F1 and achieved state-of-the-art performance. Our results demonstrate that combining open-domain knowledge bases with biomedical knowledge bases can lead to a substantial improvement in understanding user-generated health content.

  9. Intercalation of acrylic acid and sodium acrylate into kaolinite and their in situ polymerization

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Li, Yanfeng; Pan, Xiaobing; Jia, Xin; Wang, Xiaolong

    2007-02-01

    Novel nano-composites of poly (acrylic acid)-kaolinite were prepared, and intercalation and in situ polymerization were used in this process. The nano-composites were obtained by in situ polymerization of acrylic acid (AA) and sodium acrylate (AANa) intercalated into organo-kaolinite, which was obtained by refining and chemically modifying with solution intercalation step in order to increase the basal plane distance of the original clay. The modification was completed by using dimethyl-sulfoxide (DMSO)/methanol and potassium acetate (KAc)/water systems step by step. The materials were characterized with the help of XRD, FT-IR and TEM; the results confirmed that poly(acrylic acid) (PAA) and poly(sodium acrylate) (PAANa) were intercalated into the interlamellar spaces of kaolinite, the resulting copolymer composites (CC0 : copolymer crude kaolinite composite, CC1 : copolymer DMSO kaolinite composite, CC2 : copolymer KAc kaolinite composite) of CC2 exhibited a lamellar nano-composite with a mixed nano-morphology, and partial exfoliation of the intercalating clay platelets should be the main morphology. Finally, the effect of neutralization degree on the intercalation behavior was also investigated.

  10. Silicon compilation: From the circuit to the system

    NASA Astrophysics Data System (ADS)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  11. Automatic OPC repair flow: optimized implementation of the repair recipe

    NASA Astrophysics Data System (ADS)

    Bahnas, Mohamed; Al-Imam, Mohamed; Word, James

    2007-10-01

    Virtual manufacturing that is enabled by rapid, accurate, full-chip simulation is a main pillar in achieving successful mask tape-out in the cutting-edge low-k1 lithography. It facilitates detecting printing failures before a costly and time-consuming mask tape-out and wafer print occur. The OPC verification step role is critical at the early production phases of a new process development, since various layout patterns will be suspected that they might to fail or cause performance degradation, and in turn need to be accurately flagged to be fed back to the OPC Engineer for further learning and enhancing in the OPC recipe. At the advanced phases of the process development, there is much less probability of detecting failures but still the OPC Verification step act as the last-line-of-defense for the whole RET implemented work. In recent publication the optimum approach of responding to these detected failures was addressed, and a solution was proposed to repair these defects in an automated methodology and fully integrated and compatible with the main RET/OPC flow. In this paper the authors will present further work and optimizations of this Repair flow. An automated analysis methodology for root causes of the defects and classification of them to cover all possible causes will be discussed. This automated analysis approach will include all the learning experience of the previously highlighted causes and include any new discoveries. Next, according to the automated pre-classification of the defects, application of the appropriate approach of OPC repair (i.e. OPC knob) on each classified defect location can be easily selected, instead of applying all approaches on all locations. This will help in cutting down the runtime of the OPC repair processing and reduce the needed number of iterations to reach the status of zero defects. An output report for existing causes of defects and how the tool handled them will be generated. The report will with help further learning and facilitate the enhancement of the main OPC recipe. Accordingly, the main OPC recipe can be more robust, converging faster and probably in a fewer number of iterations. This knowledge feedback loop is one of the fruitful benefits of the Automatic OPC Repair flow.

  12. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  13. Knowledge Management Orientation: An Innovative Perspective to Hospital Management

    PubMed Central

    GHASEMI, Matina; GHADIRI NEJAD, Mazyar; BAGZIBAGLI, Kemal

    2017-01-01

    Background: By considering innovation as a new project in hospitals, all the project management’s standard steps should be followed in execution. This study investigated the validation of a new set of measures in terms of providing a procedure for knowledge management-oriented innovation that enriches the hospital management system. Methods: The relation between innovation and all the knowledge management areas, as the main constructs of project management, was illustrated by referring to project management standard steps and previous studies. Through consultations and meetings with a committee of professional project managers, a questionnaire was developed to measure ten knowledge management areas in hospital’s innovation process. Additionally, a group of experts from hospital managers were invited to comment on the applicability of the questionnaires by considering if the items are measurable in hospitals practically. Results: A close-ended, Likert-type scale items, consisted of ten sections, were developed based on project management body of knowledge thorough Delphi technique. It enables the managers to evaluate hospitals’ situation to be aware whether the organization follows the knowledge management standards in innovation process or not. By pilot study, confirmatory factor analysis and exploratory factor analysis were conducted to ensure the validity and reliability of the measurement items. Conclusion: The developed items seem to have a potential to help hospital managers and subsequently delivering new products/services successfully based on the standard procedures in their organization. In all innovation processes, the knowledge management areas and their standard steps help hospital managers by a new tool as questionnaire format. PMID:29259938

  14. Industrial production of L-ascorbic Acid (vitamin C) and D-isoascorbic acid.

    PubMed

    Pappenberger, Günter; Hohmann, Hans-Peter

    2014-01-01

    L-ascorbic acid (vitamin C) was first isolated in 1928 and subsequently identified as the long-sought antiscorbutic factor. Industrially produced L-ascorbic acid is widely used in the feed, food, and pharmaceutical sector as nutritional supplement and preservative, making use of its antioxidative properties. Until recently, the Reichstein-Grüssner process, designed in 1933, was the main industrial route. Here, D-sorbitol is converted to L-ascorbic acid via 2-keto-L-gulonic acid (2KGA) as key intermediate, using a bio-oxidation with Gluconobacter oxydans and several chemical steps. Today, industrial production processes use additional bio-oxidation steps with Ketogulonicigenium vulgare as biocatalyst to convert D-sorbitol to the intermediate 2KGA without chemical steps. The enzymes involved are characterized by a broad substrate range, but remarkable regiospecificity. This puzzling specificity pattern can be understood from the preferences of these enyzmes for certain of the many isomeric structures which the carbohydrate substrates adopt in aqueous solution. Recently, novel enzymes were identified that generate L-ascorbic acid directly via oxidation of L-sorbosone, an intermediate of the bio-oxidation of D-sorbitol to 2KGA. This opens the possibility for a direct route from D-sorbitol to L-ascorbic acid, obviating the need for chemical rearrangement of 2KGA. Similar concepts for industrial processes apply for the production of D-isoascorbic acid, the C5 epimer of L-ascorbic acid. D-isoascorbic acid has the same conformation at C5 as D-glucose and can be derived more directly than L-ascorbic acid from this common carbohydrate feed stock.

  15. Reducing acid leaching of manganiferous ore: effect of the iron removal operation on solid waste disposal.

    PubMed

    De Michelis, Ida; Ferella, Francesco; Beolchini, Francesca; Vegliò, Francesco

    2009-01-01

    The process of reducing acid leaching of manganiferous ore is aimed at the extraction of manganese from low grade manganese ores. This work is focused on the iron removal operation. The following items have been considered in order to investigate the effect of the main operating conditions on solid waste disposal and on the process costs: (i) type and quantity of the base agent used for iron precipitation, (ii) effective need of leaching waste separation prior to the iron removal operation, (iii) presence of a second leaching stage with the roasted ore, which might also act as a preliminary iron removal step, and (iv) effect of tailings washing on the solid waste classification. Different base compounds have been tested, including CaO, CaCO3, NaOH, and Na2CO3. The latter gave the best results concerning both the precipitation process kinetics and the reagent consumption. The filtration of the liquor leach prior to iron removal was not necessary, implying significant savings in capital costs. A reduction of chemical consumption and an increase of manganese concentration in the solution were obtained by introducing secondary leaching tests with the previously roasted ore; this additional step was introduced without a significant decrease of global manganese extraction yield. Finally, toxicity characteristic leaching procedure (TCLP) tests carried out on the leaching solid waste showed: (i) a reduction of arsenic mobility in the presence of iron precipitates, and (ii) the need for a washing step in order to produce a waste that is classifiable as not dangerous, taking into consideration the existing Environmental National Laws.

  16. Image Registration Workshop Proceedings

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline (Editor)

    1997-01-01

    Automatic image registration has often been considered as a preliminary step for higher-level processing, such as object recognition or data fusion. But with the unprecedented amounts of data which are being and will continue to be generated by newly developed sensors, the very topic of automatic image registration has become and important research topic. This workshop presents a collection of very high quality work which has been grouped in four main areas: (1) theoretical aspects of image registration; (2) applications to satellite imagery; (3) applications to medical imagery; and (4) image registration for computer vision research.

  17. Insertion of lithium into electrochromic devices after completion

    DOEpatents

    Berland, Brian Spencer; Lanning, Bruce Roy; Frey, Jonathan Mack; Barrett, Kathryn Suzanne; DuPont, Paul Damon; Schaller, Ronald William

    2015-12-22

    The present disclosure describes methods of inserting lithium into an electrochromic device after completion. In the disclosed methods, an ideal amount of lithium can be added post-fabrication to maximize or tailor the free lithium ion density of a layer or the coloration range of a device. Embodiments are directed towards a method to insert lithium into the main device layers of an electrochromic device as a post-processing step after the device has been manufactured. In an embodiment, the methods described are designed to maximize the coloration range while compensating for blind charge loss.

  18. Developing a Learning Algorithm-Generated Empirical Relaxer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Wayne; Kallman, Josh; Toreja, Allen

    2016-03-30

    One of the main difficulties when running Arbitrary Lagrangian-Eulerian (ALE) simulations is determining how much to relax the mesh during the Eulerian step. This determination is currently made by the user on a simulation-by-simulation basis. We present a Learning Algorithm-Generated Empirical Relaxer (LAGER) which uses a regressive random forest algorithm to automate this decision process. We also demonstrate that LAGER successfully relaxes a variety of test problems, maintains simulation accuracy, and has the potential to significantly decrease both the person-hours and computational hours needed to run a successful ALE simulation.

  19. Microbiological profile of selected mucks

    NASA Astrophysics Data System (ADS)

    Dąbek-Szreniawska, M.; Wyczółkowski, A. I.

    2009-04-01

    INTRODUCTION Matyka-Sarzynska and Sokolowska (2000) emphasize that peats and peat soils comprise large areas of Poland. The creation of soil begins when the formation of swamp has ended. Gawlik (2000) states that the degree of influence of the mucky process of organic soils on the differentiations of the conditions of growth and development of plants is mainly connected with the changes of moisture-retentive properties of mucks which constitute the material for these soils, and the loss of their wetting capacities. The above-mentioned changes, which usually occur gradually and show a clear connection with the extent of dehydration and, at times, with its duration, intensify significantly when the soils are under cultivation. The mucky process of peat soils leads to transformations of their physical, chemical and biological properties. The main ingredient of peat soils is organic substance. The substance is maintained inside them by the protective activity of water. The process of land improvement reduces the humidity of the environment, and that Intensifies the pace of the activity of soil microorganisms which cause the decay of organic substance. The decay takes place in the direction of two parallel processes: mineralization and humification. All groups of chemical substances constituting peat undergo mineralization. Special attention should be called to the mineralization of carbon and nitrogen compounds, which constitute a large percentage of theorganic substance of the peat organic mass. Okruszko (1976) has examined scientificbases of the classification of peat soils depending on the intensity of the muck process. The aim of this publication was to conduct a microbiological characteristic of selected mucky material. METHODS AND MATERIALS Soil samples used in the experiments were acquired from the Leczynsko-Wlodawski Lake Region, a large area of which constitutes a part of the Poleski National Park, which is covered to a large extent with high peat bogs. It was a mucky-peat soil with different degrees of muck process, described by Gawlik (2000) as MtI - first step of muck process, and MtII - second step of muck process. The numbers of selected groups of microorganisms were established using the cultivation method. The total number of microorganisms, zymogenic, aerobic and anaerobic microorganisms (Fred, Waksman 1928), oligotrophic microorganisms, the number of fungi (Parkinson 1982), ammonifiers (Parkinson et al 1971), nitrogen reducers and amolytic microorganisms (Pochon and Tardieux 1962), were determined. RESULTS The interpretation of the obtained results should take into consideration not only the characteristics of the studied objects, but also the characteristics of the methods used and of the examined microorganisms. As a result of the experiments that were carried out, significant differences of the numbers of the examined groups of microorganisms, depending on the degree of the muck process, have been observed. The number of the examined groups was significantly higher in the soil at the first step muck process than the second step of muck process. Amylolytic bacteria were an exception. Probably, during the muck process, ammonification, nitrification and nitrogen reduction process take place at the same time, which is indicated by the number of individual groups of examined microorganisms. CONCLUSIONS During the muck process, the number of microorganisms in the soil decreases. It can be presupposed that during the muck process, the basic process realized by microorganisms is the degradation of organic substance, using nitrates as oxidizers. Dąbek-Szreniawska M.: 1992 Results of microbiological analysis related to soil physical properties. Zesz. Probl. Post. Nauk Roln., 398, 1-6. Fred E.B., Waksman S.A.: 1928 Laboratory manual of general microbiology. Mc Graw-Hill Book Company, New York - London pp. 145. Gawlik J.: 2000 Division of differently silted peat formations into classes according to their state of secondary transformations. Acta Agrophysica, 26, 17-24. Maciak F.: 1985 MateriaŁ y do ćwiczeń z rekultywacji teren

  20. Writing a case report: polishing a gem?

    PubMed

    Papanas, N; Lazarides, M K

    2008-08-01

    Case reports describe patient cases which are of particular interest due to their novelty and their potential message for clinical practice. While there are several types of case reports, originality and clinical implications constitute the main virtues by which case reports are judged. Defining the educational message and choosing the right audience are vital steps in the process of writing. Generally, a case report is structured, its main components being the abstract, the introduction, the case description and the discussion. Guidelines and tips for writing case reports are not enough for making a successful author, but they help, especially less experienced doctors, to exercise and improve their writing. If properly prepared, case reports can still communicate new observations in an interesting and pleasant way, thereby enriching our knowledge, even in the era of evidence-based medicine.

  1. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    NASA Astrophysics Data System (ADS)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  2. Discovering the infectome of human endothelial cells challenged with Aspergillus fumigatus applying a mass spectrometry label-free approach.

    PubMed

    Curty, N; Kubitschek-Barreira, P H; Neves, G W; Gomes, D; Pizzatti, L; Abdelhay, E; Souza, G H M F; Lopes-Bezerra, L M

    2014-01-31

    Blood vessel invasion is a key feature of invasive aspergillosis. This angioinvasion process contributes to tissue thrombosis, which can impair the access of leukocytes and antifungal drugs to the site of infection. It has been demonstrated that human umbilical vein endothelial cells (HUVECs) are activated and assume a prothrombotic phenotype following contact with Aspergillus fumigatus hyphae or germlings, a process that is independent of fungus viability. However, the molecular mechanisms by which this pathogen can activate endothelial cells, together with the endothelial pathways that are involved in this process, remain unknown. Using a label-free approach by High Definition Mass Spectrometry (HDMS(E)), differentially expressed proteins were identified during HUVEC-A. fumigatus interaction. Among these, 89 proteins were determined to be up- or down-regulated, and another 409 proteins were exclusive to one experimental condition: the HUVEC control or HUVEC:AF interaction. The in silico predictions provided a general view of which biological processes and/or pathways were regulated during HUVEC:AF interaction, and they mainly included cell signaling, immune response and hemostasis pathways. This work describes the first global proteomic analysis of HUVECs following interaction with A. fumigatus germlings, the fungus morphotype that represents the first step of invasion and dissemination within the host. A. fumigatus causes the main opportunistic invasive fungal infection related to neutropenic hematologic patients. One of the key steps during the establishment of invasive aspergillosis is angioinvasion but the mechanism associated with the interaction of A. fumigatus with the vascular endothelium remains unknown. The identification of up- and down-regulated proteins expressed by human endothelial cells in response to the fungus infection can contribute to reveal the mechanism of endothelial response and, to understand the physiopathology of this high mortality disease. This article is part of a Special Issue entitled: Trends in Microbial Proteomics. © 2013 Elsevier B.V. All rights reserved.

  3. Hot melt extrusion of ion-exchange resin for taste masking.

    PubMed

    Tan, David Cheng Thiam; Ong, Jeremy Jianming; Gokhale, Rajeev; Heng, Paul Wan Sia

    2018-05-30

    Taste masking is important for some unpleasant tasting bioactives in oral dosage forms. Among many methods available for taste-masking, use of ion-exchange resin (IER) holds promise. IER combined with hot melt extrusion (HME) may offer additional advantages over solvent methods. IER provides taste masking by complexing with the drug ions and preventing drug dissolution in the mouth. Drug-IER complexation approaches described in literatures are mainly based either on batch processing or column eluting. These methods of drug-IER complexation have obvious limitations such as high solvent volume requirements, multiprocessing steps and extended processing time. Thus, the objective of this study was to develop a single-step, solvent-free, continuous HME process for complexation of drug-IER. The screening study evaluated drug to IER ratio, types of IER and drug complexation methods. In the screening study, a potassium salt of a weakly acidic carboxylate-based cationic IER was found suitable for the HME method. Thereafter, optimization study was conducted by varying HME process parameters such as screw speed, extrusion temperature and drug to IER ratio. It was observed that extrusion temperature and drug to IER ratio are imperative in drug-IER complexation through HME. In summary, this study has established the feasibility of a continuous complexation method for drug to IER using HME for taste masking. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. A multicriteria-based methodology for site prioritisation in sediment management.

    PubMed

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  5. Automated assay for screening the enzymatic release of reducing sugars from micronized biomass.

    PubMed

    Navarro, David; Couturier, Marie; da Silva, Gabriela Ghizzi Damasceno; Berrin, Jean-Guy; Rouau, Xavier; Asther, Marcel; Bignon, Christophe

    2010-07-16

    To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol), it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature) were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the robot can autonomously process 120 triplicate wheat-straw samples per day. This throughput can be doubled if the incubation time is reduced from 24 h to 4 h (for initial rates measurements, for instance). This method can potentially be used with any insoluble substrate that is micronizable. A video illustrating the method can be seen at the following URL: http://www.youtube.com/watch?v=NFg6TxjuMWU.

  6. Automatic Detection of Clouds and Shadows Using High Resolution Satellite Image Time Series

    NASA Astrophysics Data System (ADS)

    Champion, Nicolas

    2016-06-01

    Detecting clouds and their shadows is one of the primaries steps to perform when processing satellite images because they may alter the quality of some products such as large-area orthomosaics. The main goal of this paper is to present the automatic method developed at IGN-France for detecting clouds and shadows in a sequence of satellite images. In our work, surface reflectance orthoimages are used. They were processed from initial satellite images using a dedicated software. The cloud detection step consists of a region-growing algorithm. Seeds are firstly extracted. For that purpose and for each input ortho-image to process, we select the other ortho-images of the sequence that intersect it. The pixels of the input ortho-image are secondly labelled seeds if the difference of reflectance (in the blue channel) with overlapping ortho-images is bigger than a given threshold. Clouds are eventually delineated using a region-growing method based on a radiometric and homogeneity criterion. Regarding the shadow detection, our method is based on the idea that a shadow pixel is darker when comparing to the other images of the time series. The detection is basically composed of three steps. Firstly, we compute a synthetic ortho-image covering the whole study area. Its pixels have a value corresponding to the median value of all input reflectance ortho-images intersecting at that pixel location. Secondly, for each input ortho-image, a pixel is labelled shadows if the difference of reflectance (in the NIR channel) with the synthetic ortho-image is below a given threshold. Eventually, an optional region-growing step may be used to refine the results. Note that pixels labelled clouds during the cloud detection are not used for computing the median value in the first step; additionally, the NIR input data channel is used to perform the shadow detection, because it appeared to better discriminate shadow pixels. The method was tested on times series of Landsat 8 and Pléiades-HR images and our first experiments show the feasibility to automate the detection of shadows and clouds in satellite image sequences.

  7. A low-frequency near-field interferometric-TOA 3-D Lightning Mapping Array

    NASA Astrophysics Data System (ADS)

    Lyu, Fanchao; Cummer, Steven A.; Solanki, Rahulkumar; Weinert, Joel; McTague, Lindsay; Katko, Alex; Barrett, John; Zigoneanu, Lucian; Xie, Yangbo; Wang, Wenqi

    2014-11-01

    We report on the development of an easily deployable LF near-field interferometric-time of arrival (TOA) 3-D Lightning Mapping Array applied to imaging of entire lightning flashes. An interferometric cross-correlation technique is applied in our system to compute windowed two-sensor time differences with submicrosecond time resolution before TOA is used for source location. Compared to previously reported LF lightning location systems, our system captures many more LF sources. This is due mainly to the improved mapping of continuous lightning processes by using this type of hybrid interferometry/TOA processing method. We show with five station measurements that the array detects and maps different lightning processes, such as stepped and dart leaders, during both in-cloud and cloud-to-ground flashes. Lightning images mapped by our LF system are remarkably similar to those created by VHF mapping systems, which may suggest some special links between LF and VHF emission during lightning processes.

  8. Optimizing electrocoagulation process using experimental design for COD removal from unsanitary landfill leachate.

    PubMed

    Ogedey, Aysenur; Tanyol, Mehtap

    2017-12-01

    Leachate is the most difficult wastewater to be treated due to its complex content and high pollution release. For this reason, since it is not possible to be treated with a single process, a pre-treatment is needed. In the present study, a batch electrocoagulation reactor containing aluminum and iron electrodes was used to reduce chemical oxygen demand (COD) from landfill leachate (Tunceli, Turkey). Optimization of COD elimination was carried out with response surface methodology to describe the interaction effect of four main process independent parameters (current density, inter-electrode distance, pH and time of electrolysis). The optimum current density, inter-electrode distance, pH and time of electrolysis for maximum COD removal (43%) were found to be 19.42 mA/m 2 , 0.96 cm, 7.23 and 67.64 min, respectively. The results shown that the electrocoagulation process can be used as a pre-treatment step for leachate.

  9. An integrated process for the extraction of fuel and chemicals from marine macroalgal biomass

    NASA Astrophysics Data System (ADS)

    Trivedi, Nitin; Baghel, Ravi S.; Bothwell, John; Gupta, Vishal; Reddy, C. R. K.; Lali, Arvind M.; Jha, Bhavanath

    2016-07-01

    We describe an integrated process that can be applied to biomass of the green seaweed, Ulva fasciata, to allow the sequential recovery of four economically important fractions; mineral rich liquid extract (MRLE), lipid, ulvan, and cellulose. The main benefits of our process are: a) its simplicity and b) the consistent yields obtained from the residual biomass after each successive extraction step. For example, dry Ulva biomass yields ~26% of its starting mass as MRLE, ~3% as lipid, ~25% as ulvan, and ~11% as cellulose, with the enzymatic hydrolysis and fermentation of the final cellulose fraction under optimized conditions producing ethanol at a competitive 0.45 g/g reducing sugar. These yields are comparable to those obtained by direct processing of the individual components from primary biomass. We propose that this integration of ethanol production and chemical feedstock recovery from macroalgal biomass could substantially enhance the sustainability of marine biomass use.

  10. Processing of zero-derived words in English: an fMRI investigation.

    PubMed

    Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C

    2014-01-01

    Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalitybridge-V) i.e., zero-derivation (Aronoff, 1980). We compared the processing of one-step (soaking

  11. Scale structure of aluminised Manet steel after HIP treatment

    NASA Astrophysics Data System (ADS)

    Glasbrenner, H.; Stein-Fechner, K.; Konys, J.

    2000-12-01

    Coatings on low activation steels are required in fusion technology in order to reduce the tritium permeation rate through the steel into the cooling water system by a factor of at least 100. Alumina seems to be a promising coating material. However, an appropriate coating system must also have the potential for self-healing since the ceramic alumina scale tends to fail if mechanical stress is applied. A technology is introduced here to form a ductile Al enriched surface scale on Manet II steel (Fe-10.3%Cr) with an alumina overlayer. This technology consists of two main process steps. Hot dip aluminising has been performed at 700°C for 30 s in order to introduce Al to the near surface zone. The very hard intermetallic scale Fe 2Al 5 which forms during the immersion process gets completely transformed into FeAl 2, FeAl and α-Fe(Al) phases during a subsequent hot isostatic press (HIP) process step at high pressure at 1040°C for 30 min. The pressures chosen for the HIPing were 1000 and 2000 bar. Without HIPing pores form due to the Kirkendall effect. The influence of the high pressure on the heat treatment (1040°C, 30 min) will be discussed in this paper.

  12. Influence of surface pretreatments on the quality of trivalent chromium process coatings on aluminum alloy

    NASA Astrophysics Data System (ADS)

    Viroulaud, Rémi; Światowska, Jolanta; Seyeux, Antoine; Zanna, Sandrine; Tardelli, Joffrey; Marcus, Philippe

    2017-11-01

    The effects of surface pretreatments (degreasing and pickling) on the characteristics of the Trivalent Chromium Process (TCP) coating on pure aluminum and on AA2024-T351 aluminum alloy were investigated here by means of surface sensitive techniques: X-ray photoelectron spectroscopy (XPS) and time-of-flight secondary ion mass spectrometry (ToF-SIMS). The XPS and ToF-SIMS results evidence that the TCP coating homogeneity is strongly dependent on the pretreatment process used. The TCP coverage factor, calculated from XPS results, is significantly lower, on both pure aluminum and AA2024-T351 alloy surface, when a pickling step is applied. One of the main effects of pickling pretreatment is strong metallic copper enrichment at the surface of the 2024 alloy, associated with chemical dissolution of Al-Cu intermetallic particles. However, it is evidenced here, that the copper enrichment is not detrimental for the quality of the TCP coating. The coating failure, observed when the pickling step is applied, can be assigned to a faster kinetics of the coating growth leading to formation of thicker conversion coating more susceptible to cracking or to the localized presence of aluminum fluoride species leading to the appearance of coating defects or detachment.

  13. Numerical Issues Associated with Compensating and Competing Processes in Climate Models: an Example from ECHAM-HAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Hui; Rasch, Philip J.; Zhang, Kai

    2013-06-26

    The purpose of this paper is to draw attention to the need for appropriate numerical techniques to represent process interactions in climate models. In two versions of the ECHAM-HAM model, different time integration methods are used to solve the sulfuric acid (H2SO4) gas evolution equation, which lead to substantially different results in the H2SO4 gas concentration and the aerosol nucleation rate. Using convergence tests and sensitivity simulations performed with various time stepping schemes, it is confirmed that numerical errors in the second model version are significantly smaller than those in version one. The use of sequential operator splitting in combinationmore » with long time step is identified as the main reason for the large systematic biases in the old model. The remaining errors in version two in the nucleation rate, related to the competition between condensation and nucleation, have a clear impact on the simulated concentration of cloud condensation nuclei in the lower troposphere. These errors can be significantly reduced by employing an implicit solver that handles production, condensation and nucleation at the same time. Lessons learned in this work underline the need for more caution when treating multi-time-scale problems involving compensating and competing processes, a common occurrence in current climate models.« less

  14. Biosynthesis of cis,cis-muconic acid and its aromatic precursors, catechol and protocatechuic acid, from renewable feedstocks by Saccharomyces cerevisiae.

    PubMed

    Weber, Christian; Brückner, Christine; Weinreb, Sheila; Lehr, Claudia; Essl, Christine; Boles, Eckhard

    2012-12-01

    Adipic acid is a high-value compound used primarily as a precursor for the synthesis of nylon, coatings, and plastics. Today it is produced mainly in chemical processes from petrochemicals like benzene. Because of the strong environmental impact of the production processes and the dependence on fossil resources, biotechnological production processes would provide an interesting alternative. Here we describe the first engineered Saccharomyces cerevisiae strain expressing a heterologous biosynthetic pathway converting the intermediate 3-dehydroshikimate of the aromatic amino acid biosynthesis pathway via protocatechuic acid and catechol into cis,cis-muconic acid, which can be chemically dehydrogenated to adipic acid. The pathway consists of three heterologous microbial enzymes, 3-dehydroshikimate dehydratase, protocatechuic acid decarboxylase composed of three different subunits, and catechol 1,2-dioxygenase. For each heterologous reaction step, we analyzed several potential candidates for their expression and activity in yeast to compose a functional cis,cis-muconic acid synthesis pathway. Carbon flow into the heterologous pathway was optimized by increasing the flux through selected steps of the common aromatic amino acid biosynthesis pathway and by blocking the conversion of 3-dehydroshikimate into shikimate. The recombinant yeast cells finally produced about 1.56 mg/liter cis,cis-muconic acid.

  15. Manufacturing Precise, Lightweight Paraboloidal Mirrors

    NASA Technical Reports Server (NTRS)

    Hermann, Frederick Thomas

    2006-01-01

    A process for fabricating a precise, diffraction- limited, ultra-lightweight, composite- material (matrix/fiber) paraboloidal telescope mirror has been devised. Unlike the traditional process of fabrication of heavier glass-based mirrors, this process involves a minimum of manual steps and subjective judgment. Instead, this process involves objectively controllable, repeatable steps; hence, this process is better suited for mass production. Other processes that have been investigated for fabrication of precise composite-material lightweight mirrors have resulted in print-through of fiber patterns onto reflecting surfaces, and have not provided adequate structural support for maintenance of stable, diffraction-limited surface figures. In contrast, this process does not result in print-through of the fiber pattern onto the reflecting surface and does provide a lightweight, rigid structure capable of maintaining a diffraction-limited surface figure in the face of changing temperature, humidity, and air pressure. The process consists mainly of the following steps: 1. A precise glass mandrel is fabricated by conventional optical grinding and polishing. 2. The mandrel is coated with a release agent and covered with layers of a carbon- fiber composite material. 3. The outer surface of the outer layer of the carbon-fiber composite material is coated with a surfactant chosen to provide for the proper flow of an epoxy resin to be applied subsequently. 4. The mandrel as thus covered is mounted on a temperature-controlled spin table. 5. The table is heated to a suitable temperature and spun at a suitable speed as the epoxy resin is poured onto the coated carbon-fiber composite material. 6. The surface figure of the optic is monitored and adjusted by use of traditional Ronchi, Focault, and interferometric optical measurement techniques while the speed of rotation and the temperature are adjusted to obtain the desired figure. The proper selection of surfactant, speed or rotation, viscosity of the epoxy, and temperature make it possible to obtain the desired diffraction-limited, smooth (1/50th wave) parabolic outer surface, suitable for reflective coating. 7. A reflective coat is applied by use of conventional coating techniques. 8. Once the final figure is set, a lightweight structural foam is applied to the rear of the optic to ensure stability of the figure.

  16. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Practical and Ethical Aspects of Advance Research Directives for Research on Healthy Aging: German and Israeli Professionals' Perspectives.

    PubMed

    Werner, Perla; Schicktanz, Silke

    2018-01-01

    Healthy aging is the development and maintenance of optimal cognitive, social and physical well-being, and function in older adults. Preventing or minimizing disease is one of the main ways of achieving healthy aging. Dementia is one of the most prevalent and life-changing diseases of old age. Thus, dementia prevention research is defined as one of the main priorities worldwide. However, conducting research with persons who lack the capacity to give consent is a major ethical issue. Our study attempts to explore if and how advance research directives (ARDs) may be used as a future tool to deal with the ethical and practical issues in dementia research. We conducted focus groups and in-depth interviews with German and Israeli professional stakeholders from the fields of gerontology, ethics, medical law, psychiatry, neurology and policy advice ( n  = 16), and analyzed the main topics discussed regarding cross-national similarities and controversies within the groups, as well as across the two national contexts. While both countries are in the midst of a developmental process and have recognized the importance and need for ARD as a tool for expanding healthy aging, Germany is in a more advanced stage than Israel because of the EU regulation process, which indicates the influence of international harmonization on these research-related ethical issues. Consensual themes within the qualitative material were identified: the need for a broader debate on ARD, the ethical importance of autonomy and risk-benefit assessment for ARD implementation, the role of the proxy and the need for the differentiation of types of dementia research. Controversies and dilemmas aroused around themes such as the current role of IRBs in each country, the need for limits, and how to guaranty safeguarding and control. Implementing a new tool is a step-by-step procedure requiring a thorough understanding of the current state of knowledge as well as of the challenges and hurdles ahead. As long as improving quality of life and promoting autonomy continue to be core elements in the process of healthy aging, efforts to advance knowledge and solve dilemmas associated with the implementation of ARD is of the utmost importance.

  18. [The future of intensive medicine].

    PubMed

    Palencia Herrejón, E; González Díaz, G; Mancebo Cortés, J

    2011-05-01

    Although Intensive Care Medicine is a young specialty compared with other medical disciplines, it currently plays a key role in the process of care for many patients. Experience has shown that professionals with specific training in Intensive Care Medicine are needed to provide high quality care to critically ill patients. In Europe, important steps have been taken towards the standardization of training programs of the different member states. However, it is now necessary to take one more step forward, that is, the creation of a primary specialty in Intensive Care Medicine. Care of the critically ill needs to be led by specialists who have received specific and complete training and who have the necessary professional competences to provide maximum quality care to their patients. The future of the specialty presents challenges that must be faced with determination, with the main objective of meeting the needs of the population. Copyright © 2011 Elsevier España, S.L. y SEMICYUC. All rights reserved.

  19. Extrusion of xylans extracted from corn cobs into biodegradable polymeric materials.

    PubMed

    Bahcegul, Erinc; Akinalan, Busra; Toraman, Hilal E; Erdemir, Duygu; Ozkan, Necati; Bakir, Ufuk

    2013-12-01

    Solvent casting technique, which comprises multiple energy demanding steps including the dissolution of a polymer in a solvent followed by the evaporation of the solvent from the polymer solution, is currently the main technique for the production of xylan based polymeric materials. The present study shows that sufficient water content renders arabinoglucuronoxylan (AGX) polymers extrudable, enabling the production of AGX based polymeric materials in a single step via extrusion, which is economically advantageous to solvent casting process for mass production. AGX polymers with water content of 27% were found to yield extrudates at an extrusion temperature of 90°C. The extruded strips showed very good mechanical properties with an ultimate tensile strength of 76 ± 6 MPa and elongation at break value of 35 ± 8%, which were superior to the mechanical properties of the strips obtained from polylactic acid. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. How many theories for the origin of /proto/life

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1980-01-01

    The sequence of primordial chemical events leading to contemporary metabolism is considered, taking into account primordial reactants, amino acids, proteinoid, protocells, ATP, polynucleotides, and protein. The right kind of matter, thermal copolyamino acids, can organize itself into cell-like structures, in the absence of discrete lipid, when triggered to do so by water. Another unpredicted result of examination of the microsystems formed was the step-by-step realization that the component processes of a primitive form of replication were latent in the proteinoid microsystems. At the present time, four modes of primitive replication of proteinoid microsystems have been identified, plus one that has the appearance of protosexual reproduction. Two main conceptual pathways have received attention. One is the proteinoid theory, derived from experiments. The other is the DNA-first theory, for which attempts at conceptual construction and experimental support continue to be sought.

  1. Underground structure pattern and multi AO reaction with step feed concept for upgrading an large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Peng, Yi; Zhang, Jie; Li, Dong

    2018-03-01

    A large wastewater treatment plant (WWTP) could not meet the new demand of urban environment and the need of reclaimed water in China, using a US treatment technology. Thus a multi AO reaction process (Anaerobic/oxic/anoxic/oxic/anoxic/oxic) WWTP with underground structure was proposed to carry out the upgrade project. Four main new technologies were applied: (1) multi AO reaction with step feed technology; (2) deodorization; (3) new energy-saving technology such as water resource heat pump and optical fiber lighting system; (4) dependable old WWTP’s water quality support measurement during new WWTP’s construction. After construction, upgrading WWTP had saved two thirds land occupation, increased 80% treatment capacity and improved effluent standard by more than two times. Moreover, it had become a benchmark of an ecological negative capital changing to a positive capital.

  2. Rapid biodiesel synthesis from waste pepper seeds without lipid isolation step.

    PubMed

    Lee, Jechan; Kim, Jieun; Ok, Yong Sik; Kwon, Eilhann E

    2017-09-01

    In situ transformation of lipid in waste pepper seeds into biodiesel (i.e., fatty acid methyl esters: FAMEs) via thermally-induced transmethylation on silica was mainly investigated in this study. This study reported that waste pepper seeds contained 26.9wt% of lipid and that 94.1% of the total lipid in waste pepper seeds could be converted into biodiesel without lipid extraction step for only ∼1min reaction time. This study also suggested that the optimal temperature for in situ transmethylation was identified as 390°C. Moreover, comparison of in situ process via the conventional transmethylation catalyzed by H 2 SO 4 showed that the introduced biodiesel conversion in this study had a higher tolerance against impurities, thereby being technically feasible. The in situ biodiesel production from other oil-bearing food wastes can be studied. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Highly Reconfigurable Beamformer Stimulus Generator

    NASA Astrophysics Data System (ADS)

    Vaviļina, E.; Gaigals, G.

    2018-02-01

    The present paper proposes a highly reconfigurable beamformer stimulus generator of radar antenna array, which includes three main blocks: settings of antenna array, settings of objects (signal sources) and a beamforming simulator. Following from the configuration of antenna array and object settings, different stimulus can be generated as the input signal for a beamformer. This stimulus generator is developed under a greater concept with two utterly independent paths where one is the stimulus generator and the other is the hardware beamformer. Both paths can be complemented in final and in intermediate steps as well to check and improve system performance. This way the technology development process is promoted by making each of the future hardware steps more substantive. Stimulus generator configuration capabilities and test results are presented proving the application of the stimulus generator for FPGA based beamforming unit development and tuning as an alternative to an actual antenna system.

  4. Real Gas Computation Using an Energy Relaxation Method and High-Order WENO Schemes

    NASA Technical Reports Server (NTRS)

    Montarnal, Philippe; Shu, Chi-Wang

    1998-01-01

    In this paper, we use a recently developed energy relaxation theory by Coquel and Perthame and high order weighted essentially non-oscillatory (WENO) schemes to simulate the Euler equations of real gas. The main idea is an energy decomposition into two parts: one part is associated with a simpler pressure law and the other part (the nonlinear deviation) is convected with the flow. A relaxation process is performed for each time step to ensure that the original pressure law is satisfied. The necessary characteristic decomposition for the high order WENO schemes is performed on the characteristic fields based on the first part. The algorithm only calls for the original pressure law once per grid point per time step, without the need to compute its derivatives or any Riemann solvers. Both one and two dimensional numerical examples are shown to illustrate the effectiveness of this approach.

  5. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  6. Comparative analysis of strain fields in layers of step-graded metamorphic buffers of various designs

    NASA Astrophysics Data System (ADS)

    Aleshin, A. N.; Bugaev, A. S.; Ruban, O. A.; Tabachkova, N. Yu.; Shchetinin, I. V.

    2017-10-01

    Spatial distribution of residual elastic strain in the layers of two step-graded metamophic buffers of various designs, grown by molecular beam epitaxy from ternary InxAl1-xAs solutions on GaAs(001) substrates, is obtained using reciprocal space mapping by three-axis X-ray diffractometry and the linear theory of elasticity. The difference in the design of the buffers enabled the formation of a dislocation-free layer with different thickness in each of the heterostructures, which was the main basis of this study. It is shown that, in spite of the different design of graded metamorphic buffers, the nature of strain fields in them is the same, and the residual elastic strains in the final elements of both buffers adjusted for the effect of work hardening subject to the same phenomenological law, which describes the strain relief process in single-layer heterostructures.

  7. A biconjugate gradient type algorithm on massively parallel architectures

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Hochbruck, Marlis

    1991-01-01

    The biconjugate gradient (BCG) method is the natural generalization of the classical conjugate gradient algorithm for Hermitian positive definite matrices to general non-Hermitian linear systems. Unfortunately, the original BCG algorithm is susceptible to possible breakdowns and numerical instabilities. Recently, Freund and Nachtigal have proposed a novel BCG type approach, the quasi-minimal residual method (QMR), which overcomes the problems of BCG. Here, an implementation is presented of QMR based on an s-step version of the nonsymmetric look-ahead Lanczos algorithm. The main feature of the s-step Lanczos algorithm is that, in general, all inner products, except for one, can be computed in parallel at the end of each block; this is unlike the other standard Lanczos process where inner products are generated sequentially. The resulting implementation of QMR is particularly attractive on massively parallel SIMD architectures, such as the Connection Machine.

  8. Introduction to multifractal detrended fluctuation analysis in matlab.

    PubMed

    Ihlen, Espen A F

    2012-01-01

    Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra.

  9. Introduction to Multifractal Detrended Fluctuation Analysis in Matlab

    PubMed Central

    Ihlen, Espen A. F.

    2012-01-01

    Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra. PMID:22675302

  10. Scientific Insights in the Preparation and Characterisation of a Lead-based Naga Bhasma.

    PubMed

    Nagarajan, S; Krishnaswamy, S; Pemiah, Brindha; Rajan, K S; Krishnan, Umamaheswari; Sethuraman, S

    2014-01-01

    Naga bhasma is one of the herbo-metallic preparations used in Ayurveda, a traditional Indian System of Medicine. The preparation of Naga bhasma involves thermal treatment of 'Naga' (metallic lead) in a series of quenching liquids, followed by reaction with realgar and herbal constituents, before calcination to prepare a fine product. We have analysed the intermediates obtained during different stages of preparation to understand the relevance and importance of different steps involved in the preparation. Our results show that 'Sodhana' (purification process) removes heavy metals other than lead, apart from making it soft and amenable for trituration. The use of powders of tamarind bark and peepal bark maintains the oxidation state of lead in Jarita Naga (lead oxide) as Pb(2+). The repeated calcination steps result in the formation of nano-crystalline lead sulphide, the main chemical species present in Naga bhasma.

  11. Scientific Insights in the Preparation and Characterisation of a Lead-based Naga Bhasma

    PubMed Central

    Nagarajan, S.; Krishnaswamy, S.; Pemiah, Brindha; Rajan, K. S.; Krishnan, Umamaheswari; Sethuraman, S.

    2014-01-01

    Naga bhasma is one of the herbo-metallic preparations used in Ayurveda, a traditional Indian System of Medicine. The preparation of Naga bhasma involves thermal treatment of ‘Naga’ (metallic lead) in a series of quenching liquids, followed by reaction with realgar and herbal constituents, before calcination to prepare a fine product. We have analysed the intermediates obtained during different stages of preparation to understand the relevance and importance of different steps involved in the preparation. Our results show that ‘Sodhana’ (purification process) removes heavy metals other than lead, apart from making it soft and amenable for trituration. The use of powders of tamarind bark and peepal bark maintains the oxidation state of lead in Jarita Naga (lead oxide) as Pb2+. The repeated calcination steps result in the formation of nano-crystalline lead sulphide, the main chemical species present in Naga bhasma. PMID:24799737

  12. [A Concept Analysis for Mind-Body Interaction].

    PubMed

    Chen, Hsing-Wen; Yeh, Mei-Ling; Rong, Jiin-Ru

    2015-08-01

    Mind-body interaction (MBI) refers the holistic association and interactive process between wisdom, thinking, belief, and physiological reaction, which critically affects health. The main goal of nursing is to maintain mind and body in a healthy state of well being. Few reports in the literatures have addressed the evaluation and application of MBI. Thus, a conceptual analysis of this subject is worth exploring in depth. This paper analyzes the MBI concept step by step based on the procedures of Walker and Avant. The result defines the characteristics of MBI as (1) being aware of psychosomatic effects, (2) interacting between psychology, neurology, immunology and others, and (3) turning out a bio-psycho-social status. Antecedents include geography, culture, race, gender, age, education, profession, values, personality, experience, and health status. Consequences of MBI include well-being, illness, and death. This paper provides new information on MBI that clarifies its meaning, provides comprehensive cognition, and suggests useful applications.

  13. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    PubMed

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.

  14. Why do organizations not learn from incidents? Bottlenecks, causes and conditions for a failure to effectively learn.

    PubMed

    Drupsteen, Linda; Hasle, Peter

    2014-11-01

    If organizations would be able to learn more effectively from incidents that occurred in the past, future incidents and consequential injury or damage can be prevented. To improve learning from incidents, this study aimed to identify limiting factors, i.e. the causes of the failure to effectively learn. In seven organizations focus groups were held to discuss factors that according to employees contributed to the failure to learn. By use of a model of the learning from incidents process, the steps, where difficulties for learning arose, became visible, and the causes for these difficulties could be studied. Difficulties were identified in multiple steps of the learning process, but most difficulties became visible when planning actions, which is the phase that bridges the gap from incident investigation to actions for improvement. The main causes for learning difficulties, which were identified by the participants in this study, were tightly related to the learning process, but some indirect causes - or conditions - such as lack of ownership and limitations in expertise were also mentioned. The results illustrate that there are two types of causes for the failure to effectively learn: direct causes and indirect causes, here called conditions. By actively and systematically studying learning, more conditions might be identified and indicators for a successful learning process may be determined. Studying the learning process does, however, require a shift from learning from incidents to learning to learn. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Pleiades image quality: from users' needs to products definition

    NASA Astrophysics Data System (ADS)

    Kubik, Philippe; Pascal, Véronique; Latry, Christophe; Baillarin, Simon

    2005-10-01

    Pleiades is the highest resolution civilian earth observing system ever developed in Europe. This imagery programme is conducted by the French National Space Agency, CNES. It will operate in 2008-2009 two agile satellites designed to provide optical images to civilian and defence users. Images will be simultaneously acquired in Panchromatic (PA) and multispectral (XS) mode, which allows, in Nadir acquisition condition, to deliver 20 km wide, false or natural colored scenes with a 70 cm ground sampling distance after PA+XS fusion. Imaging capabilities have been highly optimized in order to acquire along-track mosaics, stereo pairs and triplets, and multi-targets. To fulfill the operational requirements and ensure quick access to information, ground processing has to automatically perform the radiometrical and geometrical corrections. Since ground processing capabilities have been taken into account very early in the programme development, it has been possible to relax some costly on-board components requirements, in order to achieve a cost effective on-board/ground compromise. Starting from an overview of the system characteristics, this paper deals with the image products definition (raw level, perfect sensor, orthoimage and along-track orthomosaics), and the main processing steps. It shows how each system performance is a result of the satellite performance followed by an appropriate ground processing. Finally, it focuses on the radiometrical performances of final products which are intimately linked to the following processing steps : radiometrical corrections, PA restoration, image resampling and PAN-sharpening.

  16. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  17. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  18. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  19. Two-Step Plasma Process for Cleaning Indium Bonding Bumps

    NASA Technical Reports Server (NTRS)

    Greer, Harold F.; Vasquez, Richard P.; Jones, Todd J.; Hoenk, Michael E.; Dickie, Matthew R.; Nikzad, Shouleh

    2009-01-01

    A two-step plasma process has been developed as a means of removing surface oxide layers from indium bumps used in flip-chip hybridization (bump bonding) of integrated circuits. The two-step plasma process makes it possible to remove surface indium oxide, without incurring the adverse effects of the acid etching process.

  20. Influence of parameters controlling the extrusion step in fused filament fabrication (FFF) process applied to polymers using numerical simulation

    NASA Astrophysics Data System (ADS)

    Shahriar, Bakrani Balani; Arthur, Cantarel; France, Chabert; Valérie, Nassiet

    2018-05-01

    Extrusion is one of the oldest manufacturing processes; it is widely used for manufacturing finished and semi-finished products. Moreover, extrusion is also the main process in additive manufacturing technologies such as Fused Filament Fabrication (FFF). In FFF process, the parts are manufactured layer by layer using thermoplastic material. The latter in form of filament, is melted in the liquefier and then it is extruded and deposited on the previous layer. The mechanical properties of the printed parts rely on the coalescence of each extrudate with another one. The coalescence phenomenon is driven by the flow properties of the melted polymer when it comes out the nozzle just before the deposition step. This study aims to master the quality of the printed parts by controlling the effect of the parameters of the extruder on the flow properties in the FFF process. In the current study, numerical simulation of the polymer coming out of the extruder was carried out using Computational Fluid Dynamics (CFD) and two phase flow (TPF) simulation Level Set (LS) method by 2D axisymmetric module of COMSOL Multiphysics software. In order to pair the heat transfer with the flow simulation, an advection-diffusion equation was used. Advection-diffusion equation was implemented as a Partial Differential Equation (PDE) in the software. In order to define the variation of viscosity of the polymer with temperature, the rheological behaviors of two thermoplastics were measured by extensional rheometer and using a parallel-plate configuration of an oscillatory rheometer. The results highlight the influence of the environment temperature and the cooling rate on the temperature and viscosity of the extrudate exiting from the nozzle. Moreover, the temperature and its corresponding viscosity at different times have been determined using numerical simulation. At highest shear rates, the extrudate undergoes deformation from typical cylindrical shape. These results are required to predict the coalescence of filaments, a step towards understanding the mechanical properties of the printed parts.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soliman, A; Safigholi, H; Sunnybrook Health Sciences Center, Toronto, ON

    Purpose: To propose a new method that provides a positive contrast visualization of the prostate brachytherapy seeds using the phase information from MR images. Additionally, the feasibility of using the processed phase information to distinguish seeds from calcifications is explored. Methods: A gel phantom was constructed using 2% agar dissolved in 1 L of distilled water. Contrast agents were added to adjust the relaxation times. Four iodine-125 (Eckert & Ziegler SML86999) dummy seeds were placed at different orientations with respect to the main magnetic field (B0). Calcifications were obtained from a sheep femur cortical bone due to its close similaritymore » to human bone tissue composition. Five samples of calcifications were shaped into different dimensions with lengths ranging between 1.2 – 6.1 mm.MR imaging was performed on a 3T Philips Achieva using an 8-channel head coil. Eight images were acquired at eight echo-times using a multi-gradient echo sequence. Spatial resolution was 0.7 × 0.7 × 2 mm, TR/TE/dTE = 20.0/2.3/2.3 ms and BW = 541 Hz/pixel. Complex images were acquired and fed into a two-step processing pipeline: the first includes phase unwrapping and background phase removal using Laplacian operator (Wei et al. 2013). The second step applies a specific phase mask on the resulting tissue phase from the first step to provide the desired positive contrast of the seeds and to, potentially, differentiate them from the calcifications. Results: The phase-processing was performed in less than 30 seconds. The proposed method has successfully resulted in a positive contrast of the brachytherapy seeds. Additionally, the final processed phase image showed difference between the appearance of seeds and calcifications. However, the shape of the seeds was slightly distorted compared to the original dimensions. Conclusion: It is feasible to provide a positive contrast of the seeds from MR images using Laplacian operator-based phase processing.« less

  2. A sample of potential disk hosting first ascent red giants

    NASA Astrophysics Data System (ADS)

    Steele, Amy; Debes, John

    2018-01-01

    Observations of (sub)giants with planets and disks provide the first set of proof that disks can survive the first stages of post-main-sequence evolution, even though the disks are expected to dissipate by this time. The infrared (IR) excesses present around a number of post-main-sequence (PMS) stars could be due to a traditional debris disk with planets (e.g. kappa CrB), some remnant of enhanced mass loss (e.g. the shell-like structure of R Sculptoris), and/or background contamination. We present a sample of potential disk hosting first ascent red giants. These stars all have infrared excesses at 22 microns, and possibly host circumstellar debris. We summarize the characteristics of the sample to better inform the incidence rates of thermally emitting material around giant stars. A thorough follow-up study of these candidates would serve as the first step in probing the composition of the dust in these systems that have left the main sequence, providing clues to the degree of disk processing that occurs beyond the main-sequence.

  3. The birth of a supermassive black hole binary

    NASA Astrophysics Data System (ADS)

    Pfister, Hugo; Lupi, Alessandro; Capelo, Pedro R.; Volonteri, Marta; Bellovary, Jillian M.; Dotti, Massimo

    2017-11-01

    We study the dynamical evolution of supermassive black holes, in the late stage of galaxy mergers, from kpc to pc scales. In particular, we capture the formation of the binary, a necessary step before the final coalescence, and trace back the main processes causing the decay of the orbit. We use hydrodynamical simulations of galaxy mergers with different resolutions, from 20 pc down to 1 pc, in order to study the effects of the resolution on our results, remove numerical effects, and assess that resolving the influence radius of the orbiting black hole is a minimum condition to fully capture the formation of the binary. Our simulations include the relevant physical processes, namely star formation, supernova feedback, accretion on to the black holes and the ensuing feedback. We find that, in these mergers, dynamical friction from the smooth stellar component of the nucleus is the main process that drives black holes from kpc to pc scales. Gas does not play a crucial role and even clumps do not induce scattering or perturb the orbits. We compare the time needed for the formation of the binary to analytical predictions and suggest how to apply such analytical formalism to obtain estimates of binary formation times in lower resolution simulations.

  4. Assessment of Polarimetric SAR Interferometry for Improving Ship Classification based on Simulated Data

    PubMed Central

    Margarit, Gerard; Mallorqui, Jordi J.

    2008-01-01

    This paper uses a complete and realistic SAR simulation processing chain, GRECOSAR, to study the potentialities of Polarimetric SAR Interferometry (POLInSAR) in the development of new classification methods for ships. Its high processing efficiency and scenario flexibility have allowed to develop exhaustive scattering studies. The results have revealed, first, vessels' geometries can be described by specific combinations of Permanent Polarimetric Scatterers (PePS) and, second, each type of vessel could be characterized by a particular spatial and polarimetric distribution of PePS. Such properties have been recently exploited to propose a new Vessel Classification Algorithm (VCA) working with POLInSAR data, which, according to several simulation tests, may provide promising performance in real scenarios. Along the paper, explanation of the main steps summarizing the whole research activity carried out with ships and GRECOSAR are provided as well as examples of the main results and VCA validation tests. Special attention will be devoted to the new improvements achieved, which are related to simulations processing a new and highly realistic sea surface model. The paper will show that, for POLInSAR data with fine resolution, VCA can help to classify ships with notable robustness under diverse and adverse observation conditions. PMID:27873954

  5. Production of chemicals from C1 gases (CO, CO2) by Clostridium carboxidivorans.

    PubMed

    Fernández-Naveira, Ánxela; Abubackar, Haris Nalakath; Veiga, María C; Kennes, Christian

    2017-03-01

    Bioprocesses in conventional second generation biorefineries are mainly based on the fermentation of sugars obtained from lignocellulosic biomass or agro-industrial wastes. An alternative to this process consists in gasifying those same feedstocks or even other carbon-containing materials to obtain syngas which can also be fermented by some anaerobic bacteria to produce chemicals or fuels. Carbon monoxide, carbon dioxide and hydrogen, which are the main components of syngas, are also found in some industrial waste gases, among others in steel industries. Clostridium carboxidivorans is able to metabolise such gases to produce ethanol and higher alcohols, i.e. butanol and hexanol, following the Wood-Ljungdahl pathway. This does simultaneously allow the removal of volatile pollutants involved in climate change. The bioconversion is a two step process in which organic acids (acetate, butyrate, hexanoate) are produced first, followed by the accumulation of alcohols; although partial overlap in time of acids and alcohols production may sometimes take place as well. Several parameters, among others pH, temperature, or gas-feed flow rates in bioreactors, affect the bioconversion process. Besides, the accumulation of high concentrations of alcohols in the fermentation broth inhibits the growth and metabolic activity of C. carboxidivorans.

  6. Defining hazards of supplemental oxygen therapy in neonatology using the FMEA tool.

    PubMed

    van der Eijk, Anne Catherine; Rook, Denise; Dankelman, Jenny; Smit, Bert Johan

    2013-01-01

    To prospectively evaluate hazards in the process of supplemental oxygen therapy in very preterm infants hospitalized in a Dutch NICU. A Failure Mode and Effects Analysis (FMEA) was conducted by a multidisciplinary team. This team identified, evaluated, and prioritized hazards of supplemental oxygen therapy in preterm infants. After accrediting "hazard scores" for each step in this process, recommendations were formulated for the main hazards. Performing the FMEA took seven meetings of 2 hours. The top 10 hazards could all be categorized into three main topics: incorrect adjustment of the fraction of inspired oxygen (FiO2), incorrect alarm limits for SpO2, and incorrect pulse-oximetry alarm limits on patient monitors for temporary use. The FMEA culminated in recommendations in both educational and technical directions. These included suggestions for (changes in) protocols on alarm limits and manual FiO2 adjustments, education of NICU staff on hazards of supplemental oxygen, and technical improvements in respiratory devices and patient monitors. The FMEA prioritized flaws in the process of supplemental oxygen therapy in very preterm infants. Thanks to the structured approach of the analysis by a multidisciplinary team, several recommendations were made. These recommendations are currently implemented in the study's center.

  7. ICME — A Mere Coupling of Models or a Discipline of Its Own?

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Schmitz, Georg J.; Prahl, Ulrich

    Technically, ICME — Integrated computational materials engineering — is an approach for solving advanced engineering problems related to the design of new materials and processes by combining individual materials and process models. The combination of models by now is mainly achieved by manual transformation of the output of a simulation to form the input to a subsequent one. This subsequent simulation is either performed at a different length scale or constitutes a subsequent step along the process chain. Is ICME thus just a synonym for the coupling of simulations? In fact, most ICME publications up to now are examples of the joint application of selected models and software codes to a specific problem. However, from a systems point of view, the coupling of individual models and/or software codes across length scales and along material processing chains leads to highly complex meta-models. Their viability has to be ensured by joint efforts from science, industry, software developers and independent organizations. This paper identifies some developments that seem necessary to make future ICME simulations viable, sustainable and broadly accessible and accepted. The main conclusion is that ICME is more than a multi-disciplinary subject but a discipline of its own, for which a generic structural framework has to be elaborated and established.

  8. Ultrasound: a subexploited tool for sample preparation in metabolomics.

    PubMed

    Luque de Castro, M D; Delgado-Povedano, M M

    2014-01-02

    Metabolomics, one of the most recently emerged "omics", has taken advantage of ultrasound (US) to improve sample preparation (SP) steps. The metabolomics-US assisted SP step binomial has experienced a dissimilar development that has depended on the area (vegetal or animal) and the SP step. Thus, vegetal metabolomics and US assisted leaching has received the greater attention (encompassing subdisciplines such as metallomics, xenometabolomics and, mainly, lipidomics), but also liquid-liquid extraction and (bio)chemical reactions in metabolomics have taken advantage of US energy. Also clinical and animal samples have benefited from US assisted SP in metabolomics studies but in a lesser extension. The main effects of US have been shortening of the time required for the given step, and/or increase of its efficiency or availability for automation; nevertheless, attention paid to potential degradation caused by US has been scant or nil. Achievements and weak points of the metabolomics-US assisted SP step binomial are discussed and possible solutions to the present shortcomings are exposed. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A fast exact simulation method for a class of Markov jump processes.

    PubMed

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  10. Reservoir rehabilitations: Seeking the Fountain of Youth

    USGS Publications Warehouse

    Pegg, Mark A.; Pope, Kevin L.; Powell, L.A.; Turek, Kelly C.; Spurgeon, Jonathan J.; Stewart, Nathaniel T.; Hogberg, Nick P.; Porath, Mark T.

    2017-01-01

    Aging of reservoirs alters the functions, and associated services, of these systems through time. The goal of habitat rehabilitation is often to alter the trajectory of the aging process such that the duration of the desired state is prolonged. There are two important characteristics in alteration of the trajectory—the amplitude relative to current state and the subsequent rate of change, or aging—that ultimately determine the duration of extension for the desired state. Rehabilitation processes largely fall into three main categories: fish community manipulation, water quality manipulation, and physical habitat manipulation. We can slow aging of reservoirs through carefully implemented management actions, perhaps even turning back the hands of time, but we cannot stop aging. We call for new, innovative perspectives that incorporate an understanding of aging processes in all steps of rehabilitation of reservoirs, especially in planning and assessing.

  11. Discrimination of dynamical system models for biological and chemical processes.

    PubMed

    Lorenz, Sönke; Diederichs, Elmar; Telgmann, Regina; Schütte, Christof

    2007-06-01

    In technical chemistry, systems biology and biotechnology, the construction of predictive models has become an essential step in process design and product optimization. Accurate modelling of the reactions requires detailed knowledge about the processes involved. However, when concerned with the development of new products and production techniques for example, this knowledge often is not available due to the lack of experimental data. Thus, when one has to work with a selection of proposed models, the main tasks of early development is to discriminate these models. In this article, a new statistical approach to model discrimination is described that ranks models wrt. the probability with which they reproduce the given data. The article introduces the new approach, discusses its statistical background, presents numerical techniques for its implementation and illustrates the application to examples from biokinetics.

  12. The Aluminum Smelting Process and Innovative Alternative Technologies

    PubMed Central

    Drabløs, Per Arne

    2014-01-01

    Objective: The industrial aluminum production process is addressed. The purpose is to give a short but comprehensive description of the electrolysis cell technology, the raw materials used, and the health and safety relevance of the process. Methods: This article is based on a study of the extensive chemical and medical literature on primary aluminum production. Results: At present, there are two main technological challenges for the process—to reduce energy consumption and to mitigate greenhouse gas emissions. A future step may be carbon dioxide gas capture and sequestration related to the electric power generation from fossil sources. Conclusions: Workers' health and safety have now become an integrated part of the aluminum business. Work-related injuries and illnesses are preventable, and the ultimate goal to eliminate accidents with lost-time injuries may hopefully be approached in the future. PMID:24806723

  13. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  14. Two-dimensional assembly structure of graphene and TiO2 nanosheets from titanic acid with enhanced visible-light photocatalytic performance

    NASA Astrophysics Data System (ADS)

    Hao, Rong; Guo, Shien; Wang, Xiuwen; Feng, Tong; Feng, Qingmao; Li, Mingxia; Jiang, Baojiang

    2016-06-01

    The titanic acid sheets were prepared by one-step hydrazine hydrate-assisted hydrothermal process. Then the reduced graphite oxide (rGO)@TiO2 nanosheet composites were finally obtained through ultrasonic exfoliation and following calcination treatment process. rGO@TiO2 nanosheet composites show excellent hydrogen production performance under AM1.5 light source. The highest hydrogen evolution yield (923.23 μmol) is nearly two times higher than that of pure TiO2, mainly due to the special electron structure and more active sites for TiO2 nanosheet. The introduction of graphene could improve the TiO2 nanosheet stability and extend visible-light absorption range.

  15. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  16. Fe(III)-solar light induced degradation of diethyl phthalate (DEP) in aqueous solutions.

    PubMed

    Mailhot, G; Sarakha, M; Lavedrine, B; Cáceres, J; Malato, S

    2002-11-01

    The degradation of diethyl phthalate (DEP) photoinduced by Fe(III) in aqueous solutions has been investigated under solar irradiation in the compound parabolic collector reactor at Plataforma Solar de Almeria. Hydroxyl radicals *OH, responsible of the degradation, are formed via an intramolecular photoredox process in the excited state of Fe(III) aquacomplexes. The primary step of the reaction is mainly due to the attack of *OH radicals on the aromatic ring. For prolonged irradiations DEP and its photoproducts are completely mineralized due to the regeneration of the absorbing species and the continuous formation of *OH radicals that confers a catalytic aspect to the process. Consequently, the degradation photoinduced by Fe(III) could be an efficient method of DEP removal from water.

  17. Artistic creativity, style and brain disorders.

    PubMed

    Bogousslavsky, Julien

    2005-01-01

    The production of novel, motivated or useful material defines creativity, which appears to be one of the higher, specific, human brain functions. While creativity can express itself in virtually any domain, art might particularly well illustrate how creativity may be modulated by the normal or pathological brain. Evidence emphasizes global brain functioning in artistic creativity and output, but critical steps which link perception processing to execution of a work, such as extraction-abstraction, as well as major developments of non-esthetic values attached to art also underline complex activation and inhibition processes mainly localized in the frontal lobe. Neurological diseases in artists provide a unique opportunity to study brain-creativity relationships, in particular through the stylistic changes which may develop after brain lesion. (c) 2005 S. Karger AG, Basel

  18. Synthesis of nonionic-anionic colloidal systems based on alkaline and ammonium β-nonylphenol polyethyleneoxy (n = 3-20) propionates/dodecylbenzenesulfonates with prospects for food hygiene

    PubMed Central

    2012-01-01

    Background The main objective of this work was to obtain a binary system of surface-active components (nonionic soap – alkaline and/or ammonium dodecylbenzenesulfonate) with potential competences in food hygiene, by accessing a scheme of classical reactions (cyanoethylation, total acid hydrolysis and stoichiometric neutralization with inorganic alkaline and/or organic ammonium bases) adapted to heterogeneously polyethoxylated nonylphenols (n = 3-20). In the processing system mentioned, dodecylbenzenesulfonic acid, initially the acid catalyst for the exhaustive hydrolysis of β-nonylphenolpolyethyleneoxy (n = 3-20) propionitriles, becomes together with the nonionic soap formed the second surface-active component of the binary system. Results In the reaction scheme adopted the influence of the main operating (duration, temperature, molar ratio of reagents) and structural parameters (degree of oligomerization of the polyoxyethylene chain) on the processing yields for the synthetic steps was followed. The favorable role of the polyoxyethylene chain size is remarked, through its specific conformation and its alkaline cations sequestration competences on the yields of cyanoethylation, but also the beneficial influence of phase-transfer catalysts in the total acid hydrolysis step. The chemical stability of dodecylbenzenesulfonic acid (DBSH) at the temperature and strongly acidic pH of the reaction environment is confirmed. The controlled change of the amount of DBSH in the final binary system will later confer it potential colloidal competences in food hygiene receipts. Conclusions The preliminary synthetic tests performed confirmed the prospect of obtaining a broad range of useful colloidal competences in various food hygiene scenarios. PMID:22958389

  19. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety

    PubMed Central

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-01-01

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. PMID:27294931

  20. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety.

    PubMed

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-06-09

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety.

  1. Comparison of three different wastewater sludge and their respective drying processes: Solar, thermal and reed beds - Impact on organic matter characteristics.

    PubMed

    Collard, Marie; Teychené, Benoit; Lemée, Laurent

    2017-12-01

    Drying process aims at minimising the volume of wastewater sludge (WWS) before disposal, however it can impact sludge characteristics. Due to its high content in organic matter (OM) and lipids, sludge are mainly valorised by land farming but can also be considered as a feedstock for biodiesel production. As sludge composition is a major parameter for the choice of disposal techniques, the objective of this study was to determine the influence of the drying process. To reach this goal, three sludges obtained from solar, reed beds and thermal drying processes were investigated at the global and molecular scales. Before the drying step the sludges presented similar physico-chemical (OM content, elemental analysis, pH, infrared spectra) characteristics and lipid contents. A strong influence of the drying process on lipids and humic-like substances contents was observed through OM fractionation. Thermochemolysis-GCMS of raw sludge and lipids revealed similar molecular content mainly constituted with steroids and fatty acids. Molecular changes were noticeable for thermal drying through differences in branched to linear fatty acids ratio. Finally the thermal drying induced a weakening of OM whereas the solar drying led to a complexification. These findings show that smooth drying processes such as solar or reed-beds are preferable for amendment production whereas thermal process leads to pellets with a high lipid content which could be considered for fuel production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Next Step for Main Street Credit Availability Act of 2009

    THOMAS, 111th Congress

    Sen. Snowe, Olympia J. [R-ME

    2009-08-06

    Senate - 08/06/2009 Read twice and referred to the Committee on Small Business and Entrepreneurship. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  3. The “Health Coaching” programme: a new patient-centred and visually supported approach for health behaviour change in primary care

    PubMed Central

    2013-01-01

    Background Health related behaviour is an important determinant of chronic disease, with a high impact on public health. Motivating and assisting people to change their unfavourable health behaviour is thus a major challenge for health professionals. The objective of the study was to develop a structured programme of counselling in primary care practice, and to test its feasibility and acceptance among general practitioners (GPs) and their patients. Methods Our new concept integrates change of roles, shared responsibility, patient-centredness, and modern communication techniques—such as motivational interviewing. A new colour-coded visual communication tool is used for the purpose of leading through the 4-step counselling process. As doctors’ communication skills are crucial, communication training is a mandatory part of the programme. We tested the feasibility and acceptance of the “Health Coaching” programme with 20 GPs and 1045 patients, using questionnaires and semistructured interviewing techniques. The main outcomes were participation rates; the duration of counselling; patients’ self-rated behavioural change in their areas of choice; and ratings of motivational, conceptual, acceptance, and feasibility issues. Results In total, 37% (n=350) of the patients enrolled in step 1 completed the entire 4-Step counselling process, with each step taking 8–22 minutes. 50% of ratings (n=303) improved by one or two categories in the three-colour circle, and the proportion of favourable health behaviour ratings increased from 9% to 39%. The ratings for motivation, concept, acceptance, and feasibility of the “Health Coaching” programme were consistently high. Conclusions Our innovative, patient-centred counselling programme for health behaviour change was well accepted and feasible among patients and physicians in a primary care setting. Randomised controlled studies will have to establish cost-effectiveness and promote dissemination. PMID:23865509

  4. Description of the Baudet Surgical Technique and Introduction of a Systematic Method for Training Surgeons to Perform Male-to-Female Sex Reassignment Surgery.

    PubMed

    Leclère, Franck Marie; Casoli, Vincent; Baudet, Jacques; Weigert, Romain

    2015-12-01

    Male-to-female sex reassignment surgery involves three main procedures, namely, clitoroplasty, new urethral meatoplasty and vaginopoiesis. Herein we describe the key steps of our surgical technique. Male-to-female sex reassignment surgery includes the following 14 key steps which are documented in this article: (1) patient installation and draping, (2) urethral catheter placement, (3) scrotal incision and vaginal cavity formation, (4) bilateral orchidectomy, (5) penile skin inversion, (6) dismembering of the urethra from the corpora, (7) neoclitoris formation, (8) neoclitoris refinement, (9) neovaginalphallic cylinder formation, (10) fixation of the neoclitoris, (11) neovaginalphallic cylinder insertion, (12) contouring of the labia majora and positioning the neoclitoris and urethra, (13) tie-over dressing and (14) compression dressing. The size and position of the neoclitoris, position of the urethra, adequacy of the neovaginal cavity, position and tension on the triangular flap, size of the neo labia minora, size of the labia majora, symmetry and ease of intromission are important factors when considering the immediate results of the surgery. We present our learning process of graduated responsibility for optimisation of these results. We describe our postoperative care and the possible complications. Herein, we have described the 14 steps of the Baudet technique for male-to-female sex reassignment surgery which include clitoroplasty, new urethral meatoplasty and vaginopoiesis. The review of each key stage of the procedure represents the first step of our global teaching process. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  5. The "Health Coaching" programme: a new patient-centred and visually supported approach for health behaviour change in primary care.

    PubMed

    Neuner-Jehle, Stefan; Schmid, Margareta; Grüninger, Ueli

    2013-07-17

    Health related behaviour is an important determinant of chronic disease, with a high impact on public health. Motivating and assisting people to change their unfavourable health behaviour is thus a major challenge for health professionals. The objective of the study was to develop a structured programme of counselling in primary care practice, and to test its feasibility and acceptance among general practitioners (GPs) and their patients. Our new concept integrates change of roles, shared responsibility, patient-centredness, and modern communication techniques-such as motivational interviewing. A new colour-coded visual communication tool is used for the purpose of leading through the 4-step counselling process. As doctors' communication skills are crucial, communication training is a mandatory part of the programme. We tested the feasibility and acceptance of the "Health Coaching" programme with 20 GPs and 1045 patients, using questionnaires and semistructured interviewing techniques. The main outcomes were participation rates; the duration of counselling; patients' self-rated behavioural change in their areas of choice; and ratings of motivational, conceptual, acceptance, and feasibility issues. In total, 37% (n=350) of the patients enrolled in step 1 completed the entire 4-Step counselling process, with each step taking 8-22 minutes. 50% of ratings (n=303) improved by one or two categories in the three-colour circle, and the proportion of favourable health behaviour ratings increased from 9% to 39%. The ratings for motivation, concept, acceptance, and feasibility of the "Health Coaching" programme were consistently high. Our innovative, patient-centred counselling programme for health behaviour change was well accepted and feasible among patients and physicians in a primary care setting. Randomised controlled studies will have to establish cost-effectiveness and promote dissemination.

  6. Education and research in medical optronics in France

    NASA Astrophysics Data System (ADS)

    Demongeot, Jacques; Fleute, M.; Herve, T.; Lavallee, Stephane

    2000-06-01

    First we present here the main post-graduate courses proposed in France both for physicians and engineers in medical optronics. After we explain which medical domains are concerned by this teaching, essentially computer assisted surgery, telemedicine and functional exploration. Then we show the main research axes in these fields, in which new jobs have to be invented and new educational approaches have to be prepared in order to satisfy the demand coming both from hospitals (mainly referent hospitals) and from industry (essentially medical imaging and instrumentation companies). Finally we will conclude that medical optronics is an important step in an entire chain of acquisition and processing of medical data, capable to create the medical knowledge a surgeon or a physician needs for diagnosis or therapy purposes. Optimizing the teaching of medical optronics needs a complete integration from acquiring to modeling the medical reality. This tendency to give a holistic education in medical imaging and instrumentation is called `Model driven Acquisition' learning.

  7. Application of Quality by Design to the characterization of the cell culture process of an Fc-Fusion protein.

    PubMed

    Rouiller, Yolande; Solacroup, Thomas; Deparis, Véronique; Barbafieri, Marco; Gleixner, Ralf; Broly, Hervé; Eon-Duval, Alex

    2012-06-01

    The production bioreactor step of an Fc-Fusion protein manufacturing cell culture process was characterized following Quality by Design principles. Using scientific knowledge derived from the literature and process knowledge gathered during development studies and manufacturing to support clinical trials, potential critical and key process parameters with a possible impact on product quality and process performance, respectively, were determined during a risk assessment exercise. The identified process parameters were evaluated using a design of experiment approach. The regression models generated from the data allowed characterizing the impact of the identified process parameters on quality attributes. The main parameters having an impact on product titer were pH and dissolved oxygen, while those having the highest impact on process- and product-related impurities and variants were pH and culture duration. The models derived from characterization studies were used to define the cell culture process design space. The design space limits were set in such a way as to ensure that the drug substance material would consistently have the desired quality. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Study of the aroma formation and transformation during the manufacturing process of oolong tea by solid-phase micro-extraction and gas chromatography-mass spectrometry combined with chemometrics.

    PubMed

    Ma, Chengying; Li, Junxing; Chen, Wei; Wang, Wenwen; Qi, Dandan; Pang, Shi; Miao, Aiqing

    2018-06-01

    Oolong tea is a typical semi-fermented tea and is famous for its unique aroma. The aim of this study was to compare the volatile compounds during manufacturing process to reveal the formation of aroma. In this paper, a method was developed based on head-space solid phase microextraction/gas chromatography-mass spectrometry (HS-SPME/GC-MS) combined with chemometrics to assess volatile profiles during manufacturing process (fresh leaves, sun-withered leaves, rocked leaves and leaves after de-enzyming). A total of 24 aroma compounds showing significant differences during manufacturing process were identified. Subsequently, according to these aroma compounds, principal component analysis and hierarchical cluster analysis showed that the four samples were clearly distinguished from each other, which suggested that the 24 identified volatile compounds can represent the changes of volatile compounds during the four steps. Additionally, sun-withering, rocking and de-enzyming can influence the variations of volatile compounds in different degree, and we found the changes of volatile compounds in withering step were less than other two manufacturing process, indicating that the characteristic volatile compounds of oolong tea might be mainly formed in rocking stage by biological reactions and de-enzyming stage through thermal chemical transformations rather than withering stage. This study suggested that HS-SPME/GC-MS combined with chemometrics methods is accurate, sensitive, fast and ideal for rapid routine analysis of the aroma compounds changes in oolong teas during manufacturing processing. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Fate and Prediction of Phenolic Secoiridoid Compounds throughout the Different Stages of the Virgin Olive Oil Making Process.

    PubMed

    Fregapane, Giuseppe; Salvador, M Desamparados

    2017-08-03

    The evolution of the main phenolic secoiridoid compounds throughout the different stages of the virgin olive oil making process-crushing, malaxation and liquid-solid separation-is studied here, with the goal of making possible the prediction of the partition and transformation that take place in the different steps of the process. The concentration of hydroxytyrosol secoiridoids produced under the different crushing conditions studied are reasonably proportional to the intensity of the milling stage, and strongly depend on the olive variety processed. During malaxation, the content of the main phenolic secoiridoids is reduced, especially in the case of the hydroxytyrosol derivatives, in which a variety-dependent behaviour is observed. The prediction of the concentration of phenolic secoiridoids finally transferred from the kneaded paste to the virgin olive oil is also feasible, and depends on the phenolic content and amount of water in the olive paste. The determination of the phenolic compounds in the olive fruit, olive paste and olive oil has been carried out by LC-MS (Liquid-Chromatography Mass-Spectrometry). This improved knowledge could help in the use of more adequate processing conditions for the production of virgin olive oil with desired properties; for example, higher or lower phenolic content, as the amount of these minor components is directly related to its sensory, antioxidant and healthy properties.

  10. Advanced Vacuum Plasma Spray (VPS) for a Robust, Longlife and Safe Space Shuttle Main Engine (SSME)

    NASA Technical Reports Server (NTRS)

    Holmes, Richard R.; Elam, Sandra K.; McKechnie, Timothy N.; Power, Christopher A.

    2010-01-01

    In 1984, the Vacuum Plasma Spray Lab was built at NASA/Marshall Space Flight Center for applying durable, protective coatings to turbine blades for the space shuttle main engine (SSME) high pressure fuel turbopump. Existing turbine blades were cracking and breaking off after five hot fire tests while VPS coated turbine blades showed no wear or cracking after 40 hot fire tests. Following that, a major manufacturing problem of copper coatings peeling off the SSME Titanium Main Fuel Valve Housing was corrected with a tenacious VPS copper coating. A patented VPS process utilizing Functional Gradient Material (FGM) application was developed to build ceramic lined metallic cartridges for space furnace experiments, safely containing gallium arsenide at 1260 degrees centigrade. The VPS/FGM process was then translated to build robust, long life, liquid rocket combustion chambers for the space shuttle main engine. A 5K (5,000 Lb. thrust) thruster with the VPS/FGM protective coating experienced 220 hot firing tests in pristine condition with no wear compared to the SSME which showed blanching (surface pulverization) and cooling channel cracks in less than 30 of the same hot firing tests. After 35 of the hot firing tests, the injector face plates disintegrated. The VPS/FGM process was then applied to spraying protective thermal barrier coatings on the face plates which showed 50% cooler operating temperature, with no wear after 50 hot fire tests. Cooling channels were closed out in two weeks, compared to one year for the SSME. Working up the TRL (Technology Readiness Level) to establish the VPS/FGM process as viable technology, a 40K thruster was built and is currently being tested. Proposed is to build a J-2X size liquid rocket engine as the final step in establishing the VPS/FGM process TRL for space flight.

  11. Advanced metrology by offline SEM data processing

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  12. Pasteurization of shell eggs using radio frequency heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  13. Pasteurization of shell eggs using radio frequency heating

    DOE PAGES

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    2016-08-21

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  14. A two-step crushed lava rock filter unit for grey water treatment at household level in an urban slum.

    PubMed

    Katukiza, A Y; Ronteltap, M; Niwagaba, C B; Kansiime, F; Lens, P N L

    2014-01-15

    Decentralised grey water treatment in urban slums using low-cost and robust technologies offers opportunities to minimise public health risks and to reduce environmental pollution caused by the highly polluted grey water i.e. with a COD and N concentration of 3000-6000 mg L(-1) and 30-40 mg L(-1), respectively. However, there has been very limited action research to reduce the pollution load from uncontrolled grey water discharge by households in urban slums. This study was therefore carried out to investigate the potential of a two-step filtration process to reduce the grey water pollution load in an urban slum using a crushed lava rock filter, to determine the main filter design and operation parameters and the effect of intermittent flow on the grey water effluent quality. A two-step crushed lava rock filter unit was designed and implemented for use by a household in the Bwaise III slum in Kampala city (Uganda). It was monitored at a varying hydraulic loading rate (HLR) of 0.5-1.1 m d(-1) as well as at a constant HLR of 0.39 m d(-1). The removal efficiencies of COD, TP and TKN were, respectively, 85.9%, 58% and 65.5% under a varying HLR and 90.5%, 59.5% and 69%, when operating at a constant HLR regime. In addition, the log removal of Escherichia coli, Salmonella spp. and total coliforms was, respectively, 3.8, 3.2 and 3.9 under the varying HLR and 3.9, 3.5 and 3.9 at a constant HLR. The results show that the use of a two-step filtration process as well as a lower constant HLR increased the pollutant removal efficiencies. Further research is needed to investigate the feasibility of adding a tertiary treatment step to increase the nutrients and microorganisms removal from grey water. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Limits of acceptable change and natural resources planning: when is LAC useful, when is it not?

    Treesearch

    David N. Cole; Stephen F. McCool

    1997-01-01

    There are ways to improve the LAC process and its implementational procedures. One significant procedural modification is the addition of a new step. This step — which becomes the first step in the process — involves more explicitly defining goals and desired conditions. For other steps in the process, clarifications of concept and terminology are advanced, as are...

  16. Role of step stiffness and kinks in the relaxation of vicinal (001) with zigzag [110] steps

    NASA Astrophysics Data System (ADS)

    Mahjoub, B.; Hamouda, Ajmi BH.; Einstein, TL.

    2017-08-01

    We present a kinetic Monte Carlo study of the relaxation dynamics and steady state configurations of 〈110〉 steps on a vicinal (001) simple cubic surface. This system is interesting because 〈110〉 (fully kinked) steps have different elementary excitation energetics and favor step diffusion more than 〈100〉 (nominally straight) steps. In this study we show how this leads to different relaxation dynamics as well as to different steady state configurations, including that 2-bond breaking processes are rate determining for 〈110〉 steps in contrast to 3-bond breaking processes for 〈100〉-steps found in previous work [Surface Sci. 602, 3569 (2008)]. The analysis of the terrace-width distribution (TWD) shows a significant role of kink-generation-annihilation processes during the relaxation of steps: the kinetic of relaxation, toward the steady state, is much faster in the case of 〈110〉-zigzag steps, with a higher standard deviation of the TWD, in agreement with a decrease of step stiffness due to orientation. We conclude that smaller step stiffness leads inexorably to faster step dynamics towards the steady state. The step-edge anisotropy slows the relaxation of steps and increases the strength of step-step effective interactions.

  17. Computation of turbulent flows over backward and forward-facing steps using a near-wall Reynolds stress model

    NASA Technical Reports Server (NTRS)

    Ko, Sung HO

    1993-01-01

    Separation and reattachment of turbulent shear layers is observed in many important engineering applications, yet it is poorly understood. This has motivated many studies on understanding and predicting the processes of separation and reattachment of turbulent shear layers. Both of the situations in which separation is induced by adverse pressure gradient, or by discontinuities of geometry, have attracted attention of turbulence model developers. Formulation of turbulence closure models to describe the essential features of separated turbulent flows accurately is still a formidable task. Computations of separated flows associated with sharp-edged bluff bodies are described. For the past two decades, the backward-facing step flow, the simplest separated flow, has been a popular test case for turbulence models. Detailed studies on the performance of many turbulence models, including two equation turbulence models and Reynolds stress models, for flows over steps can be found in the papers by Thangam & Speziale and Lasher & Taulbee). These studies indicate that almost all the existing turbulence models fail to accurately predict many important features of back step flow such as reattachment length, recovery rate of the redeveloping boundary layers downstream of the reattachment point, streamlines near the reattachment point, and the skin friction coefficient. The main objectives are to calculate flows over backward and forward-facing steps using the NRSM and to make use of the newest DNS data for detailed comparison. This will give insights for possible improvements of the turbulence model.

  18. A Cell Programmable Assay (CPA) chip.

    PubMed

    Ju, Jongil; Warrick, Jay; Beebe, David J

    2010-08-21

    This article describes two kinds of "Cell Programmable Assay" (CPA) chips that utilize passive pumping for the culture and autonomous staining of cells to simply common protocols. One is a single timer channel CPA (sCPA) chip that has one timer channel and one main channel containing a cell culture chamber. The sCPA is used to culture and stain cells using Hoechst nuclear staining dye (a 2 step staining process). The other is a dual timer channel CPA (dCPA) chip that has two timer channels and one main channel with a chamber for cell culture. The dCPA is used here to culture, fix, permeablize, and stain cells using DAPI. The additional timer channel of the dCPA chip allows for automation of 3 steps. The CPA chips were successfully evaluated using HEK 293 cells. In addition, we provide a simplified equation for tuning or redesigning CPA chips to meet the needs of a variety of protocols that may require different timings. The equation is easy to use as it only depends upon the dimensions of microchannel and the volume of the reagent drops. The sCPA and dCPA chips can be readily modified to apply to a wide variety of common cell culture methods and procedures.

  19. Improving marine disease surveillance through sea temperature monitoring, outlooks and projections

    PubMed Central

    Maynard, Jeffrey; van Hooidonk, Ruben; Harvell, C. Drew; Eakin, C. Mark; Liu, Gang; Willis, Bette L.; Williams, Gareth J.; Dobson, Andrew; Heron, Scott F.; Glenn, Robert; Reardon, Kathleen; Shields, Jeffrey D.

    2016-01-01

    To forecast marine disease outbreaks as oceans warm requires new environmental surveillance tools. We describe an iterative process for developing these tools that combines research, development and deployment for suitable systems. The first step is to identify candidate host–pathogen systems. The 24 candidate systems we identified include sponges, corals, oysters, crustaceans, sea stars, fishes and sea grasses (among others). To illustrate the other steps, we present a case study of epizootic shell disease (ESD) in the American lobster. Increasing prevalence of ESD is a contributing factor to lobster fishery collapse in southern New England (SNE), raising concerns that disease prevalence will increase in the northern Gulf of Maine under climate change. The lowest maximum bottom temperature associated with ESD prevalence in SNE is 12°C. Our seasonal outlook for 2015 and long-term projections show bottom temperatures greater than or equal to 12°C may occur in this and coming years in the coastal bays of Maine. The tools presented will allow managers to target efforts to monitor the effects of ESD on fishery sustainability and will be iteratively refined. The approach and case example highlight that temperature-based surveillance tools can inform research, monitoring and management of emerging and continuing marine disease threats. PMID:26880840

  20. Improving marine disease surveillance through sea temperature monitoring, outlooks and projections.

    PubMed

    Maynard, Jeffrey; van Hooidonk, Ruben; Harvell, C Drew; Eakin, C Mark; Liu, Gang; Willis, Bette L; Williams, Gareth J; Groner, Maya L; Dobson, Andrew; Heron, Scott F; Glenn, Robert; Reardon, Kathleen; Shields, Jeffrey D

    2016-03-05

    To forecast marine disease outbreaks as oceans warm requires new environmental surveillance tools. We describe an iterative process for developing these tools that combines research, development and deployment for suitable systems. The first step is to identify candidate host-pathogen systems. The 24 candidate systems we identified include sponges, corals, oysters, crustaceans, sea stars, fishes and sea grasses (among others). To illustrate the other steps, we present a case study of epizootic shell disease (ESD) in the American lobster. Increasing prevalence of ESD is a contributing factor to lobster fishery collapse in southern New England (SNE), raising concerns that disease prevalence will increase in the northern Gulf of Maine under climate change. The lowest maximum bottom temperature associated with ESD prevalence in SNE is 12 °C. Our seasonal outlook for 2015 and long-term projections show bottom temperatures greater than or equal to 12 °C may occur in this and coming years in the coastal bays of Maine. The tools presented will allow managers to target efforts to monitor the effects of ESD on fishery sustainability and will be iteratively refined. The approach and case example highlight that temperature-based surveillance tools can inform research, monitoring and management of emerging and continuing marine disease threats. © 2016 The Authors.

  1. Separation of porcine parvovirus from bovine serum albumin using PEG-salt aqueous two-phase system.

    PubMed

    Vijayaragavan, K Saagar; Zahid, Amna; Young, Jonathan W; Heldt, Caryn L

    2014-09-15

    Vaccine production faces a challenge in adopting conventional downstream processing steps that can efficiently purify large viral particles. Some major issues that plague vaccine purification are purity, potency, and quality. The industry currently considers 30% as an acceptable virus recovery for a vaccine purification process, including all downstream processes, whereas antibody recovery from CHO cell culture is generally around 80-85%. A platform technology with an improved virus recovery would revolutionize vaccine production. In a quest to fulfill this goal, we have been exploring aqueous two-phase systems (ATPSs) as an optional mechanism to purify virus. ATPS has been unable to gain wide implementation mainly due to loss of virus infectivity, co-purification of proteins, and difficulty of polymer recycling. Non-enveloped viruses are chemically resistant enough to withstand the high polymer and salt concentrations that are required for effective ATPS separations. We used infectious porcine parvovirus (PPV), a non-enveloped, DNA virus as a model virus to test and develop an ATPS separation method. We successfully tackled two of the three main disadvantages of ATPS previously stated; we achieved a high infectious yield of 64% in a PEG-citrate ATPS process while separating out the main contaminate protein, bovine serum albumin (BSA). The most dominant forces in the separation were biomolecule charge, virus surface hydrophobicity, and the ATPS surface tension. Highly hydrophobic viruses are likely to benefit from the discovered ATPS for high-purity vaccine production and ease of implementation. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Accurate documentation in cultural heritage by merging TLS and high-resolution photogrammetric data

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, Pierre; Alby, Emmanuel; Assali, Pierre; Poitevin, Valentin; Hullo, Jean-François; Smigiel, Eddie

    2011-07-01

    Several recording techniques are used together in Cultural Heritage Documentation projects. The main purpose of the documentation and conservation works is usually to generate geometric and photorealistic 3D models for both accurate reconstruction and visualization purposes. The recording approach discussed in this paper is based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons, and criteria as geometry, texture, accuracy, resolution, recording and processing time are often compared. TLS techniques (time of flight or phase shift systems) are often used for the recording of large and complex objects or sites. Point cloud generation from images by dense stereo or multi-image matching can be used as an alternative or a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one as the acquisition system is limited to a digital camera and a few accessories only. Indeed, the stereo matching process offers a cheap, flexible and accurate solution to get 3D point clouds and textured models. The calibration of the camera allows the processing of distortion free images, accurate orientation of the images, and matching at the subpixel level. The main advantage of this photogrammetric methodology is to get at the same time a point cloud (the resolution depends on the size of the pixel on the object), and therefore an accurate meshed object with its texture. After the matching and processing steps, we can use the resulting data in much the same way as a TLS point cloud, but with really better raster information for textures. The paper will address the automation of recording and processing steps, the assessment of the results, and the deliverables (e.g. PDF-3D files). Visualization aspects of the final 3D models are presented. Two case studies with merged photogrammetric and TLS data are finally presented: - The Gallo-roman Theatre of Mandeure, France); - The Medieval Fortress of Châtel-sur-Moselle, France), where a network of underground galleries and vaults has been recorded.

  3. In-depth analysis of chloride treatments for thin-film CdTe solar cells

    PubMed Central

    Major, J. D.; Al Turkestani, M.; Bowen, L.; Brossard, M.; Li, C.; Lagoudakis, P.; Pennycook, S. J.; Phillips, L. J.; Treharne, R. E.; Durose, K.

    2016-01-01

    CdTe thin-film solar cells are now the main industrially established alternative to silicon-based photovoltaics. These cells remain reliant on the so-called chloride activation step in order to achieve high conversion efficiencies. Here, by comparison of effective and ineffective chloride treatments, we show the main role of the chloride process to be the modification of grain boundaries through chlorine accumulation, which leads an increase in the carrier lifetime. It is also demonstrated that while improvements in fill factor and short circuit current may be achieved through use of the ineffective chlorides, or indeed simple air annealing, voltage improvement is linked directly to chlorine incorporation at the grain boundaries. This suggests that focus on improved or more controlled grain boundary treatments may provide a route to achieving higher cell voltages and thus efficiencies. PMID:27775037

  4. Through ARIPAR-GIS the quantified area risk analysis supports land-use planning activities.

    PubMed

    Spadoni, G; Egidi, D; Contini, S

    2000-01-07

    The paper first summarises the main aspects of the ARIPAR methodology whose steps can be applied to quantify the impact on a territory of major accident risks due to processing, storing and transporting dangerous substances. Then the capabilities of the new decision support tool ARIPAR-GIS, implementing the mentioned procedure, are described, together with its main features and types of results. These are clearly shown through a short description of the updated ARIPAR study (reference year 1994), in which the impact of changes due to industrial and transportation dynamics on the Ravenna territory in Italy were evaluated. The brief explanation of how results have been used by local administrations offers the opportunity to discuss about advantages of the quantitative area risk analysis tool in supporting activities of risk management, risk control and land-use planning.

  5. Drive piston assembly for a valve actuator assembly

    DOEpatents

    Sun, Zongxuan

    2010-02-23

    A drive piston assembly is provided that is operable to selectively open a poppet valve. The drive piston assembly includes a cartridge defining a generally stepped bore. A drive piston is movable within the generally stepped bore and a boost sleeve is coaxially disposed with respect to the drive piston. A main fluid chamber is at least partially defined by the generally stepped bore, drive piston, and boost sleeve. First and second feedback chambers are at least partially defined by the drive piston and each are disposed at opposite ends of the drive piston. At least one of the drive piston and the boost sleeve is sufficiently configured to move within the generally stepped bore in response to fluid pressure within the main fluid chamber to selectively open the poppet valve. A valve actuator assembly and engine are also provided incorporating the disclosed drive piston assembly.

  6. General principles for the treatment of non-infectious uveitis.

    PubMed

    Díaz-Llopis, Manuel; Gallego-Pinazo, Roberto; García-Delpech, Salvador; Salom-Alonso, David

    2009-09-01

    Ocular inflammatory disorders constitute a sight-threatening group of diseases that might be managed according to their severity. Their treatment guidelines experience constant changes with new agents that improve the results obtained with former drugs. Nowadays we can make use of a five step protocol in which topical, periocular and systemic corticosteroids remain as the main therapy for non infectious uveitis. In addition, immunosuppresive drugs can be added in order to enhance the anti-inflammatory effects and to develop the role of corticosteroid-saving agents. These can be organized in four other steps: Cyclosporine and Methotrexate in a second one; Azathioprine, Mycophenolate Mofetil and Tacrolimus in a third step; biological anti-TNF drugs in fourth position; and a theoretical last one with Cyclophosphamide and Chlorambucil. In the present review we go through the main characteristics and complications of all these treatments and make a rational of this five-step treatment protocol for non infectious posterior uveitis.

  7. A stochastic approach to noise modeling for barometric altimeters.

    PubMed

    Sabatini, Angelo Maria; Genovese, Vincenzo

    2013-11-18

    The question whether barometric altimeters can be applied to accurately track human motions is still debated, since their measurement performance are rather poor due to either coarse resolution or drifting behavior problems. As a step toward accurate short-time tracking of changes in height (up to few minutes), we develop a stochastic model that attempts to capture some statistical properties of the barometric altimeter noise. The barometric altimeter noise is decomposed in three components with different physical origin and properties: a deterministic time-varying mean, mainly correlated with global environment changes, and a first-order Gauss-Markov (GM) random process, mainly accounting for short-term, local environment changes, the effects of which are prominent, respectively, for long-time and short-time motion tracking; an uncorrelated random process, mainly due to wideband electronic noise, including quantization noise. Autoregressive-moving average (ARMA) system identification techniques are used to capture the correlation structure of the piecewise stationary GM component, and to estimate its standard deviation, together with the standard deviation of the uncorrelated component. M-point moving average filters used alone or in combination with whitening filters learnt from ARMA model parameters are further tested in few dynamic motion experiments and discussed for their capability of short-time tracking small-amplitude, low-frequency motions.

  8. Dynamic Analysis of the Temperature and the Concentration Profiles of an Industrial Rotary Kiln Used in Clinker Production.

    PubMed

    Rodrigues, Diulia C Q; Soares, Atílio P; Costa, Esly F; Costa, Andréa O S

    2017-01-01

    Cement is one of the most used building materials in the world. The process of cement production involves numerous and complex reactions that occur under different temperatures. Thus, there is great interest in the optimization of cement manufacturing. Clinker production is one of the main steps of cement production and it occurs inside the kiln. In this paper, the dry process of clinker production is analysed in a rotary kiln that operates in counter flow. The main phenomena involved in clinker production is as follows: free residual water evaporation of raw material, decomposition of magnesium carbonate, decarbonation, formation of C3A and C4AF, formation of dicalcium silicate, and formation of tricalcium silicate. The main objective of this study was to propose a mathematical model that realistically describes the temperature profile and the concentration of clinker components in a real rotary kiln. In addition, the influence of different speeds of inlet gas and solids in the system was analysed. The mathematical model is composed of partial differential equations. The model was implemented in Mathcad (available at CCA/UFES) and solved using industrial input data. The proposal model is satisfactory to describe the temperature and concentration profiles of a real rotary kiln.

  9. Modeling the MHC class I pathway by combining predictions of proteasomal cleavage, TAP transport and MHC class I binding.

    PubMed

    Tenzer, S; Peters, B; Bulik, S; Schoor, O; Lemmel, C; Schatz, M M; Kloetzel, P-M; Rammensee, H-G; Schild, H; Holzhütter, H-G

    2005-05-01

    Epitopes presented by major histocompatibility complex (MHC) class I molecules are selected by a multi-step process. Here we present the first computational prediction of this process based on in vitro experiments characterizing proteasomal cleavage, transport by the transporter associated with antigen processing (TAP) and MHC class I binding. Our novel prediction method for proteasomal cleavages outperforms existing methods when tested on in vitro cleavage data. The analysis of our predictions for a new dataset consisting of 390 endogenously processed MHC class I ligands from cells with known proteasome composition shows that the immunological advantage of switching from constitutive to immunoproteasomes is mainly to suppress the creation of peptides in the cytosol that TAP cannot transport. Furthermore, we show that proteasomes are unlikely to generate MHC class I ligands with a C-terminal lysine residue, suggesting processing of these ligands by a different protease that may be tripeptidyl-peptidase II (TPPII).

  10. Optimization of High-Throughput Sequencing Kinetics for determining enzymatic rate constants of thousands of RNA substrates

    PubMed Central

    Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633

  11. Green dyeing process of modified cotton fibres using natural dyes extracted from Tamarix aphylla (L.) Karst. leaves.

    PubMed

    Baaka, Noureddine; Mahfoudhi, Adel; Haddar, Wafa; Mhenni, Mohamed Farouk; Mighri, Zine

    2017-01-01

    This research work involves an eco-friendly dyeing process of modified cotton with the aqueous extract of Tamarix aphylla leaves. During this process, the dyeing step was carried out on modified cotton by several cationising agents in order to improve its dyeability. The influence of the main dyeing conditions (dye bath pH, dyeing time, dyeing temperature, salt addition) on the performances of this dyeing process were studied. The dyeing performances of this process were appreciated by measuring the colour yield (K/S) and the fastness properties of the dyed samples. The effect of mordant type with different mordanting methods on dyeing quality was also studied. The results showed that mordanting gave deeper shades and enhanced fastness properties. In addition, environmental indicators (BOD 5 , COD and COD/BOD 5 ) were used to describe potential improvements in the biodegradability of the dyebath wastewater. Further, HPLC was used to identify the major phenolic compounds in the extracted dye.

  12. [The processes of manuscript evaluation and publication in Medicina Clínica. The editorial committee of Medicina Clínica].

    PubMed

    Ribera, Josep M; Cardellach, Francesc; Selva, Albert

    2005-12-01

    The decision-making process includes a series of activities undertaken in biomedical journals from the moment a manuscript is received until it is accepted or rejected. Firstly, the manuscript is evaluated by the members of the Editorial Board, who analyze both its suitability for the journal and its scientific quality. After this initial evaluation, the article is evaluated by peer reviewers, an essential process to guarantee its scientific validity. Both the Editorial Board and the peer reviewers usually use checklists which are of enormous help in this task. Once the biomedical article has been accepted, the publication process is started, which in turn includes a series of steps, beginning with technical and medical review of the article's contents and ending with the article's publication in the journal. The present article provides a detailed description of the main technical and ethical issues involved in the processes of decision-making and publication of biomedical articles.

  13. A two-in-one process for reliable graphene transistors processed with photo-lithography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlberg, P.; Hinnemo, M.; Song, M.

    2015-11-16

    Research on graphene field-effect transistors (GFETs) has mainly relied on devices fabricated using electron-beam lithography for pattern generation, a method that has known problems with polymer contaminants. GFETs fabricated via photo-lithography suffer even worse from other chemical contaminations, which may lead to strong unintentional doping of the graphene. In this letter, we report on a scalable fabrication process for reliable GFETs based on ordinary photo-lithography by eliminating the aforementioned issues. The key to making this GFET processing compatible with silicon technology lies in a two-in-one process where a gate dielectric is deposited by means of atomic layer deposition. During thismore » deposition step, contaminants, likely unintentionally introduced during the graphene transfer and patterning, are effectively removed. The resulting GFETs exhibit current-voltage characteristics representative to that of intrinsic non-doped graphene. Fundamental aspects pertaining to the surface engineering employed in this work are investigated in the light of chemical analysis in combination with electrical characterization.« less

  14. Ten steps to successful software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  15. Bistatic SAR: Signal Processing and Image Formation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, Daniel E.; Yocky, David A.

    This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013more » on Kirtland Air Force Base, New Mexico.« less

  16. Production of polyimide ceria nanocomposites by development of molecular hook technology in nano-sonochemistry.

    PubMed

    Hatami, Mehdi

    2018-06-01

    Poly(amic acid), the precursor of polyimide (PI), was used for the preparation of PI/CeO 2 nanocomposites (NC)s by ultrasonic assisted technique via insertion of the surface modified CeO 2 nanoparticles (NP)s into PI matrix. In the preparation stages, in the first, the modifications of CeO 2 NPs by using hexadecyltrimethoxysilane (HDTMS) as a binder were targeted using ultrasonic waves. In the second step, newly designed PI structure was formed from the sonochemical imidization process as a molecular hook. In this step two different reactions were occurred. The acetic acid elimination reaction in the main chain of macromolecule, and the acetylation reaction in the side chains of poly(amic acid) were accomplished. By acetylation process the hook structure was created for trapping of the modified nanoparticles. In the final step the preparation of PI NCs were achieved by sonochemical process. The structural and thermal properties of pure PI and PI/CeO 2 NCs were studied by several techniques such as fourier transform infrared spectroscopy (FT-IR), nuclear magnetic resonance spectroscopy (NMR), field emission scanning electron microscopy (FE-SEM), transmission electron microscopy (TEM), atomic force microscopy (AFM), X-ray diffraction (XRD), and thermal analyses. FT-IR and 1 H NMR spectra confirmed the success in preparation of PI matrix. The FE-SEM, TEM, and AFM analyses showed the uniform distribution of CeO 2 NPs in PI matrix. The XRD patterns of NCs show the presence of crystalline CeO 2 NPs in amorphous PI matrix. The thermal analysis results reveal that, with increases in the content of CeO 2 NPs in PI matrix, the thermally stability factors of samples were improved. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Quantum Mechanics and Molecular Mechanics Study of the Catalytic Mechanism of Human AMSH-LP Domain Deubiquitinating Enzymes.

    PubMed

    Zhu, Wenyou; Liu, Yongjun; Ling, Baoping

    2015-08-25

    Deubiquitinating enzymes (DUBs) catalyze the cleavage of the isopeptide bond in polyubiquitin chains to control and regulate the deubiquitination process in all known eukaryotic cells. The human AMSH-LP DUB domain specifically cleaves the isopeptide bonds in the Lys63-linked polyubiquitin chains. In this article, the catalytic mechanism of AMSH-LP has been studied using a combined quantum mechanics and molecular mechanics method. Two possible hydrolysis processes (Path 1 and Path 2) have been considered. Our calculation results reveal that the activation of Zn(2+)-coordinated water molecule is the essential step for the hydrolysis of isopeptide bond. In Path 1, the generated hydroxyl first attacks the carbonyl group of Gly76, and then the amino group of Lys63 is protonated, which is calculated to be the rate limiting step with an energy barrier of 13.1 kcal/mol. The energy barrier of the rate limiting step and the structures of intermediate and product are in agreement with the experimental results. In Path 2, the protonation of amino group of Lys63 is prior to the nucleophilic attack of activated hydroxyl. The two proton transfer processes in Path 2 correspond to comparable overall barriers (33.4 and 36.1 kcal/mol), which are very high for an enzymatic reaction. Thus, Path 2 can be ruled out. During the reaction, Glu292 acts as a proton transfer mediator, and Ser357 mainly plays a role in stabilizing the negative charge of Gly76. Besides acting as a Lewis acid, Zn(2+) also influences the reaction by coordinating to the reaction substrates (W1 and Gly76).

  18. Reducing acid leaching of manganiferous ore: Effect of the iron removal operation on solid waste disposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Michelis, Ida; Ferella, Francesco; Beolchini, Francesca

    2009-01-15

    The process of reducing acid leaching of manganiferous ore is aimed at the extraction of manganese from low grade manganese ores. This work is focused on the iron removal operation. The following items have been considered in order to investigate the effect of the main operating conditions on solid waste disposal and on the process costs: (i) type and quantity of the base agent used for iron precipitation, (ii) effective need of leaching waste separation prior to the iron removal operation, (iii) presence of a second leaching stage with the roasted ore, which might also act as a preliminary ironmore » removal step, and (iv) effect of tailings washing on the solid waste classification. Different base compounds have been tested, including CaO, CaCO{sub 3}, NaOH, and Na{sub 2}CO{sub 3}. The latter gave the best results concerning both the precipitation process kinetics and the reagent consumption. The filtration of the liquor leach prior to iron removal was not necessary, implying significant savings in capital costs. A reduction of chemical consumption and an increase of manganese concentration in the solution were obtained by introducing secondary leaching tests with the previously roasted ore; this additional step was introduced without a significant decrease of global manganese extraction yield. Finally, toxicity characteristic leaching procedure (TCLP) tests carried out on the leaching solid waste showed: (i) a reduction of arsenic mobility in the presence of iron precipitates, and (ii) the need for a washing step in order to produce a waste that is classifiable as not dangerous, taking into consideration the existing Environmental National Laws.« less

  19. A DFT study on NHC-catalyzed intramolecular aldehyde-ketone crossed-benzoin reaction: mechanism, regioselectivity, stereoselectivity, and role of NHC.

    PubMed

    Zhang, Wei; Wang, Yang; Wei, Donghui; Tang, Mingsheng; Zhu, Xinju

    2016-07-06

    A systematic theoretical study has been carried out to understand the mechanism and stereoselectivity of N-heterocyclic carbene (NHC)-catalyzed intramolecular crossed-benzoin reaction of enolizable keto-aldehyde using density functional theory (DFT) calculations. The calculated results reveal that the most favorable pathway contains four steps, i.e., the nucleophilic attack of NHC on the carbonyl carbon atom of a formyl group, the formation of a Breslow intermediate, a ring-closure process coupled with proton transfer, and regeneration of the catalyst. For the formation of the Breslow intermediate via the [1,2]-proton transfer process, apart from the direct proton transfer mechanism, the base Et3N and the in situ generated Brønsted acid Et3N·H(+) mediated proton transfer mechanisms have also been investigated; the free energy barriers for the crucial proton transfer steps are found to be significantly lowered by explicit inclusion of the Brønsted acid Et3N·H(+). The computational results show that the ring-closure process is the stereoselectivity-determining step, in which two chirality centers assigned on the coupling carbon atoms are formed, and the S-configured diastereomer is the predominant product, which is in good agreement with the experimental observations. NCI and NBO analyses are employed to disclose the origin of stereoselectivity and regioselectivity. Moreover, a global reaction index (GRI) analysis has been performed to confirm that NHC mainly plays the role of a Lewis base. The mechanistic insights obtained in the present study should be valuable for the rational design of an effective organocatalyst for this kind of reaction with high stereoselectivity and regioselectivity.

  20. Assessment of molecular contamination in mask pod

    NASA Astrophysics Data System (ADS)

    Foray, Jean Marie; Dejaune, Patrice; Sergent, Pierre; Gough, Stuart; Cheung, D.; Davenet, Magali; Favre, Arnaud; Rude, C.; Trautmann, T.; Tissier, Michel; Fontaine, H.; Veillerot, M.; Avary, K.; Hollein, I.; Lerit, R.

    2008-04-01

    Context/ study Motivation: Contamination and especially Airbone Molecular Contamination (AMC) is a critical issue for mask material flow with a severe and fairly unpredictable risk of induced contamination and damages especially for 193 nm lithography. It is therefore essential to measure, to understand and then try to reduce AMC in mask environment. Mask material flow was studied in a global approach by a pool of European partners, especially within the frame of European MEDEA+ project, so called "MUSCLE". This paper deals with results and assessment of mask pod environment in term of molecular contamination in a first step, then in a second step preliminary studies to reduce mask pod influence and contamination due to material out gassing. Approach and techniques: A specific assessment of environmental / molecular contamination along the supply chain was performed by all partners. After previous work presented at EMLC 07, further studies were performed on real time contamination measurement pod at different sites locations (including Mask manufacturing site, blank manufacturing sites, IC fab). Studies were linked to the main critical issues: cleaning, storage, handling, materials and processes. Contamination measurement campaigns were carried out along the mask supply chain using specific Adixen analyzer in order to monitor in real time organic contaminants (ppb level) in mask pods. Key results would be presented: VOC, AMC and humidity level on different kinds of mask carriers, impact of basic cleaning on pod outgassing measurement (VOC, NH3), and process influence on pod contamination... In a second step, preliminary specific pod conditioning studies for better pod environment were performed based on Adixen vacuum process. Process influence had been experimentally measured in term of molecular outgassing from mask pods. Different AMC experimental characterization methods had been carried out leading to results on a wide range of organic and inorganic contaminants: by inline techniques based on Adixen humidity, also VOC and organic sensors, together by off-line techniques already used in the extensive previous mask pods benchmark (TD-GCMS & Ionic Chromatography). Humidity and VOC levels from mask carriers had shown significant reduction after Adixen pod conditioning process. Focus had been made on optimized vacuum step (for AMC) after particles carrier cleaning cycle. Based upon these key results new procedures, as well as guidelines for mask carrier cleaning optimization are proposed to improve pod contamination control. Summary results/next steps: This paper reports molecular contamination measurement campaigns performed by a pool of European partners along the mask supply chain. It allows us to investigate, identify and quantify critical molecular contamination in mask pod, as well as VOC and humidity, issues depending on locations, uses, and carrier's type. Preliminary studies highlight initial process solutions for pods conditioning that are being used for short term industrialization and further industrialized.

  1. Effective virus inactivation and removal by steps of Biotest Pharmaceuticals IGIV production process

    PubMed Central

    Dichtelmüller, Herbert O.; Flechsig, Eckhard; Sananes, Frank; Kretschmar, Michael; Dougherty, Christopher J.

    2012-01-01

    The virus validation of three steps of Biotest Pharmaceuticals IGIV production process is described here. The steps validated are precipitation and removal of fraction III of the cold ethanol fractionation process, solvent/detergent treatment and 35 nm virus filtration. Virus validation was performed considering combined worst case conditions. By these validated steps sufficient virus inactivation/removal is achieved, resulting in a virus safe product. PMID:24371563

  2. Increased interestingness of extraneous details in a multimedia science presentation leads to decreased learning.

    PubMed

    Mayer, Richard E; Griffith, Emily; Jurkowitz, Ilana T N; Rothman, Daniel

    2008-12-01

    In Experiment 1, students received an illustrated booklet, PowerPoint presentation, or narrated animation that explained 6 steps in how a cold virus infects the human body. The material included 6 high-interest details mainly about the role of viruses in sex or death (high group) or 6 low-interest details consisting of facts and health tips about viruses (low group). The low group outperformed the high group across all 3 media on a subsequent test of problem-solving transfer (d = .80) but not retention (d = .05). In Experiment 2, students who studied a PowerPoint lesson explaining the steps in how digestion works performed better on a problem-solving transfer test if the lesson contained 7 low-interest details rather than 7 high-interest details (d = .86), but the groups did not differ on retention (d = .26). In both experiments, as the interestingness of details was increased, student understanding decreased (as measured by transfer). Results are consistent with a cognitive theory of multimedia learning, in which highly interesting details sap processing capacity away from deeper cognitive processing of the core material during learning. PsycINFO Database Record (c) 2008 APA, all rights reserved.

  3. Space Object Collision Probability via Monte Carlo on the Graphics Processing Unit

    NASA Astrophysics Data System (ADS)

    Vittaldev, Vivek; Russell, Ryan P.

    2017-09-01

    Fast and accurate collision probability computations are essential for protecting space assets. Monte Carlo (MC) simulation is the most accurate but computationally intensive method. A Graphics Processing Unit (GPU) is used to parallelize the computation and reduce the overall runtime. Using MC techniques to compute the collision probability is common in literature as the benchmark. An optimized implementation on the GPU, however, is a challenging problem and is the main focus of the current work. The MC simulation takes samples from the uncertainty distributions of the Resident Space Objects (RSOs) at any time during a time window of interest and outputs the separations at closest approach. Therefore, any uncertainty propagation method may be used and the collision probability is automatically computed as a function of RSO collision radii. Integration using a fixed time step and a quartic interpolation after every Runge Kutta step ensures that no close approaches are missed. Two orders of magnitude speedups over a serial CPU implementation are shown, and speedups improve moderately with higher fidelity dynamics. The tool makes the MC approach tractable on a single workstation, and can be used as a final product, or for verifying surrogate and analytical collision probability methods.

  4. The HERSCHEL/PACS early Data Products

    NASA Astrophysics Data System (ADS)

    Wieprecht, E.; Wetzstein, M.; Huygen, R.; Vandenbussche, B.; De Meester, W.

    2006-07-01

    ESA's Herschel Space Observatory to be launched in 2007, is the first space observatory covering the full far-infrared and submillimeter wavelength range (60 - 670 microns). The Photodetector Array Camera & Spectrometer (PACS) is one of the three science instruments. It contains two Ge:Ga photoconductor arrays and two bolometer arrays to perform imaging line spectroscopy and imaging photometry in the 60 - 210 micron wavelength band. The HERSCHEL ground segment (Herschel Common Science System - HCSS) is implemented using JAVA technology and written in a common effort by the HERSCHEL Science Center and the three instrument teams. The PACS Common Software System (PCSS) is based on the HCSS and used for the online and offline analysis of PACS data. For telemetry bandwidth reasons PACS science data are partially processed on board, compressed, cut into telemetry packets and transmitted to the ground. These steps are instrument mode dependent. We will present the software model which allows to reverse the discrete on board processing steps and evaluate the data. After decompression and reconstruction the detector data and instrument status information are organized in two main PACS Products. The design of these JAVA classes considers the individual sampling rates, data formats, memory and performance optimization aspects and comfortable user interfaces.

  5. Dynamic Scaling and Island Growth Kinetics in Pulsed Laser Deposition of SrTiO 3

    DOE PAGES

    Eres, Gyula; Tischler, J. Z.; Rouleau, C. M.; ...

    2016-11-11

    We use real-time diffuse surface x-ray diffraction to probe the evolution of island size distributions and its effects on surface smoothing in pulsed laser deposition (PLD) of SrTiO 3. In this study, we show that the island size evolution obeys dynamic scaling and two distinct regimes of island growth kinetics. Our data show that PLD film growth can persist without roughening despite thermally driven Ostwald ripening, the main mechanism for surface smoothing, being shut down. The absence of roughening is concomitant with decreasing island density, contradicting the prevailing view that increasing island density is the key to surface smoothing inmore » PLD. We also report a previously unobserved crossover from diffusion-limited to attachment-limited island growth that reveals the influence of nonequilibrium atomic level surface transport processes on the growth modes in PLD. We show by direct measurements that attachment-limited island growth is the dominant process in PLD that creates step flowlike behavior or quasistep flow as PLD “self-organizes” local step flow on a length scale consistent with the substrate temperature and PLD parameters.« less

  6. Analysis of Flatness Deviations for Austenitic Stainless Steel Workpieces after Efficient Surface Machining

    NASA Astrophysics Data System (ADS)

    Nadolny, K.; Kapłonek, W.

    2014-08-01

    The following work is an analysis of flatness deviations of a workpiece made of X2CrNiMo17-12-2 austenitic stainless steel. The workpiece surface was shaped using efficient machining techniques (milling, grinding, and smoothing). After the machining was completed, all surfaces underwent stylus measurements in order to obtain surface flatness and roughness parameters. For this purpose the stylus profilometer Hommel-Tester T8000 by Hommelwerke with HommelMap software was used. The research results are presented in the form of 2D surface maps, 3D surface topographies with extracted single profiles, Abbott-Firestone curves, and graphical studies of the Sk parameters. The results of these experimental tests proved the possibility of a correlation between flatness and roughness parameters, as well as enabled an analysis of changes in these parameters from shaping and rough grinding to finished machining. The main novelty of this paper is comprehensive analysis of measurement results obtained during a three-step machining process of austenitic stainless steel. Simultaneous analysis of individual machining steps (milling, grinding, and smoothing) enabled a complementary assessment of the process of shaping the workpiece surface macro- and micro-geometry, giving special consideration to minimize the flatness deviations

  7. Dynamic Scaling and Island Growth Kinetics in Pulsed Laser Deposition of SrTiO 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eres, Gyula; Tischler, J. Z.; Rouleau, C. M.

    We use real-time diffuse surface x-ray diffraction to probe the evolution of island size distributions and its effects on surface smoothing in pulsed laser deposition (PLD) of SrTiO 3. In this study, we show that the island size evolution obeys dynamic scaling and two distinct regimes of island growth kinetics. Our data show that PLD film growth can persist without roughening despite thermally driven Ostwald ripening, the main mechanism for surface smoothing, being shut down. The absence of roughening is concomitant with decreasing island density, contradicting the prevailing view that increasing island density is the key to surface smoothing inmore » PLD. We also report a previously unobserved crossover from diffusion-limited to attachment-limited island growth that reveals the influence of nonequilibrium atomic level surface transport processes on the growth modes in PLD. We show by direct measurements that attachment-limited island growth is the dominant process in PLD that creates step flowlike behavior or quasistep flow as PLD “self-organizes” local step flow on a length scale consistent with the substrate temperature and PLD parameters.« less

  8. Treatment of high strength distillery wastewater (cherry stillage) by integrated aerobic biological oxidation and ozonation.

    PubMed

    Beltrán, F J; Alvarez, P M; Rodríguez, E M; García-Araya, J F; Rivas, J

    2001-01-01

    The performance of integrated aerobic digestion and ozonation for the treatment of high strength distillery wastewater (i.e., cherry stillage) is reported. Experiments were conducted in laboratory batch systems operating in draw and fill mode. For the biological step, activated sludge from a municipal wastewater treatment facility was used as inoculum, showing a high degree of activity to distillery wastewater. Thus, BOD and COD overall conversions of 95% and 82% were achieved, respectively. However, polyphenol content and absorbance at 254 nm (A(254)) could not be reduced more than 35% and 15%, respectively, by means of single biological oxidation. By considering COD as substrate, the aerobic digestion process followed a Contois' model kinetics, from which the maximum specific growth rate of microorganisms (mu(max)) and the inhibition factor, beta, were then evaluated at different conditions of temperature and pH. In the combined process, the effect of a post-ozonation stage was studied. The main goals achieved by the ozonation step were the removal of polyphenols and A(254). Therefore, ozonation was shown to be an appropriate technology to aid aerobic biological oxidation in the treatment of cherry stillage.

  9. A two-step along-track spectral analysis for estimating the magnetic signals of magnetospheric ring current from Swarm data

    NASA Astrophysics Data System (ADS)

    Martinec, Zdeněk; Velímský, Jakub; Haagmans, Roger; Šachl, Libor

    2018-02-01

    This study deals with the analysis of Swarm vector magnetic field measurements in order to estimate the magnetic field of magnetospheric ring current. For a single Swarm satellite, the magnetic measurements are processed by along-track spectral analysis on a track-by-track basis. The main and lithospheric magnetic fields are modelled by the CHAOS-6 field model and subtracted from the along-track Swarm magnetic data. The mid-latitude residual signal is then spectrally analysed and extrapolated to the polar regions. The resulting model of the magnetosphere (model MME) is compared to the existing Swarm Level 2 magnetospheric field model (MMA_SHA_2C). The differences of up to 10 nT are found on the nightsides Swarm data from 2014 April 8 to May 10, which are due to different processing schemes used to construct the two magnetospheric magnetic field models. The forward-simulated magnetospheric magnetic field generated by the external part of model MME then demonstrates the consistency of the separation of the Swarm along-track signal into the external and internal parts by the two-step along-track spectral analysis.

  10. Two-Step Production of Phenylpyruvic Acid from L-Phenylalanine by Growing and Resting Cells of Engineered Escherichia coli: Process Optimization and Kinetics Modeling.

    PubMed

    Hou, Ying; Hossain, Gazi Sakir; Li, Jianghua; Shin, Hyun-Dong; Liu, Long; Du, Guocheng; Chen, Jian

    2016-01-01

    Phenylpyruvic acid (PPA) is widely used in the pharmaceutical, food, and chemical industries. Here, a two-step bioconversion process, involving growing and resting cells, was established to produce PPA from l-phenylalanine using the engineered Escherichia coli constructed previously. First, the biotransformation conditions for growing cells were optimized (l-phenylalanine concentration 20.0 g·L-1, temperature 35°C) and a two-stage temperature control strategy (keep 20°C for 12 h and increase the temperature to 35°C until the end of biotransformation) was performed. The biotransformation conditions for resting cells were then optimized in 3-L bioreactor and the optimized conditions were as follows: agitation speed 500 rpm, aeration rate 1.5 vvm, and l-phenylalanine concentration 30 g·L-1. The total maximal production (mass conversion rate) reached 29.8 ± 2.1 g·L-1 (99.3%) and 75.1 ± 2.5 g·L-1 (93.9%) in the flask and 3-L bioreactor, respectively. Finally, a kinetic model was established, and it was revealed that the substrate and product inhibition were the main limiting factors for resting cell biotransformation.

  11. On brain lesions, the milkman and Sigmunda.

    PubMed

    Izquierdo, I; Medina, J H

    1998-10-01

    Lesion studies have been of historical importance in establishing the brain systems involved in memory processes. Many of those studies, however, have been overinterpreted in terms of the actual role of each system and of connections between systems. The more recent molecular pharmacological approach has produced major advances in these two areas. The main biochemical steps of memory formation in the CAI region of the hippocampus have been established by localized microinfusions of drugs acting on specific enzymes of receptors, by subcellular measurements of the activity or function of those enzymes and receptors at definite times, and by transgenic deletions or changes of those proteins. The biochemical steps of long-term memory formation in CAI have been found to be quite similar to those of long-term potentiation in the same region, and of other forms of plasticity. Connections between the hippocampus and the entorhinal and parietal cortices in the formation and modulation of short- and long-term memory have also been elucidated using these techniques. Lesion studies, coupled with imaging studies, still have a role to play; with regard to human memory, this role is in many ways unique. But these methods by themselves are not informative as to the mechanisms of memory processing, storage or modulation.

  12. Ontology-based coupled optimisation design method using state-space analysis for the spindle box system of large ultra-precision optical grinding machine

    NASA Astrophysics Data System (ADS)

    Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian

    2017-08-01

    With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.

  13. To image analysis in computed tomography

    NASA Astrophysics Data System (ADS)

    Chukalina, Marina; Nikolaev, Dmitry; Ingacheva, Anastasia; Buzmakov, Alexey; Yakimchuk, Ivan; Asadchikov, Victor

    2017-03-01

    The presence of errors in tomographic image may lead to misdiagnosis when computed tomography (CT) is used in medicine, or the wrong decision about parameters of technological processes when CT is used in the industrial applications. Two main reasons produce these errors. First, the errors occur on the step corresponding to the measurement, e.g. incorrect calibration and estimation of geometric parameters of the set-up. The second reason is the nature of the tomography reconstruction step. At the stage a mathematical model to calculate the projection data is created. Applied optimization and regularization methods along with their numerical implementations of the method chosen have their own specific errors. Nowadays, a lot of research teams try to analyze these errors and construct the relations between error sources. In this paper, we do not analyze the nature of the final error, but present a new approach for the calculation of its distribution in the reconstructed volume. We hope that the visualization of the error distribution will allow experts to clarify the medical report impression or expert summary given by them after analyzing of CT results. To illustrate the efficiency of the proposed approach we present both the simulation and real data processing results.

  14. Steam pretreatment of Saccharum officinarum L. bagasse by adding of impregnating agents for advanced bioethanol production.

    PubMed

    Verardi, A; Blasi, A; De Bari, I; Calabrò, V

    2016-12-01

    The main byproduct of the sugarcane industry, Saccharum officinarum L. bagasse (sugarcane bagasse, SCB), is widely used as lignocellulose biomass for bio-ethanol (EtOH) production. In this research study, SCB was pretreated by steam explosion (SE) method using two different impregnating agents: sulfur dioxide (SD) and hydrogen peroxide (HP). As matter of fact, the use of impregnating agents improves the performance of SE method, increasing the concentrations of fermentable sugars after enzymatic saccharification, and decreasing the inhibitor compounds produced during the steam pretreatment step. The aim of this study was to investigate and compare the use of the two impregnating agents in various SE-conditions in order to optimize pretreatment parameters. For every pretreatment condition, it has been evaluated: concentration of fermentable sugars, glucose and xylose yields, and the effects of the inhibitor compounds on enzymatic hydrolysis step. The obtained results allow to improve the efficiency of the whole process of bio-EtOH synthesis enhancing the amount of fermentable sugars produced and the eco-sustainability of the whole process. Indeed, the optimization of steam pretreatment leads to a reduction of energy requirements and to a lower environmental impact. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Optimization of airport security lanes

    NASA Astrophysics Data System (ADS)

    Chen, Lin

    2018-05-01

    Current airport security management system is widely implemented all around the world to ensure the safety of passengers, but it might not be an optimum one. This paper aims to seek a better security system, which can maximize security while minimize inconvenience to passengers. Firstly, we apply Petri net model to analyze the steps where the main bottlenecks lie. Based on average tokens and time transition, the most time-consuming steps of security process can be found, including inspection of passengers' identification and documents, preparing belongings to be scanned and the process for retrieving belongings back. Then, we develop a queuing model to figure out factors affecting those time-consuming steps. As for future improvement, the effective measures which can be taken include transferring current system as single-queuing and multi-served, intelligently predicting the number of security checkpoints supposed to be opened, building up green biological convenient lanes. Furthermore, to test the theoretical results, we apply some data to stimulate the model. And the stimulation results are consistent with what we have got through modeling. Finally, we apply our queuing model to a multi-cultural background. The result suggests that by quantifying and modifying the variance in wait time, the model can be applied to individuals with various habits customs and habits. Generally speaking, our paper considers multiple affecting factors, employs several models and does plenty of calculations, which is practical and reliable for handling in reality. In addition, with more precise data available, we can further test and improve our models.

  16. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  17. [The centralized mycobacteriological laboratory is a necessary component of a phthisiological service in large towns of Russia].

    PubMed

    Dorozhkova, I R; Freĭman, G E; Moroz, A M

    2007-01-01

    The paper presents the main points of the authors' own concept of the centralization of mycobacteriological service in large towns of the Russian Federation. The main points of step-by-step organizational and methodological measures required to solve this problem are described in detail. Consecutive measures to realize the proposed mycobacteriological service centralization model originated in January 2004 on a model of the Moscow Eastern Administrative District with 1380 thousand inhabitants are described.

  18. Room-Temperature Fabricated Thin-Film Transistors Based on Compounds with Lanthanum and Main Family Element Boron.

    PubMed

    Xiao, Peng; Huang, Junhua; Dong, Ting; Xie, Jianing; Yuan, Jian; Luo, Dongxiang; Liu, Baiquan

    2018-06-06

    For the first time, compounds with lanthanum from the main family element Boron (LaB x ) were investigated as an active layer for thin-film transistors (TFTs). Detailed studies showed that the room-temperature fabricated LaB x thin film was in the crystalline state with a relatively narrow optical band gap of 2.28 eV. The atom ration of La/B was related to the working pressure during the sputtering process and the atom ration of La/B increased with the increase of the working pressure, which will result in the freer electrons in the LaB x thin film. LaB x -TFT without any intentionally annealing steps exhibited a saturation mobility of 0.44 cm²·V −1 ·s −1 , which is a subthreshold swing ( SS ) of 0.26 V/decade and a I on / I off ratio larger than 10⁴. The room-temperature process is attractive for its compatibility with almost all kinds of flexible substrates and the LaB x semiconductor may be a new choice for the channel materials in TFTs.

  19. Endoplasmic reticulum stress and proteasomal system in amyotrophic lateral sclerosis.

    PubMed

    Karademir, Betul; Corek, Ceyda; Ozer, Nesrin Kartal

    2015-11-01

    Protein processing including folding, unfolding and degradation is involved in the mechanisms of many diseases. Unfolded protein response and/or endoplasmic reticulum stress are accepted to be the first steps which should be completed via protein degradation. In this direction, proteasomal system and autophagy play important role as the degradation pathways and controlled via complex mechanisms. Amyotrophic lateral sclerosis is a multifactorial neurodegenerative disease which is also known as the most catastrophic one. Mutation of many different genes are involved in the pathogenesis such as superoxide dismutase 1, chromosome 9 open reading frame 72 and ubiquilin 2. These genes are mainly related to the antioxidant defense systems, endoplasmic reticulum stress related proteins and also protein aggregation, degradation pathways and therefore mutation of these genes cause related disorders.This review focused on the role of protein processing via endoplasmic reticulum and proteasomal system in amyotrophic lateral sclerosis which are the main players in the pathology. In this direction, dysfunction of endoplasmic reticulum associated degradation and related cell death mechanisms that are autophagy/apoptosis have been detailed. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Physico-Chemical Alternatives in Lignocellulosic Materials in Relation to the Kind of Component for Fermenting Purposes

    PubMed Central

    Coz, Alberto; Llano, Tamara; Cifrián, Eva; Viguri, Javier; Maican, Edmond; Sixta, Herbert

    2016-01-01

    The complete bioconversion of the carbohydrate fraction is of great importance for a lignocellulosic-based biorefinery. However, due to the structure of the lignocellulosic materials, and depending basically on the main parameters within the pretreatment steps, numerous byproducts are generated and they act as inhibitors in the fermentation operations. In this sense, the impact of inhibitory compounds derived from lignocellulosic materials is one of the major challenges for a sustainable biomass-to-biofuel and -bioproduct industry. In order to minimise the negative effects of these compounds, numerous methodologies have been tested including physical, chemical, and biological processes. The main physical and chemical treatments have been studied in this work in relation to the lignocellulosic material and the inhibitor in order to point out the best mechanisms for fermenting purposes. In addition, special attention has been made in the case of lignocellulosic hydrolysates obtained by chemical processes with SO2, due to the complex matrix of these materials and the increase in these methodologies in future biorefinery markets. Recommendations of different detoxification methods have been given. PMID:28773700

  1. Computer Based Procedures for Field Workers - FY16 Research Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna; Bly, Aaron

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven jobmore » aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to commercialize INL’s CBP system.« less

  2. Separation process using pervaporation and dephlegmation

    DOEpatents

    Vane, Leland M.; Mairal, Anurag P.; Ng, Alvin; Alvarez, Franklin R.; Baker, Richard W.

    2004-06-29

    A process for treating liquids containing organic compounds and water. The process includes a pervaporation step in conjunction with a dephlegmation step to treat at least a portion of the permeate vapor from the pervaporation step. The process yields a membrane residue stream, a stream enriched in the more volatile component (usually the organic) as the overhead stream from the dephlegmator and a condensate stream enriched in the less volatile component (usually the water) as a bottoms stream from the dephlegmator. Any of these may be the principal product of the process. The membrane separation step may also be performed in the vapor phase, or by membrane distillation.

  3. [Technology transfer between academic laboratories and industrial laboratories: licensing].

    PubMed

    Salauze, D

    2010-09-01

    The time when academic and industrial research were operating in two separate worlds is now over. Technology transfer from one to the other is now frequent and organized. It starts by filing a patent. Of course, provided the amounts at stake for developing a product, especially in the healthcare field, a non patent-protected invention has virtually no chance of eventually reaching the community. But this is only the first step of a long process which starts by licensing deals of which we will examine the main common clauses. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  4. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  5. Effect of the processing steps on compositions of table olive since harvesting time to pasteurization.

    PubMed

    Nikzad, Nasim; Sahari, Mohammad A; Vanak, Zahra Piravi; Safafar, Hamed; Boland-nazar, Seyed A

    2013-08-01

    Weight, oil, fatty acids, tocopherol, polyphenol, and sterol properties of 5 olive cultivars (Zard, Fishomi, Ascolana, Amigdalolia, and Conservalia) during crude, lye treatment, washing, fermentation, and pasteurization steps were studied. Results showed: oil percent was higher and lower in Ascolana (crude step) and in Fishomi (pasteurization step), respectively; during processing steps, in all cultivars, oleic, palmitic, linoleic, and stearic acids were higher; the highest changes in saturated and unsaturated fatty acids were in fermentation step; the highest and the lowest ratios of ω3 / ω6 were in Ascolana (washing step) and in Zard (pasteurization step), respectively; the highest and the lowest tocopherol were in Amigdalolia and Fishomi, respectively, and major damage occurred in lye step; the highest and the lowest polyphenols were in Ascolana (crude step) and in Zard and Ascolana (pasteurization step), respectively; the major damage among cultivars occurred during lye step, in which the polyphenol reduced to 1/10 of first content; sterol did not undergo changes during steps. Reviewing of olive patents shows that many compositions of fruits such as oil quality, fatty acids, quantity and its fraction can be changed by alteration in cultivar and process.

  6. Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo

    2016-04-01

    Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.

  7. Radioactive characterization of the main materials involved in the titanium dioxide production process and their environmental radiological impact.

    PubMed

    Mantero, J; Gazquez, M J; Bolivar, J P; Garcia-Tenorio, R; Vaca, F

    2013-06-01

    A study about the distribution of several radionuclides from the uranium and the thorium series radionuclides along the production process of a typical NORM industry devoted to the production of titanium dioxide has been performed. With this end the activity concentrations in raw materials, final product, co-products, and wastes of the production process have been determined by both gamma-ray and alpha-particle spectrometry. The main raw material used in the studied process (ilmenite) presents activity concentrations of around 300 Bq kg(-1) for Th-series radionuclides and 100 Bq kg(-1) for the U-series ones. These radionuclides in the industrial process are distributed in the different steps of the production process according mostly to the chemical behaviour of each radioelement, following different routes. As an example, most of the radium remains associated with the un-dissolved material waste, with activity concentrations around 3 kBq kg(-1) of (228)Ra and around 1 kBq kg(-1) of (226)Ra, while the final commercial products (TiO2 pigments and co-products) contain negligible amounts of radioactivity. The obtained results have allowed assessing the possible public radiological impact associated with the use of the products and co-products obtained in this type of industry, as well as the environmental radiological impact associated with the solid residues and liquid generated discharges. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. An updated review of dietary isoflavones: Nutrition, processing, bioavailability and impacts on human health.

    PubMed

    Zaheer, Khalid; Humayoun Akhtar, M

    2017-04-13

    Isoflavones (genistein, daidzein, and glycitein) are bioactive compounds with mildly estrogenic properties and often referred to as phytoestrogen. These are present in significant quantities (up to 4-5 mg·g -1 on dry basis) in legumes mainly soybeans, green beans, mung beans. In grains (raw materials) they are present mostly as glycosides, which are poorly absorbed on consumption. Thus, soybeans are processed into various food products for digestibility, taste and bioavailability of nutrients and bioactives. Main processing steps include steaming, cooking, roasting, microbial fermentation that destroy protease inhibitors and also cleaves the glycoside bond to yield absorbable aglycone in the processed soy products, such as miso, natto, soy milk, tofu; and increase shelf lives. Processed soy food products have been an integral part of regular diets in many Asia-Pacific countries for centuries, e.g. China, Japan and Korea. However, in the last two decades, there have been concerted efforts to introduce soy products in western diets for their health benefits with some success. Isoflavones were hailed as magical natural component that attribute to prevent some major prevailing health concerns. Consumption of soy products have been linked to reduction in incidence or severity of chronic diseases such as cardiovascular, breast and prostate cancers, menopausal symptoms, bone loss, etc. Overall, consuming moderate amounts of traditionally prepared and minimally processed soy foods may offer modest health benefits while minimizing potential for any adverse health effects.

  9. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  10. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  11. The time course of reading processes in children with and without dyslexia: an ERP study

    PubMed Central

    Hasko, Sandra; Groth, Katarina; Bruder, Jennifer; Bartling, Jürgen; Schulte-Körne, Gerd

    2013-01-01

    The main diagnostic criterion for developmental dyslexia (DD) in transparent orthographies is a remarkable reading speed deficit, which is often accompanied by spelling difficulties. These deficits have been traced back to both deficits in orthographic and phonological processing. For a better understanding of the reading speed deficit in DD it is necessary to clarify which processing steps are degraded in children with DD during reading. In order to address this question the present study used EEG to investigate three reading related ERPs: the N170, N400 and LPC. Twenty-nine children without DD and 52 children with DD performed a phonological lexical decision (PLD)—task, which tapped both orthographic and phonological processing. Children were presented with words, pseudohomophones, pseudowords and false fonts and had to decide whether the presented stimulus sounded like an existing German word or not. Compared to control children, children with DD showed deficits in all the investigated ERPs. Firstly, a diminished mean area under the curve for the word material-false font contrasts in the time window of the N170 was observed, indicating a reduced degree of print sensitivity; secondly, N400 amplitudes, as suggested to reflect the access to the orthographic lexicon and grapheme-phoneme conversion, were attenuated; and lastly, phonological access as indexed by the LPC was degraded in children with DD. Processing differences dependent on the linguistic material in children without DD were observed only in the LPC, suggesting that similar reading processes were adopted independent of orthographic familiarity. The results of this study suggest that effective treatment should include both orthographic and phonological training. Furthermore, more longitudinal studies utilizing the same task and stimuli are needed to clarify how these processing steps and their time course change during reading development. PMID:24109444

  12. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  13. Homojunction silicon solar cells doping by ion implantation

    NASA Astrophysics Data System (ADS)

    Milési, Frédéric; Coig, Marianne; Lerat, Jean-François; Desrues, Thibaut; Le Perchec, Jérôme; Lanterne, Adeline; Lachal, Laurent; Mazen, Frédéric

    2017-10-01

    Production costs and energy efficiency are the main priorities for the photovoltaic (PV) industry (COP21 conclusions). To lower costs and increase efficiency, we are proposing to reduce the number of processing steps involved in the manufacture of N-type Passivated Rear Totally Diffused (PERT) silicon solar cells. Replacing the conventional thermal diffusion doping steps by ion implantation followed by thermal annealing allows reducing the number of steps from 7 to 3 while maintaining similar efficiency. This alternative approach was investigated in the present work. Beamline and plasma immersion ion implantation (BLII and PIII) methods were used to insert n-(phosphorus) and p-type (boron) dopants into the Si substrate. With higher throughput and lower costs, PIII is a better candidate for the photovoltaic industry, compared to BL. However, the optimization of the plasma conditions is demanding and more complex than the beamline approach. Subsequent annealing was performed on selected samples to activate the dopants on both sides of the solar cell. Two annealing methods were investigated: soak and spike thermal annealing. Best performing solar cells, showing a PV efficiency of about 20%, was obtained using spike annealing with adapted ion implantation conditions.

  14. A new model in achieving Green Accounting at hotels in Bali

    NASA Astrophysics Data System (ADS)

    Astawa, I. P.; Ardina, C.; Yasa, I. M. S.; Parnata, I. K.

    2018-01-01

    The concept of green accounting becomes a debate in terms of its implementation in a company. The result of previous studies indicates that there are no standard model regarding its implementation to support performance. The research aims to create a different green accounting model to other models by using local cultural elements as the variables in building it. The research is conducted in two steps. The first step is designing the model based on theoretical studies by considering the main and supporting elements in building the concept of green accounting. The second step is conducting a model test at 60 five stars hotels started with data collection through questionnaire and followed by data processing using descriptive statistic. The result indicates that the hotels’ owner has implemented green accounting attributes and it supports previous studies. Another result, which is a new finding, shows that the presence of local culture, government regulation, and the awareness of hotels’ owner has important role in the development of green accounting concept. The results of the research give contribution to accounting science in terms of green reporting. The hotel management should adopt local culture in building the character of accountant hired in the accounting department.

  15. Scaled-up production of poacic acid, a plant-derived antifungal agent

    DOE PAGES

    Yue, Fengxia; Gao, Ruili; Piotrowski, Jeff S.; ...

    2017-09-01

    Poacic acid, a decarboxylated product from 8–5-diferulic acid that is commonly found in monocot lignocellulosic hydrolysates, has been identified as a natural antifungal agent against economically significant fungi and oomycete plant pathogens. Starting from commercially available or monocot-derivable ferulic acid, a three-step synthetic procedure has been developed for the production of poacic acid needed for field testing in a controlled agricultural setting. First, ferulic acid was esterified to produce ethyl ferulate in 92% yield. Second, peroxidase-catalyzed free radical dehydrodimerization of ethyl ferulate produced crude diferulates, mainly 8–5-diferulate, in 91% yield. Finally, crystalline poacic acid was obtained in 25% yield viamore » alkaline hydrolysis of the crude diferulates after purification by flash-column chromatography. Thus, this new procedure offers two key improvements relevant to large-scale production: 1) bubbling air through the reaction mixture in the second step to remove acetone greatly improves the recovery efficiency of the crude diferulates; and 2) telescoping minor impurities directly into the alkaline hydrolysis step eliminates the need for additional column purifications, thus reducing the overall cost of production and removing a major impediment to process scale-up.« less

  16. Scaled-up production of poacic acid, a plant-derived antifungal agent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Fengxia; Gao, Ruili; Piotrowski, Jeff S.

    Poacic acid, a decarboxylated product from 8–5-diferulic acid that is commonly found in monocot lignocellulosic hydrolysates, has been identified as a natural antifungal agent against economically significant fungi and oomycete plant pathogens. Starting from commercially available or monocot-derivable ferulic acid, a three-step synthetic procedure has been developed for the production of poacic acid needed for field testing in a controlled agricultural setting. First, ferulic acid was esterified to produce ethyl ferulate in 92% yield. Second, peroxidase-catalyzed free radical dehydrodimerization of ethyl ferulate produced crude diferulates, mainly 8–5-diferulate, in 91% yield. Finally, crystalline poacic acid was obtained in 25% yield viamore » alkaline hydrolysis of the crude diferulates after purification by flash-column chromatography. Thus, this new procedure offers two key improvements relevant to large-scale production: 1) bubbling air through the reaction mixture in the second step to remove acetone greatly improves the recovery efficiency of the crude diferulates; and 2) telescoping minor impurities directly into the alkaline hydrolysis step eliminates the need for additional column purifications, thus reducing the overall cost of production and removing a major impediment to process scale-up.« less

  17. Significant improvements of electrical discharge machining performance by step-by-step updated adaptive control laws

    NASA Astrophysics Data System (ADS)

    Zhou, Ming; Wu, Jianyang; Xu, Xiaoyi; Mu, Xin; Dou, Yunping

    2018-02-01

    In order to obtain improved electrical discharge machining (EDM) performance, we have dedicated more than a decade to correcting one essential EDM defect, the weak stability of the machining, by developing adaptive control systems. The instabilities of machining are mainly caused by complicated disturbances in discharging. To counteract the effects from the disturbances on machining, we theoretically developed three control laws from minimum variance (MV) control law to minimum variance and pole placements coupled (MVPPC) control law and then to a two-step-ahead prediction (TP) control law. Based on real-time estimation of EDM process model parameters and measured ratio of arcing pulses which is also called gap state, electrode discharging cycle was directly and adaptively tuned so that a stable machining could be achieved. To this end, we not only theoretically provide three proved control laws for a developed EDM adaptive control system, but also practically proved the TP control law to be the best in dealing with machining instability and machining efficiency though the MVPPC control law provided much better EDM performance than the MV control law. It was also shown that the TP control law also provided a burn free machining.

  18. Detection and localization of damage using empirical mode decomposition and multilevel support vector machine

    NASA Astrophysics Data System (ADS)

    Dushyanth, N. D.; Suma, M. N.; Latte, Mrityanjaya V.

    2016-03-01

    Damage in the structure may raise a significant amount of maintenance cost and serious safety problems. Hence detection of the damage at its early stage is of prime importance. The main contribution pursued in this investigation is to propose a generic optimal methodology to improve the accuracy of positioning of the flaw in a structure. This novel approach involves a two-step process. The first step essentially aims at extracting the damage-sensitive features from the received signal, and these extracted features are often termed the damage index or damage indices, serving as an indicator to know whether the damage is present or not. In particular, a multilevel SVM (support vector machine) plays a vital role in the distinction of faulty and healthy structures. Formerly, when a structure is unveiled as a damaged structure, in the subsequent step, the position of the damage is identified using Hilbert-Huang transform. The proposed algorithm has been evaluated in both simulation and experimental tests on a 6061 aluminum plate with dimensions 300 mm × 300 mm × 5 mm which accordingly yield considerable improvement in the accuracy of estimating the position of the flaw.

  19. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  20. Oxidation mechanism of diethyl ether: a complex process for a simple molecule.

    PubMed

    Di Tommaso, Stefania; Rotureau, Patricia; Crescenzi, Orlando; Adamo, Carlo

    2011-08-28

    A large number of organic compounds, such as ethers, spontaneously form unstable peroxides through a self-propagating process of autoxidation (peroxidation). Although the hazards of organic peroxides are well known, the oxidation mechanisms of peroxidizable compounds like ethers reported in the literature are vague and often based on old experiments, carried out in very different conditions (e.g. atmospheric, combustion). With the aim to (partially) fill the lack of information, in this paper we present an extensive Density Functional Theory (DFT) study of autoxidation reaction of diethyl ether (DEE), a chemical that is largely used as solvent in laboratories, and which is considered to be responsible for various accidents. The aim of the work is to investigate the most probable reaction paths involved in the autoxidation process and to identify all potential hazardous intermediates, such as peroxides. Beyond the determination of a complex oxidation mechanism for such a simple molecule, our results suggest that the two main reaction channels open in solution are the direct decomposition (β-scission) of DEE radical issued of the initiation step and the isomerization of the peroxy radical formed upon oxygen attack (DEEOO˙). A simple kinetic evaluation of these two competing reaction channels hints that radical isomerization may play an unexpectedly important role in the global DEE oxidation process. Finally industrial hazards could be related to the hydroperoxide formation and accumulation during the chain propagation step. The resulting information may contribute to the understanding of the accidental risks associated with the use of diethyl ether.

Top