Sample records for multi-step process involving

  1. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  2. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  3. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  4. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    PubMed Central

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  5. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  6. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, J.L.

    1990-07-10

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed. 7 figs.

  7. Self-regenerating column chromatography

    DOEpatents

    Park, Woo K.

    1995-05-30

    The present invention provides a process for treating both cations and anions by using a self-regenerating, multi-ionic exchange resin column system which requires no separate regeneration steps. The process involves alternating ion-exchange chromatography for cations and anions in a multi-ionic exchange column packed with a mixture of cation and anion exchange resins. The multi-ionic mixed-charge resin column works as a multi-function column, capable of independently processing either cationic or anionic exchange, or simultaneously processing both cationic and anionic exchanges. The major advantage offered by the alternating multi-function ion exchange process is the self-regeneration of the resins.

  8. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  9. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research

    PubMed Central

    Singh, Sonal

    2013-01-01

    Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences. PMID:24555077

  10. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research.

    PubMed

    Maruthur, Nisa M; Joy, Susan; Dolan, James; Segal, Jodi B; Shihab, Hasan M; Singh, Sonal

    2013-01-01

    Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences.

  11. Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.

    PubMed

    Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen

    2018-07-20

    Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Governance for public health and health equity: The Tröndelag model for public health work.

    PubMed

    Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim

    2018-06-01

    Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.

  13. Acoustic resonator and method of making same

    DOEpatents

    Kline, Gerald R.; Lakin, Kenneth M.

    1985-03-05

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  14. Acoustic resonator and method of making same

    DOEpatents

    Kline, G.R.; Lakin, K.M.

    1983-10-13

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  15. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference

    PubMed Central

    Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304

  16. Acoustic resonator with Al electrodes on an AlN layer and using a GaAs substrate

    DOEpatents

    Kline, Gerald R.; Lakin, Kenneth M.

    1985-12-03

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  17. Acoustic resonator and method of making same

    DOEpatents

    Kline, G.R.; Lakin, K.M.

    1985-03-05

    A method is disclosed of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers. 4 figs.

  18. Automatic Registration of GF4 Pms: a High Resolution Multi-Spectral Sensor on Board a Satellite on Geostationary Orbit

    NASA Astrophysics Data System (ADS)

    Gao, M.; Li, J.

    2018-04-01

    Geometric correction is an important preprocessing process in the application of GF4 PMS image. The method of geometric correction that is based on the manual selection of geometric control points is time-consuming and laborious. The more common method, based on a reference image, is automatic image registration. This method involves several steps and parameters. For the multi-spectral sensor GF4 PMS, it is necessary for us to identify the best combination of parameters and steps. This study mainly focuses on the following issues: necessity of Rational Polynomial Coefficients (RPC) correction before automatic registration, base band in the automatic registration and configuration of GF4 PMS spatial resolution.

  19. Multi-objective optimization of process parameters of multi-step shaft formed with cross wedge rolling based on orthogonal test

    NASA Astrophysics Data System (ADS)

    Han, S. T.; Shu, X. D.; Shchukin, V.; Kozhevnikova, G.

    2018-06-01

    In order to achieve reasonable process parameters in forming multi-step shaft by cross wedge rolling, the research studied the rolling-forming process multi-step shaft on the DEFORM-3D finite element software. The interactive orthogonal experiment was used to study the effect of the eight parameters, the first section shrinkage rate φ1, the first forming angle α1, the first spreading angle β1, the first spreading length L1, the second section shrinkage rate φ2, the second forming angle α2, the second spreading angle β2 and the second spreading length L2, on the quality of shaft end and the microstructure uniformity. By using the fuzzy mathematics comprehensive evaluation method and the extreme difference analysis, the influence degree of the process parameters on the quality of the multi-step shaft is obtained: β2>φ2L1>α1>β1>φ1>α2L2. The results of the study can provide guidance for obtaining multi-stepped shaft with high mechanical properties and achieving near net forming without stub bar in cross wedge rolling.

  20. From framework to action: the DESIRE approach to combat desertification.

    PubMed

    Hessel, R; Reed, M S; Geeson, N; Ritsema, C J; van Lynden, G; Karavitis, C A; Schwilch, G; Jetten, V; Burger, P; van der Werff Ten Bosch, M J; Verzandvoort, S; van den Elsen, E; Witsenburg, K

    2014-11-01

    It has become increasingly clear that desertification can only be tackled through a multi-disciplinary approach that not only involves scientists but also stakeholders. In the DESIRE project such an approach was taken. As a first step, a conceptual framework was developed in which the factors and processes that may lead to land degradation and desertification were described. Many of these factors do not work independently, but can reinforce or weaken one another, and to illustrate these relationships sustainable management and policy feedback loops were included. This conceptual framework can be applied globally, but can also be made site-specific to take into account that each study site has a unique combination of bio-physical, socio-economic and political conditions. Once the conceptual framework was defined, a methodological framework was developed in which the methodological steps taken in the DESIRE approach were listed and their logic and sequence were explained. The last step was to develop a concrete working plan to put the project into action, involving stakeholders throughout the process. This series of steps, in full or in part, offers explicit guidance for other organizations or projects that aim to reduce land degradation and desertification.

  1. Multi-step splicing of sphingomyelin synthase linear and circular RNAs.

    PubMed

    Filippenkov, Ivan B; Sudarkina, Olga Yu; Limborska, Svetlana A; Dergunova, Lyudmila V

    2018-05-15

    The SGMS1 gene encodes the enzyme sphingomyelin synthase 1 (SMS1), which is involved in the regulation of lipid metabolism, apoptosis, intracellular vesicular transport and other significant processes. The SGMS1 gene is located on chromosome 10 and has a size of 320 kb. Previously, we showed that dozens of alternative transcripts of the SGMS1 gene are present in various human tissues. In addition to mRNAs that provide synthesis of the SMS1 protein, this gene participates in the synthesis of non-coding transcripts, including circular RNAs (circRNAs), which include exons of the 5'-untranslated region (5'-UTR) and are highly represented in the brain. In this study, using the high-throughput technology RNA-CaptureSeq, many new SGMS1 transcripts were identified, including both intronic unspliced RNAs (premature RNAs) and RNAs formed via alternative splicing. Recursive exons (RS-exons) that can participate in the multi-step splicing of long introns of the gene were also identified. These exons participate in the formation of circRNAs. Thus, multi-step splicing may provide a variety of linear and circular RNAs of eukaryotic genes in tissues. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    NASA Technical Reports Server (NTRS)

    Reck, Theodore (Inventor); Perez, Jose Vicente Siles (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Jung-Kubiak, Cecile (Inventor); Mehdi, Imran (Inventor); Chattopadhyay, Goutam (Inventor); Lin, Robert H. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  3. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    PubMed

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Systematic Approach to Calculate the Concentration of Chemical Species in Multi-Equilibrium Problems

    ERIC Educational Resources Information Center

    Baeza-Baeza, Juan Jose; Garcia-Alvarez-Coque, Maria Celia

    2011-01-01

    A general systematic approach is proposed for the numerical calculation of multi-equilibrium problems. The approach involves several steps: (i) the establishment of balances involving the chemical species in solution (e.g., mass balances, charge balance, and stoichiometric balance for the reaction products), (ii) the selection of the unknowns (the…

  5. Color sensitivity of the multi-exposure HDR imaging process

    NASA Astrophysics Data System (ADS)

    Lenseigne, Boris; Jacobs, Valéry Ann; Withouck, Martijn; Hanselaer, Peter; Jonker, Pieter P.

    2013-04-01

    Multi-exposure high dynamic range(HDR) imaging builds HDR radiance maps by stitching together different views of a same scene with varying exposures. Practically, this process involves converting raw sensor data into low dynamic range (LDR) images, estimate the camera response curves, and use them in order to recover the irradiance for every pixel. During the export, applying white balance settings and image stitching, which both have an influence on the color balance in the final image. In this paper, we use a calibrated quasi-monochromatic light source, an integrating sphere, and a spectrograph in order to evaluate and compare the average spectral response of the image sensor. We finally draw some conclusion about the color consistency of HDR imaging and the additional steps necessary to use multi-exposure HDR imaging as a tool to measure the physical quantities such as radiance and luminance.

  6. MREG V1.1 : a multi-scale image registration algorithm for SAR applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichel, Paul H.

    2013-08-01

    MREG V1.1 is the sixth generation SAR image registration algorithm developed by the Signal Processing&Technology Department for Synthetic Aperture Radar applications. Like its predecessor algorithm REGI, it employs a powerful iterative multi-scale paradigm to achieve the competing goals of sub-pixel registration accuracy and the ability to handle large initial offsets. Since it is not model based, it allows for high fidelity tracking of spatially varying terrain-induced misregistration. Since it does not rely on image domain phase, it is equally adept at coherent and noncoherent image registration. This document provides a brief history of the registration processors developed by Dept. 5962more » leading up to MREG V1.1, a full description of the signal processing steps involved in the algorithm, and a user's manual with application specific recommendations for CCD, TwoColor MultiView, and SAR stereoscopy.« less

  7. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor); Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  8. Multi-electrolyte-step anodic aluminum oxide method for the fabrication of self-organized nanochannel arrays

    PubMed Central

    2012-01-01

    Nanochannel arrays were fabricated by the self-organized multi-electrolyte-step anodic aluminum oxide [AAO] method in this study. The anodization conditions used in the multi-electrolyte-step AAO method included a phosphoric acid solution as the electrolyte and an applied high voltage. There was a change in the phosphoric acid by the oxalic acid solution as the electrolyte and the applied low voltage. This method was used to produce self-organized nanochannel arrays with good regularity and circularity, meaning less power loss and processing time than with the multi-step AAO method. PMID:22333268

  9. Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES

    NASA Astrophysics Data System (ADS)

    Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu

    2016-11-01

    Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.

  10. Multi-compartmental modeling of SORLA’s influence on amyloidogenic processing in Alzheimer’s disease

    PubMed Central

    2012-01-01

    Background Proteolytic breakdown of the amyloid precursor protein (APP) by secretases is a complex cellular process that results in formation of neurotoxic Aβ peptides, causative of neurodegeneration in Alzheimer’s disease (AD). Processing involves monomeric and dimeric forms of APP that traffic through distinct cellular compartments where the various secretases reside. Amyloidogenic processing is also influenced by modifiers such as sorting receptor-related protein (SORLA), an inhibitor of APP breakdown and major AD risk factor. Results In this study, we developed a multi-compartment model to simulate the complexity of APP processing in neurons and to accurately describe the effects of SORLA on these processes. Based on dose–response data, our study concludes that SORLA specifically impairs processing of APP dimers, the preferred secretase substrate. In addition, SORLA alters the dynamic behavior of β-secretase, the enzyme responsible for the initial step in the amyloidogenic processing cascade. Conclusions Our multi-compartment model represents a major conceptual advance over single-compartment models previously used to simulate APP processing; and it identified APP dimers and β-secretase as the two distinct targets of the inhibitory action of SORLA in Alzheimer’s disease. PMID:22727043

  11. A ruthenium dimer complex with a flexible linker slowly threads between DNA bases in two distinct steps.

    PubMed

    Bahira, Meriem; McCauley, Micah J; Almaqwashi, Ali A; Lincoln, Per; Westerlund, Fredrik; Rouzina, Ioulia; Williams, Mark C

    2015-10-15

    Several multi-component DNA intercalating small molecules have been designed around ruthenium-based intercalating monomers to optimize DNA binding properties for therapeutic use. Here we probe the DNA binding ligand [μ-C4(cpdppz)2(phen)4Ru2](4+), which consists of two Ru(phen)2dppz(2+) moieties joined by a flexible linker. To quantify ligand binding, double-stranded DNA is stretched with optical tweezers and exposed to ligand under constant applied force. In contrast to other bis-intercalators, we find that ligand association is described by a two-step process, which consists of fast bimolecular intercalation of the first dppz moiety followed by ∼10-fold slower intercalation of the second dppz moiety. The second step is rate-limited by the requirement for a DNA-ligand conformational change that allows the flexible linker to pass through the DNA duplex. Based on our measured force-dependent binding rates and ligand-induced DNA elongation measurements, we are able to map out the energy landscape and structural dynamics for both ligand binding steps. In addition, we find that at zero force the overall binding process involves fast association (∼10 s), slow dissociation (∼300 s), and very high affinity (Kd ∼10 nM). The methodology developed in this work will be useful for studying the mechanism of DNA binding by other multi-step intercalating ligands and proteins. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Multi-objective design optimization of antenna structures using sequential domain patching with automated patch size determination

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2018-02-01

    In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.

  13. Regulation of floral stem cell termination in Arabidopsis

    PubMed Central

    Sun, Bo; Ito, Toshiro

    2015-01-01

    In Arabidopsis, floral stem cells are maintained only at the initial stages of flower development, and they are terminated at a specific time to ensure proper development of the reproductive organs. Floral stem cell termination is a dynamic and multi-step process involving many transcription factors, chromatin remodeling factors and signaling pathways. In this review, we discuss the mechanisms involved in floral stem cell maintenance and termination, highlighting the interplay between transcriptional regulation and epigenetic machinery in the control of specific floral developmental genes. In addition, we discuss additional factors involved in floral stem cell regulation, with the goal of untangling the complexity of the floral stem cell regulatory network. PMID:25699061

  14. Focused-electron-beam-induced processing (FEBIP) for emerging applications in carbon nanoelectronics

    NASA Astrophysics Data System (ADS)

    Fedorov, Andrei G.; Kim, Songkil; Henry, Mathias; Kulkarni, Dhaval; Tsukruk, Vladimir V.

    2014-12-01

    Focused-electron-beam-induced processing (FEBIP), a resist-free additive nanomanufacturing technique, is an actively researched method for "direct-write" processing of a wide range of structural and functional nanomaterials, with high degree of spatial and time-domain control. This article attempts to critically assess the FEBIP capabilities and unique value proposition in the context of processing of electronics materials, with a particular emphasis on emerging carbon (i.e., based on graphene and carbon nanotubes) devices and interconnect structures. One of the major hurdles in advancing the carbon-based electronic materials and device fabrication is a disjoint nature of various processing steps involved in making a functional device from the precursor graphene/CNT materials. Not only this multi-step sequence severely limits the throughput and increases the cost, but also dramatically reduces the processing reproducibility and negatively impacts the quality because of possible between-the-step contamination, especially for impurity-susceptible materials such as graphene. The FEBIP provides a unique opportunity to address many challenges of carbon nanoelectronics, especially when it is employed as part of an integrated processing environment based on multiple "beams" of energetic particles, including electrons, photons, and molecules. This avenue is promising from the applications' prospective, as such a multi-functional (electron/photon/molecule beam) enables one to define shapes (patterning), form structures (deposition/etching), and modify (cleaning/doping/annealing) properties with locally resolved control on nanoscale using the same tool without ever changing the processing environment. It thus will have a direct positive impact on enhancing functionality, improving quality and reducing fabrication costs for electronic devices, based on both conventional CMOS and emerging carbon (CNT/graphene) materials.

  15. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.

  16. Eating and drinking interventions for people at risk of lacking decision-making capacity: who decides and how?

    PubMed

    Clarke, Gemma; Galbraith, Sarah; Woodward, Jeremy; Holland, Anthony; Barclay, Stephen

    2015-06-11

    Some people with progressive neurological diseases find they need additional support with eating and drinking at mealtimes, and may require artificial nutrition and hydration. Decisions concerning artificial nutrition and hydration at the end of life are ethically complex, particularly if the individual lacks decision-making capacity. Decisions may concern issues of life and death: weighing the potential for increasing morbidity and prolonging suffering, with potentially shortening life. When individuals lack decision-making capacity, the standard processes of obtaining informed consent for medical interventions are disrupted. Increasingly multi-professional groups are being utilised to make difficult ethical decisions within healthcare. This paper reports upon a service evaluation which examined decision-making within a UK hospital Feeding Issues Multi-Professional Team. A three month observation of a hospital-based multi-professional team concerning feeding issues, and a one year examination of their records. The key research questions are: a) How are decisions made concerning artificial nutrition for individuals at risk of lacking decision-making capacity? b) What are the key decision-making factors that are balanced? c) Who is involved in the decision-making process? Decision-making was not a singular decision, but rather involved many different steps. Discussions involving relatives and other clinicians, often took place outside of meetings. Topics of discussion varied but the outcome relied upon balancing the information along four interdependent axes: (1) Risks, burdens and benefits; (2) Treatment goals; (3) Normative ethical values; (4) Interested parties. Decision-making was a dynamic ongoing process with many people involved. The multiple points of decision-making, and the number of people involved with the decision-making process, mean the question of 'who decides' cannot be fully answered. There is a potential for anonymity of multiple decision-makers to arise. Decisions in real world clinical practice may not fit precisely into a model of decision-making. The findings from this service evaluation illustrate that within multi-professional team decision-making; decisions may contain elements of both substituted and supported decision-making, and may be better represented as existing upon a continuum.

  17. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  18. Enantioselective Syntheses of (−)-Alloyohimbane and (−)-Yohimbane by an Efficient Enzymatic Desymmetrization Process

    PubMed Central

    Ghosh, Arun K.; Sarkar, Anindya

    2016-01-01

    Enantioselective syntheses of (−)-alloyohimbane and (−)-yohimbane was accomplished in a convergent manner. The key step involved a modified mild protocol for the enantioselective enzymatic desymmetrization of meso-diacetate. The protocol provided convenient access to an optically active monoacetate in multi-gram scale in high enantiomeric purity. This monoacetate was converted to (−)-alloyohimbane. Reductive amination of the derived aldehyde causes the isomerization leading to the trans-product and allows the synthesis of (−)-yohimbane. PMID:28757804

  19. Recent progress on understanding the mechanisms of amyloid nucleation.

    PubMed

    Chatani, Eri; Yamamoto, Naoki

    2018-04-01

    Amyloid fibrils are supramolecular protein assemblies with a fibrous morphology and cross-β structure. The formation of amyloid fibrils typically follows a nucleation-dependent polymerization mechanism, in which a one-step nucleation scheme has widely been accepted. However, a variety of oligomers have been identified in early stages of fibrillation, and a nucleated conformational conversion (NCC) mechanism, in which oligomers serve as a precursor of amyloid nucleation and convert to amyloid nuclei, has been proposed. This development has raised the need to consider more complicated multi-step nucleation processes in addition to the simplest one-step process, and evidence for the direct involvement of oligomers as nucleation precursors has been obtained both experimentally and theoretically. Interestingly, the NCC mechanism has some analogy with the two-step nucleation mechanism proposed for inorganic and organic crystals and protein crystals, although a more dramatic conformational conversion of proteins should be considered in amyloid nucleation. Clarifying the properties of the nucleation precursors of amyloid fibrils in detail, in comparison with those of crystals, will allow a better understanding of the nucleation of amyloid fibrils and pave the way to develop techniques to regulate it.

  20. Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes

    NASA Astrophysics Data System (ADS)

    Huang, Shaoming

    2003-06-01

    An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.

  1. A Student Synthesis of the Housefly Sex Attractant.

    ERIC Educational Resources Information Center

    Cormier, Russell; And Others

    1979-01-01

    A novel and efficient (34 percent overall) multi-step synthesis of the housefly sex attractant, muscalure, is described. Each of the steps involves types of reactions with which the undergraduate student would be familiar after one-and-one-half semesters of organic chemistry. (BB)

  2. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  3. Improved perovskite phototransistor prepared using multi-step annealing method

    NASA Astrophysics Data System (ADS)

    Cao, Mingxuan; Zhang, Yating; Yu, Yu; Yao, Jianquan

    2018-02-01

    Organic-inorganic hybrid perovskites with good intrinsic physical properties have received substantial interest for solar cell and optoelectronic applications. However, perovskite film always suffers from a low carrier mobility due to its structural imperfection including sharp grain boundaries and pinholes, restricting their device performance and application potential. Here we demonstrate a straightforward strategy based on multi-step annealing process to improve the performance of perovskite photodetector. Annealing temperature and duration greatly affects the surface morphology and optoelectrical properties of perovskites which determines the device property of phototransistor. The perovskite films treated with multi-step annealing method tend to form highly uniform, well-crystallized and high surface coverage perovskite film, which exhibit stronger ultraviolet-visible absorption and photoluminescence spectrum compare to the perovskites prepared by conventional one-step annealing process. The field-effect mobilities of perovskite photodetector treated by one-step direct annealing method shows mobility as 0.121 (0.062) cm2V-1s-1 for holes (electrons), which increases to 1.01 (0.54) cm2V-1s-1 for that treated with muti-step slow annealing method. Moreover, the perovskite phototransistors exhibit a fast photoresponse speed of 78 μs. In general, this work focuses on the influence of annealing methods on perovskite phototransistor, instead of obtains best parameters of it. These findings prove that Multi-step annealing methods is feasible to prepared high performance based photodetector.

  4. Fabrication of diamond shells

    DOEpatents

    Hamza, Alex V.; Biener, Juergen; Wild, Christoph; Woerner, Eckhard

    2016-11-01

    A novel method for fabricating diamond shells is introduced. The fabrication of such shells is a multi-step process, which involves diamond chemical vapor deposition on predetermined mandrels followed by polishing, microfabrication of holes, and removal of the mandrel by an etch process. The resultant shells of the present invention can be configured with a surface roughness at the nanometer level (e.g., on the order of down to about 10 nm RMS) on a mm length scale, and exhibit excellent hardness/strength, and good transparency in the both the infra-red and visible. Specifically, a novel process is disclosed herein, which allows coating of spherical substrates with optical-quality diamond films or nanocrystalline diamond films.

  5. Practical Aspects of Designing and Conducting Validation Studies Involving Multi-study Trials.

    PubMed

    Coecke, Sandra; Bernasconi, Camilla; Bowe, Gerard; Bostroem, Ann-Charlotte; Burton, Julien; Cole, Thomas; Fortaner, Salvador; Gouliarmou, Varvara; Gray, Andrew; Griesinger, Claudius; Louhimies, Susanna; Gyves, Emilio Mendoza-de; Joossens, Elisabeth; Prinz, Maurits-Jan; Milcamps, Anne; Parissis, Nicholaos; Wilk-Zasadna, Iwona; Barroso, João; Desprez, Bertrand; Langezaal, Ingrid; Liska, Roman; Morath, Siegfried; Reina, Vittorio; Zorzoli, Chiara; Zuang, Valérie

    This chapter focuses on practical aspects of conducting prospective in vitro validation studies, and in particular, by laboratories that are members of the European Union Network of Laboratories for the Validation of Alternative Methods (EU-NETVAL) that is coordinated by the EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM). Prospective validation studies involving EU-NETVAL, comprising a multi-study trial involving several laboratories or "test facilities", typically consist of two main steps: (1) the design of the validation study by EURL ECVAM and (2) the execution of the multi-study trial by a number of qualified laboratories within EU-NETVAL, coordinated and supported by EURL ECVAM. The approach adopted in the conduct of these validation studies adheres to the principles described in the OECD Guidance Document on the Validation and International Acceptance of new or updated test methods for Hazard Assessment No. 34 (OECD 2005). The context and scope of conducting prospective in vitro validation studies is dealt with in Chap. 4 . Here we focus mainly on the processes followed to carry out a prospective validation of in vitro methods involving different laboratories with the ultimate aim of generating a dataset that can support a decision in relation to the possible development of an international test guideline (e.g. by the OECD) or the establishment of performance standards.

  6. Cytoplasmic dynein binding, run length, and velocity are guided by long-range electrostatic interactions

    PubMed Central

    Li, Lin; Alper, Joshua; Alexov, Emil

    2016-01-01

    Dyneins are important molecular motors involved in many essential biological processes, including cargo transport along microtubules, mitosis, and in cilia. Dynein motility involves the coupling of microtubule binding and unbinding to a change in the configuration of the linker domain induced by ATP hydrolysis, which occur some 25 nm apart. This leaves the accuracy of dynein stepping relatively inaccurate and susceptible to thermal noise. Using multi-scale modeling with a computational focusing technique, we demonstrate that the microtubule forms an electrostatic funnel that guides the dynein’s microtubule binding domain (MTBD) as it finally docks to the precise, keyed binding location on the microtubule. Furthermore, we demonstrate that electrostatic component of the MTBD’s binding free energy is linearly correlated with the velocity and run length of dynein, and we use this linearity to predict the effect of mutating each glutamic and aspartic acid located in MTBD domain to alanine. Lastly, we show that the binding of dynein to the microtubule is associated with conformational changes involving several helices, and we localize flexible hinge points within the stalk helices. Taken all together, we demonstrate that long range electrostatic interactions bring a level of precision to an otherwise noisy dynein stepping process. PMID:27531742

  7. Multi-target-qubit unconventional geometric phase gate in a multi-cavity system

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Cao, Xiao-Zhi; Su, Qi-Ping; Xiong, Shao-Jie; Yang, Chui-Ping

    2016-02-01

    Cavity-based large scale quantum information processing (QIP) may involve multiple cavities and require performing various quantum logic operations on qubits distributed in different cavities. Geometric-phase-based quantum computing has drawn much attention recently, which offers advantages against inaccuracies and local fluctuations. In addition, multiqubit gates are particularly appealing and play important roles in QIP. We here present a simple and efficient scheme for realizing a multi-target-qubit unconventional geometric phase gate in a multi-cavity system. This multiqubit phase gate has a common control qubit but different target qubits distributed in different cavities, which can be achieved using a single-step operation. The gate operation time is independent of the number of qubits and only two levels for each qubit are needed. This multiqubit gate is generic, e.g., by performing single-qubit operations, it can be converted into two types of significant multi-target-qubit phase gates useful in QIP. The proposal is quite general, which can be used to accomplish the same task for a general type of qubits such as atoms, NV centers, quantum dots, and superconducting qubits.

  8. Multi-target-qubit unconventional geometric phase gate in a multi-cavity system.

    PubMed

    Liu, Tong; Cao, Xiao-Zhi; Su, Qi-Ping; Xiong, Shao-Jie; Yang, Chui-Ping

    2016-02-22

    Cavity-based large scale quantum information processing (QIP) may involve multiple cavities and require performing various quantum logic operations on qubits distributed in different cavities. Geometric-phase-based quantum computing has drawn much attention recently, which offers advantages against inaccuracies and local fluctuations. In addition, multiqubit gates are particularly appealing and play important roles in QIP. We here present a simple and efficient scheme for realizing a multi-target-qubit unconventional geometric phase gate in a multi-cavity system. This multiqubit phase gate has a common control qubit but different target qubits distributed in different cavities, which can be achieved using a single-step operation. The gate operation time is independent of the number of qubits and only two levels for each qubit are needed. This multiqubit gate is generic, e.g., by performing single-qubit operations, it can be converted into two types of significant multi-target-qubit phase gates useful in QIP. The proposal is quite general, which can be used to accomplish the same task for a general type of qubits such as atoms, NV centers, quantum dots, and superconducting qubits.

  9. Investigation of the Josephin Domain protein-protein interaction by molecular dynamics.

    PubMed

    Deriu, Marco A; Grasso, Gianvito; Licandro, Ginevra; Danani, Andrea; Gallo, Diego; Tuszynski, Jack A; Morbiducci, Umberto

    2014-01-01

    Spinocerebellar ataxia (SCA) 3, the most common form of SCA, is a neurodegenerative rare disease characterized by polyglutamine tract expansion and self-assembly of Ataxin3 (At3) misfolded proteins into highly organized fibrillar aggregates. The At3 N-terminal Josephin Domain (JD) has been suggested as being responsible for mediating the initial phase of the At3 double-step fibrillogenesis. Several issues concerning the residues involved in the JD's aggregation and, more generally, the JD clumping mechanism have not been clarified yet. In this paper we present an investigation focusing on the JD protein-protein interaction by means of molecular modeling. Our results suggest possible aminoacids involved in JD contact together with local and non-local effects following JD dimerization. Surprisingly, JD conformational changes following the binding may involve ubiquitin binding sites and hairpin region even though they do not pertain to the JD interaction surfaces. Moreover, the JD binding event has been found to alter the hairpin open-like conformation toward a closed-like arrangement over the simulated timescale. Finally, our results suggest that the JD aggregation might be a multi-step process, with an initial fast JD-JD binding mainly driven by Arg101, followed by slower structural global rearrangements involving the exposure to the solvent of Leu84-Trp87, which might play a role in a second step of JD aggregation.

  10. Using Sudoku to Introduce Proof Techniques

    ERIC Educational Resources Information Center

    Snyder, Brian A.

    2010-01-01

    In this article we show how the Sudoku puzzle and the three simple rules determining its solution can be used as an introduction to proof-based mathematics. In the completion of the puzzle, students can construct multi-step solutions that involve sequencing of steps, use methods such as backtracking and proof by cases, and proof by contradiction…

  11. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  12. Strengths-Based Nursing: A Process for Implementing a Philosophy Into Practice.

    PubMed

    Gottlieb, Laurie N; Gottlieb, Bruce

    2017-08-01

    Strengths-Based Nursing (SBN) is both a philosophy and value-driven approach that can guide clinicians, educators, manager/leaders, and researchers. SBN is rooted in principles of person/family centered care, empowerment, relational care, and innate health and healing. SBN is family nursing yet not all family nursing models are strengths-based. The challenge is how to translate a philosophy to change practice. In this article, we describe a process of implementation that has organically evolved of a multi-layered and multi-pronged approach that involves patients and families, clinicians, educators, leaders, managers, and researchers as well as key stakeholders including union leaders, opinion leaders, and policy makers from both nursing and other disciplines. There are two phases to the implementation process, namely, Phase 1: pre-commitment/pre-adoption and Phase 2: adoption. Each phase consists of distinct steps with accompanying strategies. These phases occur both sequentially and concurrently. Facilitating factors that enable the implementation process include values which align, readiness to accept SBN, curiosity-courage-commitment on the part of early adopters, a critical mass of early adopters, and making SBN approach both relevant and context specific.

  13. Fast quantification of bovine milk proteins employing external cavity-quantum cascade laser spectroscopy.

    PubMed

    Schwaighofer, Andreas; Kuligowski, Julia; Quintás, Guillermo; Mayer, Helmut K; Lendl, Bernhard

    2018-06-30

    Analysis of proteins in bovine milk is usually tackled by time-consuming analytical approaches involving wet-chemical, multi-step sample clean-up procedures. The use of external cavity-quantum cascade laser (EC-QCL) based IR spectroscopy was evaluated as an alternative screening tool for direct and simultaneous quantification of individual proteins (i.e. casein and β-lactoglobulin) and total protein content in commercial bovine milk samples. Mid-IR spectra of protein standard mixtures were used for building partial least squares (PLS) regression models. A sample set comprising different milk types (pasteurized; differently processed extended shelf life, ESL; ultra-high temperature, UHT) was analysed and results were compared to reference methods. Concentration values of the QCL-IR spectroscopy approach obtained within several minutes are in good agreement with reference methods involving multiple sample preparation steps. The potential application as a fast screening method for estimating the heat load applied to liquid milk is demonstrated. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Direct electrochemistry of cytochrome c immobilized on titanium nitride/multi-walled carbon nanotube composite for amperometric nitrite biosensor.

    PubMed

    Haldorai, Yuvaraj; Hwang, Seung-Kyu; Gopalan, Anantha-Iyengar; Huh, Yun Suk; Han, Young-Kyu; Voit, Walter; Sai-Anand, Gopalan; Lee, Kwang-Pill

    2016-05-15

    In this report, titanium nitride (TiN) nanoparticles decorated multi-walled carbon nanotube (MWCNTs) nanocomposite is fabricated via a two-step process. These two steps involve the decoration of titanium dioxide nanoparticles onto the MWCNTs surface and a subsequent thermal nitridation. Transmission electron microscopy shows that TiN nanoparticles with a mean diameter of ≤ 20 nm are homogeneously dispersed onto the MWCNTs surface. Direct electrochemistry and electrocatalysis of cytochrome c immobilized on the MWCNTs-TiN composite modified on a glassy carbon electrode for nitrite sensing are investigated. Under optimum conditions, the current response is linear to its concentration from 1 µM to 2000 µM with a sensitivity of 121.5 µA µM(-1)cm(-2) and a low detection limit of 0.0014 µM. The proposed electrode shows good reproducibility and long-term stability. The applicability of the as-prepared biosensor is validated by the successful detection of nitrite in tap and sea water samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Monte Carlo Planning Method Estimates Planning Horizons during Interactive Social Exchange.

    PubMed

    Hula, Andreas; Montague, P Read; Dayan, Peter

    2015-06-01

    Reciprocating interactions represent a central feature of all human exchanges. They have been the target of various recent experiments, with healthy participants and psychiatric populations engaging as dyads in multi-round exchanges such as a repeated trust task. Behaviour in such exchanges involves complexities related to each agent's preference for equity with their partner, beliefs about the partner's appetite for equity, beliefs about the partner's model of their partner, and so on. Agents may also plan different numbers of steps into the future. Providing a computationally precise account of the behaviour is an essential step towards understanding what underlies choices. A natural framework for this is that of an interactive partially observable Markov decision process (IPOMDP). However, the various complexities make IPOMDPs inordinately computationally challenging. Here, we show how to approximate the solution for the multi-round trust task using a variant of the Monte-Carlo tree search algorithm. We demonstrate that the algorithm is efficient and effective, and therefore can be used to invert observations of behavioural choices. We use generated behaviour to elucidate the richness and sophistication of interactive inference.

  16. Monte Carlo Planning Method Estimates Planning Horizons during Interactive Social Exchange

    PubMed Central

    Hula, Andreas; Montague, P. Read; Dayan, Peter

    2015-01-01

    Reciprocating interactions represent a central feature of all human exchanges. They have been the target of various recent experiments, with healthy participants and psychiatric populations engaging as dyads in multi-round exchanges such as a repeated trust task. Behaviour in such exchanges involves complexities related to each agent’s preference for equity with their partner, beliefs about the partner’s appetite for equity, beliefs about the partner’s model of their partner, and so on. Agents may also plan different numbers of steps into the future. Providing a computationally precise account of the behaviour is an essential step towards understanding what underlies choices. A natural framework for this is that of an interactive partially observable Markov decision process (IPOMDP). However, the various complexities make IPOMDPs inordinately computationally challenging. Here, we show how to approximate the solution for the multi-round trust task using a variant of the Monte-Carlo tree search algorithm. We demonstrate that the algorithm is efficient and effective, and therefore can be used to invert observations of behavioural choices. We use generated behaviour to elucidate the richness and sophistication of interactive inference. PMID:26053429

  17. Multi-layered nanoparticles for penetrating the endosome and nuclear membrane via a step-wise membrane fusion process.

    PubMed

    Akita, Hidetaka; Kudo, Asako; Minoura, Arisa; Yamaguti, Masaya; Khalil, Ikramy A; Moriguchi, Rumiko; Masuda, Tomoya; Danev, Radostin; Nagayama, Kuniaki; Kogure, Kentaro; Harashima, Hideyoshi

    2009-05-01

    Efficient targeting of DNA to the nucleus is a prerequisite for effective gene therapy. The gene-delivery vehicle must penetrate through the plasma membrane, and the DNA-impermeable double-membraned nuclear envelope, and deposit its DNA cargo in a form ready for transcription. Here we introduce a concept for overcoming intracellular membrane barriers that involves step-wise membrane fusion. To achieve this, a nanotechnology was developed that creates a multi-layered nanoparticle, which we refer to as a Tetra-lamellar Multi-functional Envelope-type Nano Device (T-MEND). The critical structural elements of the T-MEND are a DNA-polycation condensed core coated with two nuclear membrane-fusogenic inner envelopes and two endosome-fusogenic outer envelopes, which are shed in stepwise fashion. A double-lamellar membrane structure is required for nuclear delivery via the stepwise fusion of double layered nuclear membrane structure. Intracellular membrane fusions to endosomes and nuclear membranes were verified by spectral imaging of fluorescence resonance energy transfer (FRET) between donor and acceptor fluorophores that had been dually labeled on the liposome surface. Coating the core with the minimum number of nucleus-fusogenic lipid envelopes (i.e., 2) is essential to facilitate transcription. As a result, the T-MEND achieves dramatic levels of transgene expression in non-dividing cells.

  18. Ten Steps to Conducting a Large, Multi-Site, Longitudinal Investigation of Language and Reading in Young Children

    PubMed Central

    Farquharson, Kelly; Murphy, Kimberly A.

    2016-01-01

    Purpose: This paper describes methodological procedures involving execution of a large-scale, multi-site longitudinal study of language and reading comprehension in young children. Researchers in the Language and Reading Research Consortium (LARRC) developed and implemented these procedures to ensure data integrity across multiple sites, schools, and grades. Specifically, major features of our approach, as well as lessons learned, are summarized in 10 steps essential for successful completion of a large-scale longitudinal investigation in early grades. Method: Over 5 years, children in preschool through third grade were administered a battery of 35 higher- and lower-level language, listening, and reading comprehension measures (RCM). Data were collected from children, their teachers, and their parents/guardians at four sites across the United States. Substantial and rigorous effort was aimed toward maintaining consistency in processes and data management across sites for children, assessors, and staff. Conclusion: With appropriate planning, flexibility, and communication strategies in place, LARRC developed and executed a successful multi-site longitudinal research study that will meet its goal of investigating the contribution and role of language skills in the development of children's listening and reading comprehension. Through dissemination of our design strategies and lessons learned, research teams embarking on similar endeavors can be better equipped to anticipate the challenges. PMID:27064308

  19. Metallic superhydrophobic surfaces via thermal sensitization

    NASA Astrophysics Data System (ADS)

    Vahabi, Hamed; Wang, Wei; Popat, Ketul C.; Kwon, Gibum; Holland, Troy B.; Kota, Arun K.

    2017-06-01

    Superhydrophobic surfaces (i.e., surfaces extremely repellent to water) allow water droplets to bead up and easily roll off from the surface. While a few methods have been developed to fabricate metallic superhydrophobic surfaces, these methods typically involve expensive equipment, environmental hazards, or multi-step processes. In this work, we developed a universal, scalable, solvent-free, one-step methodology based on thermal sensitization to create appropriate surface texture and fabricate metallic superhydrophobic surfaces. To demonstrate the feasibility of our methodology and elucidate the underlying mechanism, we fabricated superhydrophobic surfaces using ferritic (430) and austenitic (316) stainless steels (representative alloys) with roll off angles as low as 4° and 7°, respectively. We envision that our approach will enable the fabrication of superhydrophobic metal alloys for a wide range of civilian and military applications.

  20. Numerical simulation of machining distortions on a forged aerospace component following a one and a multi-step approaches

    NASA Astrophysics Data System (ADS)

    Prete, Antonio Del; Franchi, Rodolfo; Antermite, Fabrizio; Donatiello, Iolanda

    2018-05-01

    Residual stresses appear in a component as a consequence of thermo-mechanical processes (e.g. ring rolling process) casting and heat treatments. When machining these kinds of components, distortions arise due to the redistribution of residual stresses due to the foregoing process history inside the material. If distortions are excessive, they can lead to a large number of scrap parts. Since dimensional accuracy can affect directly the engines efficiency, the dimensional control for aerospace components is a non-trivial issue. In this paper, the problem related to the distortions of large thin walled aeroengines components in nickel superalloys has been addressed. In order to estimate distortions on inner diameters after internal turning operations, a 3D Finite Element Method (FEM) analysis has been developed on a real industrial test case. All the process history, has been taken into account by developing FEM models of ring rolling process and heat treatments. Three different strategies of ring rolling process have been studied and the combination of related parameters which allows to obtain the best dimensional accuracy has been found. Furthermore, grain size evolution and recrystallization phenomena during manufacturing process has been numerically investigated using a semi empirical Johnson-Mehl-Avrami-Kohnogorov (JMAK) model. The volume subtractions have been simulated by boolean trimming: a one step and a multi step analysis have been performed. The multi-step procedure has allowed to choose the best material removal sequence in order to reduce machining distortions.

  1. Electron correlations and pre-collision in the re-collision picture of high harmonic generation

    NASA Astrophysics Data System (ADS)

    Mašín, Zdeněk; Harvey, Alex G.; Spanner, Michael; Patchkovskii, Serguei; Ivanov, Misha; Smirnova, Olga

    2018-07-01

    We discuss the seminal three-step model and the re-collision picture in the context of high harmonic generation in molecules. In particular, we stress the importance of multi-electron correlation during the first and the third of the three steps of the process: (1) the strong-field ionization and (3) the recombination. We point out how an accurate account of multi-electron correlations during the third recombination step allows one to gauge the importance of pre-collision: the term coined by Eberly (n.d. private communication) to describe unusual pathways during the first, ionization, step.

  2. Video Processes in Teacher Education Programs; Scope, Techniques, and Assessment. Multi-State Teacher Education Project, Monograph III.

    ERIC Educational Resources Information Center

    Bosley, Howard E.; And Others

    "Video Processes Are Changing Teacher Education" by Howard Bosley (the first of five papers comprising this document) discusses the Multi-State Teacher Education Project (M-STEP) experimentation with media; it lists various uses of video processes, concentrating specifically on microteaching and the use of simulation and critical…

  3. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    PubMed

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-07-01

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  4. The Synthesis of 2-acetyl-1,4-naphthoquinone: A Multi-step Synthesis.

    ERIC Educational Resources Information Center

    Green, Ivan R.

    1982-01-01

    Outlines 2 procedures for synthesizing 2-acetyl-1,4-naphthoquinone to compare relative merits of the two pathways. The major objective of the exercise is to demonstrate that certain factors should be considered when selecting a pathway for synthesis including availability of starting materials, cost of reagents, number of steps involved,…

  5. Multi-catalysis cascade reactions based on the methoxycarbonylketene platform: diversity-oriented synthesis of functionalized non-symmetrical malonates for agrochemicals and pharmaceuticals.

    PubMed

    Ramachary, Dhevalapally B; Venkaiah, Chintalapudi; Reddy, Y Vijayendar; Kishor, Mamillapalli

    2009-05-21

    In this paper we describe new multi-catalysis cascade (MCC) reactions for the one-pot synthesis of highly functionalized non-symmetrical malonates. These metal-free reactions are either five-step (olefination/hydrogenation/alkylation/ketenization/esterification) or six-step (olefination/hydrogenation/alkylation/ketenization/esterification/alkylation), and employ aldehydes/ketones, Meldrum's acid, 1,4-dihydropyridine/o-phenylenediamine, diazomethane, alcohols and active ethylene/acetylenes, and involve iminium-, self-, self-, self- and base-catalysis, respectively. Many of the products have direct application in agricultural and pharmaceutical chemistry.

  6. From Structure to Function: A Comprehensive Compendium of Tools to Unveil Protein Domains and Understand Their Role in Cytokinesis.

    PubMed

    Rincon, Sergio A; Paoletti, Anne

    2016-01-01

    Unveiling the function of a novel protein is a challenging task that requires careful experimental design. Yeast cytokinesis is a conserved process that involves modular structural and regulatory proteins. For such proteins, an important step is to identify their domains and structural organization. Here we briefly discuss a collection of methods commonly used for sequence alignment and prediction of protein structure that represent powerful tools for the identification homologous domains and design of structure-function approaches to test experimentally the function of multi-domain proteins such as those implicated in yeast cytokinesis.

  7. Macro-fingerprint analysis-through-separation of licorice based on FT-IR and 2DCOS-IR

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Wang, Ping; Xu, Changhua; Yang, Yan; Li, Jin; Chen, Tao; Li, Zheng; Cui, Weili; Zhou, Qun; Sun, Suqin; Li, Huifen

    2014-07-01

    In this paper, a step-by-step analysis-through-separation method under the navigation of multi-step IR macro-fingerprint (FT-IR integrated with second derivative IR (SD-IR) and 2DCOS-IR) was developed for comprehensively characterizing the hierarchical chemical fingerprints of licorice from entirety to single active components. Subsequently, the chemical profile variation rules of three parts (flavonoids, saponins and saccharides) in the separation process were holistically revealed and the number of matching peaks and correlation coefficients with standards of pure compounds was increasing along the extracting directions. The findings were supported by UPLC results and a verification experiment of aqueous separation process. It has been demonstrated that the developed multi-step IR macro-fingerprint analysis-through-separation approach could be a rapid, effective and integrated method not only for objectively providing comprehensive chemical characterization of licorice and all its separated parts, but also for rapidly revealing the global enrichment trend of the active components in licorice separation process.

  8. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics.

    PubMed

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-10-05

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics.

  9. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics

    PubMed Central

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-01-01

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics. PMID:27703141

  10. AN INTEGRATED PERSPECTIVE ON THE ASSESSMENT OF TECHNOLOGIES: INTEGRATE-HTA.

    PubMed

    Wahlster, Philip; Brereton, Louise; Burns, Jacob; Hofmann, Björn; Mozygemba, Kati; Oortwijn, Wija; Pfadenhauer, Lisa; Polus, Stephanie; Rehfuess, Eva; Schilling, Imke; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2017-01-01

    Current health technology assessment (HTA) is not well equipped to assess complex technologies as insufficient attention is being paid to the diversity in patient characteristics and preferences, context, and implementation. Strategies to integrate these and several other aspects, such as ethical considerations, in a comprehensive assessment are missing. The aim of the European research project INTEGRATE-HTA was to develop a model for an integrated HTA of complex technologies. A multi-method, four-stage approach guided the development of the INTEGRATE-HTA Model: (i) definition of the different dimensions of information to be integrated, (ii) literature review of existing methods for integration, (iii) adjustment of concepts and methods for assessing distinct aspects of complex technologies in the frame of an integrated process, and (iv) application of the model in a case study and subsequent revisions. The INTEGRATE-HTA Model consists of five steps, each involving stakeholders: (i) definition of the technology and the objective of the HTA; (ii) development of a logic model to provide a structured overview of the technology and the system in which it is embedded; (iii) evidence assessment on effectiveness, economic, ethical, legal, and socio-cultural aspects, taking variability of participants, context, implementation issues, and their interactions into account; (iv) populating the logic model with the data generated in step 3; (v) structured process of decision-making. The INTEGRATE-HTA Model provides a structured process for integrated HTAs of complex technologies. Stakeholder involvement in all steps is essential as a means of ensuring relevance and meaningful interpretation of the evidence.

  11. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  12. KEEPING A STEP AHEAD - FORMATIVE PHASE OF A WORKPLACE INTERVENTION TRIAL TO PREVENT OBESITY

    PubMed Central

    Zapka, Jane; Lemon, Stephenie C.; Estabrook, Barbara B.; Jolicoeur, Denise G.

    2008-01-01

    Background Ecological interventions hold promise for promoting overweight and obesity prevention in worksites. Given the paucity of evaluative research in the hospital worksite setting, considerable formative work is required for successful implementation and evaluation. Purpose This paper describes the formative phases of Step Ahead, a site-randomized controlled trial of a multi-level intervention that promotes physical activity and healthy eating in 6 hospitals in central Massachusetts. The purpose of the formative research phase was to increase the feasibility, effectiveness and likelihood of sustainability of the intervention. Design and Procedures The Step Ahead ecological intervention approach targets change at the organization, the interpersonal work environment and the individual levels. The intervention was developed using fundamental steps of intervention mapping and important tenets of participatory research. Formative research methods were used to engage leadership support and assistance and to develop an intervention plan that is both theoretically and practically grounded. This report uses observational data, program minutes and reports, and process tracking data. Developmental Strategies and Observations Leadership involvement (key informant interviews and advisory boards), employee focus groups and advisory boards, and quantitative environmental assessments cultivated participation and support. Determining multiple foci of change and designing measurable objectives and generic assessment tools to document progress are complex challenges encountered in planning phases. Lessons Learned Multi-level trials in diverse organizations require flexibility and balance of theory application and practice-based perspectives to affect impact and outcome objectives. Formative research is an essential component. PMID:18073339

  13. A preliminary model of work during initial examination and treatment planning appointments.

    PubMed

    Irwin, J Y; Torres-Urquidy, M H; Schleyer, T; Monaco, V

    2009-01-10

    Objective This study's objective was to formally describe the work process for charting and treatment planning in general dental practice to inform the design of a new clinical computing environment.Methods Using a process called contextual inquiry, researchers observed 23 comprehensive examination and treatment planning sessions during 14 visits to 12 general US dental offices. For each visit, field notes were analysed and reformulated as formalised models. Subsequently, each model type was consolidated across all offices and visits. Interruptions to the workflow, called breakdowns, were identified.Results Clinical work during dental examination and treatment planning appointments is a highly collaborative activity involving dentists, hygienists and assistants. Personnel with multiple overlapping roles complete complex multi-step tasks supported by a large and varied collection of equipment, artifacts and technology. Most of the breakdowns were related to technology which interrupted the workflow, caused rework and increased the number of steps in work processes.Conclusion Current dental software could be significantly improved with regard to its support for communication and collaboration, workflow, information design and presentation, information content, and data entry.

  14. Using the Binary Phase-Field Crystal Model to Describe Non-Classical Nucleation Pathways in Gold Nanoparticles

    NASA Astrophysics Data System (ADS)

    Smith, Nathan; Provatas, Nikolas

    Recent experimental work has shown that gold nanoparticles can precipitate from an aqueous solution through a non-classical, multi-step nucleation process. This multi-step process begins with spinodal decomposition into solute-rich and solute-poor liquid domains followed by nucleation from within the solute-rich domains. We present a binary phase-field crystal theory that shows the same phenomology and examine various cross-over regimes in the growth and coarsening of liquid and solid domains. We'd like to the thank Canada Research Chairs (CRC) program for funding this work.

  15. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics.

    PubMed

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-07-21

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.

  16. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics

    PubMed Central

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-01-01

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876

  17. An automated flow system incorporating in-line acid dissolution of bismuth metal from a cyclotron irradiated target assembly for use in the isolation of astatine-211

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Hara, Matthew J.; Krzysko, Anthony J.; Niver, Cynthia M.

    Astatine-211 (211At) is a promising cyclotron-produced radionuclide being investigated for use in targeted alpha therapy of blood borne and metastatic cancers, as well as treatment of tumor remnants after surgical resections. The isolation of trace quantities of 211At, produced within several grams of a Bi metal cyclotron target, involves a complex, multi-step procedure: (1) Bi metal dissolution in strong HNO3, (2) distillation of the HNO3 to yield Bi salts containing 211At, (3) dissolution of the salts in strong HCl, (4) solvent extraction of 211At from bismuth salts with diisopropyl ether (DIPE), and (5) back-extraction of 211At from DIPE into NaOH,more » leading to a purified 211At product. Step (1) has been addressed first to begin the process of automating the onerous 211At isolation process. A computer-controlled Bi target dissolution system has been designed. The system performs in-line dissolution of Bi metal from the target assembly using an enclosed target dissolution block, routing the resulting solubilized 211At/Bi mixture to the subsequent process step. The primary parameters involved in Bi metal solubilization (HNO3 concentration and influent flow rate) were optimized prior to evaluation of the system performance on replicate cyclotron irradiated targets. The results indicate that the system performs reproducibly, having nearly quantitative release of 211At from irradiated targets, with cumulative 211At recoveries that follow a sigmoidal function. The predictable nature of the 211At release profile allows the user to tune the system to meet target processing requirements.« less

  18. Development, upscaling and validation of the purification process for human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII produced in a human cell-line.

    PubMed

    Winge, Stefan; Yderland, Louise; Kannicht, Christoph; Hermans, Pim; Adema, Simon; Schmidt, Torben; Gilljam, Gustav; Linhult, Martin; Tiemeyer, Maya; Belyanskaya, Larisa; Walter, Olaf

    2015-11-01

    Human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII (rFVIII), is the first rFVIII produced in a human cell-line approved by the European Medicines Agency. To describe the development, upscaling and process validation for industrial-scale human-cl rhFVIII purification. The purification process involves one centrifugation, two filtration, five chromatography columns and two dedicated pathogen clearance steps (solvent/detergent treatment and 20 nm nanofiltration). The key purification step uses an affinity resin (VIIISelect) with high specificity for FVIII, removing essentially all host-cell proteins with >80% product recovery. The production-scale multi-step purification process efficiently removes process- and product-related impurities and results in a high-purity rhFVIII product, with an overall yield of ∼50%. Specific activity of the final product was >9000 IU/mg, and the ratio between active FVIII and total FVIII protein present was >0.9. The entire production process is free of animal-derived products. Leaching of potential harmful compounds from chromatography resins and all pathogens tested were below the limit of quantification in the final product. Human-cl rhFVIII can be produced at 500 L bioreactor scale, maintaining high purity and recoveries. The innovative purification process ensures a high-purity and high-quality human-cl rhFVIII product with a high pathogen safety margin. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Mobile magnetic particles as solid-supports for rapid surface-based bioanalysis in continuous flow.

    PubMed

    Peyman, Sally A; Iles, Alexander; Pamme, Nicole

    2009-11-07

    An extremely versatile microfluidic device is demonstrated in which multi-step (bio)chemical procedures can be performed in continuous flow. The system operates by generating several co-laminar flow streams, which contain reagents for specific (bio)reactions across a rectangular reaction chamber. Functionalized magnetic microparticles are employed as mobile solid-supports and are pulled from one side of the reaction chamber to the other by use of an external magnetic field. As the particles traverse the co-laminar reagent streams, binding and washing steps are performed on their surface in one operation in continuous flow. The applicability of the platform was first demonstrated by performing a proof-of-principle binding assay between streptavidin coated magnetic particles and biotin in free solution with a limit of detection of 20 ng mL(-1) of free biotin. The system was then applied to a mouse IgG sandwich immunoassay as a first example of a process involving two binding steps and two washing steps, all performed within 60 s, a fraction of the time required for conventional testing.

  20. Applying micro-costing methods to estimate the costs of pharmacy interventions: an illustration using multi-professional clinical medication reviews in care homes for older people.

    PubMed

    Sach, Tracey H; Desborough, James; Houghton, Julie; Holland, Richard

    2014-11-06

    Economic methods are underutilised within pharmacy research resulting in a lack of quality evidence to support funding decisions for pharmacy interventions. The aim of this study is to illustrate the methods of micro-costing within the pharmacy context in order to raise awareness and use of this approach in pharmacy research. Micro-costing methods are particularly useful where a new service or intervention is being evaluated and for which no previous estimates of the costs of providing the service exist. This paper describes the rationale for undertaking a micro-costing study before detailing and illustrating the process involved. The illustration relates to a recently completed trial of multi-professional medication reviews as an intervention provided in care homes. All costs are presented in UK£2012. In general, costing methods involve three broad steps (identification, measurement and valuation); when using micro-costing, closer attention to detail is required within all three stages of this process. The mean (standard deviation; 95% confidence interval (CI) ) cost per resident of the multi-professional medication review intervention was £104.80 (50.91; 98.72 to 109.45), such that the overall cost of providing the intervention to all intervention home residents was £36,221.29 (95% CI, 32 810.81 to 39 631.77). This study has demonstrated that micro-costing can be a useful method, not only for estimating the cost of a pharmacy intervention to feed into a pharmacy economic evaluation, but also as a source of information to help inform those designing pharmacy services about the potential time and costs involved in delivering such services. © 2014 Royal Pharmaceutical Society.

  1. Certify for success: A methodology for human-centered certification of advanced aviation systems

    NASA Technical Reports Server (NTRS)

    Small, Ronald L.; Rouse, William B.

    1994-01-01

    This position paper uses the methodology in Design for Success as a basis for a human factors certification program. The Design for Success (DFS) methodology espouses a multi-step process to designing and developing systems in a human-centered fashion. These steps are as follows: (1) naturalizing - understand stakeholders and their concerns; (2) marketing - understand market-oriented alternatives to meeting stakeholder concerns; (3) engineering - detailed design and development of the system considering tradeoffs between technology, cost, schedule, certification requirements, etc.; (4) system evaluation - determining if the system meets its goal(s); and (5) sales and service - delivering and maintaining the system. Because the main topic of this paper is certification, we will focus our attention on step 4, System Evaluation, since it is the natural precursor to certification. Evaluation involves testing the system and its parts for their correct behaviors. Certification focuses not only on ensuring that the system exhibits the correct behaviors, but ONLY the correct behaviors.

  2. Video Completion in Digital Stabilization Task Using Pseudo-Panoramic Technique

    NASA Astrophysics Data System (ADS)

    Favorskaya, M. N.; Buryachenko, V. V.; Zotin, A. G.; Pakhirka, A. I.

    2017-05-01

    Video completion is a necessary stage after stabilization of a non-stationary video sequence, if it is desirable to make the resolution of the stabilized frames equalled the resolution of the original frames. Usually the cropped stabilized frames lose 10-20% of area that means the worse visibility of the reconstructed scenes. The extension of a view of field may appear due to the pan-tilt-zoom unwanted camera movement. Our approach deals with a preparing of pseudo-panoramic key frame during a stabilization stage as a pre-processing step for the following inpainting. It is based on a multi-layered representation of each frame including the background and objects, moving differently. The proposed algorithm involves four steps, such as the background completion, local motion inpainting, local warping, and seamless blending. Our experiments show that a necessity of a seamless stitching occurs often than a local warping step. Therefore, a seamless blending was investigated in details including four main categories, such as feathering-based, pyramid-based, gradient-based, and optimal seam-based blending.

  3. 78 FR 13868 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    .... It is an important early step in what will be a multi-step process to develop the policy. The agenda... opportunity for communications between participants about management and science issues that relate to the ABC...

  4. Microarc oxidation coating covered Ti implants with micro-scale gouges formed by a multi-step treatment for improving osseointegration.

    PubMed

    Bai, Yixin; Zhou, Rui; Cao, Jianyun; Wei, Daqing; Du, Qing; Li, Baoqiang; Wang, Yaming; Jia, Dechang; Zhou, Yu

    2017-07-01

    The sub-microporous microarc oxidation (MAO) coating covered Ti implant with micro-scale gouges has been fabricated via a multi-step MAO process to overcome the compromised bone-implant integration. The as-prepared implant has been further mediated by post-heat treatment to compare the effects of -OH functional group and the nano-scale orange peel-like morphology on osseointegration. The bone regeneration, bone-implant contact interface, and biomechanical push-out force of the modified Ti implant have been discussed thoroughly in this work. The greatly improved push-out force for the MAO coated Ti implants with micro-scale gouges could be attributed to the excellent mechanical interlocking effect between implants and biologically meshed bone tissues. Attributed to the -OH functional group which promotes synostosis between the biologically meshed bone and the gouge surface of implant, the multi-step MAO process could be an effective strategy to improve the osseointegration of Ti implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  6. The parallel algorithm for the 2D discrete wavelet transform

    NASA Astrophysics Data System (ADS)

    Barina, David; Najman, Pavel; Kleparnik, Petr; Kula, Michal; Zemcik, Pavel

    2018-04-01

    The discrete wavelet transform can be found at the heart of many image-processing algorithms. Until now, the transform on general-purpose processors (CPUs) was mostly computed using a separable lifting scheme. As the lifting scheme consists of a small number of operations, it is preferred for processing using single-core CPUs. However, considering a parallel processing using multi-core processors, this scheme is inappropriate due to a large number of steps. On such architectures, the number of steps corresponds to the number of points that represent the exchange of data. Consequently, these points often form a performance bottleneck. Our approach appropriately rearranges calculations inside the transform, and thereby reduces the number of steps. In other words, we propose a new scheme that is friendly to parallel environments. When evaluating on multi-core CPUs, we consistently overcome the original lifting scheme. The evaluation was performed on 61-core Intel Xeon Phi and 8-core Intel Xeon processors.

  7. Understanding missed opportunities for more timely diagnosis of cancer in symptomatic patients after presentation

    PubMed Central

    Lyratzopoulos, G; Vedsted, P; Singh, H

    2015-01-01

    The diagnosis of cancer is a complex, multi-step process. In this paper, we highlight factors involved in missed opportunities to diagnose cancer more promptly in symptomatic patients and discuss responsible mechanisms and potential strategies to shorten intervals from presentation to diagnosis. Missed opportunities are instances in which post-hoc judgement indicates that alternative decisions or actions could have led to more timely diagnosis. They can occur in any of the three phases of the diagnostic process (initial diagnostic assessment; diagnostic test performance and interpretation; and diagnostic follow-up and coordination) and can involve patient, doctor/care team, and health-care system factors, often in combination. In this perspective article, we consider epidemiological ‘signals' suggestive of missed opportunities and draw on evidence from retrospective case reviews of cancer patient cohorts to summarise factors that contribute to missed opportunities. Multi-disciplinary research targeting such factors is important to shorten diagnostic intervals post presentation. Insights from the fields of organisational and cognitive psychology, human factors science and informatics can be extremely valuable in this emerging research agenda. We provide a conceptual foundation for the development of future interventions to minimise the occurrence of missed opportunities in cancer diagnosis, enriching current approaches that chiefly focus on clinical decision support or on widening access to investigations. PMID:25734393

  8. Molecular genetic analysis of plant gravitropism

    NASA Technical Reports Server (NTRS)

    Lomax, T. L.

    1997-01-01

    The analysis of mutants is a powerful approach for elucidating the components of complex biological processes. A growing number of mutants have been isolated which affect plant gravitropism and the classes of mutants found thus far provide important information about the gravity response mechanism. The wide variety of mutants isolated, especially in Arabidopsis, indicates that gravitropism is a complex, multi-step process. The existence of mutants altered in either root gravitropism alone, shoot gravitropism alone, or both indicates that the root and shoot gravitropic mechanisms have both separate and common steps. Reduced starch mutants have confirmed the role of amyloplasts in sensing the gravity signal. The hormone auxin is thought to act as the transducing signal between the sites of gravity perception (the starch parenchyma cells surrounding the vascular tissue in shoots and the columella cells of root caps) and asymmetric growth (the epidermal cells of the elongation zone(s) of each organ). To date, all mutants that are resistant to high concentrations of auxin have also been found to exhibit a reduced gravitropic response, thus supporting the role of auxin. Not all gravitropic mutants are auxin-resistant, however, indicating that there are additional steps which do not involve auxin. Studies with mutants of tomato which exhibit either reduced or reversed gravitropic responses further support the role of auxin redistribution in gravitropism and suggest that both red light and cytokinin interact with gravitropism through controlling lateral auxin transport. Plant responses to gravity thus likely involve changes in both auxin transport and sensitivity.

  9. Selective dry etching of silicon containing anti-reflective coating

    NASA Astrophysics Data System (ADS)

    Sridhar, Shyam; Nolan, Andrew; Wang, Li; Karakas, Erdinc; Voronin, Sergey; Biolsi, Peter; Ranjan, Alok

    2018-03-01

    Multi-layer patterning schemes involve the use of Silicon containing Anti-Reflective Coating (SiARC) films for their anti-reflective properties. Patterning transfer completion requires complete and selective removal of SiARC which is very difficult due to its high silicon content (>40%). Typically, SiARC removal is accomplished through a non-selective etch during the pattern transfer process using fluorine containing plasmas, or an ex-situ wet etch process using hydrofluoric acid is employed to remove the residual SiARC, post pattern transfer. Using a non-selective etch may result in profile distortion or wiggling, due to distortion of the underlying organic layer. The drawbacks of using wet etch process for SiARC removal are increased overall processing time and the need for additional equipment. Many applications may involve patterning of active structures in a poly-Si layer with an underlying oxide stopping layer. In such applications, SiARC removal selective to oxide using a wet process may prove futile. Removing SiARC selectively to SiO2 using a dry etch process is also challenging, due to similarity in the nature of chemical bonds (Si - O) in the two materials. In this work, we present highly selective etching of SiARC, in a plasma driven by a surface wave radial line slot antenna. The first step in the process involves an in-situ modification of the SiARC layer in O2 plasma followed by selective etching in a NF3/H2 plasma. Surface treatment in O2 plasma resulted in enhanced etching of the SiARC layer. For the right processing conditions, in-situ NF3/H2 dry etch process demonstrated selectivity values greater than 15:1 with respect to SiO2. The etching chemistry, however, was sensitive to NF3:H2 gas ratio. For dilute NF3 in H2, no SiARC etching was observed. Presumably, this is due to the deposition of ammonium fluorosilicate layer that occurs for dilute NF3/H2 plasmas. Additionally, challenges involved in selective SiARC removal (selective to SiO2, organic and Si layers) post pattern transfer, in a multi-layer structure will be discussed.

  10. Unifying Temporal and Structural Credit Assignment Problems

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian K.; Tumer, Kagan

    2004-01-01

    Single-agent reinforcement learners in time-extended domains and multi-agent systems share a common dilemma known as the credit assignment problem. Multi-agent systems have the structural credit assignment problem of determining the contributions of a particular agent to a common task. Instead, time-extended single-agent systems have the temporal credit assignment problem of determining the contribution of a particular action to the quality of the full sequence of actions. Traditionally these two problems are considered different and are handled in separate ways. In this article we show how these two forms of the credit assignment problem are equivalent. In this unified frame-work, a single-agent Markov decision process can be broken down into a single-time-step multi-agent process. Furthermore we show that Monte-Carlo estimation or Q-learning (depending on whether the values of resulting actions in the episode are known at the time of learning) are equivalent to different agent utility functions in a multi-agent system. This equivalence shows how an often neglected issue in multi-agent systems is equivalent to a well-known deficiency in multi-time-step learning and lays the basis for solving time-extended multi-agent problems, where both credit assignment problems are present.

  11. Long range personalized cancer treatment strategies incorporating evolutionary dynamics.

    PubMed

    Yeang, Chen-Hsiang; Beckman, Robert A

    2016-10-22

    Current cancer precision medicine strategies match therapies to static consensus molecular properties of an individual's cancer, thus determining the next therapeutic maneuver. These strategies typically maintain a constant treatment while the cancer is not worsening. However, cancers feature complicated sub-clonal structure and dynamic evolution. We have recently shown, in a comprehensive simulation of two non-cross resistant therapies across a broad parameter space representing realistic tumors, that substantial improvement in cure rates and median survival can be obtained utilizing dynamic precision medicine strategies. These dynamic strategies explicitly consider intratumoral heterogeneity and evolutionary dynamics, including predicted future drug resistance states, and reevaluate optimal therapy every 45 days. However, the optimization is performed in single 45 day steps ("single-step optimization"). Herein we evaluate analogous strategies that think multiple therapeutic maneuvers ahead, considering potential outcomes at 5 steps ahead ("multi-step optimization") or 40 steps ahead ("adaptive long term optimization (ALTO)") when recommending the optimal therapy in each 45 day block, in simulations involving both 2 and 3 non-cross resistant therapies. We also evaluate an ALTO approach for situations where simultaneous combination therapy is not feasible ("Adaptive long term optimization: serial monotherapy only (ALTO-SMO)"). Simulations utilize populations of 764,000 and 1,700,000 virtual patients for 2 and 3 drug cases, respectively. Each virtual patient represents a unique clinical presentation including sizes of major and minor tumor subclones, growth rates, evolution rates, and drug sensitivities. While multi-step optimization and ALTO provide no significant average survival benefit, cure rates are significantly increased by ALTO. Furthermore, in the subset of individual virtual patients demonstrating clinically significant difference in outcome between approaches, by far the majority show an advantage of multi-step or ALTO over single-step optimization. ALTO-SMO delivers cure rates superior or equal to those of single- or multi-step optimization, in 2 and 3 drug cases respectively. In selected virtual patients incurable by dynamic precision medicine using single-step optimization, analogous strategies that "think ahead" can deliver long-term survival and cure without any disadvantage for non-responders. When therapies require dose reduction in combination (due to toxicity), optimal strategies feature complex patterns involving rapidly interleaved pulses of combinations and high dose monotherapy. This article was reviewed by Wendy Cornell, Marek Kimmel, and Andrzej Swierniak. Wendy Cornell and Andrzej Swierniak are external reviewers (not members of the Biology Direct editorial board). Andrzej Swierniak was nominated by Marek Kimmel.

  12. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    PubMed

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  13. Effect of Continuous Multi-Walled Carbon Nanotubes on Thermal and Mechanical Properties of Flexible Composite Film

    PubMed Central

    Cha, Ji Eun; Kim, Seong Yun; Lee, Seung Hee

    2016-01-01

    To investigate the effect of continuous multi-walled carbon nanotubes (MWCNTs) on the thermal and mechanical properties of composites, we propose a fabrication method for a buckypaper-filled flexible composite film prepared by a two-step process involving buckypaper fabrication using vacuum filtration of MWCNTs, and composite film fabrication using the dipping method. The thermal conductivity and tensile strength of the composite film filled with the buckypaper exhibited improved results, respectively 76% and 275% greater than those of the individual MWCNT-filled composite film. It was confirmed that forming continuous MWCNT fillers is an important factor which determines the physical characteristics of the composite film. In light of the study findings, composite films using buckypaper as a filler and polydimethylsiloxane (PDMS) as a flexible matrix have sufficient potential to be applied as a heat-dissipating material, and as a flexible film with high thermal conductivity and excellent mechanical properties. PMID:28335310

  14. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  15. Lignocellulose hydrolysis by multienzyme complexes

    USDA-ARS?s Scientific Manuscript database

    Lignocellulosic biomass is the most abundant renewable resource on the planet. Converting this material into a usable fuel is a multi-step process, the rate-limiting step being enzymatic hydrolysis of organic polymers into monomeric sugars. While the substrate can be complex and require a multitud...

  16. GIST Clinic Application 2018 | Center for Cancer Research

    Cancer.gov

    Clinic date: June 20-22, 2018 This Application is the first step in a multi-step process for being considered for participation in our upcoming Pediatric and wild-type GIST clinic. Please review all 3 pages and complete all questions in full.

  17. Analysing UK clinicians' understanding of cognitive symptoms in major depression: A survey of primary care physicians and psychiatrists.

    PubMed

    McAllister-Williams, R Hamish; Bones, Kate; Goodwin, Guy M; Harrison, John; Katona, Cornelius; Rasmussen, Jill; Strong, Sarah; Young, Allan H

    2017-01-01

    Cognitive dysfunction occurs in depression and can persist into remission. It impacts on patient functioning but remains largely unrecognised, unmonitored and untreated. We explored understanding of cognitive dysfunction in depression among UK clinicians. A multi-step consultation process. Step 1: a multi-stakeholder steering committee identified key themes of burden, detection and management of cognitive dysfunction in depression, and developed statements on each to explore understanding and degree of agreement among clinicians. Step 2: 100 general practitioners (GPs) and 100 psychiatrists indicated their level of agreement with these statements. Step 3: the steering committee reviewed responses and highlighted priority areas for future education and research. There was agreement that clinicians are not fully aware of cognitive dysfunction in depression. Views of the relationship between cognitive dysfunction and other depressive symptom severities was not consistent with the literature. In particular, there was a lack of recognition that some cognitive dysfunction can persist into remission. There was understandable uncertainty around treatment options, given the current limited evidence base. However, it was recognised that cognitive dysfunction is an area of unmet need and that there is a lack of objective tests of cognition appropriate for depressed patients that can be easily implemented in the clinic. Respondents are likely to be 'led' by the direction of the statements they reviewed. The study did not involve patients and carers. UK clinicians should undergo training regarding cognitive dysfunction in depression, and further research is needed into its assessment, treatment and monitoring. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  19. Multi-Criteria Decision Making for a Spatial Decision Support System on the Analysis of Changing Risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in this case study namely: construction of defense structures, relocation, implementation of an early warning system and spatial planning regulations. Some of the criteria are determined partly in other modules of the CHANGES SDSS, such as the costs for implementation, the risk reduction in monetary values, and societal risk. Other criteria, which could be environmental, economic, cultural, perception in nature, are defined by different stakeholders such as local authorities, expert organizations, private sector, and local public. In the next step, the stakeholders weight the importance of the criteria by pairwise comparison and visualize the decision matrix, which is a matrix based on criteria versus alternatives values. Finally alternatives are ranked by Analytic Hierarchy Process (AHP) method. We expect that this approach will help the decision makers to ease their works and reduce their costs, because the process is more transparent, more accurate and involves a group decision. In that way there will be more confidence in the overall decision making process. Keywords: MCDM, Analytic Hierarchy Process (AHP), SDSS, Natural Hazard Risk Management

  20. Enhancing the functional properties of thermophilic enzymes by chemical modification and immobilization.

    PubMed

    Cowan, Don A; Fernandez-Lafuente, Roberto

    2011-09-10

    The immobilization of proteins (mostly typically enzymes) onto solid supports is mature technology and has been used successfully to enhance biocatalytic processes in a wide range of industrial applications. However, continued developments in immobilization technology have led to more sophisticated and specialized applications of the process. A combination of targeted chemistries, for both the support and the protein, sometimes in combination with additional chemical and/or genetic engineering, has led to the development of methods for the modification of protein functional properties, for enhancing protein stability and for the recovery of specific proteins from complex mixtures. In particular, the development of effective methods for immobilizing large multi-subunit proteins with multiple covalent linkages (multi-point immobilization) has been effective in stabilizing proteins where subunit dissociation is the initial step in enzyme inactivation. In some instances, multiple benefits are achievable in a single process. Here we comprehensively review the literature pertaining to immobilization and chemical modification of different enzyme classes from thermophiles, with emphasis on the chemistries involved and their implications for modification of the enzyme functional properties. We also highlight the potential for synergies in the combined use of immobilization and other chemical modifications. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Communication Audits in the Age of the Internet.

    ERIC Educational Resources Information Center

    Goldhaber, Gerald M.

    2002-01-01

    Describes the history of a multi-instrument approach for auditing the communication behavior of organizations. Notes that with the advent of the Internet, limitations of survey research have virtually been eliminated. Outlines four necessary steps involved in a Web-based communication survey. (PM)

  2. Multi-objective optimization of solid waste flows: environmentally sustainable strategies for municipalities.

    PubMed

    Minciardi, Riccardo; Paolucci, Massimo; Robba, Michela; Sacile, Roberto

    2008-11-01

    An approach to sustainable municipal solid waste (MSW) management is presented, with the aim of supporting the decision on the optimal flows of solid waste sent to landfill, recycling and different types of treatment plants, whose sizes are also decision variables. This problem is modeled with a non-linear, multi-objective formulation. Specifically, four objectives to be minimized have been taken into account, which are related to economic costs, unrecycled waste, sanitary landfill disposal and environmental impact (incinerator emissions). An interactive reference point procedure has been developed to support decision making; these methods are considered appropriate for multi-objective decision problems in environmental applications. In addition, interactive methods are generally preferred by decision makers as they can be directly involved in the various steps of the decision process. Some results deriving from the application of the proposed procedure are presented. The application of the procedure is exemplified by considering the interaction with two different decision makers who are assumed to be in charge of planning the MSW system in the municipality of Genova (Italy).

  3. A Study on Segmented Multiple-Step Forming of Doubly Curved Thick Plate by Reconfigurable Multi-Punch Dies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Young Ho; Han, Myoung Soo; Han, Jong Man

    2007-05-17

    Doubly curved thick plate forming in shipbuilding industries is currently performed by a thermal forming process, called as Line Heating by using gas flame torches. Due to the empirical manual work of it, the industries are eager for an alternative way to manufacture curved thick plates for ships. It was envisaged in this study to manufacture doubly curved thick plates by the multi-punch die forming. Experiments and finite element analyses were conducted to evaluate the feasibility of the reconfigurable discrete die forming to the thick plates. Single and segmented multiple step forming procedures were considered from both forming efficiency andmore » accuracy. Configuration of the multi-punch dies suitable for the segmented multiple step forming was also explored. As a result, Segmented multiple step forming with matched dies had a limited formability when the objective shapes become complicate, while a unmatched die configuration provided better possibility to manufacture large curved plates for ships.« less

  4. A Multi-step Transcriptional and Chromatin State Cascade Underlies Motor Neuron Programming from Embryonic Stem Cells.

    PubMed

    Velasco, Silvia; Ibrahim, Mahmoud M; Kakumanu, Akshay; Garipler, Görkem; Aydin, Begüm; Al-Sayegh, Mohamed Ahmed; Hirsekorn, Antje; Abdul-Rahman, Farah; Satija, Rahul; Ohler, Uwe; Mahony, Shaun; Mazzoni, Esteban O

    2017-02-02

    Direct cell programming via overexpression of transcription factors (TFs) aims to control cell fate with the degree of precision needed for clinical applications. However, the regulatory steps involved in successful terminal cell fate programming remain obscure. We have investigated the underlying mechanisms by looking at gene expression, chromatin states, and TF binding during the uniquely efficient Ngn2, Isl1, and Lhx3 motor neuron programming pathway. Our analysis reveals a highly dynamic process in which Ngn2 and the Isl1/Lhx3 pair initially engage distinct regulatory regions. Subsequently, Isl1/Lhx3 binding shifts from one set of targets to another, controlling regulatory region activity and gene expression as cell differentiation progresses. Binding of Isl1/Lhx3 to later motor neuron enhancers depends on the Ebf and Onecut TFs, which are induced by Ngn2 during the programming process. Thus, motor neuron programming is the product of two initially independent transcriptional modules that converge with a feedforward transcriptional logic. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    PubMed

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.

  6. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  7. Evaluation of accuracy in implant site preparation performed in single- or multi-step drilling procedures.

    PubMed

    Marheineke, Nadine; Scherer, Uta; Rücker, Martin; von See, Constantin; Rahlf, Björn; Gellrich, Nils-Claudius; Stoetzer, Marcus

    2018-06-01

    Dental implant failure and insufficient osseointegration are proven results of mechanical and thermal damage during the surgery process. We herein performed a comparative study of a less invasive single-step drilling preparation protocol and a conventional multiple drilling sequence. Accuracy of drilling holes was precisely analyzed and the influence of different levels of expertise of the handlers and additional use of drill template guidance was evaluated. Six experimental groups, deployed in an osseous study model, were representing template-guided and freehanded drilling actions in a stepwise drilling procedure in comparison to a single-drill protocol. Each experimental condition was studied by the drilling actions of respectively three persons without surgical knowledge as well as three highly experienced oral surgeons. Drilling actions were performed and diameters were recorded with a precision measuring instrument. Less experienced operators were able to significantly increase the drilling accuracy using a guiding template, especially when multi-step preparations are performed. Improved accuracy without template guidance was observed when experienced operators were executing single-step versus multi-step technique. Single-step drilling protocols have shown to produce more accurate results than multi-step procedures. The outcome of any protocol can be further improved by use of guiding templates. Operator experience can be a contributing factor. Single-step preparations are less invasive and are promoting osseointegration. Even highly experienced surgeons are achieving higher levels of accuracy by combining this technique with template guidance. Hereby template guidance enables a reduction of hands-on time and side effects during surgery and lead to a more predictable clinical diameter.

  8. Regulation of wound healing and fibrosis by hypoxia and hypoxia-inducible factor-1.

    PubMed

    Ruthenborg, Robin J; Ban, Jae-Jun; Wazir, Anum; Takeda, Norihiko; Kim, Jung-Whan

    2014-09-01

    Wound healing is a complex multi-step process that requires spatial and temporal orchestration of cellular and non-cellular components. Hypoxia is one of the prominent microenvironmental factors in tissue injury and wound healing. Hypoxic responses, mainly mediated by a master transcription factor of oxygen homeostasis, hypoxia-inducible factor-1 (HIF-1), have been shown to be critically involved in virtually all processes of wound healing and remodeling. Yet, mechanisms underlying hypoxic regulation of wound healing are still poorly understood. Better understanding of how the wound healing process is regulated by the hypoxic microenvironment and HIF-1 signaling pathway will provide insight into the development of a novel therapeutic strategy for impaired wound healing conditions such as diabetic wound and fibrosis. In this review, we will discuss recent studies illuminating the roles of HIF-1 in physiologic and pathologic wound repair and further, the therapeutic potentials of HIF-1 stabilization or inhibition.

  9. Age and gender estimation using Region-SIFT and multi-layered SVM

    NASA Astrophysics Data System (ADS)

    Kim, Hyunduk; Lee, Sang-Heon; Sohn, Myoung-Kyu; Hwang, Byunghun

    2018-04-01

    In this paper, we propose an age and gender estimation framework using the region-SIFT feature and multi-layered SVM classifier. The suggested framework entails three processes. The first step is landmark based face alignment. The second step is the feature extraction step. In this step, we introduce the region-SIFT feature extraction method based on facial landmarks. First, we define sub-regions of the face. We then extract SIFT features from each sub-region. In order to reduce the dimensions of features we employ a Principal Component Analysis (PCA) and a Linear Discriminant Analysis (LDA). Finally, we classify age and gender using a multi-layered Support Vector Machines (SVM) for efficient classification. Rather than performing gender estimation and age estimation independently, the use of the multi-layered SVM can improve the classification rate by constructing a classifier that estimate the age according to gender. Moreover, we collect a dataset of face images, called by DGIST_C, from the internet. A performance evaluation of proposed method was performed with the FERET database, CACD database, and DGIST_C database. The experimental results demonstrate that the proposed approach classifies age and performs gender estimation very efficiently and accurately.

  10. Process synthesis involving multi-period operations by the P-graph framework

    EPA Science Inventory

    The P-graph (process graph) framework is an effective tool for process-network synthesis (PNS). Here we extended it to multi-period operations. The efficacy of the P-graph methodology has been demonstrated by numerous applications. The unambiguous representation of processes and ...

  11. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.

  12. Developing a Mind-Body Exercise Programme for Stressed Children

    ERIC Educational Resources Information Center

    Wang, Claudia; Seo, Dong-Chul; Geib, Roy W

    2017-01-01

    Objective: To describe the process of developing a Health Qigong programme for stressed children using a formative evaluation approach. Methods: A multi-step formative evaluation method was utilised. These steps included (1) identifying programme content and drafting the curriculum, (2) synthesising effective and age-appropriate pedagogies, (3)…

  13. The Ocean Observatories Initiative: Data pre-Processing: Diagnostic Tools to Prepare Data for QA/QC Processing.

    NASA Astrophysics Data System (ADS)

    Belabbassi, L.; Garzio, L. M.; Smith, M. J.; Knuth, F.; Vardaro, M.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of deployed oceanographic sensors. The Pioneer Array in the Atlantic Ocean off the Coast of New England hosts 10 moorings and 6 gliders. Each mooring is outfitted with 6 to 19 different instruments telemetering more than 1000 data streams. These data are available to science users to collaborate on common scientific goals such as water quality monitoring and scale variability measures of continental shelf processes and coastal open ocean exchanges. To serve this purpose, the acquired datasets undergo an iterative multi-step quality assurance and quality control procedure automated to work with all types of data. Data processing involves several stages, including a fundamental pre-processing step when the data are prepared for processing. This takes a considerable amount of processing time and is often not given enough thought in development initiatives. The volume and complexity of OOI data necessitates the development of a systematic diagnostic tool to enable the management of a comprehensive data information system for the OOI arrays. We present two examples to demonstrate the current OOI pre-processing diagnostic tool. First, Data Filtering is used to identify incomplete, incorrect, or irrelevant parts of the data and then replaces, modifies or deletes the coarse data. This provides data consistency with similar datasets in the system. Second, Data Normalization occurs when the database is organized in fields and tables to minimize redundancy and dependency. At the end of this step, the data are stored in one place to reduce the risk of data inconsistency and promote easy and efficient mapping to the database.

  14. Design and Processing of a Novel Chaos-Based Stepped Frequency Synthesized Wideband Radar Signal.

    PubMed

    Zeng, Tao; Chang, Shaoqiang; Fan, Huayu; Liu, Quanhua

    2018-03-26

    The linear stepped frequency and linear frequency shift keying (FSK) signal has been widely used in radar systems. However, such linear modulation signals suffer from the range-Doppler coupling that degrades radar multi-target resolution. Moreover, the fixed frequency-hopping or frequency-coded sequence can be easily predicted by the interception receiver in the electronic countermeasures (ECM) environments, which limits radar anti-jamming performance. In addition, the single FSK modulation reduces the radar low probability of intercept (LPI) performance, for it cannot achieve a large time-bandwidth product. To solve such problems, we propose a novel chaos-based stepped frequency (CSF) synthesized wideband signal in this paper. The signal introduces chaotic frequency hopping between the coherent stepped frequency pulses, and adopts a chaotic frequency shift keying (CFSK) and phase shift keying (PSK) composited coded modulation in a subpulse, called CSF-CFSK/PSK. Correspondingly, the processing method for the signal has been proposed. According to our theoretical analyses and the simulations, the proposed signal and processing method achieve better multi-target resolution and LPI performance. Furthermore, flexible modulation is able to increase the robustness against identification of the interception receiver and improve the anti-jamming performance of the radar.

  15. If we build it, will they come? Curation and use of the ESO telescope bibliography

    NASA Astrophysics Data System (ADS)

    Grothkopf, Uta; Meakins, Silvia; Bordelon, Dominic

    2015-12-01

    The ESO Telescope Bibliography (telbib) is a database of refereed papers published by the ESO users community. It links data in the ESO Science Archive with the published literature, and vice versa. Developed and maintained by the ESO library, telbib also provides insights into the organization's research output and impact as measured through bibliometric studies. Curating telbib is a multi-step process that involves extensive tagging of the database records. Based on selected use cases, this talk will explain how the rich metadata provide parameters for reports and statistics in order to investigate the performance of ESO's facilities and to understand trends and developments in the publishing behaviour of the user community.

  16. Quantification of soil water retention parameters using multi-section TDR-waveform analysis

    NASA Astrophysics Data System (ADS)

    Baviskar, S. M.; Heimovaara, T. J.

    2017-06-01

    Soil water retention parameters are important for describing flow in variably saturated soils. TDR is one of the standard methods used for determining water content in soil samples. In this study, we present an approach to estimate water retention parameters of a sample which is initially saturated and subjected to an incremental decrease in boundary head causing it to drain in a multi-step fashion. TDR waveforms are measured along the height of the sample at assumed different hydrostatic conditions at daily interval. The cumulative discharge outflow drained from the sample is also recorded. The saturated water content is obtained using volumetric analysis after the final step involved in multi-step drainage. The equation obtained by coupling the unsaturated parametric function and the apparent dielectric permittivity is fitted to a TDR wave propagation forward model. The unsaturated parametric function is used to spatially interpolate the water contents along TDR probe. The cumulative discharge outflow data is fitted with cumulative discharge estimated using the unsaturated parametric function. The weight of water inside the sample estimated at the first and final boundary head in multi-step drainage is fitted with the corresponding weights calculated using unsaturated parametric function. A Bayesian optimization scheme is used to obtain optimized water retention parameters for these different objective functions. This approach can be used for samples with long heights and is especially suitable for characterizing sands with a uniform particle size distribution at low capillary heads.

  17. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  18. The multiBac protein complex production platform at the EMBL.

    PubMed

    Berger, Imre; Garzoni, Frederic; Chaillet, Maxime; Haffke, Matthias; Gupta, Kapil; Aubert, Alice

    2013-07-11

    Proteomics research revealed the impressive complexity of eukaryotic proteomes in unprecedented detail. It is now a commonly accepted notion that proteins in cells mostly exist not as isolated entities but exert their biological activity in association with many other proteins, in humans ten or more, forming assembly lines in the cell for most if not all vital functions.(1,2) Knowledge of the function and architecture of these multiprotein assemblies requires their provision in superior quality and sufficient quantity for detailed analysis. The paucity of many protein complexes in cells, in particular in eukaryotes, prohibits their extraction from native sources, and necessitates recombinant production. The baculovirus expression vector system (BEVS) has proven to be particularly useful for producing eukaryotic proteins, the activity of which often relies on post-translational processing that other commonly used expression systems often cannot support.(3) BEVS use a recombinant baculovirus into which the gene of interest was inserted to infect insect cell cultures which in turn produce the protein of choice. MultiBac is a BEVS that has been particularly tailored for the production of eukaryotic protein complexes that contain many subunits.(4) A vital prerequisite for efficient production of proteins and their complexes are robust protocols for all steps involved in an expression experiment that ideally can be implemented as standard operating procedures (SOPs) and followed also by non-specialist users with comparative ease. The MultiBac platform at the European Molecular Biology Laboratory (EMBL) uses SOPs for all steps involved in a multiprotein complex expression experiment, starting from insertion of the genes into an engineered baculoviral genome optimized for heterologous protein production properties to small-scale analysis of the protein specimens produced.(5-8) The platform is installed in an open-access mode at EMBL Grenoble and has supported many scientists from academia and industry to accelerate protein complex research projects.

  19. A Selection Method That Succeeds!

    ERIC Educational Resources Information Center

    Weitman, Catheryn J.

    Provided a structural selection method is carried out, it is possible to find quality early childhood personnel. The hiring process involves five definite steps, each of which establishes a base for the next. A needs assessment formulating basic minimal qualifications is the first step. The second step involves review of current job descriptions…

  20. Expression of metastasis suppressor BRMS1 in breast cancer cells results in a marked delay in cellular adhesion to matrix

    USDA-ARS?s Scientific Manuscript database

    Metastatic dissemination is a multi-step process that depends on cancer cells’ ability to respond to microenvironmental cues by adapting adhesion abilities and undergoing cytoskeletal rearrangement. Breast Cancer Metastasis Suppressor 1 (BRMS1) affects several steps of the metastatic cascade: it dec...

  1. Implementing the Indiana Model. Indiana Leadership Consortium: Equity through Change.

    ERIC Educational Resources Information Center

    Indiana Leadership Consortium.

    This guide, which was developed as a part of a multi-year, statewide effort to institutionalize gender equity in various educational settings throughout Indiana, presents a step-by-step process model for achieving gender equity in the state's secondary- and postsecondary-level vocational programs through coalition building and implementation of a…

  2. Vesselness propagation: a fast interactive vessel segmentation method

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Dachille, Frank; Harris, Gordon J.; Yoshida, Hiroyuki

    2006-03-01

    With the rapid development of multi-detector computed tomography (MDCT), resulting in increasing temporal and spatial resolution of data sets, clinical use of computed tomographic angiography (CTA) is rapidly increasing. Analysis of vascular structures is much needed in CTA images; however, the basis of the analysis, vessel segmentation, can still be a challenging problem. In this paper, we present a fast interactive method for CTA vessel segmentation, called vesselness propagation. This method is a two-step procedure, with a pre-processing step and an interactive step. During the pre-processing step, a vesselness volume is computed by application of a CTA transfer function followed by a multi-scale Hessian filtering. At the interactive stage, the propagation is controlled interactively in terms of the priority of the vesselness. This method was used successfully in many CTA applications such as the carotid artery, coronary artery, and peripheral arteries. It takes less than one minute for a user to segment the entire vascular structure. Thus, the proposed method provides an effective way of obtaining an overview of vascular structures.

  3. Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.

    PubMed

    Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny

    2010-12-01

    A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.

  4. The noni anthraquinone damnacanthal is a multi-kinase inhibitor with potent anti-angiogenic effects.

    PubMed

    García-Vilas, Javier A; Pino-Ángeles, Almudena; Martínez-Poveda, Beatriz; Quesada, Ana R; Medina, Miguel Ángel

    2017-01-28

    The natural bioactive compound damnacanthal inhibits several tyrosine kinases. Herein, we show that -in fact- damancanthal is a multi kinase inhibitor. A docking and molecular dynamics simulation approach allows getting further insight on the inhibitory effect of damnacanthal on three different kinases: vascular endothelial growth factor receptor-2, c-Met and focal adhesion kinase. Several of the kinases targeted and inhibited by damnacanthal are involved in angiogenesis. Ex vivo and in vivo experiments clearly demonstrate that, indeed, damnacanthal is a very potent inhibitor of angiogenesis. A number of in vitro assays contribute to determine the specific effects of damnacanthal on each of the steps of the angiogenic process, including inhibition of tubulogenesis, endothelial cell proliferation, survival, migration and production of extracellular matrix remodeling enzyme. Taken altogether, these results suggest that damancanthal could have potential interest for the treatment of cancer and other angiogenesis-dependent diseases. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Intervention Mapping as a Participatory Approach to Developing an HIV prevention Intervention in Rural African American Communities

    PubMed Central

    Corbie-Smith, Giselle; Akers, Aletha; Blumenthal, Connie; Council, Barbara; Wynn, Mysha; Muhammad, Melvin; Stith, Doris

    2011-01-01

    Southeastern states are among the hardest hit by the HIV epidemic in this country, and racial disparities in HIV rates are high in this region. This is particularly true in our communities of interest in rural eastern North Carolina. Although most recent efforts to prevent HIV attempt to address multiple contributing factors, we have found few multilevel HIV interventions that have been developed, tailored or tested in rural communities for African Americans. We describe how Project GRACE integrated Intervention Mapping (IM) methodology with community based participatory research (CBPR) principles to develop a multi-level, multi-generational HIV prevention intervention. IM was carried out in a series of steps from review of relevant data through producing program components. Through the IM process, all collaborators agreed that we needed a family-based intervention involving youth and their caregivers. We found that the structured approach of IM can be adapted to incorporate the principles of CBPR. PMID:20528128

  6. Simplified Helium Refrigerator Cycle Analysis Using the `Carnot Step'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Knudsen; V. Ganni

    2006-05-01

    An analysis of the Claude form of an idealized helium liquefier for the minimum input work reveals the ''Carnot Step'' for helium refrigerator cycles. As the ''Carnot Step'' for a multi-stage polytropic compression process consists of equal pressure ratio stages; similarly for an idealized helium liquefier the ''Carnot Step'' consists of equal temperature ratio stages for a given number of expansion stages. This paper presents the analytical basis and some useful equations for the preliminary examination of existing and new Claude helium refrigeration cycles.

  7. A Multi-touch Tool for Co-creation

    NASA Astrophysics Data System (ADS)

    Ludden, Geke D. S.; Broens, Tom

    Multi-touch technology provides an attractive way for knowledge workers to collaborate. Co-creation is an important collaboration process in which collecting resources, creating results and distributing these results is essential. We propose a wall-based multi-touch system (called CoCreate) in which these steps are made easy due to the notion of connected private spaces and a shared co-create space. We present our ongoing work, expert evaluation of interaction scenarios and future plans.

  8. Automated Geo/Co-Registration of Multi-Temporal Very-High-Resolution Imagery.

    PubMed

    Han, Youkyung; Oh, Jaehong

    2018-05-17

    For time-series analysis using very-high-resolution (VHR) multi-temporal satellite images, both accurate georegistration to the map coordinates and subpixel-level co-registration among the images should be conducted. However, applying well-known matching methods, such as scale-invariant feature transform and speeded up robust features for VHR multi-temporal images, has limitations. First, they cannot be used for matching an optical image to heterogeneous non-optical data for georegistration. Second, they produce a local misalignment induced by differences in acquisition conditions, such as acquisition platform stability, the sensor's off-nadir angle, and relief displacement of the considered scene. Therefore, this study addresses the problem by proposing an automated geo/co-registration framework for full-scene multi-temporal images acquired from a VHR optical satellite sensor. The proposed method comprises two primary steps: (1) a global georegistration process, followed by (2) a fine co-registration process. During the first step, two-dimensional multi-temporal satellite images are matched to three-dimensional topographic maps to assign the map coordinates. During the second step, a local analysis of registration noise pixels extracted between the multi-temporal images that have been mapped to the map coordinates is conducted to extract a large number of well-distributed corresponding points (CPs). The CPs are finally used to construct a non-rigid transformation function that enables minimization of the local misalignment existing among the images. Experiments conducted on five Kompsat-3 full scenes confirmed the effectiveness of the proposed framework, showing that the georegistration performance resulted in an approximately pixel-level accuracy for most of the scenes, and the co-registration performance further improved the results among all combinations of the georegistered Kompsat-3 image pairs by increasing the calculated cross-correlation values.

  9. Time-Accurate Local Time Stepping and High-Order Time CESE Methods for Multi-Dimensional Flows Using Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan; Venkatachari, Balaji Shankar; Cheng, Gary

    2013-01-01

    With the wide availability of affordable multiple-core parallel supercomputers, next generation numerical simulations of flow physics are being focused on unsteady computations for problems involving multiple time scales and multiple physics. These simulations require higher solution accuracy than most algorithms and computational fluid dynamics codes currently available. This paper focuses on the developmental effort for high-fidelity multi-dimensional, unstructured-mesh flow solvers using the space-time conservation element, solution element (CESE) framework. Two approaches have been investigated in this research in order to provide high-accuracy, cross-cutting numerical simulations for a variety of flow regimes: 1) time-accurate local time stepping and 2) highorder CESE method. The first approach utilizes consistent numerical formulations in the space-time flux integration to preserve temporal conservation across the cells with different marching time steps. Such approach relieves the stringent time step constraint associated with the smallest time step in the computational domain while preserving temporal accuracy for all the cells. For flows involving multiple scales, both numerical accuracy and efficiency can be significantly enhanced. The second approach extends the current CESE solver to higher-order accuracy. Unlike other existing explicit high-order methods for unstructured meshes, the CESE framework maintains a CFL condition of one for arbitrarily high-order formulations while retaining the same compact stencil as its second-order counterpart. For large-scale unsteady computations, this feature substantially enhances numerical efficiency. Numerical formulations and validations using benchmark problems are discussed in this paper along with realistic examples.

  10. Construction and analysis of lncRNA-lncRNA synergistic networks to reveal clinically relevant lncRNAs in cancer.

    PubMed

    Li, Yongsheng; Chen, Juan; Zhang, Jinwen; Wang, Zishan; Shao, Tingting; Jiang, Chunjie; Xu, Juan; Li, Xia

    2015-09-22

    Long non-coding RNAs (lncRNAs) play key roles in diverse biological processes. Moreover, the development and progression of cancer often involves the combined actions of several lncRNAs. Here we propose a multi-step method for constructing lncRNA-lncRNA functional synergistic networks (LFSNs) through co-regulation of functional modules having three features: common coexpressed genes of lncRNA pairs, enrichment in the same functional category and close proximity within protein interaction networks. Applied to three cancers, we constructed cancer-specific LFSNs and found that they exhibit a scale free and modular architecture. In addition, cancer-associated lncRNAs tend to be hubs and are enriched within modules. Although there is little synergistic pairing of lncRNAs across cancers, lncRNA pairs involved in the same cancer hallmarks by regulating same or different biological processes. Finally, we identify prognostic biomarkers within cancer lncRNA expression datasets using modules derived from LFSNs. In summary, this proof-of-principle study indicates synergistic lncRNA pairs can be identified through integrative analysis of genome-wide expression data sets and functional information.

  11. Multi-criteria decision making to support waste management: A critical review of current practices and methods.

    PubMed

    Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg

    2017-01-01

    Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.

  12. Stochastic modelling of animal movement.

    PubMed

    Smouse, Peter E; Focardi, Stefano; Moorcroft, Paul R; Kie, John G; Forester, James D; Morales, Juan M

    2010-07-27

    Modern animal movement modelling derives from two traditions. Lagrangian models, based on random walk behaviour, are useful for multi-step trajectories of single animals. Continuous Eulerian models describe expected behaviour, averaged over stochastic realizations, and are usefully applied to ensembles of individuals. We illustrate three modern research arenas. (i) Models of home-range formation describe the process of an animal 'settling down', accomplished by including one or more focal points that attract the animal's movements. (ii) Memory-based models are used to predict how accumulated experience translates into biased movement choices, employing reinforced random walk behaviour, with previous visitation increasing or decreasing the probability of repetition. (iii) Lévy movement involves a step-length distribution that is over-dispersed, relative to standard probability distributions, and adaptive in exploring new environments or searching for rare targets. Each of these modelling arenas implies more detail in the movement pattern than general models of movement can accommodate, but realistic empiric evaluation of their predictions requires dense locational data, both in time and space, only available with modern GPS telemetry.

  13. Porous polycarbene-bearing membrane actuator for ultrasensitive weak-acid detection and real-time chemical reaction monitoring.

    PubMed

    Sun, Jian-Ke; Zhang, Weiyi; Guterman, Ryan; Lin, Hui-Juan; Yuan, Jiayin

    2018-04-30

    Soft actuators with integration of ultrasensitivity and capability of simultaneous interaction with multiple stimuli through an entire event ask for a high level of structure complexity, adaptability, and/or multi-responsiveness, which is a great challenge. Here, we develop a porous polycarbene-bearing membrane actuator built up from ionic complexation between a poly(ionic liquid) and trimesic acid (TA). The actuator features two concurrent structure gradients, i.e., an electrostatic complexation (EC) degree and a density distribution of a carbene-NH 3 adduct (CNA) along the membrane cross-section. The membrane actuator performs the highest sensitivity among the state-of-the-art soft proton actuators toward acetic acid at 10 -6  mol L -1 (M) level in aqueous media. Through competing actuation of the two gradients, it is capable of monitoring an entire process of proton-involved chemical reactions that comprise multiple stimuli and operational steps. The present achievement constitutes a significant step toward real-life application of soft actuators in chemical sensing and reaction technology.

  14. Dynamic behavior of the weld pool in stationary GMAW

    NASA Astrophysics Data System (ADS)

    Chapuis, J.; Romero, E.; Bordreuil, C.; Soulié, F.; Fras, G.

    2010-06-01

    Because hump formation limits welding productivity, better understanding of the humping phenomena during the welding process is needed to access to process modifications that decrease the tendency for hump formation and then allow higher productivity welding. From a physical point of view, the mechanism identified is the Rayleigh instability initiated by strong surface tension gradient which induces a variation of kinetic flow. But the causes of the appearance of this instability are not yet well explained. Because of the phenomena complex and multi-physics, we chose in first step to conduct an analysis of the characteristic times involved in weld pool in pulsed stationary GMAW. The goal is to study the dynamic behavior of the weld pool, using our experimental multi physics approach. The experimental tool and methodology developed to understand these fast phenomena are presented first: frames acquisition with high speed digital camera and specific optical devices, numerical library. The analysis of geometric parameters of the weld pool during welding operation are presented in the last part: we observe the variations of wetting angles (or contact lines angles), the base and the height of the weld pool (macro-drop) versus weld time.

  15. Limits of acceptable change and natural resources planning: when is LAC useful, when is it not?

    Treesearch

    David N. Cole; Stephen F. McCool

    1997-01-01

    There are ways to improve the LAC process and its implementational procedures. One significant procedural modification is the addition of a new step. This step — which becomes the first step in the process — involves more explicitly defining goals and desired conditions. For other steps in the process, clarifications of concept and terminology are advanced, as are...

  16. Patterning control strategies for minimum edge placement error in logic devices

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Hanna, Michael; Slachter, Bram; Tel, Wim; Kubis, Michael; Maslow, Mark; Spence, Chris; Timoshkov, Vadim

    2017-03-01

    In this paper we discuss the edge placement error (EPE) for multi-patterning semiconductor manufacturing. In a multi-patterning scheme the creation of the final pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. We describe the fidelity of the final pattern in terms of EPE, which is defined as the relative displacement of the edges of two features from their intended target position. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As an experimental test vehicle we use the 7-nm logic device patterning process flow as developed by IMEC. This patterning process is based on Self-Aligned-Quadruple-Patterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography. The computational metrology method to determine EPE is explained. It will be shown that ArF to EUV overlay, CDU from the individual process steps, and local CD and placement of the individual pattern features, are the important contributors. Based on the error budget, we developed an optimization strategy for each individual step and for the final pattern. Solutions include overlay and CD metrology based on angle resolved scatterometry, scanner actuator control to enable high order overlay corrections and computational lithography optimization to minimize imaging induced pattern placement errors of devices and metrology targets.

  17. Dual-step synthesis of 3-dimensional niobium oxide - Zinc oxide

    NASA Astrophysics Data System (ADS)

    Rani, Rozina Abdul; Zoolfakar, Ahmad Sabirin; Rusop, M.

    2018-05-01

    A facile fabrication process for constructing 3-dimensional (3D) structure of Niobium oxide - Zinc oxide (Nb2O5-ZnO) consisting of branched ZnO microrods on top of nanoporous Nb2O5 films was developed based on dual-step synthesis approach. The preliminary procedure was anodization of sputtered niobium metal on Fluorine doped Tin Oxide (FTO) to produce nanoporous Nb2O5, and continued with the growth of branched microrods of ZnO by hydrothermal process. This approach offers insight knowledge on the development of novel 3D metal oxide films via dual-step synthesis process, which might potentially use for multi-functional applications ranging from sensing to photoconversion.

  18. Numerical Issues Associated with Compensating and Competing Processes in Climate Models: an Example from ECHAM-HAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Hui; Rasch, Philip J.; Zhang, Kai

    2013-06-26

    The purpose of this paper is to draw attention to the need for appropriate numerical techniques to represent process interactions in climate models. In two versions of the ECHAM-HAM model, different time integration methods are used to solve the sulfuric acid (H2SO4) gas evolution equation, which lead to substantially different results in the H2SO4 gas concentration and the aerosol nucleation rate. Using convergence tests and sensitivity simulations performed with various time stepping schemes, it is confirmed that numerical errors in the second model version are significantly smaller than those in version one. The use of sequential operator splitting in combinationmore » with long time step is identified as the main reason for the large systematic biases in the old model. The remaining errors in version two in the nucleation rate, related to the competition between condensation and nucleation, have a clear impact on the simulated concentration of cloud condensation nuclei in the lower troposphere. These errors can be significantly reduced by employing an implicit solver that handles production, condensation and nucleation at the same time. Lessons learned in this work underline the need for more caution when treating multi-time-scale problems involving compensating and competing processes, a common occurrence in current climate models.« less

  19. Using Resin-Based 3D Printing to Build Geometrically Accurate Proxies of Porous Sedimentary Rocks.

    PubMed

    Ishutov, Sergey; Hasiuk, Franciszek J; Jobe, Dawn; Agar, Susan

    2018-05-01

    Three-dimensional (3D) printing is capable of transforming intricate digital models into tangible objects, allowing geoscientists to replicate the geometry of 3D pore networks of sedimentary rocks. We provide a refined method for building scalable pore-network models ("proxies") using stereolithography 3D printing that can be used in repeated flow experiments (e.g., core flooding, permeametry, porosimetry). Typically, this workflow involves two steps, model design and 3D printing. In this study, we explore how the addition of post-processing and validation can reduce uncertainty in the 3D-printed proxy accuracy (difference of proxy geometry from the digital model). Post-processing is a multi-step cleaning of porous proxies involving pressurized ethanol flushing and oven drying. Proxies are validated by: (1) helium porosimetry and (2) digital measurements of porosity from thin-section images of 3D-printed proxies. 3D printer resolution was determined by measuring the smallest open channel in 3D-printed "gap test" wafers. This resolution (400 µm) was insufficient to build porosity of Fontainebleau sandstone (∼13%) from computed tomography data at the sample's natural scale, so proxies were printed at 15-, 23-, and 30-fold magnifications to validate the workflow. Helium porosities of the 3D-printed proxies differed from digital calculations by up to 7% points. Results improved after pressurized flushing with ethanol (e.g., porosity difference reduced to ∼1% point), though uncertainties remain regarding the nature of sub-micron "artifact" pores imparted by the 3D printing process. This study shows the benefits of including post-processing and validation in any workflow to produce porous rock proxies. © 2017, National Ground Water Association.

  20. Faculty Salary Equity: Issues and Options. AIR 1993 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Snyder, Julie K.; And Others

    This paper presents a multi-phased approach to identifying and correcting gender-based salary inequities within institutions of higher education. The major steps in this approach involve: (1) determining if a problem situation exists by using a conceptually sound, objective procedure that does a legal and effective job of explaining faculty…

  1. Isotopic and trace element characteristics of an unusual refractory inclusion from Essebi

    NASA Technical Reports Server (NTRS)

    Deloule, E.; Kennedy, A. K.; Hutcheon, I. D.; Elgoresy, A.

    1993-01-01

    The isotopic and chemical properties of Ca-Al-rich inclusions (CAI) provide important clues to the early solar nebula environment. While the abundances of refractory major and trace elements are similar to those expected for high temperature condensates, the variety of textural, chemical, and isotopic signatures indicate most CAI experienced complex, multi-stage histories involving repeated episodes of condensation, evaporation, and metamorphism. Evidence of multiple processes is especially apparent in an unusual refractory inclusion from Essebi (URIE) described by El Goresy et al. The melilite (mel)-rich core of URIE contains polygonal framboids of spinel (sp) and hibonite (hb) or sp and fassaite (fas) and is surrounded by a rim sequence consisting of five layers. In contrast to rims on Allende, the mineralogy of the URIE rim layers becomes increasingly refractory from the core outwards, ending in a layer of spinel-Al2O3 solid solution + Sc-rich fassaite. The chemical and mineralogical features of URIE are inconsistent with crystallization from a homogeneous melt, and El Goresy et al. proposed a multi-step history involving condensation of sp + hb and aggregation into framboids, capture of framboids by a refractory silicate melt droplet, condensation of rim layers, and alteration of mel to calcite and feldspathoid. The PANURGE ion probe was used to investigate the isotopic and trace element characteristics of URIE to develop a more complete picture of the multiple processes leading to formation and metamorphism.

  2. 'We didn't know anything, it was a mess!' Emergent structures and the effectiveness of a rescue operation multi-team system.

    PubMed

    Fleştea, Alina Maria; Fodor, Oana Cătălina; Curşeu, Petru Lucian; Miclea, Mircea

    2017-01-01

    Multi-team systems (MTS) are used to tackle unpredictable events and to respond effectively to fast-changing environmental contingencies. Their effectiveness is influenced by within as well as between team processes (i.e. communication, coordination) and emergent phenomena (i.e. situational awareness). The present case study explores the way in which the emergent structures and the involvement of bystanders intertwine with the dynamics of processes and emergent states both within and between the component teams. Our findings show that inefficient transition process and the ambiguous leadership generated poor coordination and hindered the development of emergent phenomena within the whole system. Emergent structures and bystanders substituted leadership functions and provided a pool of critical resources for the MTS. Their involvement fostered the emergence of situational awareness and facilitated contingency planning processes. However, bystander involvement impaired the emergence of cross-understandings and interfered with coordination processes between the component teams. Practitioner Summary: Based on a real emergency situation, the present research provides important theoretical and practical insights about the role of bystander involvement in the dynamics of multi-team systems composed to tackle complex tasks and respond to fast changing and unpredictable environmental contingencies.

  3. Combined non-parametric and parametric approach for identification of time-variant systems

    NASA Astrophysics Data System (ADS)

    Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz

    2018-03-01

    Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.

  4. A Multi-Scale, Multi-Physics Optimization Framework for Additively Manufactured Structural Components

    NASA Astrophysics Data System (ADS)

    El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel

    This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.

  5. Effects of protein and phosphate buffer concentrations on thermal denaturation of lysozyme analyzed by isoconversional method.

    PubMed

    Cao, X M; Tian, Y; Wang, Z Y; Liu, Y W; Wang, C X

    2016-07-03

    Thermal denaturation of lysozymes was studied as a function of protein concentration, phosphate buffer concentration, and scan rate using differential scanning calorimetry (DSC), which was then analyzed by the isoconversional method. The results showed that lysozyme thermal denaturation was only slightly affected by the protein concentration and scan rate. When the protein concentration and scan rate increased, the denaturation temperature (Tm) also increased accordingly. On the contrary, the Tm decreased with the increase of phosphate buffer concentration. The denaturation process of lysozymes was accelatated and the thermal stability was reduced with the increase of phosphate concentration. One part of degeneration process was not reversible where the aggregation occurred. The other part was reversible. The apparent activation energy (Ea) was computed by the isoconversional method. It decreased with the increase of the conversion ratio (α). The observed denaturation process could not be described by a simple reaction mechanism. It was not a process involving 2 standard reversible states, but a multi-step process. The new opportunities for investigating the kinetics process of protein denaturation can be supplied by this novel isoconversional method.

  6. Continuous-Time Random Walk with multi-step memory: an application to market dynamics

    NASA Astrophysics Data System (ADS)

    Gubiec, Tomasz; Kutner, Ryszard

    2017-11-01

    An extended version of the Continuous-Time Random Walk (CTRW) model with memory is herein developed. This memory involves the dependence between arbitrary number of successive jumps of the process while waiting times between jumps are considered as i.i.d. random variables. This dependence was established analyzing empirical histograms for the stochastic process of a single share price on a market within the high frequency time scale. Then, it was justified theoretically by considering bid-ask bounce mechanism containing some delay characteristic for any double-auction market. Our model appeared exactly analytically solvable. Therefore, it enables a direct comparison of its predictions with their empirical counterparts, for instance, with empirical velocity autocorrelation function. Thus, the present research significantly extends capabilities of the CTRW formalism. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  7. Anti-apoptotic effect of hyperglycemia can allow survival of potentially autoreactive T cells.

    PubMed

    Ramakrishnan, P; Kahn, D A; Baltimore, D

    2011-04-01

    Thymocyte development is a tightly controlled multi-step process involving selective elimination of self-reactive and non-functional T cells by apoptosis. This developmental process depends on signaling by Notch, IL-7 and active glucose metabolism. In this study, we explored the requirement of glucose for thymocyte survival and found that in addition to metabolic regulation, glucose leads to the expression of anti-apoptotic genes. Under hyperglycemic conditions, both mouse and human thymocytes demonstrate enhanced survival. We show that glucose-induced anti-apoptotic genes are dependent on NF-κB p65 because high glucose is unable to attenuate normal ongoing apoptosis of thymocytes isolated from p65 knockout mice. Furthermore, we demonstrate that in vivo hyperglycemia decreases apoptosis of thymocytes allowing for survival of potentially self-reactive thymocytes. These results imply that hyperglycemic conditions could contribute to the development of autoimmunity through dysregulated thymic selection. © 2011 Macmillan Publishers Limited

  8. Convergence and Extrusion Are Required for Normal Fusion of the Mammalian Secondary Palate

    PubMed Central

    Kim, Seungil; Lewis, Ace E.; Singh, Vivek; Ma, Xuefei; Adelstein, Robert; Bush, Jeffrey O.

    2015-01-01

    The fusion of two distinct prominences into one continuous structure is common during development and typically requires integration of two epithelia and subsequent removal of that intervening epithelium. Using confocal live imaging, we directly observed the cellular processes underlying tissue fusion, using the secondary palatal shelves as a model. We find that convergence of a multi-layered epithelium into a single-layer epithelium is an essential early step, driven by cell intercalation, and is concurrent to orthogonal cell displacement and epithelial cell extrusion. Functional studies in mice indicate that this process requires an actomyosin contractility pathway involving Rho kinase (ROCK) and myosin light chain kinase (MLCK), culminating in the activation of non-muscle myosin IIA (NMIIA). Together, these data indicate that actomyosin contractility drives cell intercalation and cell extrusion during palate fusion and suggest a general mechanism for tissue fusion in development. PMID:25848986

  9. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  10. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  11. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  12. Multisensor data fusion across time and space

    NASA Astrophysics Data System (ADS)

    Villeneuve, Pierre V.; Beaven, Scott G.; Reed, Robert A.

    2014-06-01

    Field measurement campaigns typically deploy numerous sensors having different sampling characteristics for spatial, temporal, and spectral domains. Data analysis and exploitation is made more difficult and time consuming as the sample data grids between sensors do not align. This report summarizes our recent effort to demonstrate feasibility of a processing chain capable of "fusing" image data from multiple independent and asynchronous sensors into a form amenable to analysis and exploitation using commercially-available tools. Two important technical issues were addressed in this work: 1) Image spatial registration onto a common pixel grid, 2) Image temporal interpolation onto a common time base. The first step leverages existing image matching and registration algorithms. The second step relies upon a new and innovative use of optical flow algorithms to perform accurate temporal upsampling of slower frame rate imagery. Optical flow field vectors were first derived from high-frame rate, high-resolution imagery, and then finally used as a basis for temporal upsampling of the slower frame rate sensor's imagery. Optical flow field values are computed using a multi-scale image pyramid, thus allowing for more extreme object motion. This involves preprocessing imagery to varying resolution scales and initializing new vector flow estimates using that from the previous coarser-resolution image. Overall performance of this processing chain is demonstrated using sample data involving complex too motion observed by multiple sensors mounted to the same base. Multiple sensors were included, including a high-speed visible camera, up to a coarser resolution LWIR camera.

  13. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders.

    PubMed

    van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-08-13

    It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.

  14. The handling of thin substrates and its potential for new architectures in multi-junction solar cells technology

    NASA Astrophysics Data System (ADS)

    Colin, Clément; Jaouad, Abdelatif; Darnon, Maxime; De Lafontaine, Mathieu; Volatier, Maïté; Boucherif, Abderraouf; Arès, Richard; Fafard, Simon; Aimez, Vincent

    2017-09-01

    In this paper, we investigate the development of a robust handling process for thin (<50 µm) substrates in the framework of the monolithic multi-junction solar cell (MJSC) technology. The process, designed for its versatility, is based on a temporary front side bonding of the cell with a polymeric adhesive and then a permanent back side soldering, allowing classical cell micro-fabrication steps on both sides of the wafer. We have demonstrated that the process does not degrade the performances of monolithic MJSC with Ge substrates thickness reduced from 170 µm to 25 µm. Then, we investigate a perspective unlocked with this work: the study of 3D-interconnect architecture for multi-junction solar cells.

  15. Improving the two-step remediation process for CCA-treated wood. Part I, Evaluating oxalic acid extraction

    Treesearch

    Carol Clausen

    2004-01-01

    In this study, three possible improvements to a remediation process for chromated-copper-arsenate (CCA) treated wood were evaluated. The process involves two steps: oxalic acid extraction of wood fiber followed by bacterial culture with Bacillus licheniformis CC01. The three potential improvements to the oxalic acid extraction step were (1) reusing oxalic acid for...

  16. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  17. Polysaccharide of Black cumin (Nigella sativa) modulates molecular signaling cascade of gastric ulcer pathogenesis.

    PubMed

    Manjegowda, Srikanta Belagihalli; Rajagopal, Harsha Mysore; Dharmesh, Shylaja Mallaiah

    2017-08-01

    Gastric ulcer is a multi-step disease and healing requires a complex process including repair and re-architecture of gastric mucosa with the involvement of molecular events. Current study was designed to understand the gastric ulcer healing mechanism of rhamnogalacturonan-I type pectic polysaccharide of black cumin (BCPP) utilizing acetic acid induced gastric ulcers in rats. BCPP fed groups at 200mg/kg b.w. for 10days showed up to 85% healing of gastric ulcers with modulation of key molecular events involved in ulcer healing process such as increase in gastric mucin content, cyclooxygenase-2 (Cox-2) and prostaglandin E 2 (PGE 2 ). The increase in extracellular signal-regulated kinase-2 (ERK-2) indicated that, BCPP could induce PGE-2 synthesis by increasing ERK-2 mediated COX-2 activity. Increase in matrix metalloproteinase-2 (MMP-2) and decrease in MMP-9 levels in BCPP treated groups indicated differential regulation of MMP-2 and 9, an essential event required for gastric mucosal re-modulation. BCPP containing bound phenolics (26mg/g) might have also played a role in increasing speed and quality of ulcer healing by inhibiting H + , K + -ATPase and decreasing free radical mediated oxidation and cellular damages. Overall, studies showed that the polysaccharide can mediate ulcer healing by modulating signaling pathways involved in either ulcer aggravation or healing process. Copyright © 2017. Published by Elsevier B.V.

  18. English for Everyday Activities: A Picture Process Dictionary.

    ERIC Educational Resources Information Center

    Zwier, Lawrence J.

    These books are designed to help English-as-a-Second-Language (ESL) students learn the skills they need to communicate the step-by-step aspects of daily activities. Unlike most picture dictionaries, this is a verb-based multi-skills program that uses a student text with a clear and colorful pictorial detail as a starting point and focuses on the…

  19. The activation of human endogenous retrovirus K (HERV-K) is implicated in melanoma cell malignant transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serafino, A.; Balestrieri, E.; Pierimarchi, P.

    2009-03-10

    Melanoma development is a multi-step process arising from a series of genetic and epigenetic events. Although the sequential stages involved in progression from melanocytes to malignant melanoma are clearly defined, our current understanding of the mechanisms leading to melanoma onset is still incomplete. Growing evidence show that the activation of endogenous retroviral sequences might be involved in transformation of melanocytes as well as in the increased ability of melanoma cells to escape immune surveillance. Here we show that human melanoma cells in vitro undergo a transition from adherent to a more malignant, non-adherent phenotype when exposed to stress conditions. Melanoma-derivedmore » non-adherent cells are characterized by an increased proliferative potential and a decreased expression of both HLA class I molecules and Melan-A/MART-1 antigen, similarly to highly malignant cells. These phenotypic and functional modifications are accompanied by the activation of human endogenous retrovirus K expression (HERV-K) and massive production of viral-like particles. Down-regulation of HERV-K expression by RNA interference prevents the transition from the adherent to the non-adherent growth phenotype in low serum. These results implicate HERV-K in at least some critical steps of melanoma progression.« less

  20. Characteristics of lipid micro- and nanoparticles based on supercritical formation for potential pharmaceutical application

    PubMed Central

    2013-01-01

    The interest of the pharmaceutical industry in lipid drug delivery systems due to their prolonged release profile, biocompatibility, reduction of side effects, and so on is already known. However, conventional methods of preparation of these structures for their use and production in the pharmaceutical industry are difficult since these methods are usually multi-step and involve high amount of organic solvent. Furthermore, some processes need extreme conditions, which can lead to an increase of heterogeneity of particle size and degradation of the drug. An alternative for drug delivery system production is the utilization of supercritical fluid technique. Lipid particles produced by supercritical fluid have shown different physicochemical properties in comparison to lipid particles produced by classical methods. Such particles have shown more physical stability and narrower size distribution. So, in this paper, a critical overview of supercritical fluid-based processes for the production of lipid micro- and nanoparticles is given and the most important characteristics of each process are highlighted. PMID:24034341

  1. Auto Guided Oil Palm Planter by using multi-GNSS

    NASA Astrophysics Data System (ADS)

    Nur Aini, I.; W, Aimrun; Amin, M. S. M.; Ezrin, M. H.; Shafri, H. Z.

    2014-06-01

    Planting is one of the most important operations in plantation because it could affect the total area of productivity since it is the starting point in cultivation. In oil palm plantation, lining and spacing of oil palm shall be laid out and coincided with the topographic area and a system of drains. Conventionally, planting of oil palm will require the polarization process in order to prevent and overcome the lack of influence of the sun rise and get a regular crop row. Polarization is done after the completion of the opening area by using the spike wood with 1 m length painted at the top and 100 m length of wire. This process will generally require at least five persons at a time to pull the wire and carry the spikes while the other two persons will act as observer and spikes craftsmen respectively with the ability of the team is 3ha/day. Therefore, the aim of this project is to develop the oil palm planting technique by using multi- GNSS (Global Navigation Satellite System). Generally, this project will involve five main steps mainly; design of planting pattern by using SOLIDWORKS software, determine the boundary coordinate of planting area, georeference process with ArcGIS, stakeout process with Tracy software and finally marking up the location with the wooden spikes. The results proved that the multi- GNSS is capable to provide the high accuracy with less than 1 m in precise positioning system without augmentation data. With the ability of one person, time taken to complete 70 m × 50 m planting area is 290 min, which is 25 min faster than using GPS (Global Positioning System) only.

  2. Interactive Design Strategy for a Multi-Functional PAMAM Dendrimer-Based Nano-Therapeutic Using Computational Models and Experimental Analysis

    PubMed Central

    Lee, Inhan; Williams, Christopher R.; Athey, Brian D.; Baker, James R.

    2010-01-01

    Molecular dynamics simulations of nano-therapeutics as a final product and of all intermediates in the process of generating a multi-functional nano-therapeutic based on a poly(amidoamine) (PAMAM) dendrimer were performed along with chemical analyses of each of them. The actual structures of the dendrimers were predicted, based on potentiometric titration, gel permeation chromatography, and NMR. The chemical analyses determined the numbers of functional molecules, based on the actual structure of the dendrimer. Molecular dynamics simulations calculated the configurations of the intermediates and the radial distributions of functional molecules, based on their numbers. This interactive process between the simulation results and the chemical analyses provided a further strategy to design the next reaction steps and to gain insight into the products at each chemical reaction step. PMID:20700476

  3. Lights, Camera, Action: Facilitating the Design and Production of Effective Instructional Videos

    ERIC Educational Resources Information Center

    Di Paolo, Terry; Wakefield, Jenny S.; Mills, Leila A.; Baker, Laura

    2017-01-01

    This paper outlines a rudimentary process intended to guide faculty in K-12 and higher education through the steps involved to produce video for their classes. The process comprises four steps: planning, development, delivery and reflection. Each step is infused with instructional design information intended to support the collaboration between…

  4. Dynamic Neuromuscular Control of the Lower Limbs in Response to Unexpected Single-Planar versus Multi-Planar Support Perturbations in Young, Active Adults.

    PubMed

    Malfait, Bart; Staes, Filip; de Vries, Aijse; Smeets, Annemie; Hawken, Malcolm; Robinson, Mark A; Vanrenterghem, Jos; Verschueren, Sabine

    2015-01-01

    An anterior cruciate ligament (ACL) injury involves a multi-planar injury mechanism. Nevertheless, unexpected multi-planar perturbations have not been used to screen athletes in the context of ACL injury prevention yet could reveal those more at risk. The objective of this study was to compare neuromuscular responses to multi-planar (MPP) and single-planar perturbations (SPP) during a stepping-down task. These results might serve as a basis for future implementation of external perturbations in ACL injury screening programs. Thirteen young adults performed a single leg stepping-down task in eight conditions (four MPP and four SPP with a specified amplitude and velocity). The amplitudes of vastus lateralis (VL), vastus medialis (VM), hamstrings lateralis (HL), hamstrings medialis (HM) EMG activity, medio-lateral and anterior-posterior centre of mass (COM) displacements, the peak knee flexion and abduction angles were compared between conditions using an one-way ANOVA. Number of stepping responses were monitored during all conditions. Significantly greater muscle activity levels were found in response to the more challenging MPP and SPP compared to the less challenging conditions (p < 0.05). No differences in neuromuscular activity were found between the MPP conditions and their equivalents in the SPP. Eighteen stepping responses were monitored in the SPP versus nine in the MPP indicating that the overall neuromuscular control was even more challenged during the SPP which was supported by greater COM displacements in the SPP. The more intense MPP and SPP evoked different neuromuscular responses resulting in greater muscle activity levels compared to small perturbations. Based on the results of COM displacements and based on the amount of stepping responses, dynamic neuromuscular control of the knee joint appeared less challenged during the MPP. Therefore, future work should investigate extensively if other neuromuscular differences (i.e. co-activation patterns and kinetics) exist between MPP and SPP. In addition, future work should examine the influence on the neuromuscular control of the magnitude of the perturbations and the magnitude of stepping height and stepping distance.

  5. An Evaluation of the Peer Helper Component of "Go!": A Multimessage, Multi-"Step" Obesity Prevention Intervention

    ERIC Educational Resources Information Center

    de Souza, Rebecca; Dauner, Kim Nichols; Goei, Ryan; LaCaille, Lara; Kotowski, Michael R.; Schultz, Jennifer Feenstra; LaCaille, Rick; Versnik Nowak, Amy L.

    2014-01-01

    Background: Obesity prevention efforts typically involve changing eating and exercise behaviors as well as the physical and social environment in which those behaviors occur. Due to existing social networks, worksites are a logical choice for implementing such interventions. Purpose: This article describes the development and implementation of a…

  6. Measuring the Quality of Life of University Students. Research Monograph Series. Volume 1.

    ERIC Educational Resources Information Center

    Roberts, Lance W.; Clifton, Rodney A.

    This study sought to develop a valid set of scales in the cognitive and affective domains for measuring the quality of life of university students. In addition the study attempted to illustrate the usefulness of Thomas Piazza's procedures for constructing valid scales in educational research. Piazza's method involves a multi-step construction of…

  7. Discrepancy between mRNA and protein abundance: Insight from information retrieval process in computers

    PubMed Central

    Wang, Degeng

    2008-01-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers - multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks – biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird’s-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation. PMID:18757239

  8. Discrepancy between mRNA and protein abundance: insight from information retrieval process in computers.

    PubMed

    Wang, Degeng

    2008-12-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers-multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory, respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks-biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird's-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation.

  9. Multi-phase functionalization of titanium for enhanced photon absorption in the vis-NIR region.

    PubMed

    Thakur, Pooja; Tan, Bo; Venkatakrishnan, Krishnan

    2015-10-19

    Inadequate absorption of Near Infrared (NIR) photons by conventional silicon solar cells has been a major stumbling block towards the attainment of a high efficiency "full spectrum" solar cell. An effective enhancement in the absorption of such photons is desired as they account for a considerable portion of the tappable solar energy. In this work, we report a remarkable gain observed in the absorption of photons in the near infrared and visible region (400 nm-1000 nm) by a novel multi-phased oxide of titanium. Synthesised via a single step ultra-fast laser pulse interaction with pure titanium, characterisation studies have identified this oxide of titanium to be multi-phased and composed of Ti3O, (TiO.716)3.76 and TiO2 (rutile). Computed to have an average band gap value of 2.39 eV, this ultrafast laser induced multi-phased titanium oxide has especially exhibited steady absorption capability in the NIR range of 750-1000 nm, which to the best of our knowledge, was never reported before. The unique NIR absorption properties of the laser functionalised titanium coupled with the simplicity and versatility of the ultrafast laser interaction process involved thereby provides tremendous potential towards the photon sensitization of titanium and thereafter for the inception of a "full spectrum" solar device.

  10. 75 FR 42378 - Fisheries of the South Atlantic; Southeast Data, Assessment, and Review (SEDAR); South Atlantic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... (SEDAR) process, a multi-step method for determining the status of fish stocks in the Southeast Region. SEDAR includes a Data Workshop, a Stock Assessment Process and a Review Workshop. The product of the... datasets are appropriate for assessment analyses. The product of the Stock Assessment Process is a stock...

  11. Measurement needs guided by synthetic radar scans in high-resolution model output

    NASA Astrophysics Data System (ADS)

    Varble, A.; Nesbitt, S. W.; Borque, P.

    2017-12-01

    Microphysical and dynamical process interactions within deep convective clouds are not well understood, partly because measurement strategies often focus on statistics of cloud state rather than cloud processes. While processes cannot be directly measured, they can be inferred with sufficiently frequent and detailed scanning radar measurements focused on the life cycleof individual cloud regions. This is a primary goal of the 2018-19 DOE ARM Cloud, Aerosol, and Complex Terrain Interactions (CACTI) and NSF Remote sensing of Electrification, Lightning, And Mesoscale/microscale Processes with Adaptive Ground Observations (RELAMPAGO) field campaigns in central Argentina, where orographic deep convective initiation is frequent with some high-impact systems growing into the tallest and largest in the world. An array of fixed and mobile scanning multi-wavelength dual-polarization radars will be coupled with surface observations, sounding systems, multi-wavelength vertical profilers, and aircraft in situ measurements to characterize convective cloud life cycles and their relationship with environmental conditions. While detailed cloud processes are an observational target, the radar scan patterns that are most ideal for observing them are unclear. They depend on the locations and scales of key microphysical and dynamical processes operating within the cloud. High-resolution simulations of clouds, while imperfect, can provide information on these locations and scales that guide radar measurement needs. Radar locations are set in the model domain based on planned experiment locations, and simulatedorographic deep convective initiation and upscale growth are sampled using a number of different scans involving RHIs or PPIs with predefined elevation and azimuthal angles that approximately conform with radar range and beam width specifications. Each full scan pattern is applied to output atsingle model time steps with time step intervals that depend on the length of time required to complete each scan in the real world. The ability of different scans to detect key processes within the convective cloud life cycle are examined in connection with previous and subsequent dynamical and microphysical transitions. This work will guide strategic scan patterns that will be used during CACTI and RELAMPAGO.

  12. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings. PMID:25285151

  13. Mechanisms of mononuclear phagocyte recruitment in Alzheimer's disease.

    PubMed

    Hickman, Suzanne E; El Khoury, Joseph

    2010-04-01

    Alzheimer's disease (AD) is associated with a significant neuroinflammatory component. Mononuclear phagocytes including monocytes and microglia are the principal cells involved, and they accumulate at perivascular sites of beta-amyloid (Abeta) deposition and in senile plaques. Recent evidence suggests that mononuclear phagocyte accumulation in the AD brain is dependent on chemokines. CCL2, a major monocyte chemokine, is upregulated in the AD brain. Interaction of CCL2 with its receptor CCR2 regulates mononuclear phagocyte accumulation in a mouse model of AD. CCR2 deficiency leads to lower mononuclear phagocyte accumulation and is associated with higher brain Abeta levels, specifically around blood vessels, suggesting that monocytes accumulate at sites of Abeta deposition in an initial attempt to clear these deposits and stop or delay their neurotoxic effects. Indeed, enhancing mononuclear phagocyte accumulation delays progression of AD. Here we review the mechanisms of mononuclear phagocyte accumulation in AD and discuss the potential roles of additional chemokines and their receptors in this process. We also propose a multi-step model for recruitment of mononuclear phagocytes into the brain. The first step involves egress of monocyte/microglial precursors from the bone marrow into the blood. The second step is crossing the blood-brain barrier to the perivascular areas and into the brain parenchyma. The final step includes movement of monocytes/microglia from areas of the brain that lack any amyloid deposition to senile plaques. Understanding the mechanism of recruitment of mononuclear phagocytes to the AD brain is necessary to further understand the role of these cells in the pathogenesis of AD and to identify any potential therapeutic use of these cells for the treatment of this disease.

  14. Quasi-multi-pulse voltage source converter design with two control degrees of freedom

    NASA Astrophysics Data System (ADS)

    Vural, A. M.; Bayindir, K. C.

    2015-05-01

    In this article, the design details of a quasi-multi-pulse voltage source converter (VSC) switched at line frequency of 50 Hz are given in a step-by-step process. The proposed converter is comprised of four 12-pulse converter units, which is suitable for the simulation of single-/multi-converter flexible alternating current transmission system devices as well as high voltage direct current systems operating at the transmission level. The magnetic interface of the converter is originally designed with given all parameters for 100 MVA operation. The so-called two-angle control method is adopted to control the voltage magnitude and the phase angle of the converter independently. PSCAD simulation results verify both four-quadrant converter operation and closed-loop control of the converter operated as static synchronous compensator (STATCOM).

  15. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  16. Fabrication of magnetic bubble memory overlay

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Self-contained magnetic bubble memory overlay is fabricated by process that employs epitaxial deposition to form multi-layered complex of magnetically active components on single chip. Overlay fabrication comprises three metal deposition steps followed by subtractive etch.

  17. Earth As An Unstructured Mesh and Its Recovery from Seismic Waveform Data

    NASA Astrophysics Data System (ADS)

    De Hoop, M. V.

    2015-12-01

    We consider multi-scale representations of Earth's interior from thepoint of view of their possible recovery from multi- andhigh-frequency seismic waveform data. These representations areintrinsically connected to (geologic, tectonic) structures, that is,geometric parametrizations of Earth's interior. Indeed, we address theconstruction and recovery of such parametrizations using localiterative methods with appropriately designed data misfits andguaranteed convergence. The geometric parametrizations containinterior boundaries (defining, for example, faults, salt bodies,tectonic blocks, slabs) which can, in principle, be obtained fromsuccessive segmentation. We make use of unstructured meshes. For the adaptation and recovery of an unstructured mesh we introducean energy functional which is derived from the Hausdorff distance. Viaan augmented Lagrangian method, we incorporate the mentioned datamisfit. The recovery is constrained by shape optimization of theinterior boundaries, and is reminiscent of Hausdorff warping. We useelastic deformation via finite elements as a regularization whilefollowing a two-step procedure. The first step is an update determinedby the energy functional; in the second step, we modify the outcome ofthe first step where necessary to ensure that the new mesh isregular. This modification entails an array of techniques includingtopology correction involving interior boundary contacting andbreakup, edge warping and edge removal. We implement this as afeed-back mechanism from volume to interior boundary meshesoptimization. We invoke and apply a criterion of mesh quality controlfor coarsening, and for dynamical local multi-scale refinement. Wepresent a novel (fluid-solid) numerical framework based on theDiscontinuous Galerkin method.

  18. Continuous-Time Bilinear System Identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan

    2003-01-01

    The objective of this paper is to describe a new method for identification of a continuous-time multi-input and multi-output bilinear system. The approach is to make judicious use of the linear-model properties of the bilinear system when subjected to a constant input. Two steps are required in the identification process. The first step is to use a set of pulse responses resulting from a constant input of one sample period to identify the state matrix, the output matrix, and the direct transmission matrix. The second step is to use another set of pulse responses with the same constant input over multiple sample periods to identify the input matrix and the coefficient matrices associated with the coupling terms between the state and the inputs. Numerical examples are given to illustrate the concept and the computational algorithm for the identification method.

  19. Insights into the mechanism of X-ray-induced disulfide-bond cleavage in lysozyme crystals based on EPR, optical absorption and X-ray diffraction studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, Kristin A.; Black, Paul J.; Mercer, Kermit R.

    2013-12-01

    Electron paramagnetic resonance (EPR) and online UV–visible absorption microspectrophotometry with X-ray crystallography have been used in a complementary manner to follow X-ray-induced disulfide-bond cleavage, to confirm a multi-track radiation-damage process and to develop a model of that process. Electron paramagnetic resonance (EPR) and online UV–visible absorption microspectrophotometry with X-ray crystallography have been used in a complementary manner to follow X-ray-induced disulfide-bond cleavage. Online UV–visible spectroscopy showed that upon X-irradiation, disulfide radicalization appeared to saturate at an absorbed dose of approximately 0.5–0.8 MGy, in contrast to the saturating dose of ∼0.2 MGy observed using EPR at much lower dose rates. Themore » observations suggest that a multi-track model involving product formation owing to the interaction of two separate tracks is a valid model for radiation damage in protein crystals. The saturation levels are remarkably consistent given the widely different experimental parameters and the range of total absorbed doses studied. The results indicate that even at the lowest doses used for structural investigations disulfide bonds are already radicalized. Multi-track considerations offer the first step in a comprehensive model of radiation damage that could potentially lead to a combined computational and experimental approach to identifying when damage is likely to be present, to quantitate it and to provide the ability to recover the native unperturbed structure.« less

  20. Cleave and couple: toward fully sustainable catalytic conversion of lignocellulose to value added building blocks and fuels.

    PubMed

    Sun, Zhuohua; Barta, Katalin

    2018-06-21

    The structural complexity of lignocellulose offers unique opportunities for the development of entirely new, energy efficient and waste-free pathways in order to obtain valuable bio-based building blocks. Such sustainable catalytic methods - specifically tailored to address the efficient conversion of abundant renewable starting materials - are necessary to successfully compete, in the future, with fossil-based multi-step processes. In this contribution we give a summary of recent developments in this field and describe our "cleave and couple" strategy, where "cleave" refers to the catalytic deconstruction of lignocellulose to aromatic and aliphatic alcohol intermediates, and "couple" involves the development of novel, sustainable transformations for the formation of C-C and C-N bonds in order to obtain a range of attractive products from lignocellulose.

  1. New concept: cellular senescence in pathophysiology of cholangiocarcinoma.

    PubMed

    Sasaki, Motoko; Nakanuma, Yasuni

    2016-01-01

    Cholangiocarcinoma, a malignant tumor arising in the hepatobiliary system, presents with poor prognosis because of difficulty in its early detection/diagnosis. Recent progress revealed that cellular senescence may be involved in the pathophysiology of cholangiocarcinoma. Cellular senescence is defined as permanent growth arrest caused by several cellular injuries, such as oncogenic mutations and oxidative stress. "Oncogene-induced" and/or stress-induced senescence may occur in the process of multi-step cholangiocarcinogenesis, and overexpression of a polycomb group protein EZH2 may play a role in the escape from, and/or bypassing of, senescence. Furthermore, senescent cells may play important roles in tumor development and progression via the production of senescence-associated secretory phenotypes. Cellular senescence may be a new target for the prevention, early diagnosis, and therapy of cholangiocarcinoma in the near future.

  2. A computational kinetic model of diffusion for molecular systems.

    PubMed

    Teo, Ivan; Schulten, Klaus

    2013-09-28

    Regulation of biomolecular transport in cells involves intra-protein steps like gating and passage through channels, but these steps are preceded by extra-protein steps, namely, diffusive approach and admittance of solutes. The extra-protein steps develop over a 10-100 nm length scale typically in a highly particular environment, characterized through the protein's geometry, surrounding electrostatic field, and location. In order to account for solute energetics and mobility of solutes in this environment at a relevant resolution, we propose a particle-based kinetic model of diffusion based on a Markov State Model framework. Prerequisite input data consist of diffusion coefficient and potential of mean force maps generated from extensive molecular dynamics simulations of proteins and their environment that sample multi-nanosecond durations. The suggested diffusion model can describe transport processes beyond microsecond duration, relevant for biological function and beyond the realm of molecular dynamics simulation. For this purpose the systems are represented by a discrete set of states specified by the positions, volumes, and surface elements of Voronoi grid cells distributed according to a density function resolving the often intricate relevant diffusion space. Validation tests carried out for generic diffusion spaces show that the model and the associated Brownian motion algorithm are viable over a large range of parameter values such as time step, diffusion coefficient, and grid density. A concrete application of the method is demonstrated for ion diffusion around and through the Eschericia coli mechanosensitive channel of small conductance ecMscS.

  3. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Analyzing Idioms and Their Frequency in Three Advanced ILI Textbooks: A Corpus-Based Study

    ERIC Educational Resources Information Center

    Alavi, Sepideh; Rajabpoor, Aboozar

    2015-01-01

    The present study aimed at identifying and quantifying the idioms used in three ILI "Advanced" level textbooks based on three different English corpora; MICASE, BNC and the Brown Corpus, and comparing the frequencies of the idioms across the three corpora. The first step of the study involved searching the books to find multi-word…

  5. 78 FR 15707 - Fisheries of the Atlantic and Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the status of... Center. Participants include: data collectors and database managers; stock assessment scientists...

  6. Cold-in-place recycling in New York State.

    DOT National Transportation Integrated Search

    2010-07-01

    Cold in-place recycling (CIPR) is a continuous multi-step process in which the existing asphalt pavement is : recycled using specialized equipment that cold mills the asphaltic pavement and blends asphalt emulsion and : aggregate (if necessary) with ...

  7. Multi-Family Pediatric Pain Group Therapy: Capturing Acceptance and Cultivating Change.

    PubMed

    Huestis, Samantha E; Kao, Grace; Dunn, Ashley; Hilliard, Austin T; Yoon, Isabel A; Golianu, Brenda; Bhandari, Rashmi P

    2017-12-07

    Behavioral health interventions for pediatric chronic pain include cognitive-behavioral (CBT), acceptance and commitment (ACT), and family-based therapies, though literature regarding multi-family therapy (MFT) is sparse. This investigation examined the utility and outcomes of the Courage to Act with Pain: Teens Identifying Values, Acceptance, and Treatment Effects (CAPTIVATE) program, which included all three modalities (CBT, ACT, MFT) for youth with chronic pain and their parents. Program utility, engagement, and satisfaction were evaluated via quantitative and qualitative feedback. Pain-specific psychological, behavioral, and interpersonal processes were examined along with outcomes related to disability, quality of life, pain interference, fatigue, anxiety, and depressive symptoms. Participants indicated that CAPTIVATE was constructive, engaging, and helpful for social and family systems. Clinical and statistical improvements with large effect sizes were captured for pain catastrophizing, acceptance, and protective parenting but not family functioning. Similar effects were found for functional disability, pain interference, fatigue, anxiety, and depression. Given the importance of targeting multiple systems in the management of pediatric chronic pain, preliminary findings suggest a potential new group-based treatment option for youth and families. Next steps involve evaluating the differential effect of the program over treatment as usual, as well as specific CBT, ACT, and MFT components and processes that may affect outcomes.

  8. Identification of Intermediate in Hepatitis B Virus CCC DNA Formation and Sensitive and Selective CCC DNA Detection.

    PubMed

    Luo, Jun; Cui, Xiuji; Gao, Lu; Hu, Jianming

    2017-06-21

    The hepatitis B virus (HBV) covalently closed circular (CCC) DNA functions as the only viral template capable of coding for all the viral RNA species and is thus essential to initiate and sustain viral replication. CCC DNA is converted, in a multi-step and ill-understood process, from a relaxed circular (RC) DNA, in which neither of the two DNA strands is covalently closed. To detect putative intermediates during RC to CCC DNA conversion, two 3' exonucleases Exo I and Exo III, in combination were used to degrade all DNA strands with a free 3' end, which would nevertheless preserve closed circular DNA, either single-stranded (SS) or double-stranded (DS). Indeed, a RC DNA species with a covalently closed minus strand but an open plus strand (closed minus-strand RC DNA or cM-RC DNA) was detected by this approach. Further analyses indicated that at least some of the plus strands in such a putative intermediate likely still retained the RNA primer that is attached to the 5' end of the plus strand in RC DNA, suggesting that minus strand closing can occur before plus strand processing. Furthermore, the same nuclease treatment proved to be useful for sensitive and specific detection of CCC DNA by removing all DNA species other than closed circular DNA. Application of these and similar approaches may allow the identification of additional intermediates during CCC DNA formation and facilitate specific and sensitive detection of CCC DNA, which should help elucidate the pathways of CCC DNA formation and factors involved. IMPORTANCE The hepatitis B virus (HBV) covalently closed circular (CCC) DNA is the molecular basis of viral persistence, by serving as the viral transcriptional template. CCC DNA is converted, in a multi-step and ill-understood process, from a relaxed circular (RC) DNA. Little is currently understood about the pathways or factors involved in CCC DNA formation. We have now detected a likely intermediate during the conversion of RC to CCC DNA, thus providing important clues to the pathways of CCC DNA formation. Furthermore, the same experimental approach that led to the detection of the intermediate could also facilitate specific and sensitive detection of CCC DNA, which has remained challenging. This and similar approaches will help identify additional intermediates during CCC DNA formation and elucidate the pathways and factors involved. Copyright © 2017 American Society for Microbiology.

  9. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders

    PubMed Central

    Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-01-01

    Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510

  10. 75 FR 59226 - Fisheries of the South Atlantic, Gulf of Mexico, and Caribbean; Southeastern Data, Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-27

    ... the Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the... the South Atlantic, Gulf of Mexico, and Caribbean; Southeastern Data, Assessment, and Review (SEDAR... Committee will meet to discuss the SEDAR assessment schedule, budget, and the SEDAR process. See...

  11. Reversing the conventional leather processing sequence for cleaner leather production.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2006-02-01

    Conventional leather processing generally involves a combination of single and multistep processes that employs as well as expels various biological, inorganic, and organic materials. It involves nearly 14-15 steps and discharges a huge amount of pollutants. This is primarily due to the fact that conventional leather processing employs a "do-undo" process logic. In this study, the conventional leather processing steps have been reversed to overcome the problems associated with the conventional method. The charges of the skin matrix and of the chemicals and pH profiles of the process have been judiciously used for reversing the process steps. This reversed process eventually avoids several acidification and basification/neutralization steps used in conventional leather processing. The developed process has been validated through various analyses such as chromium content, shrinkage temperature, softness measurements, scanning electron microscopy, and physical testing of the leathers. Further, the performance of the leathers is shown to be on par with conventionally processed leathers through bulk property evaluation. The process enjoys a significant reduction in COD and TS by 53 and 79%, respectively. Water consumption and discharge is reduced by 65 and 64%, respectively. Also, the process benefits from significant reduction in chemicals, time, power, and cost compared to the conventional process.

  12. Quality measurement and benchmarking of HPV vaccination services: a new approach.

    PubMed

    Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta

    2014-01-01

    A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.

  13. Process of Converting Military Training Materials to Competency-Based Modules for Civilian Use. A Documentation.

    ERIC Educational Resources Information Center

    Organization and Human Resources Development Associates, Inc., Austin, TX.

    This document outlines the steps in the process of converting military training materials in physician and dental assistant education to competency-based learning modules for use in the civilian sector. Subsections discuss the activity and any problems or issues involved for 14 steps. The 14 steps are as follow: establish liaison to obtain…

  14. Structure determination of an 11-subunit exosome in complex with RNA by molecular replacement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makino, Debora Lika, E-mail: dmakino@biochem.mpg.de; Conti, Elena

    The crystallographic steps towards the structure determination of a complete eukaryotic exosome complex bound to RNA are presented. Phasing of this 11-protein subunit complex was carried out via molecular replacement. The RNA exosome is an evolutionarily conserved multi-protein complex involved in the 3′ degradation of a variety of RNA transcripts. In the nucleus, the exosome participates in the maturation of structured RNAs, in the surveillance of pre-mRNAs and in the decay of a variety of noncoding transcripts. In the cytoplasm, the exosome degrades mRNAs in constitutive and regulated turnover pathways. Several structures of subcomplexes of eukaryotic exosomes or related prokaryoticmore » exosome-like complexes are known, but how the complete assembly is organized to fulfil processive RNA degradation has been unclear. An atomic snapshot of a Saccharomyces cerevisiae 420 kDa exosome complex bound to an RNA substrate in the pre-cleavage state of a hydrolytic reaction has been determined. Here, the crystallographic steps towards the structural elucidation, which was carried out by molecular replacement, are presented.« less

  15. Laser resonance ionization spectroscopy on lutetium for the MEDICIS project

    NASA Astrophysics Data System (ADS)

    Gadelshin, V.; Cocolios, T.; Fedoseev, V.; Heinke, R.; Kieck, T.; Marsh, B.; Naubereit, P.; Rothe, S.; Stora, T.; Studer, D.; Van Duppen, P.; Wendt, K.

    2017-11-01

    The MEDICIS-PROMED Innovative Training Network under the Horizon 2020 EU program aims to establish a network of early stage researchers, involving scientific exchange and active cooperation between leading European research institutions, universities, hospitals, and industry. Primary scientific goal is the purpose of providing and testing novel radioisotopes for nuclear medical imaging and radionuclide therapy. Within a closely linked project at CERN, a dedicated electromagnetic mass separator system is presently under installation for production of innovative radiopharmaceutical isotopes at the new CERN-MEDICIS laboratory, directly adjacent to the existing CERN-ISOLDE radioactive ion beam facility. It is planned to implement a resonance ionization laser ion source (RILIS) to ensure high efficiency and unrivaled purity in the production of radioactive ions. To provide a highly efficient ionization process, identification and characterization of a specific multi-step laser ionization scheme for each individual element with isotopes of interest is required. The element lutetium is of primary relevance, and therefore was considered as first candidate. Three two-step excitation schemes for lutetium atoms are presented in this work, and spectroscopic results are compared with data of other authors.

  16. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  17. Analysis, design, fabrication, and performance of three-dimensional braided composites

    NASA Astrophysics Data System (ADS)

    Kostar, Timothy D.

    1998-11-01

    Cartesian 3-D (track and column) braiding as a method of composite preforming has been investigated. A complete analysis of the process was conducted to understand the limitations and potentials of the process. Knowledge of the process was enhanced through development of a computer simulation, and it was discovered that individual control of each track and column and multiple-step braid cycles greatly increases possible braid architectures. Derived geometric constraints coupled with the fundamental principles of Cartesian braiding resulted in an algorithm to optimize preform geometry in relation to processing parameters. The design of complex and unusual 3-D braids was investigated in three parts: grouping of yarns to form hybrid composites via an iterative simulation; design of composite cross-sectional shape through implementation of the Universal Method; and a computer algorithm developed to determine the braid plan based on specified cross-sectional shape. Several 3-D braids, which are the result of variations or extensions to Cartesian braiding, are presented. An automated four-step braiding machine with axial yarn insertion has been constructed and used to fabricate two-step, double two-step, four-step, and four-step with axial and transverse yarn insertion braids. A working prototype of a multi-step braiding machine was used to fabricate four-step braids with surrogate material insertion, unique hybrid structures from multiple track and column displacement and multi-step cycles, and complex-shaped structures with constant or varying cross-sections. Braid materials include colored polyester yarn to study the yarn grouping phenomena, Kevlar, glass, and graphite for structural reinforcement, and polystyrene, silicone rubber, and fasteners for surrogate material insertion. A verification study for predicted yarn orientation and volume fraction was conducted, and a topological model of 3-D braids was developed. The solid model utilizes architectural parameters, generated from the process simulation, to determine the composite elastic properties. Methods of preform consolidation are investigated and the results documented. The extent of yarn deformation (packing) resulting from preform consolidation was investigated through cross-sectional micrographs. The fiber volume fraction of select hybrid composites was measured and representative unit cells are suggested. Finally, a comparison study of the elastic performance of Kevlar/epoxy and carbon/Kevlar hybrid composites was conducted.

  18. Underground structure pattern and multi AO reaction with step feed concept for upgrading an large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Peng, Yi; Zhang, Jie; Li, Dong

    2018-03-01

    A large wastewater treatment plant (WWTP) could not meet the new demand of urban environment and the need of reclaimed water in China, using a US treatment technology. Thus a multi AO reaction process (Anaerobic/oxic/anoxic/oxic/anoxic/oxic) WWTP with underground structure was proposed to carry out the upgrade project. Four main new technologies were applied: (1) multi AO reaction with step feed technology; (2) deodorization; (3) new energy-saving technology such as water resource heat pump and optical fiber lighting system; (4) dependable old WWTP’s water quality support measurement during new WWTP’s construction. After construction, upgrading WWTP had saved two thirds land occupation, increased 80% treatment capacity and improved effluent standard by more than two times. Moreover, it had become a benchmark of an ecological negative capital changing to a positive capital.

  19. Development of the Fray-Farthing-Chen Cambridge Process: Towards the Sustainable Production of Titanium and Its Alloys

    NASA Astrophysics Data System (ADS)

    Hu, Di; Dolganov, Aleksei; Ma, Mingchan; Bhattacharya, Biyash; Bishop, Matthew T.; Chen, George Z.

    2018-02-01

    The Kroll process has been employed for titanium extraction since the 1950s. It is a labour and energy intensive multi-step semi-batch process. The post-extraction processes for making the raw titanium into alloys and products are also excessive, including multiple remelting steps. Invented in the late 1990s, the Fray-Farthing-Chen (FFC) Cambridge process extracts titanium from solid oxides at lower energy consumption via electrochemical reduction in molten salts. Its ability to produce alloys and powders, while retaining the cathode shape also promises energy and material efficient manufacturing. Focusing on titanium and its alloys, this article reviews the recent development of the FFC-Cambridge process in two aspects, (1) resource and process sustainability and (2) advanced post-extraction processing.

  20. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    NASA Astrophysics Data System (ADS)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2017-04-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  1. a Global Registration Algorithm of the Single-Closed Ring Multi-Stations Point Cloud

    NASA Astrophysics Data System (ADS)

    Yang, R.; Pan, L.; Xiang, Z.; Zeng, H.

    2018-04-01

    Aimed at the global registration problem of the single-closed ring multi-stations point cloud, a formula in order to calculate the error of rotation matrix was constructed according to the definition of error. The global registration algorithm of multi-station point cloud was derived to minimize the error of rotation matrix. And fast-computing formulas of transformation matrix with whose implementation steps and simulation experiment scheme was given. Compared three different processing schemes of multi-station point cloud, the experimental results showed that the effectiveness of the new global registration method was verified, and it could effectively complete the global registration of point cloud.

  2. Multi-actor involvement for integrating ecosystem services in strategic environmental assessment of spatial plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozas-Vásquez, Daniel, E-mail: danielrozas@gmail.com; Laboratorio de Planificación Territorial, Universidad Católica de Temuco, Rudecindo ortega, 02950 Temuco; Fürst, Christine

    Integrating an ecosystem services (ES) approach into Strategic Environmental Assessment (SEA) of spatial plans potentially enhances the consideration of the value of nature in decision making and policy processes. However, there is increasing concern about the institutional context and a lack of a common understanding of SEA and ecosystem services for adopting them as an integrated framework. This paper addresses this concern by analysing the current understanding and network relations in a multi-actor arrangement as a first step towards a successful integration of ES in SEA and spatial planning. Our analysis focuses on a case study in Chile, where wemore » administered a questionnaire survey to some of the main actors involved in the spatial planning process. The questionnaire focused on issues such as network relations among actors and on conceptual understanding, perceptions and challenges for integrating ES in SEA and spatial planning, knowledge on methodological approaches, and the connections and gaps in the science-policy interface. Our findings suggest that a common understanding of SEA and especially of ES in a context of multiple actors is still at an initial stage in Chile. Additionally, the lack of institutional guidelines and methodological support is considered the main challenge for integration. We conclude that preconditions exist in Chile for integrating ES in SEA for spatial planning, but they strongly depend on appropriate governance schemes that promote a close science-policy interaction, as well as collaborative work and learning. - Highlights: • Linking ecosystem services in SEA is an effective framework for sustainability. • Multi-actor understanding and networks in ecosystem services and SEA were analyzed. • Understanding of SEA and especially of ES is still in an initial stage in Chile. • A lack of institutional guidelines is one of the key challenges for this link.« less

  3. Approaches of multilayer overlay process control for 28nm FD-SOI derivative applications

    NASA Astrophysics Data System (ADS)

    Duclaux, Benjamin; De Caunes, Jean; Perrier, Robin; Gatefait, Maxime; Le Gratiet, Bertrand; Chapon, Jean-Damien; Monget, Cédric

    2018-03-01

    Derivative technology like embedded Non-Volatile Memories (eNVM) is raising new types of challenges on the "more than Moore" path. By its construction: overlay is critical across multiple layers, by its running mode: usage of high voltage are stressing leakages and breakdown, and finally with its targeted market: Automotive, Industry automation, secure transactions… which are all requesting high device reliability (typically below 1ppm level). As a consequence, overlay specifications are tights, not only between one layer and its reference, but also among the critical layers sharing the same reference. This work describes a broad picture of the key points for multilayer overlay process control in the case of a 28nm FD-SOI technology and its derivative flows. First, the alignment trees of the different flow options have been optimized using a realistic process assumptions calculation for indirect overlay. Then, in the case of a complex alignment tree involving heterogeneous scanner toolset, criticality of tool matching between reference layer and critical layers of the flow has been highlighted. Improving the APC control loops of these multilayer dependencies has been studied with simulations of feed-forward as well as implementing new rework algorithm based on multi-measures. Finally, the management of these measurement steps raises some issues for inline support and using calculations or "virtual overlay" could help to gain some tool capability. A first step towards multilayer overlay process control has been taken.

  4. Sealed-bladdered chemical processing method and apparatus

    DOEpatents

    Harless, D. Phillip

    1999-01-01

    A method and apparatus which enables a complete multi-stepped chemical treatment process to occur within a single, sealed-bladdered vessel 31. The entire chemical process occurs without interruption of the sealed-bladdered vessel 31 such as opening the sealed-bladdered vessel 31 between various steps of the process. The sealed-bladdered vessel 31 is loaded with a batch to be dissolved, treated, decanted, rinsed and/or dried. A pressure filtration step may also occur. The self-contained chemical processing apparatus 32 contains a sealed-bladder 32, a fluid pump 34, a reservoir 20, a compressed gas inlet, a vacuum pump 24, and a cold trap 23 as well as the associated piping 33, numerous valves 21,22,25,26,29,30,35,36 and other controls associated with such an apparatus. The claimed invention allows for dissolution and/or chemical treatment without the operator of the self-contained chemical processing apparatus 38 coming into contact with any of the process materials.

  5. Step by Step to Smoke-Free Schools.

    ERIC Educational Resources Information Center

    VanSciver, James H.; Roberts, H. Earl

    1989-01-01

    This ERIC digest discusses ways of effectively banning smoking in schools so that controversies do not continue after implementation of the policy. By advocating a process approach, the document cites steps taken by the Lake Forest School Board to prohibit smoking in and around school grounds. Step one involved committee planning involving…

  6. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of exploiting large computational and storage resources of Cloud Computing platforms for large scale DInSAR analysis. The presented Cloud Computing P-SBAS processing chain can be a precious tool in the perspective of developing operational services disposable for the EO scientific community related to hazard monitoring and risk prevention and mitigation.

  7. Effects of protein and phosphate buffer concentrations on thermal denaturation of lysozyme analyzed by isoconversional method

    PubMed Central

    Cao, X.M.; Tian, Y.; Wang, Z.Y.; Liu, Y.W.; Wang, C.X.

    2016-01-01

    ABSTRACT Thermal denaturation of lysozymes was studied as a function of protein concentration, phosphate buffer concentration, and scan rate using differential scanning calorimetry (DSC), which was then analyzed by the isoconversional method. The results showed that lysozyme thermal denaturation was only slightly affected by the protein concentration and scan rate. When the protein concentration and scan rate increased, the denaturation temperature (Tm) also increased accordingly. On the contrary, the Tm decreased with the increase of phosphate buffer concentration. The denaturation process of lysozymes was accelatated and the thermal stability was reduced with the increase of phosphate concentration. One part of degeneration process was not reversible where the aggregation occurred. The other part was reversible. The apparent activation energy (Ea) was computed by the isoconversional method. It decreased with the increase of the conversion ratio (α). The observed denaturation process could not be described by a simple reaction mechanism. It was not a process involving 2 standard reversible states, but a multi-step process. The new opportunities for investigating the kinetics process of protein denaturation can be supplied by this novel isoconversional method. PMID:27459596

  8. Automatic Road Gap Detection Using Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  9. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells.

    PubMed

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J; Rhodes, Christopher; Mukherjee, Partha P

    2016-02-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring.

  10. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells

    PubMed Central

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J.; Rhodes, Christopher; Mukherjee, Partha P.

    2016-01-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring. PMID:26863503

  11. Parallel Multi-Step/Multi-Rate Integration of Two-Time Scale Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Chang, Johnny T.; Ploen, Scott R.; Sohl, Garett. A,; Martin, Bryan J.

    2004-01-01

    Increasing demands on the fidelity of simulations for real-time and high-fidelity simulations are stressing the capacity of modern processors. New integration techniques are required that provide maximum efficiency for systems that are parallelizable. However many current techniques make assumptions that are at odds with non-cascadable systems. A new serial multi-step/multi-rate integration algorithm for dual-timescale continuous state systems is presented which applies to these systems, and is extended to a parallel multi-step/multi-rate algorithm. The superior performance of both algorithms is demonstrated through a representative example.

  12. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety

    PubMed Central

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-01-01

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. PMID:27294931

  13. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety.

    PubMed

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-06-09

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety.

  14. The "Motor" in Implicit Motor Sequence Learning: A Foot-stepping Serial Reaction Time Task.

    PubMed

    Du, Yue; Clark, Jane E

    2018-05-03

    This protocol describes a modified serial reaction time (SRT) task used to study implicit motor sequence learning. Unlike the classic SRT task that involves finger-pressing movements while sitting, the modified SRT task requires participants to step with both feet while maintaining a standing posture. This stepping task necessitates whole body actions that impose postural challenges. The foot-stepping task complements the classic SRT task in several ways. The foot-stepping SRT task is a better proxy for the daily activities that require ongoing postural control, and thus may help us better understand sequence learning in real-life situations. In addition, response time serves as an indicator of sequence learning in the classic SRT task, but it is unclear whether response time, reaction time (RT) representing mental process, or movement time (MT) reflecting the movement itself, is a key player in motor sequence learning. The foot-stepping SRT task allows researchers to disentangle response time into RT and MT, which may clarify how motor planning and movement execution are involved in sequence learning. Lastly, postural control and cognition are interactively related, but little is known about how postural control interacts with learning motor sequences. With a motion capture system, the movement of the whole body (e.g., the center of mass (COM)) can be recorded. Such measures allow us to reveal the dynamic processes underlying discrete responses measured by RT and MT, and may aid in elucidating the relationship between postural control and the explicit and implicit processes involved in sequence learning. Details of the experimental set-up, procedure, and data processing are described. The representative data are adopted from one of our previous studies. Results are related to response time, RT, and MT, as well as the relationship between the anticipatory postural response and the explicit processes involved in implicit motor sequence learning.

  15. Theoretical Studies of Chemical Reactions following Electronic Excitation

    NASA Technical Reports Server (NTRS)

    Chaban, Galina M.

    2003-01-01

    The use of multi-configurational wave functions is demonstrated for several processes: tautomerization reactions in the ground and excited states of the DNA base adenine, dissociation of glycine molecule after electronic excitation, and decomposition/deformation of novel rare gas molecules HRgF. These processes involve bond brealung/formation and require multi-configurational approaches that include dynamic correlation.

  16. A step-by-step translation of evidence into a psychosocial intervention for everyday activities in dementia: a focus group study.

    PubMed

    Giebel, Clarissa M; Challis, David; Hooper, Nigel M; Ferris, Sally

    2018-03-01

    In order to increase the efficacy of psychosocial interventions in dementia, a step-by-step process translating evidence and public engagement should be adhered to. This paper describes such a process by involving a two-stage focus group with people with dementia (PwD), informal carers, and staff. Based on previous evidence, general aspects of effective interventions were drawn out. These were tested in the first stage of focus groups, one with informal carers and PwD and one with staff. Findings from this stage helped shape the intervention further specifying its content. In the second stage, participants were consulted about the detailed components. The extant evidence base and focus groups helped to identify six practical and situation-specific elements worthy of consideration in planning such an intervention, including underlying theory and personal motivations for participation. Carers, PwD, and staff highlighted the importance of rapport between practitioners and PwD prior to commencing the intervention. It was also considered important that the intervention would be personalised to each individual. This paper shows how valuable public involvement can be to intervention development, and outlines a process of public involvement for future intervention development. The next step would be to formally test the intervention.

  17. SU-F-T-248: FMEA Risk Analysis Implementation (AAPM TG-100) in Total Skin Electron Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    2016-06-15

    Purpose: Total Skin Electron Irradiation (TSEI) is a radiotherapy treatment which involves irradiating the entire body surface as homogeneously as possible. It is composed of an extensive multi-step technique in which quality management requires high consumption of resources and a fluid communication between the involved staff, necessary to improve the safety of treatment. The TG-100 proposes a new perspective of quality management in radiotherapy, presenting a systematic method of risk analysis throughout the global flow of the stages through the patient. The purpose of this work has been to apply TG-100 approach to the TSEI procedure in our institution. Methods:more » A multidisciplinary team specifically targeting TSEI procedure was formed, that met regularly and jointly developed the process map (PM), following TG-100 guidelines of the AAPM. This PM is a visual representation of the temporal flow of steps through the patient since start until the end of his stay in the radiotherapy service. Results: This is the first stage of the full risk analysis, which is being carried out in the center. The PM provides an overview of the process and facilitates the understanding of the team members who will participate in the subsequent analysis. Currently, the team is implementing the analysis of failure modes and effects (FMEA). The failure modes of each of the steps have been identified and assessors are assigning a value of severity (S), frequency of occurrence (O) and lack of detection (D) individually. To our knowledge, this is the first PM made for the TSEI. The developed PM can be useful for those centers that intend to implement the TSEI technique. Conclusion: The PM of TSEI technique has been established, as the first stage of full risk analysis, performed in a reference center in this treatment.« less

  18. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  19. Optimal design of an alignment-free two-DOF rehabilitation robot for the shoulder complex.

    PubMed

    Galinski, Daniel; Sapin, Julien; Dehez, Bruno

    2013-06-01

    This paper presents the optimal design of an alignment-free exoskeleton for the rehabilitation of the shoulder complex. This robot structure is constituted of two actuated joints and is linked to the arm through passive degrees of freedom (DOFs) to drive the flexion-extension and abduction-adduction movements of the upper arm. The optimal design of this structure is performed through two steps. The first step is a multi-objective optimization process aiming to find the best parameters characterizing the robot and its position relative to the patient. The second step is a comparison process aiming to select the best solution from the optimization results on the basis of several criteria related to practical considerations. The optimal design process leads to a solution outperforming an existing solution on aspects as kinematics or ergonomics while being more simple.

  20. Cascade catalysis in membranes with enzyme immobilization for multi-enzymatic conversion of CO2 to methanol.

    PubMed

    Luo, Jianquan; Meyer, Anne S; Mateiu, R V; Pinelo, Manuel

    2015-05-25

    Facile co-immobilization of enzymes is highly desirable for bioconversion methods involving multi-enzymatic cascade reactions. Here we show for the first time that three enzymes can be immobilized in flat-sheet polymeric membranes simultaneously or separately by simple pressure-driven filtration (i.e. by directing membrane fouling formation), without any addition of organic solvent. Such co-immobilization and sequential immobilization systems were examined for the production of methanol from CO2 with formate dehydrogenase (FDH), formaldehyde dehydrogenase (FaldDH) and alcohol dehydrogenase (ADH). Enzyme activity was fully retained by this non-covalent immobilization strategy. The two immobilization systems had similar catalytic efficiencies because the second reaction (formic acid→formaldehyde) catalyzed by FaldDH was found to be the cascade bottleneck (a threshold substrate concentration was required). Moreover, the trade-off between the mitigation of product inhibition and low substrate concentration for the adjacent enzymes probably made the co-immobilization meaningless. Thus, sequential immobilization could be used for multi-enzymatic cascade reactions, as it allowed the operational conditions for each single step to be optimized, not only during the enzyme immobilization but also during the reaction process, and the pressure-driven mass transfer (flow-through mode) could overcome the diffusion resistance between enzymes. This study not only offers a green and facile immobilization method for multi-enzymatic cascade systems, but also reveals the reaction bottleneck and provides possible solutions for the bioconversion of CO2 to methanol. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Italian translation and cultural adaptation of the communication assessment tool in an outpatient surgical clinic.

    PubMed

    Scala, Daniela; Menditto, Enrica; Armellino, Mariano Fortunato; Manguso, Francesco; Monetti, Valeria Marina; Orlando, Valentina; Antonino, Antonio; Makoul, Gregory; De Palma, Maurizio

    2016-04-29

    The aim of the study is to translate and cross-culturally adapt, for use in the Italian context, the Communication Assessment Tool (CAT) developed by Makoul and colleagues. The study was performed in the out-patient clinic of the Surgical Department of Cardarelli Hospital in Naples, Italy. It involved a systematic, standardized, multi-step process adhering to internationally accepted and recommended guidelines. Corrections and adjustments to the translation addressed both linguistic factors and cultural components. The CAT was translated into Italian by two independent Italian mother-tongue translators. The consensus version was then back-translated by an English mother-tongue translator. This translation process was followed by a consensus meeting between the authors of translation and investigators, and then by two comprehension tests on a total of 65 patients. Results of the translation and cross-cultural adaptation were satisfactory and indicate that the Italian translation of the CAT can be used with confidence in the Italian context.

  2. Method for network analyzation and apparatus

    DOEpatents

    Bracht, Roger B.; Pasquale, Regina V.

    2001-01-01

    A portable network analyzer and method having multiple channel transmit and receive capability for real-time monitoring of processes which maintains phase integrity, requires low power, is adapted to provide full vector analysis, provides output frequencies of up to 62.5 MHz and provides fine sensitivity frequency resolution. The present invention includes a multi-channel means for transmitting and a multi-channel means for receiving, both in electrical communication with a software means for controlling. The means for controlling is programmed to provide a signal to a system under investigation which steps consecutively over a range of predetermined frequencies. The resulting received signal from the system provides complete time domain response information by executing a frequency transform of the magnitude and phase information acquired at each frequency step.

  3. Cellulose Biosynthesis: Current Views and Evolving Concepts

    PubMed Central

    SAXENA, INDER M.; BROWN, R. MALCOLM

    2005-01-01

    • Aims To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. • Scope Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. • Conclusions With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back. PMID:15894551

  4. Cellulose biosynthesis: current views and evolving concepts.

    PubMed

    Saxena, Inder M; Brown, R Malcolm

    2005-07-01

    To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. * Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. * With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back.

  5. Density functional theory and RRKM calculations of decompositions of the metastable E-2,4-pentadienal molecular ions.

    PubMed

    Solano Espinoza, Eduardo A; Vallejo Narváez, Wilmer E

    2010-07-01

    The potential energy profiles for the fragmentations that lead to [C(5)H(5)O](+) and [C(4)H(6)](+*) ions from the molecular ions [C(5)H(6)O](+*) of E-2,4-pentadienal were obtained from calculations at the UB3LYP/6-311G + + (3df,3pd)//UB3LYP/6-31G(d,p) level of theory. Kinetic barriers and harmonic frequencies obtained by the density functional method were then employed in Rice-Ramsperger-Kassel-Marcus calculations of individual rate coefficients for a large number of reaction steps. The pre-equilibrium and rate-controlling step approximations were applied to different regions of the complex potential energy surface, allowing the overall rate of decomposition to be calculated and discriminated between three rival pathways: C-H bond cleavage, decarbonylation and cyclization. These processes should have to compete for an equilibrated mixture of four conformers of the E-2,4-pentadienal ions. The direct dissociation, however, can only become important in the high-energy regime. In contrast, loss of CO and cyclization are observable processes in the metastable kinetic window. The former involves a slow 1,2-hydrogen shift from the carbonyl group that is immediately followed by the formation of an ion-neutral complex which, in turn, decomposes rapidly to the s-trans-1,3-butadiene ion [C(4)H(6)](+*). The predominating metastable channel is the second one, that is, a multi-step ring closure which starts with a rate-limiting cis-trans isomerization. This process yields a mixture of interconverting pyran ions that dissociates to the pyrylium ions [C(5)H(5)O](+). These results can be used to rationalize the CID mass spectrum of E-2,4-pentadienal in a low-energy regime. 2010 John Wiley & Sons, Ltd.

  6. The eClinical Care Pathway Framework: a novel structure for creation of online complex clinical care pathways and its application in the management of sexually transmitted infections.

    PubMed

    Gibbs, Jo; Sutcliffe, Lorna J; Gkatzidou, Voula; Hone, Kate; Ashcroft, Richard E; Harding-Esch, Emma M; Lowndes, Catherine M; Sadiq, S Tariq; Sonnenberg, Pam; Estcourt, Claudia S

    2016-07-22

    Despite considerable international eHealth impetus, there is no guidance on the development of online clinical care pathways. Advances in diagnostics now enable self-testing with home diagnosis, to which comprehensive online clinical care could be linked, facilitating completely self-directed, remote care. We describe a new framework for developing complex online clinical care pathways and its application to clinical management of people with genital chlamydia infection, the commonest sexually transmitted infection (STI) in England. Using the existing evidence-base, guidelines and examples from contemporary clinical practice, we developed the eClinical Care Pathway Framework, a nine-step iterative process. Step 1: define the aims of the online pathway; Step 2: define the functional units; Step 3: draft the clinical consultation; Step 4: expert review; Step 5: cognitive testing; Step 6: user-centred interface testing; Step 7: specification development; Step 8: software testing, usability testing and further comprehension testing; Step 9: piloting. We then applied the Framework to create a chlamydia online clinical care pathway (Online Chlamydia Pathway). Use of the Framework elucidated content and structure of the care pathway and identified the need for significant changes in sequences of care (Traditional: history, diagnosis, information versus Online: diagnosis, information, history) and prescribing safety assessment. The Framework met the needs of complex STI management and enabled development of a multi-faceted, fully-automated consultation. The Framework provides a comprehensive structure on which complex online care pathways such as those needed for STI management, which involve clinical services, public health surveillance functions and third party (sexual partner) management, can be developed to meet national clinical and public health standards. The Online Chlamydia Pathway's standardised method of collecting data on demographics and sexual behaviour, with potential for interoperability with surveillance systems, could be a powerful tool for public health and clinical management.

  7. Communicative Interaction Processes Involving Non-Vocal Physically Handicapped Children.

    ERIC Educational Resources Information Center

    Harris, Deberah

    1982-01-01

    Communication prostheses are critical components of the nonvocal child's communication process, but are only one component. This article focuses on the steps involved in communicative interaction processes and the potential barriers to the development of effective interaction and analysis of nonvocal communicative interactions. A discussion of the…

  8. Connection between quantum systems involving the fourth Painlevé transcendent and k-step rational extensions of the harmonic oscillator related to Hermite exceptional orthogonal polynomial

    NASA Astrophysics Data System (ADS)

    Marquette, Ian; Quesne, Christiane

    2016-05-01

    The purpose of this communication is to point out the connection between a 1D quantum Hamiltonian involving the fourth Painlevé transcendent PIV, obtained in the context of second-order supersymmetric quantum mechanics and third-order ladder operators, with a hierarchy of families of quantum systems called k-step rational extensions of the harmonic oscillator and related with multi-indexed Xm1,m2,…,mk Hermite exceptional orthogonal polynomials of type III. The connection between these exactly solvable models is established at the level of the equivalence of the Hamiltonians using rational solutions of the fourth Painlevé equation in terms of generalized Hermite and Okamoto polynomials. We also relate the different ladder operators obtained by various combinations of supersymmetric constructions involving Darboux-Crum and Krein-Adler supercharges, their zero modes and the corresponding energies. These results will demonstrate and clarify the relation observed for a particular case in previous papers.

  9. Using Institutional Survey Data to Jump-Start Your Benchmarking Process

    ERIC Educational Resources Information Center

    Chow, Timothy K. C.

    2012-01-01

    Guided by the missions and visions, higher education institutions utilize benchmarking processes to identify better and more efficient ways to carry out their operations. Aside from the initial planning and organization steps involved in benchmarking, a matching or selection step is crucial for identifying other institutions that have good…

  10. Initial Crisis Reaction and Poliheuristic Theory

    ERIC Educational Resources Information Center

    DeRouen, Karl, Jr.; Sprecher, Christopher

    2004-01-01

    Poliheuristic (PH) theory models foreign policy decisions using a two-stage process. The first step eliminates alternatives on the basis of a simplifying heuristic. The second step involves a selection from among the remaining alternatives and can employ a more rational and compensatory means of processing information. The PH model posits that…

  11. Program Evaluation: Roles and Responsibilities of Boards of Education Relative to Thorough and Efficient Legislation.

    ERIC Educational Resources Information Center

    Research for Better Schools, Inc., Philadelphia, PA.

    The process for providing a "thorough and efficient" (T & E) education according to New Jersey statutes and regulations involves six basic steps. This document suggests procedures for handling the fifth step, educational program evaluation. Processes discussed include committee formation, evaluation planning, action plan…

  12. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  13. The limits to biocatalysis: pushing the envelope.

    PubMed

    Sheldon, Roger A; Brady, Dean

    2018-06-12

    In the period 1985 to 1995 applications of biocatalysis, driven by the need for more sustainable manufacture of chemicals and catalytic, (enantio)selective methods for the synthesis of pharmaceutical intermediates, largely involved the available hydrolases. This was followed, in the next two decades, by revolutionary developments in protein engineering and directed evolution for the optimisation of enzyme function and performance that totally changed the biocatalysis landscape. In the same period, metabolic engineering and synthetic biology revolutionised the use of whole cell biocatalysis in the synthesis of commodity chemicals by fermentation. In particular, developments in the enzymatic enantioselective synthesis of chiral alcohols and amines are highlighted. Progress in enzyme immobilisation facilitated applications under harsh industrial conditions, such as in organic solvents. The emergence of biocatalytic or chemoenzymatic cascade processes, often with co-immobilised enzymes, has enabled telescoping of multi-step processes. Discovering and inventing new biocatalytic processes, based on (meta)genomic sequencing, evolving enzyme promiscuity, chemomimetic biocatalysis, artificial metalloenzymes, and the introduction of non-canonical amino acids into proteins, are pushing back the limits of biocatalysis function. Finally, the integral role of biocatalysis in developing a biobased carbon-neutral economy is discussed.

  14. XUV-induced reactions in benzene on sub-10 fs timescale: nonadiabatic relaxation and proton migration.

    PubMed

    Galbraith, M C E; Smeenk, C T L; Reitsma, G; Marciniak, A; Despré, V; Mikosch, J; Zhavoronkov, N; Vrakking, M J J; Kornilov, O; Lépine, F

    2017-08-02

    Unraveling ultrafast dynamical processes in highly excited molecular species has an impact on our understanding of chemical processes such as combustion or the chemical composition of molecular clouds in the universe. In this article we use short (<7 fs) XUV pulses to produce excited cationic states of benzene molecules and probe their dynamics using few-cycle VIS/NIR laser pulses. The excited states produced by the XUV pulses lie in an especially complex spectral region where multi-electronic effects play a dominant role. We show that very fast τ ≈ 20 fs nonadiabatic processes dominate the relaxation of these states, in agreement with the timescale expected for most excited cationic states in benzene. In the CH 3 + fragmentation channel of the doubly ionized benzene cation we identify pathways that involve structural rearrangement and proton migration to a specific carbon atom. Further, we observe non-trivial transient behavior in this fragment channel, which can be interpreted either in terms of propagation of the nuclear wavepacket in the initially excited electronic state of the cation or as a two-step electronic relaxation via an intermediate state.

  15. An Investigation of Factors Involved When Educational Psychologists sSupervise Other Professionals

    ERIC Educational Resources Information Center

    Callicott, Katie; Leadbetter, Jane

    2013-01-01

    Inter-professional supervision combines the social processes of supervision and multi-agency working: both complex and often poorly understood processes. This paper discusses the first author's research of inter-professional supervision, involving an educational psychologist (EP) supervising another professional and complements the recently…

  16. One-step production of multilayered microparticles by tri-axial electro-flow focusing

    NASA Astrophysics Data System (ADS)

    Si, Ting; Feng, Hanxin; Li, Yang; Luo, Xisheng; Xu, Ronald

    2014-03-01

    Microencapsulation of drugs and imaging agents in the same carrier is of great significance for simultaneous detection and treatment of diseases. In this work, we have developed a tri-axial electro-flow focusing (TEFF) device using three needles with a novel concentric arrangement to one-step form multilayered microparticles. The TEFF process can be characterized as a multi-fluidic compound cone-jet configuration in the core of a high-speed coflowing gas stream under an axial electric field. The tri-axial liquid jet eventually breaks up into multilayered droplets. To validate the method, the effect of main process parameters on characteristics of the cone and the jet has been studied experimentally. The applied electric field can dramatically promote the stability of the compound cone and enhance the atomization of compound liquid jets. Microparticles with both three-layer, double-layer and single-layer structures have been obtained. The results show that the TEFF technique has great benefits in fabricating multilayered microparticles at smaller scales. This method will be able to one-step encapsulate multiple therapeutic and imaging agents for biomedical applications such as multi-modal imaging, drug delivery and biomedicine.

  17. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    NASA Astrophysics Data System (ADS)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  18. Factors associated with the use of cognitive aids in operating room crises: a cross-sectional study of US hospitals and ambulatory surgical centers.

    PubMed

    Alidina, Shehnaz; Goldhaber-Fiebert, Sara N; Hannenberg, Alexander A; Hepner, David L; Singer, Sara J; Neville, Bridget A; Sachetta, James R; Lipsitz, Stuart R; Berry, William R

    2018-03-26

    Operating room (OR) crises are high-acuity events requiring rapid, coordinated management. Medical judgment and decision-making can be compromised in stressful situations, and clinicians may not experience a crisis for many years. A cognitive aid (e.g., checklist) for the most common types of crises in the OR may improve management during unexpected and rare events. While implementation strategies for innovations such as cognitive aids for routine use are becoming better understood, cognitive aids that are rarely used are not yet well understood. We examined organizational context and implementation process factors influencing the use of cognitive aids for OR crises. We conducted a cross-sectional study using a Web-based survey of individuals who had downloaded OR cognitive aids from the websites of Ariadne Labs or Stanford University between January 2013 and January 2016. In this paper, we report on the experience of 368 respondents from US hospitals and ambulatory surgical centers. We analyzed the relationship of more successful implementation (measured as reported regular cognitive aid use during applicable clinical events) with organizational context and with participation in a multi-step implementation process. We used multivariable logistic regression to identify significant predictors of reported, regular OR cognitive aid use during OR crises. In the multivariable logistic regression, small facility size was associated with a fourfold increase in the odds of a facility reporting more successful implementation (p = 0.0092). Completing more implementation steps was also significantly associated with more successful implementation; each implementation step completed was associated with just over 50% higher odds of more successful implementation (p ≤ 0.0001). More successful implementation was associated with leadership support (p < 0.0001) and dedicated time to train staff (p = 0.0189). Less successful implementation was associated with resistance among clinical providers to using cognitive aids (p < 0.0001), absence of an implementation champion (p = 0.0126), and unsatisfactory content or design of the cognitive aid (p = 0.0112). Successful implementation of cognitive aids in ORs was associated with a supportive organizational context and following a multi-step implementation process. Building strong organizational support and following a well-planned multi-step implementation process will likely increase the use of OR cognitive aids during intraoperative crises, which may improve patient outcomes.

  19. Method of thermally processing superplastically formed aluminum-lithium alloys to obtain optimum strengthening

    NASA Technical Reports Server (NTRS)

    Anton, Claire E. (Inventor)

    1993-01-01

    Optimum strengthening of a superplastically formed aluminum-lithium alloy structure is achieved via a thermal processing technique which eliminates the conventional step of solution heat-treating immediately following the step of superplastic forming of the structure. The thermal processing technique involves quenching of the superplastically formed structure using static air, forced air or water quenching.

  20. Multi-scale modelling of non-uniform consolidation of uncured toughened unidirectional prepregs

    NASA Astrophysics Data System (ADS)

    Sorba, G.; Binetruy, C.; Syerko, E.; Leygue, A.; Comas-Cardona, S.; Belnoue, J. P.-H.; Nixon-Pearson, O. J.; Ivanov, D. S.; Hallett, S. R.; Advani, S. G.

    2018-05-01

    Consolidation is a crucial step in manufacturing of composite parts with prepregs because its role is to eliminate inter- and intra-ply gaps and porosity. Some thermoset prepreg systems are toughened with thermoplastic particles. Depending on their size, thermoplastic particles can be either located in between plies or distributed within the inter-fibre regions. When subjected to transverse compaction, resin will bleed out of low-viscosity unidirectional prepregs along the fibre direction, whereas one would expect transverse squeeze flow to dominate for higher viscosity prepregs. Recent experimental work showed that the consolidation of uncured toughened prepregs involves complex flow and deformation mechanisms where both bleeding and squeeze flow patterns are observed [1]. Micrographs of compacted and cured samples confirm these features as shown in Fig.1. A phenomenological model was proposed [2] where bleeding flow and squeeze flow are combined. A criterion for the transition from shear flow to resin bleeding was also proposed. However, the micrographs also reveal a resin rich layer between plies which may be contributing to the complex flow mechanisms during the consolidation process. In an effort to provide additional insight into these complex mechanisms, this work focuses on the 3D numerical modelling of the compaction of uncured toughened prepregs in the cross-ply configuration described in [1]. A transversely isotropic fluid model is used to describe the flow behaviour of the plies coupled with interplay resin flow of an isotropic fluid. The multi-scale flow model used is based on [3, 4]. A numerical parametric study is carried out where the resin viscosity, permeability and inter-ply thickness are varied to identify the role of important variables. The squeezing flow and the bleeding flow are compared for a range of process parameters to investigate the coupling and competition between the two flow mechanisms. Figure 4 shows the predicted displacement of the sample edge with the multi-scale compaction model after one time step [3]. The ply distortion and resin flow observed in Fig.1 is qualitatively retrieved by the computational model.

  1. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  2. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  3. A multi-step chromatographic strategy to purify three fungal endo-β-glucanases.

    PubMed

    McCarthy, Tracey; Tuohy, Maria G

    2011-01-01

    Fungi and fungal enzymes have traditionally occupied a central role in biotechnology. Understanding the biochemical properties of the variety of enzymes produced by these eukaryotes has been an area of research interest for decades and again more recently due to global interest in greener bio-production technologies. Purification of an individual enzyme allows its unique biochemical and functional properties to be determined, can provide key information as to the role of individual biocatalysts within a complex enzyme system, and can inform both protein engineering and enzyme production strategies in the development of novel green technologies based on fungal biocatalysts. Many enzymes of current biotechnological interest are secreted by fungi into the extracellular culture medium. These crude enzyme mixtures are typically complex, multi-component, and generally also contain other non-enzymatic proteins and secondary metabolites. In this chapter, we describe a multi-step chromatographic strategy required to isolate three new endo-β-glucanases (denoted EG V, EG VI, and EG VII) with activity against cereal mixed-linkage β-glucans from the thermophilic fungus Talaromyces emersonii. This work also illustrates the challenges frequently involved in isolating individual extracellular fungal proteins in general.

  4. Stability of discrete time recurrent neural networks and nonlinear optimization problems.

    PubMed

    Singh, Jayant; Barabanov, Nikita

    2016-02-01

    We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Setting the Scope of Concept Inventories for Introductory Computing Subjects

    ERIC Educational Resources Information Center

    Goldman, Ken; Gross, Paul; Heeren, Cinda; Herman, Geoffrey L.; Kaczmarczyk, Lisa; Loui, Michael C.; Zilles, Craig

    2010-01-01

    A concept inventory is a standardized assessment tool intended to evaluate a student's understanding of the core concepts of a topic. In order to create a concept inventory it is necessary to accurately identify these core concepts. A Delphi process is a structured multi-step process that uses a group of experts to achieve a consensus opinion. We…

  6. Batch and multi-step fed-batch enzymatic saccharification of Formiline-pretreated sugarcane bagasse at high solid loadings for high sugar and ethanol titers.

    PubMed

    Zhao, Xuebing; Dong, Lei; Chen, Liang; Liu, Dehua

    2013-05-01

    Formiline pretreatment pertains to a biomass fractionation process. In the present work, Formiline-pretreated sugarcane bagasse was hydrolyzed with cellulases by batch and multi-step fed-batch processes at 20% solid loading. For wet pulp, after 144 h incubation with cellulase loading of 10 FPU/g dry solid, fed-batch process obtained ~150 g/L glucose and ~80% glucan conversion, while batch process obtained ~130 g/L glucose with corresponding ~70% glucan conversion. Solid loading could be further increased to 30% for the acetone-dried pulp. By fed-batch hydrolysis of the dried pulp in pH 4.8 buffer solution, glucose concentration could be 247.3±1.6 g/L with corresponding 86.1±0.6% glucan conversion. The enzymatic hydrolyzates could be well converted to ethanol by a subsequent fermentation using Saccharomices cerevisiae with ethanol titer of 60-70 g/L. Batch and fed-batch SSF indicated that Formiline-pretreated substrate showed excellent fermentability. The final ethanol concentration was 80 g/L with corresponding 82.7% of theoretical yield. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A unified inversion scheme to process multifrequency measurements of various dispersive electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Han, Y.; Misra, S.

    2018-04-01

    Multi-frequency measurement of a dispersive electromagnetic (EM) property, such as electrical conductivity, dielectric permittivity, or magnetic permeability, is commonly analyzed for purposes of material characterization. Such an analysis requires inversion of the multi-frequency measurement based on a specific relaxation model, such as Cole-Cole model or Pelton's model. We develop a unified inversion scheme that can be coupled to various type of relaxation models to independently process multi-frequency measurement of varied EM properties for purposes of improved EM-based geomaterial characterization. The proposed inversion scheme is firstly tested in few synthetic cases in which different relaxation models are coupled into the inversion scheme and then applied to multi-frequency complex conductivity, complex resistivity, complex permittivity, and complex impedance measurements. The method estimates up to seven relaxation-model parameters exhibiting convergence and accuracy for random initializations of the relaxation-model parameters within up to 3-orders of magnitude variation around the true parameter values. The proposed inversion method implements a bounded Levenberg algorithm with tuning initial values of damping parameter and its iterative adjustment factor, which are fixed in all the cases shown in this paper and irrespective of the type of measured EM property and the type of relaxation model. Notably, jump-out step and jump-back-in step are implemented as automated methods in the inversion scheme to prevent the inversion from getting trapped around local minima and to honor physical bounds of model parameters. The proposed inversion scheme can be easily used to process various types of EM measurements without major changes to the inversion scheme.

  8. Delamination detection by Multi-Level Wavelet Processing of Continuous Scanning Laser Doppler Vibrometry data

    NASA Astrophysics Data System (ADS)

    Chiariotti, P.; Martarelli, M.; Revel, G. M.

    2017-12-01

    A novel non-destructive testing procedure for delamination detection based on the exploitation of the simultaneous time and spatial sampling provided by Continuous Scanning Laser Doppler Vibrometry (CSLDV) and the feature extraction capability of Multi-Level wavelet-based processing is presented in this paper. The processing procedure consists in a multi-step approach. Once the optimal mother-wavelet is selected as the one maximizing the Energy to Shannon Entropy Ratio criterion among the mother-wavelet space, a pruning operation aiming at identifying the best combination of nodes inside the full-binary tree given by Wavelet Packet Decomposition (WPD) is performed. The pruning algorithm exploits, in double step way, a measure of the randomness of the point pattern distribution on the damage map space with an analysis of the energy concentration of the wavelet coefficients on those nodes provided by the first pruning operation. A combination of the point pattern distributions provided by each node of the ensemble node set from the pruning algorithm allows for setting a Damage Reliability Index associated to the final damage map. The effectiveness of the whole approach is proven on both simulated and real test cases. A sensitivity analysis related to the influence of noise on the CSLDV signal provided to the algorithm is also discussed, showing that the processing developed is robust enough to measurement noise. The method is promising: damages are well identified on different materials and for different damage-structure varieties.

  9. Multi-GPU implementation of a VMAT treatment plan optimization algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun

    Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less

  10. Implementation of Competency-Based Pharmacy Education (CBPE)

    PubMed Central

    Koster, Andries; Schalekamp, Tom; Meijerman, Irma

    2017-01-01

    Implementation of competency-based pharmacy education (CBPE) is a time-consuming, complicated process, which requires agreement on the tasks of a pharmacist, commitment, institutional stability, and a goal-directed developmental perspective of all stakeholders involved. In this article the main steps in the development of a fully-developed competency-based pharmacy curriculum (bachelor, master) are described and tips are given for a successful implementation. After the choice for entering into CBPE is made and a competency framework is adopted (step 1), intended learning outcomes are defined (step 2), followed by analyzing the required developmental trajectory (step 3) and the selection of appropriate assessment methods (step 4). Designing the teaching-learning environment involves the selection of learning activities, student experiences, and instructional methods (step 5). Finally, an iterative process of evaluation and adjustment of individual courses, and the curriculum as a whole, is entered (step 6). Successful implementation of CBPE requires a system of effective quality management and continuous professional development as a teacher. In this article suggestions for the organization of CBPE and references to more detailed literature are given, hoping to facilitate the implementation of CBPE. PMID:28970422

  11. Introduction to Remote Sensing Image Registration

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline

    2017-01-01

    For many applications, accurate and fast image registration of large amounts of multi-source data is the first necessary step before subsequent processing and integration. Image registration is defined by several steps and each step can be approached by various methods which all present diverse advantages and drawbacks depending on the type of data, the type of applications, the a prior information known about the data and the type of accuracy that is required. This paper will first present a general overview of remote sensing image registration and then will go over a few specific methods and their applications

  12. Fox Valley Technical College Quality First Process Model.

    ERIC Educational Resources Information Center

    Fox Valley Technical Coll., Appleton, WI.

    An overview is provided of the Quality First Process Model developed by Fox Valley Technical College (FVTC), Wisconsin, to provide guidelines for quality instruction and service consistent with the highest educational standards. The 16-step model involves activities that should be adaptable to any organization. The steps of the quality model are…

  13. Teaching Statistics from the Operating Table: Minimally Invasive and Maximally Educational

    ERIC Educational Resources Information Center

    Nowacki, Amy S.

    2015-01-01

    Statistics courses that focus on data analysis in isolation, discounting the scientific inquiry process, may not motivate students to learn the subject. By involving students in other steps of the inquiry process, such as generating hypotheses and data, students may become more interested and vested in the analysis step. Additionally, such an…

  14. Development of Regulatory Documents for Creation (Upgrade) of Physical Protection Systems under the Russian/American MPC&A Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izmaylov, Alexandr V.; Babkin, Vladimir; Kurov, Valeriy

    2009-10-07

    The development of new or the upgrade of existing physical protection systems (PPS) for nuclear facilities involves a multi-step and multidimensional process. The process consists of conceptual design, design, and commissioning stages. The activities associated with each of these stages are governed by Russian government and agency regulations. To ensure a uniform approach to development or upgrading of PPS at Russian nuclear facilities, the development of a range of regulatory and methodological documents is necessary. Some issues of PPS development are covered by the regulatory documents developed by Rosatom, as well as other Russian agencies with nuclear facilities under theirmore » control. This regulatory development has been accomplished as part of the U.S.-Russian MPC&A cooperation or independently by the Russian Federation. While regulatory coverage is extensive, there are a number of issues such as vulnerability analysis, effectiveness assessment, upgrading PPS, and protection of information systems for PPS that require additional regulations be developed. This paper reports on the status of regulatory coverage for PPS development or upgrade, and outlines a new approach to regulatory document development. It describes the evolutionary process of regulatory development through experience gained in the design, development and implementation of PPS as well as experience gained through the cooperative efforts of Russian and U.S. experts involved the development of MPC&A regulations.« less

  15. Authorship issues in multi-centre clinical trials: the importance of making an authorship contract.

    PubMed

    Rosenberg, Jacob; Burcharth, Jakob; Pommergaard, Hans-Christian; Vinther, Siri

    2015-02-01

    Discussions about authorship often arise in multi-centre clinical trials. Such trials may involve up to hundreds of contributors of whom some will eventually co-author the final publication. It is, however, often impossible to involve all contributors in the manuscript process sufficiently for them to qualify for authorship as defined by the International Committee of Medical Journal Editors. Therefore, rules for authorship in multi-centre trials are strongly recommended. We propose two contracts to prevent conflicts regarding authorship; both are freely available for use without pay but with reference to the original source.

  16. On the tandem Morita-Baylis-Hillman/transesterification processes. Mechanistic insights for the role of protic solvents

    NASA Astrophysics Data System (ADS)

    Carpanez, Arthur G.; Coelho, Fernando; Amarante, Giovanni W.

    2018-02-01

    Despite the remarkable rate acceleration under protic solvents such as alcohols and water, the use of acrylates as activated alkenes places a problem due to the possibility of ester hydrolysis or transesterification. Therefore, the tandem transesterification/Morita-Baylis-Hillman (MBH) reactions were investigated by ESI(+)-MS/(MS) and 1H NMR techniques. For the first time, the MBH back-reaction was fully examined by ESI(+)-MS/(MS) using labelling reagents revealed the complex equilibrium involving the Michael-type addition step of DABCO to acrylate. C- and O-protonation were observed at this stage, showing the transesterification process occurs previous to the aldol step, which is the rate-determining step of the mechanism. At this stage, a short-lived tetrahedral intermediate might be involved and should be considered in these processes.

  17. Performance monitoring and response conflict resolution associated with choice stepping reaction tasks.

    PubMed

    Watanabe, Tatsunori; Tsutou, Kotaro; Saito, Kotaro; Ishida, Kazuto; Tanabe, Shigeo; Nojima, Ippei

    2016-11-01

    Choice reaction requires response conflict resolution, and the resolution processes that occur during a choice stepping reaction task undertaken in a standing position, which requires maintenance of balance, may be different to those processes occurring during a choice reaction task performed in a seated position. The study purpose was to investigate the resolution processes during a choice stepping reaction task at the cortical level using electroencephalography and compare the results with a control task involving ankle dorsiflexion responses. Twelve young adults either stepped forward or dorsiflexed the ankle in response to a visual imperative stimulus presented on a computer screen. We used the Simon task and examined the error-related negativity (ERN) that follows an incorrect response and the correct-response negativity (CRN) that follows a correct response. Error was defined as an incorrect initial weight transfer for the stepping task and as an incorrect initial tibialis anterior activation for the control task. Results revealed that ERN and CRN amplitudes were similar in size for the stepping task, whereas the amplitude of ERN was larger than that of CRN for the control task. The ERN amplitude was also larger in the stepping task than the control task. These observations suggest that a choice stepping reaction task involves a strategy emphasizing post-response conflict and general performance monitoring of actual and required responses and also requires greater cognitive load than a choice dorsiflexion reaction. The response conflict resolution processes appear to be different for stepping tasks and reaction tasks performed in a seated position.

  18. Tracking Virus Particles in Fluorescence Microscopy Images Using Multi-Scale Detection and Multi-Frame Association.

    PubMed

    Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl

    2015-11-01

    Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.

  19. Preparatory steps for a robust dynamic model for organically bound tritium dynamics in agricultural crops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melintescu, A.; Galeriu, D.; Diabate, S.

    2015-03-15

    The processes involved in tritium transfer in crops are complex and regulated by many feedback mechanisms. A full mechanistic model is difficult to develop due to the complexity of the processes involved in tritium transfer and environmental conditions. First, a review of existing models (ORYZA2000, CROPTRIT and WOFOST) presenting their features and limits, is made. Secondly, the preparatory steps for a robust model are discussed, considering the role of dry matter and photosynthesis contribution to the OBT (Organically Bound Tritium) dynamics in crops.

  20. A Multi-organisational Approach to Service Delivery

    NASA Astrophysics Data System (ADS)

    Purchase, Valerie; Mills, John; Parry, Glenn

    Who is involved in delivering a service? There has been growing recognition in a wide variety of contexts that service is increasingly being delivered by multi-rather than single-organisational entities. Such recognition is evident not only in our experience but in a number of areas of literature including strategy development, core competence analysis, operations and supply chain management, and is reflected in and further facilitated by ICT developments. Customers have always been involved in some degree in the process of value delivery and such involvement is increasing to include complex co-creation of value. Such interactions are challenging when they involve individual customers, however, this becomes ever more challenging when the 'customer' is another organisation or when there are multiple 'customers'. Within this chapter we will consider some of the key drivers for a multi-organisational approach to service delivery; examine the ways in which the parties involved in service co-creation have expanded to include multiple service providers and customers; and finally, identify some of the challenges created by a multi-organisational approach to service delivery.

  1. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE PAGES

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    2018-04-17

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  2. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  3. Design and Implementation of the ARTEMIS Lunar Transfer Using Multi-Body Dynamics

    NASA Technical Reports Server (NTRS)

    Folta, David; Woodard, Mark; Sweetser, Theodore; Broschart, Stephen B.; Cosgrove, Daniel

    2011-01-01

    The use of multi-body dynamics to design the transfer of spacecraft from Earth elliptical orbits to the Earth-Moon libration (L(sub 1) and L(sub 2)) orbits has been successfully demonstrated by the Acceleration Reconnection and Turbulence and Electrodynamics of the Moon's Interaction with the Sun (ARTEMIS) mission. Operational support of the two ARTEMIS spacecraft is a final step in the realization of a design process that can be used to transfer spacecraft with restrictive operational constraints and fuel limitations. The focus of this paper is to describe in detail the processes and implementation of this successful approach.

  4. Steps in the open space planning process

    Treesearch

    Stephanie B. Kelly; Melissa M. Ryan

    1995-01-01

    This paper presents the steps involved in developing an open space plan. The steps are generic in that the methods may be applied various size communities. The intent is to provide a framework to develop an open space plan that meets Massachusetts requirements for funding of open space acquisition.

  5. Syntactic and semantic restrictions on morphological recomposition: MEG evidence from Greek.

    PubMed

    Neophytou, K; Manouilidou, C; Stockall, L; Marantz, A

    2018-05-16

    Complex morphological processing has been extensively studied in the past decades. However, most of this work has either focused on only certain steps involved in this process, or it has been conducted on a few languages, like English. The purpose of the present study is to investigate the spatiotemporal cortical processing profile of the distinct steps previously reported in the literature, from decomposition to re-composition of morphologically complex items, in a relatively understudied language, Greek. Using magnetoencephalography, we confirm the role of the fusiform gyrus in early, form-based morphological decomposition, we relate the syntactic licensing of stem-suffix combinations to the ventral visual processing stream, somewhat independent from lexical access for the stem, and we further elucidate the role of orbitofrontal regions in semantic composition. Thus, the current study offers the most comprehensive test to date of visual morphological processing and additional, crosslinguistic validation of the steps involved in it. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Comparison of Photoluminescence Imaging on Starting Multi-Crystalline Silicon Wafers to Finished Cell Performance: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, S.; Yan, F.; Dorn, D.

    2012-06-01

    Photoluminescence (PL) imaging techniques can be applied to multicrystalline silicon wafers throughout the manufacturing process. Both band-to-band PL and defect-band emissions, which are longer-wavelength emissions from sub-bandgap transitions, are used to characterize wafer quality and defect content on starting multicrystalline silicon wafers and neighboring wafers processed at each step through completion of finished cells. Both PL imaging techniques spatially highlight defect regions that represent dislocations and defect clusters. The relative intensities of these imaged defect regions change with processing. Band-to-band PL on wafers in the later steps of processing shows good correlation to cell quality and performance. The defect bandmore » images show regions that change relative intensity through processing, and better correlation to cell efficiency and reverse-bias breakdown is more evident at the starting wafer stage as opposed to later process steps. We show that thermal processing in the 200 degrees - 400 degrees C range causes impurities to diffuse to different defect regions, changing their relative defect band emissions.« less

  7. Synchrotron x-ray study of a low roughness and high efficiency K 2 CsSb photocathode during film growth

    DOE PAGES

    Xie, Junqi; Demarteau, Marcel; Wagner, Robert; ...

    2017-04-24

    Reduction of roughness to the nm level is critical of achieving the ultimate performance from photocathodes used in high gradient fields. The thrust of this paper is to explore the evolution of roughness during sequential growth, and to show that deposition of multilayer structures consisting of very thin reacted layers results in an nm level smooth photocathode. Synchrotron x-ray methods were applied to study the multi-step growth process of a high efficiency K 2CsSb photocathode. We observed a transition point of the Sb film grown on Si at the film thickness of similar to 40 angstrom with the substrate temperaturemore » at 100 degrees C and the growth rate at 0.1 Å s -1. The final K 2CsSb photocathode exhibits a thickness of around five times that of the total deposited Sb film regardless of how the Sb film was grown. The film surface roughening process occurs first at the step when K diffuses into the crystalline Sb. Furthermore, the photocathode we obtained from the multi-step growth exhibits roughness in an order of magnitude lower than the normal sequential process. X-ray diffraction measurements show that the material goes through two structural changes of the crystalline phase during formation, from crystalline Sb to K 3Sb and finally to K 2CsSb.« less

  8. Synchrotron x-ray study of a low roughness and high efficiency K 2 CsSb photocathode during film growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Junqi; Demarteau, Marcel; Wagner, Robert

    Reduction of roughness to the nm level is critical of achieving the ultimate performance from photocathodes used in high gradient fields. The thrust of this paper is to explore the evolution of roughness during sequential growth, and to show that deposition of multilayer structures consisting of very thin reacted layers results in an nm level smooth photocathode. Synchrotron x-ray methods were applied to study the multi-step growth process of a high efficiency K 2CsSb photocathode. We observed a transition point of the Sb film grown on Si at the film thickness of similar to 40 angstrom with the substrate temperaturemore » at 100 degrees C and the growth rate at 0.1 Å s -1. The final K 2CsSb photocathode exhibits a thickness of around five times that of the total deposited Sb film regardless of how the Sb film was grown. The film surface roughening process occurs first at the step when K diffuses into the crystalline Sb. Furthermore, the photocathode we obtained from the multi-step growth exhibits roughness in an order of magnitude lower than the normal sequential process. X-ray diffraction measurements show that the material goes through two structural changes of the crystalline phase during formation, from crystalline Sb to K 3Sb and finally to K 2CsSb.« less

  9. Statistical identification of stimulus-activated network nodes in multi-neuron voltage-sensitive dye optical recordings.

    PubMed

    Fathiazar, Elham; Anemuller, Jorn; Kretzberg, Jutta

    2016-08-01

    Voltage-Sensitive Dye (VSD) imaging is an optical imaging method that allows measuring the graded voltage changes of multiple neurons simultaneously. In neuroscience, this method is used to reveal networks of neurons involved in certain tasks. However, the recorded relative dye fluorescence changes are usually low and signals are superimposed by noise and artifacts. Therefore, establishing a reliable method to identify which cells are activated by specific stimulus conditions is the first step to identify functional networks. In this paper, we present a statistical method to identify stimulus-activated network nodes as cells, whose activities during sensory network stimulation differ significantly from the un-stimulated control condition. This method is demonstrated based on voltage-sensitive dye recordings from up to 100 neurons in a ganglion of the medicinal leech responding to tactile skin stimulation. Without relying on any prior physiological knowledge, the network nodes identified by our statistical analysis were found to match well with published cell types involved in tactile stimulus processing and to be consistent across stimulus conditions and preparations.

  10. Stable aqueous dispersions of functionalized multi-layer graphene by pulsed underwater plasma exfoliation of graphite

    NASA Astrophysics Data System (ADS)

    Meyer-Plath, Asmus; Beckert, Fabian; Tölle, Folke J.; Sturm, Heinz; Mülhaupt, Rolf

    2016-02-01

    A process was developed for graphite particle exfoliation in water to stably dispersed multi-layer graphene. It uses electrohydraulic shockwaves and the functionalizing effect of solution plasma discharges in water. The discharges were excited by 100 ns high voltage pulsing of graphite particle chains that bridge an electrode gap. The underwater discharges allow simultaneous exfoliation and chemical functionalization of graphite particles to partially oxidized multi-layer graphene. Exfoliation is caused by shockwaves that result from rapid evaporation of carbon and water to plasma-excited gas species. Depending on discharge energy and locus of ignition, the shockwaves cause stirring, erosion, exfoliation and/or expansion of graphite flakes. The process was optimized to produce long-term stable aqueous dispersions of multi-layer graphene from graphite in a single process step without requiring addition of intercalants, surfactants, binders or special solvents. A setup was developed that allows continuous production of aqueous dispersions of flake size-selected multi-layer graphenes. Due to the well-preserved sp2-carbon structure, thin films made from the dispersed graphene exhibited high electrical conductivity. Underwater plasma discharge processing exhibits high innovation potential for morphological and chemical modifications of carbonaceous materials and surfaces, especially for the generation of stable dispersions of two-dimensional, layered materials.

  11. Using the critical incident technique in community-based participatory research: a case study.

    PubMed

    Belkora, Jeffrey; Stupar, Lauren; O'Donnell, Sara

    2011-01-01

    Successful community-based participatory research involves the community partner in every step of the research process. The primary study for this paper took place in rural, Northern California. Collaborative partners included an academic researcher and two community based resource centers that provide supportive services to people diagnosed with cancer. This paper describes our use of the Critical Incident Technique (CIT) to conduct Community-based Participatory Research. We ask: Did the CIT facilitate or impede the active engagement of the community in all steps of the study process? We identified factors about the Critical Incident Technique that were either barriers or facilitators to involving the community partner in every step of the research process. Facilitators included the CIT's ability to accommodate involvement from a large spectrum of the community, its flexible design, and its personal approach. Barriers to community engagement included training required to conduct interviews, depth of interview probes, and time required. Overall, our academic-community partners felt that our use of the CIT facilitated community involvement in our Community-Based Participatory Research Project, where we used it to formally document the forces promoting and inhibiting successful achievement of community aims.

  12. Impact of Scale-Dependent Coupled Processes on Solute Fate and Transport in the Critical Zone: Case Studies Involving Inorganic and Radioactive Contaminants

    NASA Astrophysics Data System (ADS)

    Jardine, P. M.; Gentry, R. W.

    2011-12-01

    Soil, the thin veneer of matter covering the Earths surface that supports a web of living diversity, is often abused through anthropogenic inputs of toxic waste. This subsurface regime, coupled with life sustaining surface water and groundwater is known as the "Critical Zone". The disposal of radioactive and toxic organic and inorganic waste generated by industry and various government agencies has historically involved shallow land burial or the use of surface impoundments in unsaturated soils and sediments. Presently, contaminated sites have been closing rapidly and many remediation strategies have chosen to leave contaminants in-place. As such, contaminants will continue to interact with the geosphere and investigations on long term changes and interactive processes is imperative to verify risks. In this presentation we provide a snap-shot of subsurface science research from the past 25 y that seeks to provide an improved understanding and predictive capability of multi-scale contaminant fate and transport processes in heterogeneous unsaturated and saturated environments. Investigations focus on coupled hydrological, geochemical, and microbial processes that control reactive contaminant transport and that involve multi-scale fundamental research ranging from the molecular scale (e.g. synchrotrons, electron sources, arrays) to in situ plume interrogation strategies at the macroscopic scale (e.g. geophysics, field biostimulation, coupled processes monitoring). We show how this fundamental research is used to provide multi-process, multi-scale predictive monitoring and modeling tools that can be used at contaminated sites to (1) inform and improve the technical basis for decision making, and (2) assess which sites are amenable to natural attenuation and which would benefit from source zone remedial intervention.

  13. A regenerating ultrasensitive electrochemical impedance immunosensor for the detection of adenovirus.

    PubMed

    Lin, Donghai; Tang, Thompson; Jed Harrison, D; Lee, William E; Jemere, Abebaw B

    2015-06-15

    We report on the development of a regenerable sensitive immunosensor based on electrochemical impedance spectroscopy for the detection of type 5 adenovirus. The multi-layered immunosensor fabrication involved successive modification steps on gold electrodes: (i) modification with self-assembled layer of 1,6-hexanedithiol to which gold nanoparticles were attached via the distal thiol groups, (ii) formation of self-assembled monolayer of 11-mercaptoundecanoic acid onto the gold nanoparticles, (iii) covalent immobilization of monoclonal anti-adenovirus 5 antibody, with EDC/NHS coupling reaction on the nanoparticles, completing the immunosensor. The immunosensor displayed a very good detection limit of 30 virus particles/ml and a wide linear dynamic range of 10(5). An electrochemical reductive desorption technique was employed to completely desorb the components of the immunosensor surface, then re-assemble the sensing layer and reuse the sensor. On a single electrode, the multi-layered immunosensor could be assembled and disassembled at least 30 times with 87% of the original signal intact. The changes of electrode behavior after each assembly and desorption processes were investigated by cyclic voltammetry, electrochemical impedance spectroscopy and X-ray photoelectron spectroscopy techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Global Manufacturing of CAR T Cell Therapy.

    PubMed

    Levine, Bruce L; Miskin, James; Wonnacott, Keith; Keir, Christopher

    2017-03-17

    Immunotherapy using chimeric antigen receptor-modified T cells has demonstrated high response rates in patients with B cell malignancies, and chimeric antigen receptor T cell therapy is now being investigated in several hematologic and solid tumor types. Chimeric antigen receptor T cells are generated by removing T cells from a patient's blood and engineering the cells to express the chimeric antigen receptor, which reprograms the T cells to target tumor cells. As chimeric antigen receptor T cell therapy moves into later-phase clinical trials and becomes an option for more patients, compliance of the chimeric antigen receptor T cell manufacturing process with global regulatory requirements becomes a topic for extensive discussion. Additionally, the challenges of taking a chimeric antigen receptor T cell manufacturing process from a single institution to a large-scale multi-site manufacturing center must be addressed. We have anticipated such concerns in our experience with the CD19 chimeric antigen receptor T cell therapy CTL019. In this review, we discuss steps involved in the cell processing of the technology, including the use of an optimal vector for consistent cell processing, along with addressing the challenges of expanding chimeric antigen receptor T cell therapy to a global patient population.

  15. A process for preparing an ultra-thin, adhesiveless, multi-layered, patterned polymer substrate

    NASA Technical Reports Server (NTRS)

    Bryant, Robert G. (Inventor); Kruse, Nancy H. M. (Inventor); Fox, Robert L. (Inventor); Tran, Sang Q. (Inventor)

    1995-01-01

    A process for preparing an ultra-thin, adhesiveless, multi-layered, patterned polymer substrate is disclosed. The process may be used to prepare both rigid and flexible cables and circuit boards. A substrate is provided and a polymeric solution comprising a self-bonding, soluble polymer and a solvent is applied to the substrate. Next, the polymer solution is dried to form a polymer coated substrate. The polymer coated substrate is metallized and patterned. At least one additional coating of the polymeric solution is applied to the metallized, patterned, polymer coated substrate and the steps of metallizing and patterning are repeated. Lastly, a cover coat is applied. When preparing a flexible cable and flexible circuit board, the polymer coating is removed from the substrate.

  16. Thermodynamics and Kinetics of Prenucleation Clusters, Classical and Non-Classical Nucleation

    PubMed Central

    Zahn, Dirk

    2015-01-01

    Recent observations of prenucleation species and multi-stage crystal nucleation processes challenge the long-established view on the thermodynamics of crystal formation. Here, we review and generalize extensions to classical nucleation theory. Going beyond the conventional implementation as has been used for more than a century now, nucleation inhibitors, precursor clusters and non-classical nucleation processes are rationalized as well by analogous concepts based on competing interface and bulk energy terms. This is illustrated by recent examples of species formed prior to/instead of crystal nucleation and multi-step nucleation processes. Much of the discussed insights were obtained from molecular simulation using advanced sampling techniques, briefly summarized herein for both nucleation-controlled and diffusion-controlled aggregate formation. PMID:25914369

  17. A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2015-01-01

    The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.

  18. Multi-criteria clinical decision support: A primer on the use of multiple criteria decision making methods to promote evidence-based, patient-centered healthcare.

    PubMed

    Dolan, James G

    2010-01-01

    Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).

  19. Multi-criteria clinical decision support: A primer on the use of multiple criteria decision making methods to promote evidence-based, patient-centered healthcare

    PubMed Central

    Dolan, James G.

    2010-01-01

    Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218

  20. A time to search: finding the meaning of variable activation energy.

    PubMed

    Vyazovkin, Sergey

    2016-07-28

    This review deals with the phenomenon of variable activation energy frequently observed when studying the kinetics in the liquid or solid phase. This phenomenon commonly manifests itself through nonlinear Arrhenius plots or dependencies of the activation energy on conversion computed by isoconversional methods. Variable activation energy signifies a multi-step process and has a meaning of a collective parameter linked to the activation energies of individual steps. It is demonstrated that by using appropriate models of the processes, the link can be established in algebraic form. This allows one to analyze experimentally observed dependencies of the activation energy in a quantitative fashion and, as a result, to obtain activation energies of individual steps, to evaluate and predict other important parameters of the process, and generally to gain deeper kinetic and mechanistic insights. This review provides multiple examples of such analysis as applied to the processes of crosslinking polymerization, crystallization and melting of polymers, gelation, and solid-solid morphological and glass transitions. The use of appropriate computational techniques is discussed as well.

  1. Design of multi-body Lambert type orbits with specified departure and arrival positions

    NASA Astrophysics Data System (ADS)

    Ishii, Nobuaki; Kawaguchi, Jun'ichiro; Matsuo, Hiroki

    1991-10-01

    A new procedure for designing a multi-body Lambert type orbit comprising a multiple swingby process is developed, aiming at relieving a numerical difficulty inherent to a highly nonlinear swingby mechanism. The proposed algorithm, Recursive Multi-Step Linearization, first divides a whole orbit into several trajectory segments. Then, with a maximum use of piecewised transition matrices, a segmentized orbit is repeatedly upgraded until an approximated orbit initially based on a patched conics method eventually converges. In application to the four body earth-moon system with sun's gravitation, one of the double lunar swingby orbits including 12 lunar swingbys is successfully designed without any velocity mismatch.

  2. Immortalization of normal human fibroblasts by treatment with 4-nitroquinoline 1-oxide.

    PubMed

    Bai, L; Mihara, K; Kondo, Y; Honma, M; Namba, M

    1993-02-01

    Normal human fibroblasts (the OUMS-24 strain), derived from a 6-week-old human embryo, were transformed (into the OUMS-24F line) and immortalized by repeated treatments (59 times) with 4-nitroquinoline 1-oxide (4NQO). Treatment began during primary culture and ended at the 51st population doubling level (PDL). At the 57th PDL (146 days after the last treatment), morphologically altered, epithelial-type cells appeared, began to grow and became immortal (now past the 100th PDL). However, the control fibroblasts, which were not treated with 4NQO, senesced at the 62nd PDL. The finding that extensive, repeated treatments with 4NQO are required for the immortalization of normal human cells, indicates that multiple mutational events are involved in the immortalization of human cells in general. In other words, immortalization itself seems to be a multi-step process. Karyotypic analysis showed that many cells were hypodiploid before immortalization, but that afterwards chromosomes were distributed broadly in the diploid to tetraploid regions. The immortalized cells showed amplification and enhanced expression of c-myc. Two-dimensional electrophoretic analysis showed that the number of disappearing cellular proteins was greater than the number of the newly appearing ones after the cells became immortalized. Since the immortalized cells showed neither anchorage-independent growth nor tumorigenicity, they are useful for studying factors that can contribute to multi-step carcinogenesis in human cells. In addition, genetically matched normal (OUMS-24) and immortalized (OUMS-24F) cells will be useful for analyzing the genes related to cellular mortality and immortalization.

  3. Assessing Face Validity of a Food Behavior Checklist for Limited-resource Filipinos

    PubMed Central

    Buchthal, Opal Vanessa; Tauyan, Socorro

    2015-01-01

    Abstract Diet-related chronic health conditions are prevalent in the Filipino American community; however, there is a lack of rigorously validated nutrition education evaluation tools in Tagalog for use in this population. This study aimed to develop and evaluate the face validity of a Tagalog-language food behavior checklist (FBC). A multi-step method was used, involving translation of questionnaire text from English to Tagalog by a team of professionals, creation of accompanying color photographs, cognitive testing with the target population, final review by the team of professionals, and assessment of readability. Subjects for cognitive testing were men (n=6) and women (n=14) 18 years or older in Hawai‘i who received or were eligible to receive Supplemental Nutrition Assistance Program (SNAP) benefits, self-identified as Filipino, and preferred Tagalog rather than English. Participants were recruited from churches, the Filipino Center, and other community sites. Cognitive interviews revealed several issues with text and photographs, such as preferences for specific terms, and images that did not adequately illustrate the text. Image changes were made to reflect items most commonly consumed. The team of professionals agreed with participant suggestions. Assessment of readability revealed a reading level appropriate for a low-literacy population of grade 5.9. The multi-step process, which allowed members of the target audience to reveal the appropriateness of the questionnaire, yielded a Tagalog-language FBC found to have adequate face validity. After further evaluation of validity and reliability, this tool may be used to evaluate behavior change resulting from the United States Department of Agriculture's (USDA) nutrition education programs. PMID:26535163

  4. Multi-frame partially saturated images blind deconvolution

    NASA Astrophysics Data System (ADS)

    Ye, Pengzhao; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-12-01

    When blurred images have saturated or over-exposed pixels, conventional blind deconvolution approaches often fail to estimate accurate point spread function (PSF) and will introduce local ringing artifacts. In this paper, we propose a method to deal with the problem under the modified multi-frame blind deconvolution framework. First, in the kernel estimation step, a light streak detection scheme using multi-frame blurred images is incorporated into the regularization constraint. Second, we deal with image regions affected by the saturated pixels separately by modeling a weighted matrix during each multi-frame deconvolution iteration process. Both synthetic and real-world examples show that more accurate PSFs can be estimated and restored images have richer details and less negative effects compared to state of art methods.

  5. Digital enhancement of X-rays for NDT

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.

    1980-01-01

    Report is "cookbook" for digital processing of industrial X-rays. Computer techniques, previously used primarily in laboratory and developmental research, have been outlined and codified into step by step procedures for enhancing X-ray images. Those involved in nondestructive testing should find report valuable asset, particularly is visual inspection is method currently used to process X-ray images.

  6. Recovery of nickel, cobalt and some salts from spent Ni-MH batteries.

    PubMed

    Rabah, M A; Farghaly, F E; Abd-El Motaleb, M A

    2008-01-01

    This work provides a method to help recover nickel, cobalt metals and some of their salts having market value from spent nickel-metal hydride batteries (SNiB). The methodology used benefits the solubility of the battery electrode materials in sulfuric or hydrochloric acids. The results obtained showed that sulfuric acid was slightly less powerful in leaching SNiB compared to HCl acid. Despite that, sulfuric acid was extremely applied on economic basis. The highest level of solubility attained 93.5% using 3N sulfuric acid at 90 degrees C for 3h. The addition of hydrogen peroxide to the reacting acid solution improved the level of solubility and enhanced the process in a shorter time. The maximum recovery of nickel and cobalt metals was 99.9% and 99.4%, respectively. Results were explained in the light of a model assuming that solubility was a first order reaction. It involved a multi-step sequence, the first step of which was the rate determining step of the overall solubility. Nickel salts such as hydroxide, chloride, hexamminenickel chloride, hexamminenickel nitrate, oxalate and nickel oleate were prepared. With cobalt, basic carbonate, chloride, nitrate, citrate, oleate and acetate salts were prepared from cobalt hydroxide Cost estimates showed that the prices of the end products were nearly 30% lower compared to the prices of the same chemicals prepared from primary resources.

  7. Application of advanced diffraction based optical metrology overlay capabilities for high-volume manufacturing

    NASA Astrophysics Data System (ADS)

    Chen, Kai-Hsiung; Huang, Guo-Tsai; Hsieh, Hung-Chih; Ni, Wei-Feng; Chuang, S. M.; Chuang, T. K.; Ke, Chih-Ming; Huang, Jacky; Rao, Shiuan-An; Cumurcu Gysen, Aysegul; d'Alfonso, Maxime; Yueh, Jenny; Izikson, Pavel; Soco, Aileen; Wu, Jon; Nooitgedagt, Tjitte; Ottens, Jeroen; Kim, Yong Ho; Ebert, Martin

    2017-03-01

    On-product overlay requirements are becoming more challenging with every next technology node due to the continued decrease of the device dimensions and process tolerances. Therefore, current and future technology nodes require demanding metrology capabilities such as target designs that are robust towards process variations and high overlay measurement density (e.g. for higher order process corrections) to enable advanced process control solutions. The impact of advanced control solutions based on YieldStar overlay data is being presented in this paper. Multi patterning techniques are applied for critical layers and leading to additional overlay measurement demands. The use of 1D process steps results in the need of overlay measurements relative to more than one layer. Dealing with the increased number of overlay measurements while keeping the high measurement density and metrology accuracy at the same time presents a challenge for high volume manufacturing (HVM). These challenges are addressed by the capability to measure multi-layer targets with the recently introduced YieldStar metrology tool, YS350. On-product overlay results of such multi-layers and standard targets are presented including measurement stability performance.

  8. PRIMO: An Interactive Homology Modeling Pipeline.

    PubMed

    Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.

  9. PRIMO: An Interactive Homology Modeling Pipeline

    PubMed Central

    Glenister, Michael

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192

  10. Method for sequentially processing a multi-level interconnect circuit in a vacuum chamber

    NASA Technical Reports Server (NTRS)

    Routh, D. E.; Sharma, G. C. (Inventor)

    1982-01-01

    The processing of wafer devices to form multilevel interconnects for microelectronic circuits is described. The method is directed to performing the sequential steps of etching the via, removing the photo resist pattern, back sputtering the entire wafer surface and depositing the next layer of interconnect material under common vacuum conditions without exposure to atmospheric conditions. Apparatus for performing the method includes a vacuum system having a vacuum chamber in which wafers are processed on rotating turntables. The vacuum chamber is provided with an RF sputtering system and a DC magnetron sputtering system. A gas inlet is provided in the chamber for the introduction of various gases to the vacuum chamber and the creation of various gas plasma during the sputtering steps.

  11. Integrated explosive preconcentrator and electrochemical detection system for 2,4,6-trinitrotoluene (TNT) vapor.

    PubMed

    Cizek, Karel; Prior, Chad; Thammakhet, Chongdee; Galik, Michal; Linker, Kevin; Tsui, Ray; Cagan, Avi; Wake, John; La Belle, Jeff; Wang, Joseph

    2010-02-19

    This article reports on an integrated explosive-preconcentration/electrochemical detection system for 2,4,6-trinitrotoluene (TNT) vapor. The challenges involved in such system integration are discussed. A hydrogel-coated screen-printed electrode is used for the detection of the thermally desorbed TNT from a preconcentration device using rapid square wave voltammetry. Optimization of the preconcentration system for desorption of TNT and subsequent electrochemical detection was conducted yielding a desorption temperature of 120 degrees C under a flow rate of 500 mL min(-1). Such conditions resulted in a characteristic electrochemical signal for TNT representing the multi-step reduction process. Quantitative measurements produced a linear signal dependence on TNT quantity exposed to the preconcentrator from 0.25 to 10 microg. Finally, the integrated device was successfully demonstrated using a sample of solid TNT located upstream of the preconcentrator. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Facile Synthesis of Highly Aligned Multiwalled Carbon Nanotubes from Polymer Precursors

    DOE PAGES

    Han, Catherine Y.; Xiao, Zhi-Li; Wang, H. Hau; ...

    2009-01-01

    We report a facile one-step approach which involves no flammable gas, no catalyst, and no in situ polymerization for the preparation of well-aligned carbon nanotube array. A polymer precursor is placed on top of an anodized aluminum oxide (AAO) membrane containing regular nanopore arrays, and slow heating under Ar flow allows the molten polymer to wet the template through adhesive force. The polymer spread into the nanopores of the template to form polymer nanotubes. Upon carbonization the resulting multi-walled carbon nanotubes duplicate the nanopores morphology precisely. The process is demonstrated for 230, 50, and 20 nm pore membranes. The synthesized carbonmore » nanotubes are characterized with scanning/transmission electron microscopies, Raman spectroscopy, and resistive measurements. Convenient functionalization of the nanotubes with this method is demonstrated through premixing CoPt nanoparticles in the polymer precursors.« less

  13. Multi-perspective smFRET reveals rate-determining late intermediates of ribosomal translocation

    PubMed Central

    Wasserman, Michael R.; Alejo, Jose L.; Altman, Roger B.; Blanchard, Scott C.

    2016-01-01

    Directional translocation of the ribosome through the messenger RNA open reading frame is a critical determinant of translational fidelity. This process entails a complex interplay of large-scale conformational changes within the actively translating particle, which together coordinate the movement of transfer and messenger RNA substrates with respect to the large and small ribosomal subunits. Using pre-steady state, single-molecule fluorescence resonance energy transfer imaging, we have tracked the nature and timing of these conformational events within the Escherichia coli ribosome from five structural perspectives. Our investigations reveal direct evidence of structurally and kinetically distinct, late intermediates during substrate movement, whose resolution is rate-determining to the translocation mechanism. These steps involve intra-molecular events within the EFG(GDP)-bound ribosome, including exaggerated, reversible fluctuations of the small subunit head domain, which ultimately facilitate peptidyl-tRNA’s movement into its final post-translocation position. PMID:26926435

  14. Novel Virtual Screening Approach for the Discovery of Human Tyrosinase Inhibitors

    PubMed Central

    Ai, Ni; Welsh, William J.; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development. PMID:25426625

  15. Novel virtual screening approach for the discovery of human tyrosinase inhibitors.

    PubMed

    Ai, Ni; Welsh, William J; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development.

  16. Process for removing an organic compound from water

    DOEpatents

    Baker, Richard W.; Kaschemekat, Jurgen; Wijmans, Johannes G.; Kamaruddin, Henky D.

    1993-12-28

    A process for removing organic compounds from water is disclosed. The process involves gas stripping followed by membrane separation treatment of the stripping gas. The stripping step can be carried out using one or multiple gas strippers and using air or any other gas as stripping gas. The membrane separation step can be carried out using a single-stage membrane unit or a multistage unit. Apparatus for carrying out the process is also disclosed. The process is particularly suited for treatment of contaminated groundwater or industrial wastewater.

  17. The Madden-Julian Oscillation in the NCAR Community Earth System Model Coupled Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Chatterjee, A.; Anderson, J. L.; Moncrieff, M.; Collins, N.; Danabasoglu, G.; Hoar, T.; Karspeck, A. R.; Neale, R. B.; Raeder, K.; Tribbia, J. J.

    2014-12-01

    We present a quantitative evaluation of the simulated MJO in analyses produced with a coupled data assimilation (CDA) framework developed at the National Center for Atmosphere Research. This system is based on the Community Earth System Model (CESM; previously known as the Community Climate System Model -CCSM) interfaced to a community facility for ensemble data assimilation (Data Assimilation Research Testbed - DART). The system (multi-component CDA) assimilates data into each of the respective ocean/atmosphere/land model components during the assimilation step followed by an exchange of information between the model components during the forecast step. Note that this is an advancement over many existing prototypes of coupled data assimilation systems, which typically assimilate observations only in one of the model components (i.e., single-component CDA). The more realistic treatment of air-sea interactions and improvements to the model mean state in the multi-component CDA recover many aspects of MJO representation, from its space-time structure and propagation (see Figure 1) to the governing relationships between precipitation and sea surface temperature on intra-seasonal scales. Standard qualitative and process-based diagnostics identified by the MJO Task Force (currently under the auspices of the Working Group on Numerical Experimentation) have been used to detect the MJO signals across a suite of coupled model experiments involving both multi-component and single-component DA experiments as well as a free run of the coupled CESM model (i.e., CMIP5 style without data assimilation). Short predictability experiments during the boreal winter are used to demonstrate that the decay rates of the MJO convective anomalies are slower in the multi-component CDA system, which allows it to retain the MJO dynamics for a longer period. We anticipate that the knowledge gained through this study will enhance our understanding of the MJO feedback mechanisms across the air-sea interface, especially regarding ocean impacts on the MJO as well as highlight the capability of coupled data assimilation systems for related tropical intraseasonal variability predictions.

  18. Membrane loop process for separating carbon dioxide for use in gaseous form from flue gas

    DOEpatents

    Wijmans, Johannes G; Baker, Richard W; Merkel, Timothy C

    2014-10-07

    The invention is a process involving membrane-based gas separation for separating and recovering carbon dioxide emissions from combustion processes in partially concentrated form, and then transporting the carbon dioxide and using or storing it in a confined manner without concentrating it to high purity. The process of the invention involves building up the concentration of carbon dioxide in a gas flow loop between the combustion step and a membrane separation step. A portion of the carbon dioxide-enriched gas can then be withdrawn from this loop and transported, without the need to liquefy the gas or otherwise create a high-purity stream, to a destination where it is used or confined, preferably in an environmentally benign manner.

  19. A fractional-N frequency divider for multi-standard wireless transceiver fabricated in 0.18 μm CMOS process

    NASA Astrophysics Data System (ADS)

    Wang, Jiafeng; Fan, Xiangning; Shi, Xiaoyang; Wang, Zhigong

    2017-12-01

    With the rapid evolution of wireless communication technology, integrating various communication modes in a mobile terminal has become the popular trend. Because of this, multi-standard wireless technology is one of the hot spots in current research. This paper presents a wideband fractional-N frequency divider of the multi-standard wireless transceiver for many applications. High-speed divider-by-2 with traditional source-coupled-logic is designed for very wide band usage. Phase switching technique and a chain of divider-by-2/3 are applied to the programmable frequency divider with 0.5 step. The phase noise of the whole frequency synthesizer will be decreased by the narrower step of programmable frequency divider. Δ-Σ modulator is achieved by an improved MASH 1-1-1 structure. This structure has excellent performance in many ways, such as noise, spur and input dynamic range. Fabricated in TSMC 0.18μm CMOS process, the fractional-N frequency divider occupies a chip area of 1130 × 510 μm2 and it can correctly divide within the frequency range of 0.8-9 GHz. With 1.8 V supply voltage, its division ratio ranges from 62.5 to 254 and the total current consumption is 29 mA.

  20. Method of forming composite fiber blends

    NASA Technical Reports Server (NTRS)

    McMahon, Paul E. (Inventor); Chung, Tai-Shung (Inventor); Ying, Lincoln (Inventor)

    1989-01-01

    The instant invention involves a process used in preparing fibrous tows which may be formed into polymeric plastic composites. The process involves the steps of (a) forming a tow of strong filamentary materials; (b) forming a thermoplastic polymeric fiber; (c) intermixing the two tows; and (d) withdrawing the intermixed tow for further use.

  1. Quantifying multi-dimensional attributes of human activities at various geographic scales based on smartphone tracking.

    PubMed

    Zhou, Xiaolu; Li, Dongying

    2018-05-09

    Advancement in location-aware technologies, and information and communication technology in the past decades has furthered our knowledge of the interaction between human activities and the built environment. An increasing number of studies have collected data regarding individual activities to better understand how the environment shapes human behavior. Despite this growing interest, some challenges exist in collecting and processing individual's activity data, e.g., capturing people's precise environmental contexts and analyzing data at multiple spatial scales. In this study, we propose and implement an innovative system that integrates smartphone-based step tracking with an app and the sequential tile scan techniques to collect and process activity data. We apply the OpenStreetMap tile system to aggregate positioning points at various scales. We also propose duration, step and probability surfaces to quantify the multi-dimensional attributes of activities. Results show that, by running the app in the background, smartphones can measure multi-dimensional attributes of human activities, including space, duration, step, and location uncertainty at various spatial scales. By coordinating Global Positioning System (GPS) sensor with accelerometer sensor, this app can save battery which otherwise would be drained by GPS sensor quickly. Based on a test dataset, we were able to detect the recreational center and sports center as the space where the user was most active, among other places visited. The methods provide techniques to address key issues in analyzing human activity data. The system can support future studies on behavioral and health consequences related to individual's environmental exposure.

  2. Short-Chain 3-Hydroxyacyl-Coenzyme A Dehydrogenase Associates with a Protein Super-Complex Integrating Multiple Metabolic Pathways

    PubMed Central

    Narayan, Srinivas B.; Master, Stephen R.; Sireci, Anthony N.; Bierl, Charlene; Stanley, Paige E.; Li, Changhong; Stanley, Charles A.; Bennett, Michael J.

    2012-01-01

    Proteins involved in mitochondrial metabolic pathways engage in functionally relevant multi-enzyme complexes. We previously described an interaction between short-chain 3-hydroxyacyl-coenzyme A dehydrogenase (SCHAD) and glutamate dehydrogenase (GDH) explaining the clinical phenotype of hyperinsulinism in SCHAD-deficient patients and adding SCHAD to the list of mitochondrial proteins capable of forming functional, multi-pathway complexes. In this work, we provide evidence of SCHAD's involvement in additional interactions forming tissue-specific metabolic super complexes involving both membrane-associated and matrix-dwelling enzymes and spanning multiple metabolic pathways. As an example, in murine liver, we find SCHAD interaction with aspartate transaminase (AST) and GDH from amino acid metabolic pathways, carbamoyl phosphate synthase I (CPS-1) from ureagenesis, other fatty acid oxidation and ketogenesis enzymes and fructose-bisphosphate aldolase, an extra-mitochondrial enzyme of the glycolytic pathway. Most of the interactions appear to be independent of SCHAD's role in the penultimate step of fatty acid oxidation suggesting an organizational, structural or non-enzymatic role for the SCHAD protein. PMID:22496890

  3. Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.

    PubMed

    Howard, Allison M; Fragaszy, Dorothy M

    2014-09-01

    Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies. © 2014 Wiley Periodicals, Inc.

  4. An ESL Audio-Script Writing Workshop

    ERIC Educational Resources Information Center

    Miller, Carla

    2012-01-01

    The roles of dialogue, collaborative writing, and authentic communication have been explored as effective strategies in second language writing classrooms. In this article, the stages of an innovative, multi-skill writing method, which embeds students' personal voices into the writing process, are explored. A 10-step ESL Audio Script Writing Model…

  5. Energy--What to Do until the Computer Comes.

    ERIC Educational Resources Information Center

    Johnston, Archie B.

    Drawing from Tallahassee Community College's (TCC's) experiences with energy conservation, this paper offers suggestions for reducing energy costs through computer-controlled systems and other means. After stating the energy problems caused by TCC's multi-zone heating and cooling system, the paper discusses the five-step process by which TCC…

  6. A Quantitative Tunneling/Desorption Model for the Exchange Current at the Porous Electrode/Beta - Alumina/Alkali Metal Gas Three Phase Zone at 700-1300K

    NASA Technical Reports Server (NTRS)

    Williams, R. M.; Ryan, M. A.; Saipetch, C.; LeDuc, H. G.

    1996-01-01

    The exchange current observed at porous metal electrodes on sodium or potassium beta -alumina solid electrolytes in alkali metal vapor is quantitatively modeled with a multi-step process with good agreement with experimental results.

  7. Discovery of multi-ring basins - Gestalt perception in planetary science

    NASA Technical Reports Server (NTRS)

    Hartmann, W. K.

    1981-01-01

    Early selenographers resolved individual structural components of multi-ring basin systems but missed the underlying large-scale multi-ring basin patterns. The recognition of multi-ring basins as a general class of planetary features can be divided into five steps. Gilbert (1893) took a first step in recognizing radial 'sculpture' around the Imbrium basin system. Several writers through the 1940's rediscovered the radial sculpture and extended this concept by describing concentric rings around several circular maria. Some reminiscences are given about the fourth step - discovery of the Orientale basin and other basin systems by rectified lunar photography at the University of Arizona in 1961-62. Multi-ring basins remained a lunar phenomenon until the fifth step - discovery of similar systems of features on other planets, such as Mars (1972), Mercury (1974), and possibly Callisto and Ganymede (1979). This sequence is an example of gestalt recognition whose implications for scientific research are discussed.

  8. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    PubMed Central

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships. PMID:23900554

  9. A novel method for a multi-level hierarchical composite with brick-and-mortar structure.

    PubMed

    Brandt, Kristina; Wolff, Michael F H; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  10. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    NASA Astrophysics Data System (ADS)

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-07-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  11. Stalking the IQ Quark.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1979-01-01

    An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)

  12. Fabrication of Natural Uranium UO 2 Disks (Phase II): Texas A&M Work for Others Summary Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerczak, Tyler J.; Baldwin, Charles A.; Schmidlin, Joshua E.

    The steps to fabricate natural UO 2 disks for an irradiation campaign led by Texas A&M University are outlined. The process was initiated with stoichiometry adjustment of parent, U 3O 8 powder. The next stage of sample preparation involved exploratory pellet pressing and sintering to achieve the desired natural UO 2 pellet densities. Ideal densities were achieved through the use of a bimodal powder size blend. The steps involved with disk fabrication are also presented, describing the coring and thinning process executed to achieve final dimensionality.

  13. Targeting, out-scaling and prioritising climate-smart interventions in agricultural systems: Lessons from applying a generic framework to the livestock sector in sub-Saharan Africa.

    PubMed

    Notenbaert, An; Pfeifer, Catherine; Silvestri, Silvia; Herrero, Mario

    2017-02-01

    As a result of population growth, urbanization and climate change, agricultural systems around the world face enormous pressure on the use of resources. There is a pressing need for wide-scale innovation leading to development that improves the livelihoods and food security of the world's population while at the same time addressing climate change adaptation and mitigation. A variety of promising climate-smart interventions have been identified. However, what remains is the prioritization of interventions for investment and broad dissemination. The suitability and adoption of interventions depends on a variety of bio-physical and socio-economic factors. Also their impacts, when adopted and out-scaled, are likely to be highly heterogeneous. This heterogeneity expresses itself not only spatially and temporally but also in terms of the stakeholders affected, some might win and some might lose. A mechanism that can facilitate a systematic, holistic assessment of the likely spread and consequential impact of potential interventions is one way of improving the selection and targeting of such options. In this paper we provide climate smart agriculture (CSA) planners and implementers at all levels with a generic framework for evaluating and prioritising potential interventions. This entails an iterative process of mapping out recommendation domains, assessing adoption potential and estimating impacts. Through examples, related to livestock production in sub-Saharan Africa, we demonstrate each of the steps and how they are interlinked. The framework is applicable in many different forms, scales and settings. It has a wide applicability beyond the examples presented and we hope to stimulate readers to integrate the concepts in the planning process for climate-smart agriculture, which invariably involves multi-stakeholder, multi-scale and multi-objective decision-making.

  14. Use of wavelet-packet transforms to develop an engineering model for multifractal characterization of mutation dynamics in pathological and nonpathological gene sequences

    NASA Astrophysics Data System (ADS)

    Walker, David Lee

    1999-12-01

    This study uses dynamical analysis to examine in a quantitative fashion the information coding mechanism in DNA sequences. This exceeds the simple dichotomy of either modeling the mechanism by comparing DNA sequence walks as Fractal Brownian Motion (fbm) processes. The 2-D mappings of the DNA sequences for this research are from Iterated Function System (IFS) (Also known as the ``Chaos Game Representation'' (CGR)) mappings of the DNA sequences. This technique converts a 1-D sequence into a 2-D representation that preserves subsequence structure and provides a visual representation. The second step of this analysis involves the application of Wavelet Packet Transforms, a recently developed technique from the field of signal processing. A multi-fractal model is built by using wavelet transforms to estimate the Hurst exponent, H. The Hurst exponent is a non-parametric measurement of the dynamism of a system. This procedure is used to evaluate gene- coding events in the DNA sequence of cystic fibrosis mutations. The H exponent is calculated for various mutation sites in this gene. The results of this study indicate the presence of anti-persistent, random walks and persistent ``sub-periods'' in the sequence. This indicates the hypothesis of a multi-fractal model of DNA information encoding warrants further consideration. This work examines the model's behavior in both pathological (mutations) and non-pathological (healthy) base pair sequences of the cystic fibrosis gene. These mutations both natural and synthetic were introduced by computer manipulation of the original base pair text files. The results show that disease severity and system ``information dynamics'' correlate. These results have implications for genetic engineering as well as in mathematical biology. They suggest that there is scope for more multi-fractal models to be developed.

  15. Performance Characteristics of the Multi-Zone NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; VanderWijngaart, Rob F.

    2003-01-01

    We describe a new suite of computational benchmarks that models applications featuring multiple levels of parallelism. Such parallelism is often available in realistic flow computations on systems of grids, but had not previously been captured in bench-marks. The new suite, named NPB Multi-Zone, is extended from the NAS Parallel Benchmarks suite, and involves solving the application benchmarks LU, BT and SP on collections of loosely coupled discretization meshes. The solutions on the meshes are updated independently, but after each time step they exchange boundary value information. This strategy provides relatively easily exploitable coarse-grain parallelism between meshes. Three reference implementations are available: one serial, one hybrid using the Message Passing Interface (MPI) and OpenMP, and another hybrid using a shared memory multi-level programming model (SMP+OpenMP). We examine the effectiveness of hybrid parallelization paradigms in these implementations on three different parallel computers. We also use an empirical formula to investigate the performance characteristics of the multi-zone benchmarks.

  16. Domain Definition and Search Techniques in Meta-Analyses of L2 Research (or Why 18 Meta-Analyses of Feedback Have Different Results)

    ERIC Educational Resources Information Center

    Plonsky, Luke; Brown, Dan

    2015-01-01

    Applied linguists have turned increasingly in recent years to meta-analysis as the preferred means of synthesizing quantitative research. The first step in the meta-analytic process involves defining a domain of interest. Despite its apparent simplicity, this step involves a great deal of subjectivity on the part of the meta-analyst. This article…

  17. Implementation of Joint Multi-Segment Training

    NASA Technical Reports Server (NTRS)

    Reagan, Marc; Smith, Wyatt; Bugrova, Skella; Silkov, Sergei

    2000-01-01

    The highest level of training for ISS flight is Joint Multi-Segment Training (JMST) simulations. These simulations allow two or more partners to conduct multi-segment training for their respective Mission Control Centers (MCC), include actual crew members, and usually include training facilities in each of the participating International Partner (IP) locations. It is the dress rehearsal for those events that exercise the interface between different IP modules and/or the decision making process between the different MCCs involved. This presentation will describe the challenge of successfully implementing JMST. It will start with a brief overview of who is involved, where they are located, and when JMSTs are required. Finally, it will illustrate many of the complications involved in just running a JMST between MCC-M and MCC-H. The viewer will leave with a much better appreciation for the complexities involved in successfully conducting a JMST of this nature, as well as an idea of how the picture will change as the other partners and payloads become involved.

  18. Imaging Study of Multi-Crystalline Silicon Wafers Throughout the Manufacturing Process: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, S.; Yan, F.; Zaunbracher, K.

    2011-07-01

    Imaging techniques are applied to multi-crystalline silicon bricks, wafers at various process steps, and finished solar cells. Photoluminescence (PL) imaging is used to characterize defects and material quality on bricks and wafers. Defect regions within the wafers are influenced by brick position within an ingot and height within the brick. The defect areas in as-cut wafers are compared to imaging results from reverse-bias electroluminescence and dark lock-in thermography and cell parameters of near-neighbor finished cells. Defect areas are also characterized by defect band emissions. The defect areas measured by these techniques on as-cut wafers are shown to correlate to finishedmore » cell performance.« less

  19. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies

    PubMed Central

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S.; Thwaites, David I.; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-01-01

    Abstract The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP ‘Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques’ was conducted in 2009–2012 as an extension of previously developed audit programs. Material and methods. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. Results. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Discussion. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs. PMID:26934916

  20. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies.

    PubMed

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S; Thwaites, David I; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-07-01

    The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP 'Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques' was conducted in 2009-2012 as an extension of previously developed audit programs. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs.

  1. Controlling high-throughput manufacturing at the nano-scale

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  2. Fabrication of hybrid molecular devices using multi-layer graphene break junctions.

    PubMed

    Island, J O; Holovchenko, A; Koole, M; Alkemade, P F A; Menelaou, M; Aliaga-Alcalde, N; Burzurí, E; van der Zant, H S J

    2014-11-26

    We report on the fabrication of hybrid molecular devices employing multi-layer graphene (MLG) flakes which are patterned with a constriction using a helium ion microscope or an oxygen plasma etch. The patterning step allows for the localization of a few-nanometer gap, created by electroburning, that can host single molecules or molecular ensembles. By controlling the width of the sculpted constriction, we regulate the critical power at which the electroburning process begins. We estimate the flake temperature given the critical power and find that at low powers it is possible to electroburn MLG with superconducting contacts in close proximity. Finally, we demonstrate the fabrication of hybrid devices with superconducting contacts and anthracene-functionalized copper curcuminoid molecules. This method is extendable to spintronic devices with ferromagnetic contacts and a first step towards molecular integrated circuits.

  3. Fabrication of hybrid molecular devices using multi-layer graphene break junctions

    NASA Astrophysics Data System (ADS)

    Island, J. O.; Holovchenko, A.; Koole, M.; Alkemade, P. F. A.; Menelaou, M.; Aliaga-Alcalde, N.; Burzurí, E.; van der Zant, H. S. J.

    2014-11-01

    We report on the fabrication of hybrid molecular devices employing multi-layer graphene (MLG) flakes which are patterned with a constriction using a helium ion microscope or an oxygen plasma etch. The patterning step allows for the localization of a few-nanometer gap, created by electroburning, that can host single molecules or molecular ensembles. By controlling the width of the sculpted constriction, we regulate the critical power at which the electroburning process begins. We estimate the flake temperature given the critical power and find that at low powers it is possible to electroburn MLG with superconducting contacts in close proximity. Finally, we demonstrate the fabrication of hybrid devices with superconducting contacts and anthracene-functionalized copper curcuminoid molecules. This method is extendable to spintronic devices with ferromagnetic contacts and a first step towards molecular integrated circuits.

  4. Responsible innovation in port development: the Rotterdam Maasvlakte 2 and the Dalian Dayao Bay extension projects.

    PubMed

    Ravesteijn, Wim; Liu, Yi; Yan, Ping

    2015-01-01

    The paper outlines and specifies 'responsible port innovation', introducing the development of a methodological and procedural step-by-step plan for the implementation and evaluation of (responsible) innovations. Subsequently, it uses this as a guideline for the analysis and evaluation of two case-studies. The construction of the Rotterdam Maasvlakte 2 Port meets most of the formulated requirements, though making values more explicit and treating it as a process right from the start could have benefitted the project. The Dalian Dayao Port could improve its decision-making procedures in several respects, including the introduction of new methods to handle value tensions. Both projects show that public support is crucial in responsible port innovation and that it should be not only a multi-faceted but also a multi-level strategy.

  5. Microelectrode voltammetry of multi-electron transfers complicated by coupled chemical equilibria: a general theory for the extended square scheme.

    PubMed

    Laborda, Eduardo; Gómez-Gil, José María; Molina, Angela

    2017-06-28

    A very general and simple theoretical solution is presented for the current-potential-time response of reversible multi-electron transfer processes complicated by homogeneous chemical equilibria (the so-called extended square scheme). The expressions presented here are applicable regardless of the number of electrons transferred and coupled chemical processes, and they are particularized for a wide variety of microelectrode geometries. The voltammetric response of very different systems presenting multi-electron transfers is considered for the most widely-used techniques (namely, cyclic voltammetry, square wave voltammetry, differential pulse voltammetry and steady state voltammetry), studying the influence of the microelectrode geometry and the number and thermodynamics of the (electro)chemical steps. Most appropriate techniques and procedures for the determination of the 'interaction' between successive transfers are discussed. Special attention is paid to those situations where homogeneous chemical processes, such as protonation, complexation or ion association, affect the electrochemical behaviour of the system by different stabilization of the oxidation states.

  6. Impact of user influence on information multi-step communication in a micro-blog

    NASA Astrophysics Data System (ADS)

    Wu, Yue; Hu, Yong; He, Xiao-Hai; Deng, Ken

    2014-06-01

    User influence is generally considered as one of the most critical factors that affect information cascading spreading. Based on this common assumption, this paper proposes a theoretical model to examine user influence on the information multi-step communication in a micro-blog. The multi-steps of information communication are divided into first-step and non-first-step, and user influence is classified into five dimensions. Actual data from the Sina micro-blog is collected to construct the model by means of an approach based on structural equations that uses the Partial Least Squares (PLS) technique. Our experimental results indicate that the dimensions of the number of fans and their authority significantly impact the information of first-step communication. Leader rank has a positive impact on both first-step and non-first-step communication. Moreover, global centrality and weight of friends are positively related to the information non-first-step communication, but authority is found to have much less relation to it.

  7. An analysis of hydrogen production via closed-cycle schemes. [thermochemical processings from water

    NASA Technical Reports Server (NTRS)

    Chao, R. E.; Cox, K. E.

    1975-01-01

    A thermodynamic analysis and state-of-the-art review of three basic schemes for production of hydrogen from water: electrolysis, thermal water-splitting, and multi-step thermochemical closed cycles is presented. Criteria for work-saving thermochemical closed-cycle processes are established, and several schemes are reviewed in light of such criteria. An economic analysis is also presented in the context of energy costs.

  8. Recognising and referring children exposed to domestic abuse: a multi-professional, proactive systems-based evaluation using a modified Failure Mode and Effects Analysis (FMEA).

    PubMed

    Ashley, Laura; Armitage, Gerry; Taylor, Julie

    2017-03-01

    Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.

  9. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  10. Measurements of gluconeogenesis and glycogenolysis: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to deter...

  11. Industrial application experiment series

    NASA Technical Reports Server (NTRS)

    Bluhm, S. A.

    1980-01-01

    The deployment of parabolic dish systems into the industrial sector for the purpose of providing users, suppliers, sponsors, and developers with a realistic assessment of system feasibility in selected near-term industrial applications will be accomplished initially through the industrial module experiment and later through additional experiments involving thermal, electric, and combined thermal and electrical systems. The approach is to progress through steps, from single module to multi-module systems, from thermal-only applications to more complex combined thermal and electric applications. The experience of other solar thermal experiments, particularly those involving parabolic dish hardware, will be utilized to the fullest extent possible in experiment planning and implementation.

  12. An experimental investigation on the thermal field of overlapping layers in laser-assisted tape winding process

    NASA Astrophysics Data System (ADS)

    Hosseini, S. M. A.; Baran, I.; Akkerman, R.

    2018-05-01

    The laser-assisted tape winding (LATW) is an automated process for manufacturing fiber-reinforced thermoplastic tubular products, such as pipes and pressure vessels. Multi-physical phenomena such as heat transfer, mechanical bonding, phase changes and solid mechanics take place during the process. These phenomena need to be understood and described well for an improved product reliability. Temperature is one of the important parameters in this process to control and optimize the product quality which can be employed in an intelligent model-based inline control system. The incoming tape can overlap with the already wounded layer during the process based on the lay-up configuration. In this situation, the incoming tape can step-on or step-off to an already deposited layer/laminate. During the overlapping, the part temperature changes due to the variation of the geometry caused by previously deposited layer, i.e. a bump geometry. In order to qualify the temperature behavior at the bump regions, an experimental set up is designed on a flat laminate. Artificial bumps/steps are formed on the laminate with various thicknesses and fiber orientations. As the laser head experiences the step-on and step-off, the IR (Infra-Red) camera and the embedded thermocouples measure the temperature on the surface and inside the laminate, respectively. During the step-on, a small drop in temperature is observed while in step-off a higher peak in temperature is observed. It can be concluded that the change in the temperature during overlapping is due to the change in laser incident angle made by the bump geometry. The effect of the step thickness on the temperature peak is quantified and found to be significant.

  13. Process for separating carbon dioxide from flue gas using sweep-based membrane separation and absorption steps

    DOEpatents

    Wijmans, Johannes G.; Baker, Richard W.; Merkel, Timothy C.

    2012-08-21

    A gas separation process for treating flue gases from combustion processes, and combustion processes including such gas separation. The invention involves routing a first portion of the flue gas stream to be treated to an absorption-based carbon dioxide capture step, while simultaneously flowing a second portion of the flue gas across the feed side of a membrane, flowing a sweep gas stream, usually air, across the permeate side, then passing the permeate/sweep gas to the combustor.

  14. Submesoscale Flows and Mixing in the Oceanic Surface Layer Using the Regional Oceanic Modeling System (ROMS)

    DTIC Science & Technology

    2014-09-30

    continuation of the evolution of the Regional Oceanic Modeling System (ROMS) as a multi-scale, multi-process model and its utilization for...hydrostatic component of ROMS (Kanarska et al., 2007) is required to increase its efficiency and generality. The non-hydrostatic ROMS involves the solution...instability and wind-driven mixing. For the computational regime where those processes can be partially, but not yet fully resolved, it will

  15. The autumn effect: timing of physical dormancy break in seeds of two winter annual species of Geraniaceae by a stepwise process

    PubMed Central

    Gama-Arachchige, N. S.; Baskin, J. M.; Geneve, R. L.; Baskin, C. C.

    2012-01-01

    Background and Aims The involvement of two steps in the physical dormancy (PY)-breaking process previously has been demonstrated in seeds of Fabaceae and Convolvulaceae. Even though there is a claim for a moisture-controlled stepwise PY-breaking in some species of Geraniaceae, no study has evaluated the role of temperature in the PY-breaking process in this family. The aim of this study was to determine whether a temperature-controlled stepwise PY-breaking process occurs in seeds of the winter annuals Geranium carolinianum and G. dissectum. Methods Seeds of G. carolinianum and G. dissectum were stored under different temperature regimes to test the effect of storage temperature on PY-break. The role of temperature and moisture regimes in regulating PY-break was investigated by treatments simulating natural conditions. Greenhouse (non-heated) experiments on seed germination and burial experiments (outdoors) were carried out to determine the PY-breaking behaviour in the natural habitat. Key Results Irrespective of moisture conditions, sensitivity to the PY-breaking step in seeds of G. carolinianum was induced at temperatures ≥20 °C, and exposure to temperatures ≤20 °C made the sensitive seeds permeable. Sensitivity of seeds increased with time. In G. dissectum, PY-break occurred at temperatures ≥20 °C in a single step under constant wet or dry conditions and in two steps under alternate wet–dry conditions if seeds were initially kept wet. Conclusions Timing of seed germination with the onset of autumn can be explained by PY-breaking processes involving (a) two temperature-dependent steps in G. carolinianum and (b) one or two moisture-dependent step(s) along with the inability to germinate under high temperatures in G. dissectum. Geraniaceae is the third of 18 families with PY in which a two-step PY-breaking process has been demonstrated. PMID:22684684

  16. Systems Engineering Lessons Learned for Class D Missions

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Piatek, Irene; Moore, Josh; Calvert, Derek

    2015-01-01

    One of NASA's goals within human exploration is to determine how to get humans to Mars safely and to live and work on the Martian surface. To accomplish this goal, several smaller missions act as stepping-stones to the larger end goal. NASA uses these smaller missions to develop new technologies and learn about how to survive outside of Low Earth Orbit for long periods. Additionally, keeping a cadence of these missions allows the team to maintain proficiency in the complex art of bringing spacecraft to fruition. Many of these smaller missions are robotic in nature and have smaller timescales, whereas there are others that involve crew and have longer mission timelines. Given the timelines associated with these various missions, different levels of risk and rigor need to be implemented to be more in line with what is appropriate for the mission. Thus, NASA has four different classifications that range from Class A to Class D based on the mission details. One of these projects is the Resource Prospector (RP) Mission, which is a multi-center and multi-institution collaborative project to search for volatiles in the polar regions of the Moon. The RP mission is classified as a Class D mission and as such, has the opportunity to more tightly manage, and therefore accept, greater levels of risk. The requirements for Class D missions were at the forefront of the design and thus presented unique challenges in vehicle development and systems engineering processes. This paper will discuss the systems engineering process at NASA and how that process is tailored for Class D missions, specifically the RP mission.

  17. Linking the Grain Scale to Experimental Measurements and Other Scales

    NASA Astrophysics Data System (ADS)

    Vogler, Tracy

    2017-06-01

    A number of physical processes occur at the scale of grains that can have a profound influence on the behavior of materials under shock loading. Examples include inelastic deformation, pore collapse, fracture, friction, and internal wave reflections. In some cases such as the initiation of energetics and brittle fracture, these processes can have first order effects on the behavior of materials: the emergent behavior from the grain scale is the dominant one. In other cases, many aspects of the bulk behavior can be described by a continuum description, but some details of the behavior are missed by continuum descriptions. The multi-scale model paradigm envisions flow of information from smaller scales (atomic, dislocation, etc.) to the grain or mesoscale and the up to the continuum scale. A significant challenge in this approach is the need to validate each step. For the grain scale, diagnosing behavior is challenging because of the small spatial and temporal scales involved. Spatially resolved diagnostics have begun to shed light on these processes, and, more recently, advanced light sources have started to be used to probe behavior at the grain scale. In this talk, I will discuss some interesting phenomena that occur at the grain scale in shock loading, experimental approaches to probe the grain scale, and efforts to link the grain scale to smaller and larger scales. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE.

  18. Method of forming composite fiber blends and molding same

    NASA Technical Reports Server (NTRS)

    McMahon, Paul E. (Inventor); Chung, Tai-Shung (Inventor)

    1989-01-01

    The instant invention involves a process used in preparing fibrous tows which may be formed into polymeric plastic composites. The process involves the steps of (a) forming a tow of strong filamentary materials; (b) forming a thermoplastic polymeric fiber; (c) intermixing the two tows; and (d) withdrawing the intermixed tow for further use.

  19. Continuous, linearly intermixed fiber tows and composite molded article thereform

    NASA Technical Reports Server (NTRS)

    McMahon, Paul E. (Inventor); Chung, Tai-Shung (Inventor); Ying, Lincoln (Inventor)

    2000-01-01

    The instant invention involves a process used in preparing fibrous tows which may be formed into polymeric plastic composites. The process involves the steps of (a) forming a carbon fiber tow; (b) forming a thermoplastic polymeric fiber tow; (c) intermixing the two tows; and (d) withdrawing the intermixed tow for further use.

  20. One-pot growth of two-dimensional lateral heterostructures via sequential edge-epitaxy

    NASA Astrophysics Data System (ADS)

    Sahoo, Prasana K.; Memaran, Shahriar; Xin, Yan; Balicas, Luis; Gutiérrez, Humberto R.

    2018-01-01

    Two-dimensional heterojunctions of transition-metal dichalcogenides have great potential for application in low-power, high-performance and flexible electro-optical devices, such as tunnelling transistors, light-emitting diodes, photodetectors and photovoltaic cells. Although complex heterostructures have been fabricated via the van der Waals stacking of different two-dimensional materials, the in situ fabrication of high-quality lateral heterostructures with multiple junctions remains a challenge. Transition-metal-dichalcogenide lateral heterostructures have been synthesized via single-step, two-step or multi-step growth processes. However, these methods lack the flexibility to control, in situ, the growth of individual domains. In situ synthesis of multi-junction lateral heterostructures does not require multiple exchanges of sources or reactors, a limitation in previous approaches as it exposes the edges to ambient contamination, compromises the homogeneity of domain size in periodic structures, and results in long processing times. Here we report a one-pot synthetic approach, using a single heterogeneous solid source, for the continuous fabrication of lateral multi-junction heterostructures consisting of monolayers of transition-metal dichalcogenides. The sequential formation of heterojunctions is achieved solely by changing the composition of the reactive gas environment in the presence of water vapour. This enables selective control of the water-induced oxidation and volatilization of each transition-metal precursor, as well as its nucleation on the substrate, leading to sequential edge-epitaxy of distinct transition-metal dichalcogenides. Photoluminescence maps confirm the sequential spatial modulation of the bandgap, and atomic-resolution images reveal defect-free lateral connectivity between the different transition-metal-dichalcogenide domains within a single crystal structure. Electrical transport measurements revealed diode-like responses across the junctions. Our new approach offers greater flexibility and control than previous methods for continuous growth of transition-metal-dichalcogenide-based multi-junction lateral heterostructures. These findings could be extended to other families of two-dimensional materials, and establish a foundation for the development of complex and atomically thin in-plane superlattices, devices and integrated circuits.

  1. A Case Study Approach to Marine and Aquatic Issues.

    ERIC Educational Resources Information Center

    Snively, Gloria

    1993-01-01

    Suggests using case studies of resource management conflict involving marine and aquatic resource issues to increase student involvement in decision-making processes. Provides information for a potential case involving oyster farms and six steps to help students explore problems and make decisions. (MDH)

  2. Multi-dimensional Fokker-Planck equation analysis using the modified finite element method

    NASA Astrophysics Data System (ADS)

    Náprstek, J.; Král, R.

    2016-09-01

    The Fokker-Planck equation (FPE) is a frequently used tool for the solution of cross probability density function (PDF) of a dynamic system response excited by a vector of random processes. FEM represents a very effective solution possibility, particularly when transition processes are investigated or a more detailed solution is needed. Actual papers deal with single degree of freedom (SDOF) systems only. So the respective FPE includes two independent space variables only. Stepping over this limit into MDOF systems a number of specific problems related to a true multi-dimensionality must be overcome. Unlike earlier studies, multi-dimensional simplex elements in any arbitrary dimension should be deployed and rectangular (multi-brick) elements abandoned. Simple closed formulae of integration in multi-dimension domain have been derived. Another specific problem represents the generation of multi-dimensional finite element mesh. Assembling of system global matrices should be subjected to newly composed algorithms due to multi-dimensionality. The system matrices are quite full and no advantages following from their sparse character can be profited from, as is commonly used in conventional FEM applications in 2D/3D problems. After verification of partial algorithms, an illustrative example dealing with a 2DOF non-linear aeroelastic system in combination with random and deterministic excitations is discussed.

  3. Analysis of the phosphorescent dye concentration dependence of triplet-triplet annihilation in organic host-guest systems

    NASA Astrophysics Data System (ADS)

    Zhang, L.; van Eersel, H.; Bobbert, P. A.; Coehoorn, R.

    2016-10-01

    Using a novel method for analyzing transient photoluminescence (PL) experiments, a microscopic description is obtained for the dye concentration dependence of triplet-triplet annihilation (TTA) in phosphorescent host-guest systems. It is demonstrated that the TTA-mechanism, which could be a single-step dominated process or a diffusion-mediated multi-step process, can be deduced for any given dye concentration from a recently proposed PL intensity analysis. A comparison with the results of kinetic Monte Carlo simulations provides the TTA-Förster radius and shows that the TTA enhancement due to triplet diffusion can be well described in a microscopic manner assuming Förster- or Dexter-type energy transfer.

  4. Membrane loop process for separating carbon dioxide for use in gaseous form from flue gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijmans, Johannes G; Baker, Richard W; Merkel, Timothy C

    The invention is a process involving membrane-based gas separation for separating and recovering carbon dioxide emissions from combustion processes in partially concentrated form, and then transporting the carbon dioxide and using or storing it in a confined manner without concentrating it to high purity. The process of the invention involves building up the concentration of carbon dioxide in a gas flow loop between the combustion step and a membrane separation step. A portion of the carbon dioxide-enriched gas can then be withdrawn from this loop and transported, without the need to liquefy the gas or otherwise create a high-purity stream,more » to a destination where it is used or confined, preferably in an environmentally benign manner.« less

  5. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  6. Update of KDBI: Kinetic Data of Bio-molecular Interaction database

    PubMed Central

    Kumar, Pankaj; Han, B. C.; Shi, Z.; Jia, J.; Wang, Y. P.; Zhang, Y. T.; Liang, L.; Liu, Q. F.; Ji, Z. L.; Chen, Y. Z.

    2009-01-01

    Knowledge of the kinetics of biomolecular interactions is important for facilitating the study of cellular processes and underlying molecular events, and is essential for quantitative study and simulation of biological systems. Kinetic Data of Bio-molecular Interaction database (KDBI) has been developed to provide information about experimentally determined kinetic data of protein–protein, protein–nucleic acid, protein–ligand, nucleic acid–ligand binding or reaction events described in the literature. To accommodate increasing demand for studying and simulating biological systems, numerous improvements and updates have been made to KDBI, including new ways to access data by pathway and molecule names, data file in System Biology Markup Language format, more efficient search engine, access to published parameter sets of simulation models of 63 pathways, and 2.3-fold increase of data (19 263 entries of 10 532 distinctive biomolecular binding and 11 954 interaction events, involving 2635 proteins/protein complexes, 847 nucleic acids, 1603 small molecules and 45 multi-step processes). KDBI is publically available at http://bidd.nus.edu.sg/group/kdbi/kdbi.asp. PMID:18971255

  7. Multi-step oxidations catalyzed by cytochrome P450 enzymes: Processive vs. distributive kinetics and the issue of carbonyl oxidation in chemical mechanisms

    PubMed Central

    Guengerich, F. Peter; Sohl, Christal D.; Chowdhury, Goutam

    2010-01-01

    Catalysis of sequential oxidation reactions is not unusual in cytochrome P450 (P450) reactions, not only in steroid metabolism but also with many xenobiotics. One issue is how processive/distributive these reactions are, i.e. how much do the “intermediate” products dissociate. Our work with human P450s 2E1, 2A6, and 19A1 on this subject has revealed a mixture of systems, surprisingly with a more distributive mechanism with an endogenous substrate (P450 19A1) than for some xenobiotics (P450s 2E1, 2A6). One aspect of this research involves carbonyl intermediates, and the choice of catalytic mechanism is linked to the hydration state of the aldehyde. The non-enzymatic rates of hydration and dehydration of carbonyls are not rapid and whether P450s catalyze the reversible hydration is unknown. If carbonyl hydration and dehydration are slow, the mechanism may be set by the carbonyl hydration status. PMID:20804723

  8. Structural insight into the role of VAL1 B3 domain for targeting to FLC locus in Arabidopsis thaliana.

    PubMed

    Wu, Baixing; Zhang, Mengmeng; Su, Shichen; Liu, Hehua; Gan, Jianhua; Ma, Jinbiao

    2018-06-22

    Vernalization is a pivotal stage for some plants involving many epigenetic changes during cold exposure. In Arabidopsis, an essential step in vernalization for further flowering is successful silence the potent floral repressor Flowering Locus C (FLC) by repressing histone mark. AtVal1 is a multi-function protein containing five domains that participate into many recognition processes and is validated to recruit the repress histone modifier PHD-PRC2 complex and interact with components of the ASAP complex target to the FLC nucleation region through recognizing a cis element known as CME (cold memory element) by its plant-specific B3 domain. Here, we determine the crystal structure of the B3 domain in complex with Sph/RY motif in CME. Our structural analysis reveals the specific DNA recognition by B3 domain, combined with our in vitro experiments, we provide the structural insight into the important implication of AtVAL1-B3 domain in flowering process. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Image simulation for automatic license plate recognition

    NASA Astrophysics Data System (ADS)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  10. Transformational System Concepts and Technologies for Our Future in Space

    NASA Technical Reports Server (NTRS)

    Howell, Joe T.; Mankins, John C.

    2004-01-01

    Continued constrained budgets and growing national and international interests in the commercialization and development of space requires NASA to be constantly vigilant, to be creative, and to seize every opportunity for assuring the maximum return on space infrastructure investments. Accordingly, efforts are underway to forge new and innovative approaches to transform our space systems in the future to ultimately achieve two or three or five times as much with the same resources. This bold undertaking can be achieved only through extensive cooperative efforts throughout the aerospace community and truly effective planning to pursue advanced space system design concepts and high-risk/high-leverage research and technology. Definitive implementation strategies and roadmaps containing new methodologies and revolutionary approaches must be developed to economically accommodate the continued exploration and development of space. Transformation can be realized through modular design and stepping stone development. This approach involves sustainable budget levels and multi-purpose systems development of supporting capabilities that lead to a diverse amy of sustainable future space activities. Transformational design and development requires revolutionary advances by using modular designs and a planned, stepping stone development process. A modular approach to space systems potentially offers many improvements over traditional one-of-a-kind space systems comprised of different subsystem element with little standardization in interfaces or functionality. Modular systems must be more flexible, scaleable, reconfigurable, and evolvable. Costs can be reduced through learning curve effects and economies of scale, and by enabling servicing and repair that would not otherwise be feasible. This paper briefly discusses achieving a promising approach to transforming space systems planning and evolution into a meaningful stepping stone design, development, and implementation process. The success of this well planned and orchestrated approach holds great promise for achieving innovation and revolutionary technology development for supporting future exploration and development of space.

  11. Visible CWDM system design for Multi-Gbit/s transmission over SI-POF

    NASA Astrophysics Data System (ADS)

    Vázquez, Carmen; Pinzón, Plinio Jesús; Pérez, Isabel

    2015-01-01

    In order to increase the data rates of Multi-Gbit/s links based on large core step index (SI) plastic optical fibers (POF), different modulation scenes have been proposed. Another option is to use multiple optical carriers for parallel transmission of communication channels over the same fiber. Some designs to reach data rates of 14.77 Gb/s in 50 m, with 4 channels have been developed by off line processing. In this work, designs to test the potential of real Multi- Gbit/s transmission systems using commercial products are reported. Special care in designing low insertion loss multiplexers and demultiplexers is carried out to allow for greener solutions in terms of power consumption.

  12. Multi-parameter phenotypic profiling: using cellular effects to characterize small-molecule compounds.

    PubMed

    Feng, Yan; Mitchison, Timothy J; Bender, Andreas; Young, Daniel W; Tallarico, John A

    2009-07-01

    Multi-parameter phenotypic profiling of small molecules provides important insights into their mechanisms of action, as well as a systems level understanding of biological pathways and their responses to small molecule treatments. It therefore deserves more attention at an early step in the drug discovery pipeline. Here, we summarize the technologies that are currently in use for phenotypic profiling--including mRNA-, protein- and imaging-based multi-parameter profiling--in the drug discovery context. We think that an earlier integration of phenotypic profiling technologies, combined with effective experimental and in silico target identification approaches, can improve success rates of lead selection and optimization in the drug discovery process.

  13. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.93). We think that this protocol will be of significant value to those involved in performing high-throughput process development of process chromatography.

  14. Dissolving Hydroxyolite: A DNA Molecule into Its Hydroxyapatite Mold.

    PubMed

    Bertran, Oscar; Revilla-López, Guillermo; Casanovas, Jordi; Del Valle, Luis J; Turon, Pau; Puiggalí, Jordi; Alemán, Carlos

    2016-05-04

    In spite of the clinical importance of hydroxyapatite (HAp), the mechanism that controls its dissolution in acidic environments remains unclear. Knowledge of such a process is highly desirable to provide better understanding of different pathologies, as for example osteoporosis, and of the HAp potential as vehicle for gene delivery to replace damaged DNA. In this work, the mechanism of dissolution in acid conditions of HAp nanoparticles encapsulating double-stranded DNA has been investigated at the atomistic level using computer simulations. For this purpose, four consecutive (multi-step) molecular dynamics simulations, involving different temperatures and proton transfer processes, have been carried out. Results are consistent with a polynuclear decalcification mechanism in which proton transfer processes, from the surface to the internal regions of the particle, play a crucial role. In addition, the DNA remains protected by the mineral mold and transferred proton from both temperature and chemicals. These results, which indicate that biomineralization imparts very effective protection to DNA, also have important implications in other biomedical fields, as for example in the design of artificial bones or in the fight against osteoporosis by promoting the fixation of Ca(2+) ions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Artificial concurrent catalytic processes involving enzymes.

    PubMed

    Köhler, Valentin; Turner, Nicholas J

    2015-01-11

    The concurrent operation of multiple catalysts can lead to enhanced reaction features including (i) simultaneous linear multi-step transformations in a single reaction flask (ii) the control of intermediate equilibria (iii) stereoconvergent transformations (iv) rapid processing of labile reaction products. Enzymes occupy a prominent position for the development of such processes, due to their high potential compatibility with other biocatalysts. Genes for different enzymes can be co-expressed to reconstruct natural or construct artificial pathways and applied in the form of engineered whole cell biocatalysts to carry out complex transformations or, alternatively, the enzymes can be combined in vitro after isolation. Moreover, enzyme variants provide a wider substrate scope for a given reaction and often display altered selectivities and specificities. Man-made transition metal catalysts and engineered or artificial metalloenzymes also widen the range of reactivities and catalysed reactions that are potentially employable. Cascades for simultaneous cofactor or co-substrate regeneration or co-product removal are now firmly established. Many applications of more ambitious concurrent cascade catalysis are only just beginning to appear in the literature. The current review presents some of the most recent examples, with an emphasis on the combination of transition metal with enzymatic catalysis and aims to encourage researchers to contribute to this emerging field.

  16. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  17. Impact of multi-resolution analysis of artificial intelligence models inputs on multi-step ahead river flow forecasting

    NASA Astrophysics Data System (ADS)

    Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.

    2013-12-01

    Discrete wavelet transform was applied to decomposed ANN and ANFIS inputs.Novel approach of WNF with subtractive clustering applied for flow forecasting.Forecasting was performed in 1-5 step ahead, using multi-variate inputs.Forecasting accuracy of peak values and longer lead-time significantly improved.

  18. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soares, A.S.; Schneider, D. K.; Skinner, J. M.

    2008-09-01

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  19. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Soares; D Schneider; J Skinner

    2011-12-31

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  20. Development of Active Learning with Simulations and Games

    ERIC Educational Resources Information Center

    Zapalska, Alina; Brozik, Dallas; Rudd, Denis

    2012-01-01

    Educational games and simulations are excellent active learning tools that offer students hands-on experience. Little research is available on developing games and simulations and how teachers can be assisted in making their own games and simulations. In this context, the paper presents a multi-step process of how to develop games and simulations…

  1. Developing a Multi-Year Learning Progression for Carbon Cycling in Socio-Ecological Systems

    ERIC Educational Resources Information Center

    Mohan, Lindsey; Chen, Jing; Anderson, Charles W.

    2009-01-01

    This study reports on our steps toward achieving a conceptually coherent and empirically validated learning progression for carbon cycling in socio-ecological systems. It describes an iterative process of designing and analyzing assessment and interview data from students in upper elementary through high school. The product of our development…

  2. Scholastic Audits. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2009-01-01

    What is a scholastic audit? The purpose of the audit is to assist individual schools and districts improve. The focus is on gathering data and preparing recommendations that can be used to guide school improvement initiatives. Scholastic audits use a multi-step approach and include: (1) Preparing for the Audit; (2) Audit process; (3) Audit report;…

  3. 78 FR 36525 - Fisheries of the Atlantic and the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-18

    ...) Atlantic Sharpnose (Rhizoprionodon terraenovae) and Bonnethead (Sphyrna tiburo) sharks. SUMMARY: The SEDAR 34 assessment of HMS Atlantic Sharpnose and Bonnethead sharks will consist of an in-person workshop... for determining the status of fish stocks in the Southeast Region. SEDAR is a multi-step process...

  4. SED16 autonomous star tracker night sky testing

    NASA Astrophysics Data System (ADS)

    Foisneau, Thierry; Piriou, Véronique; Perrimon, Nicolas; Jacob, Philippe; Blarre, Ludovic; Vilaire, Didier

    2017-11-01

    The SED16 is an autonomous multi-missions star tracker which delivers three axis satellite attitude in an inertial reference frame and the satellite angular velocity with no prior information. The qualification process of this star sensor includes five validation steps using optical star simulator, digitized image simulator and a night sky tests setup. The night sky testing was the final step of the qualification process during which all the functions of the star tracker were used in almost nominal conditions : Autonomous Acquisition of the attitude, Autonomous Tracking of ten stars. These tests were performed in Calern in the premises of the OCA (Observatoire de la Cote d'Azur). The test set-up and the test results are described after a brief review of the sensor main characteristics and qualification process.

  5. NASA Expendable Launch Vehicle (ELV) Payload Safety Review Process

    NASA Technical Reports Server (NTRS)

    Starbus, Calvert S.; Donovan, Shawn; Dook, Mike; Palo, Tom

    2007-01-01

    Issues addressed by this program: (1) Complicated roles and responsibilities associated with multi-partner projects (2) Working relationships and communications between all organizations involved in the payload safety process (3) Consistent interpretation and implementation of safety requirements from one project to the rest (4) Consistent implementation of the Tailoring Process (5) Clearly defined NASA decision-making-authority (6) Bring Agency-wide perspective to each ElV payload project. Current process requires a Payload Safety Working Group (PSWG) for eac payload with representatives from all involved organizations.

  6. Hot Forging of a Cladded Component by Automated GMAW Process

    NASA Astrophysics Data System (ADS)

    Rafiq, Muhammad; Langlois, Laurent; Bigot, Régis

    2011-01-01

    Weld cladding is employed to improve the service life of engineering components by increasing corrosion and wear resistance and reducing the cost. The acceptable multi-bead cladding layer depends on single bead geometry. Hence, in first step, the relationship between input process parameters and the single bead geometry is studied and in second step a comprehensive study on multi bead clad layer deposition is carried out. This paper highlights an experimental study carried out to get single layer cladding deposited by automated Gas Metal Arc Welding (GMAW) process and to find the possibility of hot forming of the cladded work piece to get the final hot formed improved structure. GMAW is an arc welding process that uses an arc between a consumable electrode and the welding pool with an external shielding gas and the cladding is done by alongside deposition of weld beads. The experiments for single bead were conducted by varying the three main process parameters wire feed rate, arc voltage and welding speed while keeping other parameters like nozzle to work distance, shielding gas and its flow rate and torch angle constant. The effect of bead spacing and torch orientation on the cladding quality of single layer from the results of single bead deposition was studied. Effect of the dilution rate and nominal energy on the cladded layer hot bending quality was also performed at different temperatures.

  7. Effect of randomness on multi-frequency aeroelastic responses resolved by Unsteady Adaptive Stochastic Finite Elements

    NASA Astrophysics Data System (ADS)

    Witteveen, Jeroen A. S.; Bijl, Hester

    2009-10-01

    The Unsteady Adaptive Stochastic Finite Elements (UASFE) method resolves the effect of randomness in numerical simulations of single-mode aeroelastic responses with a constant accuracy in time for a constant number of samples. In this paper, the UASFE framework is extended to multi-frequency responses and continuous structures by employing a wavelet decomposition pre-processing step to decompose the sampled multi-frequency signals into single-frequency components. The effect of the randomness on the multi-frequency response is then obtained by summing the results of the UASFE interpolation at constant phase for the different frequency components. Results for multi-frequency responses and continuous structures show a three orders of magnitude reduction of computational costs compared to crude Monte Carlo simulations in a harmonically forced oscillator, a flutter panel problem, and the three-dimensional transonic AGARD 445.6 wing aeroelastic benchmark subject to random fields and random parameters with various probability distributions.

  8. Effects of acute alcohol intoxication on automated processing: evidence from the double-step paradigm.

    PubMed

    Vorstius, Christian; Radach, Ralph; Lang, Alan R

    2012-02-01

    Reflexive and voluntary levels of processing have been studied extensively with respect to possible impairments due to alcohol intoxication. This study examined alcohol effects at the 'automated' level of processing essential to many complex visual processing tasks (e.g., reading, visual search) that involve ongoing modifications or reprogramming of well-practiced routines. Data from 30 participants (16 male) were collected in two counterbalanced sessions (alcohol vs. no-alcohol control; mean breath alcohol concentration = 68 mg/dL vs. 0 mg/dL). Eye movements were recorded during a double-step task where 75% of trials involved two target stimuli in rapid succession (inter-stimulus interval [ISI]=40, 70, or 100 ms) so that they could elicit two distinct saccades or eye movements (double steps). On 25% of trials a single target appeared. Results indicated that saccade latencies were longer under alcohol. In addition, the proportion of single-step responses and the mean saccade amplitude (length) of primary saccades decreased significantly with increasing ISI. The key novel finding, however, was that the reprogramming time needed to cancel the first saccade and adjust saccade amplitude was extended significantly by alcohol. The additional time made available by prolonged latencies due to alcohol was not utilized by the saccade programming system to decrease the number of two-step responses. These results represent the first demonstration of specific alcohol-induced programming deficits at the automated level of oculomotor processing.

  9. Homology modeling and docking analyses of M. leprae Mur ligases reveals the common binding residues for structure based drug designing to eradicate leprosy.

    PubMed

    Shanmugam, Anusuya; Natarajan, Jeyakumar

    2012-06-01

    Multi drug resistance capacity for Mycobacterium leprae (MDR-Mle) demands the profound need for developing new anti-leprosy drugs. Since most of the drugs target a single enzyme, mutation in the active site renders the antibiotic ineffective. However, structural and mechanistic information on essential bacterial enzymes in a pathway could lead to the development of antibiotics that targets multiple enzymes. Peptidoglycan is an important component of the cell wall of M. leprae. The biosynthesis of bacterial peptidoglycan represents important targets for the development of new antibacterial drugs. Biosynthesis of peptidoglycan is a multi-step process that involves four key Mur ligase enzymes: MurC (EC:6.3.2.8), MurD (EC:6.3.2.9), MurE (EC:6.3.2.13) and MurF (EC:6.3.2.10). Hence in our work, we modeled the three-dimensional structure of the above Mur ligases using homology modeling method and analyzed its common binding features. The residues playing an important role in the catalytic activity of each of the Mur enzymes were predicted by docking these Mur ligases with their substrates and ATP. The conserved sequence motifs significant for ATP binding were predicted as the probable residues for structure based drug designing. Overall, the study was successful in listing significant and common binding residues of Mur enzymes in peptidoglycan pathway for multi targeted therapy.

  10. Design of Ultra-Wideband Tapered Slot Antenna by Using Binomial Transformer with Corrugation

    NASA Astrophysics Data System (ADS)

    Chareonsiri, Yosita; Thaiwirot, Wanwisa; Akkaraekthalin, Prayoot

    2017-05-01

    In this paper, the tapered slot antenna (TSA) with corrugation is proposed for UWB applications. The multi-section binomial transformer is used to design taper profile of the proposed TSA that does not involve using time consuming optimization. A step-by-step procedure for synthesis of the step impedance values related with step slot widths of taper profile is presented. The smooth taper can be achieved by fitting the smoothing curve to the entire step slot. The design of TSA based on this method yields results with a quite flat gain and wide impedance bandwidth covering UWB spectrum from 3.1 GHz to 10.6 GHz. To further improve the radiation characteristics, the corrugation is added on the both edges of the proposed TSA. The effects of different corrugation shapes on the improvement of antenna gain and front-to-back ratio (F-to-B ratio) are investigated. To demonstrate the validity of the design, the prototypes of TSA without and with corrugation are fabricated and measured. The results show good agreement between simulation and measurement.

  11. Taking Stock and Creating a Vision: A Middle School Community Takes the First Steps toward Creating an Accelerated School.

    ERIC Educational Resources Information Center

    McBride, Ron E.; Stuessy, Carol

    1996-01-01

    Accelerated schools strive to bring at-risk students into the educational mainstream and perform at grade level through acceleration rather than remediation. Describes four steps to initiate the accelerated process and how a Texas middle school involved all members of the school community in implementing the first two steps, taking stock and…

  12. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  13. Separation of Zirconium and Hafnium: A Review

    NASA Astrophysics Data System (ADS)

    Xu, L.; Xiao, Y.; van Sandwijk, A.; Xu, Q.; Yang, Y.

    Zirconium is an ideal material for nuclear reactors due to its low absorption cross-section for thermal neutrons, whereas the typically contained hafnium with strong neutron-absorption is very harmful for zirconium. This paper provides an overview of the processes for separating hafnium from zirconium. The separation processes are roughly classified into hydro- and pyrometallurgical routes. The current dominant zirconium production route involves pyrometallurgical ore cracking, multi-step hydrometallurgical liquid-liquid extraction for hafnium removal and the reduction of zirconium tetrachloride to the pure metal by the Kroll process. The lengthy hydrometallurgical Zr-Hf separation operations leads to high production cost, intensive labour and heavy environmental burden. Using a compact pyrometallurgical separation method can simplify the whole production flowsheet with a higher process efficiency. The known separation methods are discussed based on the following reaction features: redox characteristics, volatility, electrochemical properties and molten salt extraction. The commercially operating extractive distillation process is a significant advance in Zr-Hf separation technology but it suffers from high process maintenance cost. The recently developed new process based on molten salt-metal equilibrium for Zr-Hf separation shows a great potential for industrial application, which is compact for nuclear grade zirconium production starting from crude ore. In the present paper, the available separation technologies are compared. The advantages and disadvantages as well as future directions of research and development for nuclear grade zirconium production are discussed.

  14. Complex network analysis of brain functional connectivity under a multi-step cognitive task

    NASA Astrophysics Data System (ADS)

    Cai, Shi-Min; Chen, Wei; Liu, Dong-Bai; Tang, Ming; Chen, Xun

    2017-01-01

    Functional brain network has been widely studied to understand the relationship between brain organization and behavior. In this paper, we aim to explore the functional connectivity of brain network under a multi-step cognitive task involving consecutive behaviors, and further understand the effect of behaviors on the brain organization. The functional brain networks are constructed based on a high spatial and temporal resolution fMRI dataset and analyzed via complex network based approach. We find that at voxel level the functional brain network shows robust small-worldness and scale-free characteristics, while its assortativity and rich-club organization are slightly restricted to the order of behaviors performed. More interestingly, the functional connectivity of brain network in activated ROIs strongly correlates with behaviors and is obviously restricted to the order of behaviors performed. These empirical results suggest that the brain organization has the generic properties of small-worldness and scale-free characteristics, and its diverse functional connectivity emerging from activated ROIs is strongly driven by these behavioral activities via the plasticity of brain.

  15. Closing the loop of the medication use process using electronic medication administration registration.

    PubMed

    Lenderink, Bertil W; Egberts, Toine C G

    2004-08-01

    Recent reports and studies of errors in the medication process have raised the awareness of the threat to public health. An essential step in this multi-stage process is the actual administration of a medicine to the patient. The closed loop system is thought to be a way of preventing medication errors. Current information technology can facilitate this process. This article describes the way barcode technology is being used to facilitate medication administration registration on several wards in our hospital and nursing home.

  16. Lessons Learned for Collaborative Clinical Content Development

    PubMed Central

    Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.

    2013-01-01

    Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366

  17. Step-by-step: a model for practice-based learning.

    PubMed

    Kane, Gabrielle M

    2007-01-01

    Innovative technology has led to high-precision radiation therapy that has dramatically altered the practice of radiation oncology. This qualitative study explored the implementation of this innovation into practice from the perspective of the practitioners in a large academic radiation medicine program and aimed to improve understanding of and facilitate the educational process of this change. Multiprofession staff participated in a series of seven focus groups and nine in-depth interviews, and the descriptive data from the transcripts were analyzed using grounded theory methodology. Practitioners believed that there had been a major effect on many aspects of their practice. The team structure supported the adoption of change. The technology changed the way the practices worked. Learning new skills increased workload and stress but led to a new conception of the discipline and the generation of new practice-based knowledge. When the concepts were examined longitudinally, a four-step process of learning was identified. In step 1, there was anxiety as staff acquired the skills to use the technology. Step 2 involved learning to interpret new findings and images, experiencing uncertainty until new perspectives developed. Step 3 involved questioning assumptions and critical reflection, which resulted in new understanding. The final step 4 identified a process of constructing new knowledge through research, development, and dialogue within the profession. These findings expand our understanding of how practice-based learning occurs in the context of change and can guide learning activities appropriate to each stage.

  18. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  19. Superhydrophobic aluminum alloy surfaces by a novel one-step process.

    PubMed

    Saleema, N; Sarkar, D K; Paynter, R W; Chen, X-G

    2010-09-01

    A simple one-step process has been developed to render aluminum alloy surfaces superhydrophobic by immersing the aluminum alloy substrates in a solution containing NaOH and fluoroalkyl-silane (FAS-17) molecules. Scanning electron microscopy (SEM), X-ray photoelectron spectroscopy (XPS) and water contact angle measurements have been performed to characterize the morphological features, chemical composition and superhydrophobicity of the surfaces. The resulting surfaces provided a water contact angle as high as ∼162° and a contact angle hysteresis as low as ∼4°. The study indicates that it is possible to fabricate superhydrophobic aluminum surfaces easily and effectively without involving the traditional two-step processes.

  20. High-volume workflow management in the ITN/FBI system

    NASA Astrophysics Data System (ADS)

    Paulson, Thomas L.

    1997-02-01

    The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.

  1. NASA Planning for Orion Multi-Purpose Crew Vehicle Ground Operations

    NASA Technical Reports Server (NTRS)

    Letchworth, Gary; Schlierf, Roland

    2011-01-01

    The NASA Orion Ground Processing Team was originally formed by the Kennedy Space Center (KSC) Constellation (Cx) Project Office's Orion Division to define, refine and mature pre-launch and post-landing ground operations for the Orion human spacecraft. The multidisciplined KSC Orion team consisted of KSC civil servant, SAIC, Productivity Apex, Inc. and Boeing-CAPPS engineers, project managers and safety engineers, as well as engineers from Constellation's Orion Project and Lockheed Martin Orion Prime contractor. The team evaluated the Orion design configurations as the spacecraft concept matured between Systems Design Review (SDR), Systems Requirement Review (SRR) and Preliminary Design Review (PDR). The team functionally decomposed prelaunch and post-landing steps at three levels' of detail, or tiers, beginning with functional flow block diagrams (FFBDs). The third tier FFBDs were used to build logic networks and nominal timelines. Orion ground support equipment (GSE) was identified and mapped to each step. This information was subsequently used in developing lower level operations steps in a Ground Operations Planning Document PDR product. Subject matter experts for each spacecraft and GSE subsystem were used to define 5th - 95th percentile processing times for each FFBD step, using the Delphi Method. Discrete event simulations used this information and the logic network to provide processing timeline confidence intervals for launch rate assessments. The team also used the capabilities of the KSC Visualization Lab, the FFBDs and knowledge of the spacecraft, GSE and facilities to build visualizations of Orion pre-launch and postlanding processing at KSC. Visualizations were a powerful tool for communicating planned operations within the KSC community (i.e., Ground Systems design team), and externally to the Orion Project, Lockheed Martin spacecraft designers and other Constellation Program stakeholders during the SRR to PDR timeframe. Other operations planning tools included Kaizen/Lean events, mockups and human factors analysis. The majority of products developed by this team are applicable as KSC prepares 21st Century Ground Systems for the Orion Multi-Purpose Crew Vehicle and Space Launch System.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederix, Marijke; Mingardon, Florence; Hu, Matthew

    Biological production of chemicals and fuels using microbial transformation of sustainable carbon sources, such as pretreated and saccharified plant biomass, is a multi-step process. Typically, each segment of the workflow is optimized separately, often generating conditions that may not be suitable for integration or consolidation with the upstream or downstream steps. While significant effort has gone into developing solutions to incompatibilities at discrete steps, very few studies report the consolidation of the multi-step workflow into a single pot reactor system. Here we demonstrate a one-pot biofuel production process that uses the ionic liquid 1-ethyl-3-methylimidazolium acetate (C 2C 1Im][OAc] ) formore » pretreatment of switchgrass biomass. [C 2C 1Im][OAc] is highly effective in deconstructing lignocellulose, but nonetheless leaves behind residual reagents that are toxic to standard saccharification enzymes and the microbial production host. We report the discovery of an [C 2C 1Im]-tolerant E. coli strain, where [C 2C 1Im] tolerance is bestowed by a P7Q mutation in the transcriptional regulator encoded by rcdA. We establish that the causal impact of this mutation is the derepression of a hitherto uncharacterized major facilitator family transporter, YbjJ. To develop the strain for a one-pot process we engineered this [C 2C 1Im]-tolerant strain to express a recently reported d-limonene production pathway. We also screened previously reported [C 2C 1Im]-tolerant cellulases to select one that would function with the range of E. coli cultivation conditions and expressed it in the [C 2C 1 Im]-tolerant E. coli strain so as to secrete this [C 2C 1Im]-tolerant cellulase. The final strain digests pretreated biomass, and uses the liberated sugars to produce the bio-jet fuel candidate precursor d-limonene in a one-pot process.« less

  3. Development of an E. coli strain for one-pot biofuel production from ionic liquid pretreated cellulose and switchgrass

    DOE PAGES

    Frederix, Marijke; Mingardon, Florence; Hu, Matthew; ...

    2016-04-11

    Biological production of chemicals and fuels using microbial transformation of sustainable carbon sources, such as pretreated and saccharified plant biomass, is a multi-step process. Typically, each segment of the workflow is optimized separately, often generating conditions that may not be suitable for integration or consolidation with the upstream or downstream steps. While significant effort has gone into developing solutions to incompatibilities at discrete steps, very few studies report the consolidation of the multi-step workflow into a single pot reactor system. Here we demonstrate a one-pot biofuel production process that uses the ionic liquid 1-ethyl-3-methylimidazolium acetate (C 2C 1Im][OAc] ) formore » pretreatment of switchgrass biomass. [C 2C 1Im][OAc] is highly effective in deconstructing lignocellulose, but nonetheless leaves behind residual reagents that are toxic to standard saccharification enzymes and the microbial production host. We report the discovery of an [C 2C 1Im]-tolerant E. coli strain, where [C 2C 1Im] tolerance is bestowed by a P7Q mutation in the transcriptional regulator encoded by rcdA. We establish that the causal impact of this mutation is the derepression of a hitherto uncharacterized major facilitator family transporter, YbjJ. To develop the strain for a one-pot process we engineered this [C 2C 1Im]-tolerant strain to express a recently reported d-limonene production pathway. We also screened previously reported [C 2C 1Im]-tolerant cellulases to select one that would function with the range of E. coli cultivation conditions and expressed it in the [C 2C 1 Im]-tolerant E. coli strain so as to secrete this [C 2C 1Im]-tolerant cellulase. The final strain digests pretreated biomass, and uses the liberated sugars to produce the bio-jet fuel candidate precursor d-limonene in a one-pot process.« less

  4. Working through. A process of restitution.

    PubMed

    Gottesman, D M

    A number of authors, including Freud, have written about the process of working through but have left unsettled what is actually involved. I have attempted to outline the step-by-step process of working through, starting with recollection and repetition and ending with restitution and resolution. I have introduced the term restitution in order to give more importance to an already existing step in the working-throught process; it should not be looked upon as an artificial device. Restitution allows the patient to find appropriate gratification in present reality, and this helps him to relinquish the past. Rather than allowing the patient to "wallow in the muck of guilt," as Eveoleen Rexford suggests society "wallows" in its inability to help its children, restitution gives appropriate direction for change. It is a natural step in the successful resolution of treatment.

  5. Shadowing effects on multi-step Langmuir probe array on HL-2A tokamak

    NASA Astrophysics Data System (ADS)

    Ke, R.; Xu, M.; Nie, L.; Gao, Z.; Wu, Y.; Yuan, B.; Chen, J.; Song, X.; Yan, L.; Duan, X.

    2018-05-01

    Multi-step Langmuir probe arrays have been designed and installed on the HL-2A tokamak [1]–[2] to study the turbulent transport in the edge plasma, especially for the measurement of poloidal momentum flux, Reynolds stress Rs. However, except the probe tips on the top step, all other tips on lower steps are shadowed by graphite skeleton. It is necessary to estimate the shadowing effects on equilibrium and fluctuation measurement. In this paper, comparison of shadowed tips to unshadowed ones is presented. The results show that shadowing can strongly reduce the ion and electron effective collection area. However, its effect is negligible for the turbulence intensity and coherence measurement, confirming that the multi-step LP array is proper for the turbulent transport measurement.

  6. Combining living anionic polymerization with branching reactions in an iterative fashion to design branched polymers.

    PubMed

    Higashihara, Tomoya; Sugiyama, Kenji; Yoo, Hee-Soo; Hayashi, Mayumi; Hirao, Akira

    2010-06-16

    This paper reviews the precise synthesis of many-armed and multi-compositional star-branched polymers, exact graft (co)polymers, and structurally well-defined dendrimer-like star-branched polymers, which are synthetically difficult, by a commonly-featured iterative methodology combining living anionic polymerization with branched reactions to design branched polymers. The methodology basically involves only two synthetic steps; (a) preparation of a polymeric building block corresponding to each branched polymer and (b) connection of the resulting building unit to another unit. The synthetic steps were repeated in a stepwise fashion several times to successively synthesize a series of well-defined target branched polymers. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Multi-step rhodopsin inactivation schemes can account for the size variability of single photon responses in Limulus ventral photoreceptors

    PubMed Central

    1994-01-01

    Limulus ventral photoreceptors generate highly variable responses to the absorption of single photons. We have obtained data on the size distribution of these responses, derived the distribution predicted from simple transduction cascade models and compared the theory and data. In the simplest of models, the active state of the visual pigment (defined by its ability to activate G protein) is turned off in a single reaction. The output of such a cascade is predicted to be highly variable, largely because of stochastic variation in the number of G proteins activated. The exact distribution predicted is exponential, but we find that an exponential does not adequately account for the data. The data agree much better with the predictions of a cascade model in which the active state of the visual pigment is turned off by a multi-step process. PMID:8057085

  8. Combined Economic and Hydrologic Modeling to Support Collaborative Decision Making Processes

    NASA Astrophysics Data System (ADS)

    Sheer, D. P.

    2008-12-01

    For more than a decade, the core concept of the author's efforts in support of collaborative decision making has been a combination of hydrologic simulation and multi-objective optimization. The modeling has generally been used to support collaborative decision making processes. The OASIS model developed by HydroLogics Inc. solves a multi-objective optimization at each time step using a mixed integer linear program (MILP). The MILP can be configured to include any user defined objective, including but not limited too economic objectives. For example, an estimated marginal value for water for crops and M&I use were included in the objective function to drive trades in a model of the lower Rio Grande. The formulation of the MILP, constraints and objectives, in any time step is conditional: it changes based on the value of state variables and dynamic external forcing functions, such as rainfall, hydrology, market prices, arrival of migratory fish, water temperature, etc. It therefore acts as a dynamic short term multi-objective economic optimization for each time step. MILP is capable of solving a general problem that includes a very realistic representation of the physical system characteristics in addition to the normal multi-objective optimization objectives and constraints included in economic models. In all of these models, the short term objective function is a surrogate for achieving long term multi-objective results. The long term performance for any alternative (especially including operating strategies) is evaluated by simulation. An operating rule is the combination of conditions, parameters, constraints and objectives used to determine the formulation of the short term optimization in each time step. Heuristic wrappers for the simulation program have been developed improve the parameters of an operating rule, and are initiating research on a wrapper that will allow us to employ a genetic algorithm to improve the form of the rule (conditions, constraints, and short term objectives) as well. In the models operating rules represent different models of human behavior, and the objective of the modeling is to find rules for human behavior that perform well in terms of long term human objectives. The conceptual model used to represent human behavior incorporates economic multi-objective optimization for surrogate objectives, and rules that set those objectives based on current conditions and accounting for uncertainty, at least implicitly. The author asserts that real world operating rules follow this form and have evolved because they have been perceived as successful in the past. Thus, the modeling efforts focus on human behavior in much the same way that economic models focus on human behavior. This paper illustrates the above concepts with real world examples.

  9. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  10. Freestanding Aligned Multi-walled Carbon Nanotubes for Supercapacitor Devices

    NASA Astrophysics Data System (ADS)

    Moreira, João Vitor Silva; Corat, Evaldo José; May, Paul William; Cardoso, Lays Dias Ribeiro; Lelis, Pedro Almeida; Zanin, Hudson

    2016-11-01

    We report on the synthesis and electrochemical properties of multi-walled carbon nanotubes (MWCNTs) for supercapacitor devices. Freestanding vertically-aligned MWCNTs and MWCNT powder were grown concomitantly in a one-step chemical vapour deposition process. Samples were characterized by scanning and transmission electron microscopies and Fourier transform infrared and Raman spectroscopies. At similar film thicknesses and surface areas, the freestanding MWCNT electrodes showed higher electrochemical capacitance and gravimetric specific energy and power than the randomly-packed nanoparticle-based electrodes. This suggests that more ordered electrode film architectures facilitate faster electron and ion transport during the charge-discharge processes. Energy storage and supply or supercapacitor devices made from these materials could bridge the gap between rechargeable batteries and conventional high-power electrostatic capacitors.

  11. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  12. Planning that works: Empowerment through stakeholder focused interactive planning (SFIP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, J.E.; Ison, S.A.

    1994-12-31

    This paper describes a powerful planning tool that can enable government, private industries, and public interest organizations to actualize their visions through sound decision making. The stakeholder focused interactive planning model is designed to integrate and ultimately gain stakeholder investment in the success of attainment of their vision. The only concessions required of the planning organization using this process is the acceptance of the premise that sustained vision success requires the support of both internal and external stakeholders and that each step in the process must be used as a validation of the previous step and essential to the completionmore » of the next step. What is stakeholder/public involvement? It is the process in which the stakeholders (both internal and external) values, interests and expectations are included in decision-making processes. The primary goal of public involvement efforts is to include all those who have a stake in the decision, whether or not they have already been identified. Stakeholders are individuals, contractors, clients, suppliers, public organizations, state and local governments, Indian tribes, federal agencies, and other parties affected by decisions.« less

  13. Analyzing the dependence of oxygen incorporation current density on overpotential and oxygen partial pressure in mixed conducting oxide electrodes.

    PubMed

    Guan, Zixuan; Chen, Di; Chueh, William C

    2017-08-30

    The oxygen incorporation reaction, which involves the transformation of an oxygen gas molecule to two lattice oxygen ions in a mixed ionic and electronic conducting solid, is a ubiquitous and fundamental reaction in solid-state electrochemistry. To understand the reaction pathway and to identify the rate-determining step, near-equilibrium measurements have been employed to quantify the exchange coefficients as a function of oxygen partial pressure and temperature. However, because the exchange coefficient contains contributions from both forward and reverse reaction rate constants and depends on both oxygen partial pressure and oxygen fugacity in the solid, unique and definitive mechanistic assessment has been challenging. In this work, we derive a current density equation as a function of both oxygen partial pressure and overpotential, and consider both near and far from equilibrium limits. Rather than considering specific reaction pathways, we generalize the multi-step oxygen incorporation reaction into the rate-determining step, preceding and following quasi-equilibrium steps, and consider the number of oxygen ions and electrons involved in each. By evaluating the dependence of current density on oxygen partial pressure and overpotential separately, one obtains the reaction orders for oxygen gas molecules and for solid-state species in the electrode. We simulated the oxygen incorporation current density-overpotential curves for praseodymium-doped ceria for various candidate rate-determining steps. This work highlights a promising method for studying the exchange kinetics far away from equilibrium.

  14. Bacoside A downregulates matrix metalloproteinases 2 and 9 in DEN-induced hepatocellular carcinoma.

    PubMed

    Janani, Panneerselvam; Sivakumari, Kanakarajan; Geetha, Arumugam; Yuvaraj, Sambandam; Parthasarathy, Chandrakesan

    2010-03-01

    Cancer metastasis is a complex multi-step process, responsible for a majority of cancer-related deaths by affecting the critical organs and causing complications in therapies. Hepatocellular carcinoma is a multi-factorial disease and is the third most common cause of cancer related mortality worldwide. Clinical and experimental studies have shown that MMP-2 and MMP-9 are involved in tumor invasion and metastases and their elevated expression has been associated with poor prognosis. Our recent studies showed a strong anti-oxidant and hepatoprotective effects of bacoside A (BA) against carcinogen. Nevertheless the effect of BA on the activities and expression of MMP-2 and MMP-9 during hepatocellular carcinoma is not yet recognized. Therefore, the present study was designed to assess the same. Results of gelatin zymography study showed that BA co-treatment significantly decreased the activities of MMP-2 and MMP-9, which is increased during hepatocellular carcinoma. Further immunoblot analysis showed decreased expression of MMP-2 and MMP-9 in rats co-treated with BA compared to DEN-induced hepatocellular carcinoma. Our results reveal that BA exerts its anti-metastatic effect against DEN-induced hepatocellular carcinoma by inhibiting the activities and expressions of MMP-2 and MMP-9. 2010 John Wiley & Sons, Ltd.

  15. Desulfurization of benzonaphthothiophenes and dibenzothiophene with a Raney nickel catalyst and its relationship to the. pi. -electron density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagai, M.; Urimoto, H.; Uetake, K.

    The hydrodesulfurization of heavy petroleum feedstocks and coal-derived liquids requires the conversion of high molecular weight compounds like dibenzothiophene and benzonaphthothiophenes. There are several studies in the literature which deal with the mechanism of the hydrodesulfurization of multi-ring thiophenic compounds on cobalt or nickel molybdenum catalysts at high pressure. However, there are only a few studies which relate the chemical reactivity of these compounds to their electronic structure. The reactivity of a multi-ring sulfur-containing compound is not determined solely by the size of the molecule. In addition, others studied the relationship between the first step in the hydrotreating reaction ofmore » benzonaphthothiophene and the Coulombic interaction term of the compounds using the CNDO/S method. Because there is competition between the different processes (hydrogenation and desulfurization) during reaction, it is difficult to understand the relationship between desulfurization and the electronic properties of the compounds under reaction conditions. The calculation of electronic structures necessarily involves many sigma bonds of hydrogenated aromatic rings as well as many electrons of high molecular weight compounds. For this reason, it is best to select a catalyst and reaction conditions under which desulfurization takes place without hydrogenation.« less

  16. Glial brain tumor detection by using symmetry analysis

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Binaghi, Elisabetta; Balbi, Sergio; De Benedictis, Alessandro; Monti, Emanuele; Minotto, Renzo

    2012-02-01

    In this work a fully automatic algorithm to detect brain tumors by using symmetry analysis is proposed. In recent years a great effort of the research in field of medical imaging was focused on brain tumors segmentation. The quantitative analysis of MRI brain tumor allows to obtain useful key indicators of disease progression. The complex problem of segmenting tumor in MRI can be successfully addressed by considering modular and multi-step approaches mimicking the human visual inspection process. The tumor detection is often an essential preliminary phase to solvethe segmentation problem successfully. In visual analysis of the MRI, the first step of the experts cognitive process, is the detection of an anomaly respect the normal tissue, whatever its nature. An healthy brain has a strong sagittal symmetry, that is weakened by the presence of tumor. The comparison between the healthy and ill hemisphere, considering that tumors are generally not symmetrically placed in both hemispheres, was used to detect the anomaly. A clustering method based on energy minimization through Graph-Cut is applied on the volume computed as a difference between the left hemisphere and the right hemisphere mirrored across the symmetry plane. Differential analysis involves the loss the knowledge of the tumor side. Through an histogram analysis the ill hemisphere is recognized. Many experiments are performed to assess the performance of the detection strategy on MRI volumes in presence of tumors varied in terms of shapes positions and intensity levels. The experiments showed good results also in complex situations.

  17. Development of a floating drug delivery system with superior buoyancy in gastric fluid using hot-melt extrusion coupled with pressurized CO₂.

    PubMed

    Almutairy, B K; Alshetaili, A S; Ashour, E A; Patil, H; Tiwari, R V; Alshehri, S M; Repka, M A

    2016-03-01

    The present study aimed to develop a continuous single-step manufacturing platform to prepare a porous, low-density, and floating multi-particulate system (mini-tablet, 4 mm size). This process involves injecting inert, non-toxic pressurized CO₂gas (P-CO₂) in zone 4 of a 16-mm hot-melt extruder (HME) to continuously generate pores throughout the carrier matrix. Unlike conventional methods for preparing floating drug delivery systems, additional chemical excipients and additives are not needed in this approach to create minute openings on the surface of the matrices. The buoyancy efficiency of the prepared floating system (injection of P-CO₂) in terms of lag time (0 s) significantly improved (P < 0.05), compared to the formulation prepared by adding the excipient sodium bicarbonate (lag time 120 s). The main advantages of this novel manufacturing technique include: (i) no additional chemical excipients need to be incorporated in the formulation, (ii) few manufacturing steps are required, (iii) high buoyancy efficiency is attained, and (iv) the extrudate is free of toxic solvent residues. Floating mini-tablets containing acetaminophen (APAP) as a model drug within the matrix-forming carrier (Eudragit® RL PO) have been successfully processed via this combined technique (P-CO₂/HME). Desired controlled release profile of APAP from the polymer Eudragit® RL PO is attained in the optimized formulation, which remains buoyant on the surface of gastric fluids prior to gastric emptying time (average each 4 h).

  18. Implementing successful strategic plans: a simple formula.

    PubMed

    Blondeau, Whitney; Blondeau, Benoit

    2015-01-01

    Strategic planning is a process. One way to think of strategic planning is to envision its development and design as a framework that will help your hospital navigate through internal and external changing environments over time. Although the process of strategic planning can feel daunting, following a simple formula involving five steps using the mnemonic B.E.G.I.N. (Begin, Evaluate, Goals & Objectives, Integration, and Next steps) will help the planning process feel more manageable, and lead you to greater success.

  19. Laser 3D micro-manufacturing

    NASA Astrophysics Data System (ADS)

    Piqué, Alberto; Auyeung, Raymond C. Y.; Kim, Heungsoo; Charipar, Nicholas A.; Mathews, Scott A.

    2016-06-01

    Laser-based materials processing techniques are gaining widespread use in micro-manufacturing applications. The use of laser microfabrication techniques enables the processing of micro- and nanostructures from a wide range of materials and geometries without the need for masking and etching steps commonly associated with photolithography. This review aims to describe the broad applications space covered by laser-based micro- and nanoprocessing techniques and the benefits offered by the use of lasers in micro-manufacturing processes. Given their non-lithographic nature, these processes are also referred to as laser direct-write and constitute some of the earliest demonstrations of 3D printing or additive manufacturing at the microscale. As this review will show, the use of lasers enables precise control of the various types of processing steps—from subtractive to additive—over a wide range of scales with an extensive materials palette. Overall, laser-based direct-write techniques offer multiple modes of operation including the removal (via ablative processes) and addition (via photopolymerization or printing) of most classes of materials using the same equipment in many cases. The versatility provided by these multi-function, multi-material and multi-scale laser micro-manufacturing processes cannot be matched by photolithography nor with other direct-write microfabrication techniques and offer unique opportunities for current and future 3D micro-manufacturing applications.

  20. Development of a multi-criteria evaluation system to assess growing pig welfare.

    PubMed

    Martín, P; Traulsen, I; Buxadé, C; Krieter, J

    2017-03-01

    The aim of this paper was to present an alternative multi-criteria evaluation model to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. The WQ assessment protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first step of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion. The utility functions and the aggregation function were constructed in two separated steps. The Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) method was used for utility function determination and the Choquet Integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The methods were tested with generated data sets for farms of growing pigs. Using the MAUT, similar results were obtained to the ones obtained applying the WQ protocol aggregation methods. It can be concluded that due to the use of an interactive approach such as MACBETH, this alternative methodology is more transparent and more flexible than the methodology proposed by WQ, which allows the possibility to modify the model according, for instance, to new scientific knowledge.

  1. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    PubMed

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Actual operation and regulatory activities on steam generator replacement in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saeki, Hitoshi

    1997-02-01

    This paper summarizes the operating reactors in Japan, and the status of the steam generators in these plants. It reviews plans for replacement of existing steam generators, and then goes into more detail on the planning and regulatory steps which must be addressed in the process of accomplishing this maintenance. The paper also reviews the typical steps involved in the process of removal and replacement of steam generators.

  3. A comparison of transfer-appropriate processing and multi-process frameworks for prospective memory performance.

    PubMed

    McBride, Dawn M; Abney, Drew H

    2012-01-01

    We examined multi-process (MP) and transfer-appropriate processing descriptions of prospective memory (PM). Three conditions were compared that varied the overlap in processing type (perceptual/conceptual) between the ongoing and PM tasks such that two conditions involved a match of perceptual processing and one condition involved a mismatch in processing (conceptual ongoing task/perceptual PM task). One of the matched processing conditions also created a focal PM task, whereas the other two conditions were considered non-focal (Einstein & McDaniel, 2005). PM task accuracy and ongoing task completion speed in baseline and PM task conditions were measured. Accuracy results indicated a higher PM task completion rate for the focal condition than the non-focal conditions, a finding that is consistent with predictions made by the MP view. However, reaction time (RT) analyses indicated that PM task cost did not differ across conditions when practice effects are considered. Thus, the PM accuracy results are consistent with a MP description of PM, but RT results did not support the MP view predictions regarding PM cost.

  4. A Systematic Approach to Subgroup Classification in Intellectual Disability

    ERIC Educational Resources Information Center

    Schalock, Robert L.; Luckasson, Ruth

    2015-01-01

    This article describes a systematic approach to subgroup classification based on a classification framework and sequential steps involved in the subgrouping process. The sequential steps are stating the purpose of the classification, identifying the classification elements, using relevant information, and using clearly stated and purposeful…

  5. Validation of a multi-criteria evaluation model for animal welfare.

    PubMed

    Martín, P; Czycholl, I; Buxadé, C; Krieter, J

    2017-04-01

    The aim of this paper was to validate an alternative multi-criteria evaluation system to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. This alternative methodology aimed to be more transparent for stakeholders and more flexible than the methodology proposed by WQ. The WQ assessment protocol for growing pigs was implemented to collect data in different farms in Schleswig-Holstein, Germany. In total, 44 observations were carried out. The aggregation system proposed in the WQ protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first two steps of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion and principle. The utility functions and the aggregation function were constructed in two separated steps. The MACBETH (Measuring Attractiveness by a Categorical-Based Evaluation Technique) method was used for utility function determination and the Choquet integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The validation of the MAUT model was divided into two steps, first, the results of the model were compared with the results of the WQ project at criteria and principle level, and second, a sensitivity analysis of our model was carried out to demonstrate the relative importance of welfare measures in the different steps of the multi-criteria aggregation process. Using the MAUT, similar results were obtained to those obtained when applying the WQ protocol aggregation methods, both at criteria and principle level. Thus, this model could be implemented to produce an overall assessment of animal welfare in the context of the WQ protocol for growing pigs. Furthermore, this methodology could also be used as a framework in order to produce an overall assessment of welfare for other livestock species. Two main findings are obtained from the sensitivity analysis, first, a limited number of measures had a strong influence on improving or worsening the level of welfare at criteria level and second, the MAUT model was not very sensitive to an improvement in or a worsening of single welfare measures at principle level. The use of weighted sums and the conversion of disease measures into ordinal scores should be reconsidered.

  6. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  7. Ultramap: the all in One Photogrammetric Solution

    NASA Astrophysics Data System (ADS)

    Wiechert, A.; Gruber, M.; Karner, K.

    2012-07-01

    This paper describes in detail the dense matcher developed since years by Vexcel Imaging in Graz for Microsoft's Bing Maps project. This dense matcher was exclusively developed for and used by Microsoft for the production of the 3D city models of Virtual Earth. It will now be made available to the public with the UltraMap software release mid-2012. That represents a revolutionary step in digital photogrammetry. The dense matcher generates digital surface models (DSM) and digital terrain models (DTM) automatically out of a set of overlapping UltraCam images. The models have an outstanding point density of several hundred points per square meter and sub-pixel accuracy and are generated automatically. The dense matcher consists of two steps. The first step rectifies overlapping image areas to speed up the dense image matching process. This rectification step ensures a very efficient processing and detects occluded areas by applying a back-matching step. In this dense image matching process a cost function consisting of a matching score as well as a smoothness term is minimized. In the second step the resulting range image patches are fused into a DSM by optimizing a global cost function. The whole process is optimized for multi-core CPUs and optionally uses GPUs if available. UltraMap 3.0 features also an additional step which is presented in this paper, a complete automated true-ortho and ortho workflow. For this, the UltraCam images are combined with the DSM or DTM in an automated rectification step and that results in high quality true-ortho or ortho images as a result of a highly automated workflow. The paper presents the new workflow and first results.

  8. Participation of cysteine-rich secretory proteins (CRISP) in mammalian sperm-egg interaction.

    PubMed

    Cohen, Débora J; Busso, Dolores; Da Ros, Vanina; Ellerman, Diego A; Maldera, Julieta A; Goldweic, Nadia; Cuasnicu, Patricia S

    2008-01-01

    Mammalian fertilization is a complex multi-step process mediated by different molecules present on both gametes. CRISP1 (cysteine-rich secretory protein 1) is an epididymal protein thought to participate in gamete fusion through its binding to egg-complementary sites. Structure-function studies using recombinant fragments of CRISP1 as well as synthetic peptides reveal that its egg-binding ability resides in a 12 amino acid region corresponding to an evolutionary conserved motif of the CRISP family, named Signature 2 (S2). Further experiments analyzing both the ability of other CRISP proteins to bind to the rat egg and the amino acid sequence of their S2 regions show that the amino acid sequence of the S2 is needed for CRISP1 to interact with the egg. CRISP1 appears to be involved in the first step of sperm binding to the zona pellucida, identifying a novel role for this protein in fertilization. The observation that sperm testicular CRISP2 is also able to bind to the egg surface suggests a role for this protein in gamete fusion. Subsequent experiments confirmed the participation of CRISP2 in this step of fertilization and revealed that CRISP1 and CRISP2 interact with common egg surface binding sites. Together, these results suggest a functional cooperation between CRISP1 and CRISP2 to ensure the success of fertilization. These observations contribute to a better understanding of the molecular mechanisms underlying mammalian fertilization.

  9. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline–oxidative pretreatment of hybrid poplar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A.

    2018-05-17

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H2O2) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significant loss inmore » sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified.« less

  10. Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback

    NASA Astrophysics Data System (ADS)

    Zhang, Wenle; Liu, Jianchang

    2016-04-01

    This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.

  11. Validation of a One-Step Method for Extracting Fatty Acids from Salmon, Chicken and Beef Samples.

    PubMed

    Zhang, Zhichao; Richardson, Christine E; Hennebelle, Marie; Taha, Ameer Y

    2017-10-01

    Fatty acid extraction methods are time-consuming and expensive because they involve multiple steps and copious amounts of extraction solvents. In an effort to streamline the fatty acid extraction process, this study compared the standard Folch lipid extraction method to a one-step method involving a column that selectively elutes the lipid phase. The methods were tested on raw beef, salmon, and chicken. Compared to the standard Folch method, the one-step extraction process generally yielded statistically insignificant differences in chicken and salmon fatty acid concentrations, percent composition and weight percent. Initial testing showed that beef stearic, oleic and total fatty acid concentrations were significantly lower by 9-11% with the one-step method as compared to the Folch method, but retesting on a different batch of samples showed a significant 4-8% increase in several omega-3 and omega-6 fatty acid concentrations with the one-step method relative to the Folch. Overall, the findings reflect the utility of a one-step extraction method for routine and rapid monitoring of fatty acids in chicken and salmon. Inconsistencies in beef concentrations, although minor (within 11%), may be due to matrix effects. A one-step fatty acid extraction method has broad applications for rapidly and routinely monitoring fatty acids in the food supply and formulating controlled dietary interventions. © 2017 Institute of Food Technologists®.

  12. Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok K.; Ravindran, S. S.

    2017-01-01

    Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.

  13. Adaptation Criteria for the Personalised Delivery of Learning Materials: A Multi-Stage Empirical Investigation

    ERIC Educational Resources Information Center

    Thalmann, Stefan

    2014-01-01

    Personalised e-Learning represents a major step-change from the one-size-fits-all approach of traditional learning platforms to a more customised and interactive provision of learning materials. Adaptive learning can support the learning process by tailoring learning materials to individual needs. However, this requires the initial preparation of…

  14. Multi-functional micromotor: microfluidic fabrication and water treatment application.

    PubMed

    Chen, Anqi; Ge, Xue-Hui; Chen, Jian; Zhang, Liyuan; Xu, Jian-Hong

    2017-12-05

    Micromotors are important for a wide variety of applications. Here, we develop a microfluidic approach for one-step fabrication of a Janus self-propelled micromotor with multiple functions. By fine tuning the fabrication parameters and loading functional nanoparticles, our micromotor reaches a high speed and achieves an oriented function to promote the water purification efficiency and recycling process.

  15. 76 FR 103 - Fisheries of the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-03

    ... have implemented the Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for... Workshop Schedule February 14-17, 2011; SEDAR 22 Review Workshop February 14, 2010: 1 p.m.-8 p.m.; February... the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meeting AGENCY: National...

  16. Forest statistics for Mississippi counties - 1987

    Treesearch

    Bryan L. Donner; F. Dee Hines

    1987-01-01

    The tables and figures in this report were derived from data obtained through a multi-resource inventory of 82 counties and five survey regions comprising the state of Mississippi (fig. 1). The data on forest acreage and timber volume were secured by a three-step process. First, a forest-non-forest classification was accomplished on aerial photographs for points...

  17. 77 FR 16812 - Fisheries of the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-22

    ...: Notice of SEDAR 29 assessment webinars for Highly Migratory Species (HMS) blacktip shark (Carcharhinus limbatus). SUMMARY: The SEDAR 29 assessment of HMS blacktip shark will consist of a workshop and series of... Review (SEDAR) process, a multi-step method for determining the status of fish stocks in the Southeast...

  18. 78 FR 34046 - Fisheries of the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... Bonnethead sharks. SUMMARY: The SEDAR assessment of the HMS stocks of Atlantic Sharpnose and Bonnethead sharks will consist of one workshop and a series of Webinars. See SUPPLEMENTARY INFORMATION. DATES: The... status of fish stocks in the Southeast Region. SEDAR is a multi-step process including: (1) Data...

  19. Research to Go: Taking an Information Literacy Credit Course Online

    ERIC Educational Resources Information Center

    Long, Jessica; Burke, John J.; Tumbleson, Beth

    2012-01-01

    Adapting an existing face-to-face information literacy course that teaches undergraduates how to successfully conduct research and creating an online or hybrid version is a multi-step process. It begins with a desire to reach more students and help them achieve academic success. The primary learning outcomes for any information literacy course are…

  20. Making Healthy Eating and Physical Activity Policy Practice: Process Evaluation of a Group Randomized Controlled Intervention in Afterschool Programs

    ERIC Educational Resources Information Center

    Weaver, R. Glenn; Beets, Michael W.; Hutto, Brent; Saunders, Ruth P.; Moore, Justin B.; Turner-McGrievy, Gabrielle; Huberty, Jennifer L.; Ward, Dianne S.; Pate, Russell R.; Beighle, Aaron; Freedman, Darcy

    2015-01-01

    This study describes the link between level of implementation and outcomes from an intervention to increase afterschool programs' (ASPs) achievement of healthy eating and physical activity (HE-PA) Standards. Ten intervention ASPs implemented the Strategies-To-Enhance-Practice (STEPs), a multi-component, adaptive intervention framework identifying…

  1. Modified Unzipping Technique to Prepare Graphene Nano-Sheets

    NASA Astrophysics Data System (ADS)

    Al-Tamimi, B. H.; Farid, S. B. H.; Chyad, F. A.

    2018-05-01

    Graphene nano-sheets have been prepared via unzipping approach of multiwall carbon nanotubes (MWCNTs). The method includes two chemical-steps, in which a multi-parameter oxidation step is performed to achieve unzipping the carbon nanotubes. Then, a reduction step is carried out to achieve the final graphene nano-sheets. In the oxidation step, the oxidant material was minimized and balanced with longer curing time. This modification is made in order to reduce the oxygen-functional groups at the ends of graphene basal planes, which reduce its electrical conductivity. In addition, a similar adjustment is achieved in the reduction step, i.e. the consumed chemicals is reduced which make the overall process more economic and eco-friendly. The prepared nano-sheets were characterized by atomic force microscopy, scanning electron microscopy, and Raman spectroscopy. The average thickness of the prepared graphene was about 5.23 nm.

  2. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Multi-off-grid methods in multi-step integration of ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Beaudet, P. R.

    1974-01-01

    Description of methods of solving first- and second-order systems of differential equations in which all derivatives are evaluated at off-grid locations in order to circumvent the Dahlquist stability limitation on the order of on-grid methods. The proposed multi-off-grid methods require off-grid state predictors for the evaluation of the n derivatives at each step. Progressing forward in time, the off-grid states are predicted using a linear combination of back on-grid state values and off-grid derivative evaluations. A comparison is made between the proposed multi-off-grid methods and the corresponding Adams and Cowell on-grid integration techniques in integrating systems of ordinary differential equations, showing a significant reduction in the error at larger step sizes in the case of the multi-off-grid integrator.

  4. The Study of Residential Areas Extraction Based on GF-3 Texture Image Segmentation

    NASA Astrophysics Data System (ADS)

    Shao, G.; Luo, H.; Tao, X.; Ling, Z.; Huang, Y.

    2018-04-01

    The study chooses the standard stripe and dual polarization SAR images of GF-3 as the basic data. Residential areas extraction processes and methods based upon GF-3 images texture segmentation are compared and analyzed. GF-3 images processes include radiometric calibration, complex data conversion, multi-look processing, images filtering, and then conducting suitability analysis for different images filtering methods, the filtering result show that the filtering method of Kuan is efficient for extracting residential areas, then, we calculated and analyzed the texture feature vectors using the GLCM (the Gary Level Co-occurrence Matrix), texture feature vectors include the moving window size, step size and angle, the result show that window size is 11*11, step is 1, and angle is 0°, which is effective and optimal for the residential areas extracting. And with the FNEA (Fractal Net Evolution Approach), we segmented the GLCM texture images, and extracted the residential areas by threshold setting. The result of residential areas extraction verified and assessed by confusion matrix. Overall accuracy is 0.897, kappa is 0.881, and then we extracted the residential areas by SVM classification based on GF-3 images, the overall accuracy is less 0.09 than the accuracy of extraction method based on GF-3 Texture Image Segmentation. We reached the conclusion that residential areas extraction based on GF-3 SAR texture image multi-scale segmentation is simple and highly accurate. although, it is difficult to obtain multi-spectrum remote sensing image in southern China, in cloudy and rainy weather throughout the year, this paper has certain reference significance.

  5. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  6. ADVANCED SULFUR CONTROL CONCEPTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apostolos A. Nikolopoulos; Santosh K. Gangwal; William J. McMichael

    Conventional sulfur removal in integrated gasification combined cycle (IGCC) power plants involves numerous steps: COS (carbonyl sulfide) hydrolysis, amine scrubbing/regeneration, Claus process, and tail-gas treatment. Advanced sulfur removal in IGCC systems involves typically the use of zinc oxide-based sorbents. The sulfides sorbent is regenerated using dilute air to produce a dilute SO{sub 2} (sulfur dioxide) tail gas. Under previous contracts the highly effective first generation Direct Sulfur Recovery Process (DSRP) for catalytic reduction of this SO{sub 2} tail gas to elemental sulfur was developed. This process is currently undergoing field-testing. In this project, advanced concepts were evaluated to reduce themore » number of unit operations in sulfur removal and recovery. Substantial effort was directed towards developing sorbents that could be directly regenerated to elemental sulfur in an Advanced Hot Gas Process (AHGP). Development of this process has been described in detail in Appendices A-F. RTI began the development of the Single-step Sulfur Recovery Process (SSRP) to eliminate the use of sorbents and multiple reactors in sulfur removal and recovery. This process showed promising preliminary results and thus further process development of AHGP was abandoned in favor of SSRP. The SSRP is a direct Claus process that consists of injecting SO{sub 2} directly into the quenched coal gas from a coal gasifier, and reacting the H{sub 2}S-SO{sub 2} mixture over a selective catalyst to both remove and recover sulfur in a single step. The process is conducted at gasifier pressure and 125 to 160 C. The proposed commercial embodiment of the SSRP involves a liquid phase of molten sulfur with dispersed catalyst in a slurry bubble-column reactor (SBCR).« less

  7. MicroGen: a MIAME compliant web system for microarray experiment information and workflow management.

    PubMed

    Burgarella, Sarah; Cattaneo, Dario; Pinciroli, Francesco; Masseroli, Marco

    2005-12-01

    Improvements of bio-nano-technologies and biomolecular techniques have led to increasing production of high-throughput experimental data. Spotted cDNA microarray is one of the most diffuse technologies, used in single research laboratories and in biotechnology service facilities. Although they are routinely performed, spotted microarray experiments are complex procedures entailing several experimental steps and actors with different technical skills and roles. During an experiment, involved actors, who can also be located in a distance, need to access and share specific experiment information according to their roles. Furthermore, complete information describing all experimental steps must be orderly collected to allow subsequent correct interpretation of experimental results. We developed MicroGen, a web system for managing information and workflow in the production pipeline of spotted microarray experiments. It is constituted of a core multi-database system able to store all data completely characterizing different spotted microarray experiments according to the Minimum Information About Microarray Experiments (MIAME) standard, and of an intuitive and user-friendly web interface able to support the collaborative work required among multidisciplinary actors and roles involved in spotted microarray experiment production. MicroGen supports six types of user roles: the researcher who designs and requests the experiment, the spotting operator, the hybridisation operator, the image processing operator, the system administrator, and the generic public user who can access the unrestricted part of the system to get information about MicroGen services. MicroGen represents a MIAME compliant information system that enables managing workflow and supporting collaborative work in spotted microarray experiment production.

  8. Discovery of optimal zeolites for challenging separations and chemical conversions through predictive materials modeling

    NASA Astrophysics Data System (ADS)

    Siepmann, J. Ilja; Bai, Peng; Tsapatsis, Michael; Knight, Chris; Deem, Michael W.

    2015-03-01

    Zeolites play numerous important roles in modern petroleum refineries and have the potential to advance the production of fuels and chemical feedstocks from renewable resources. The performance of a zeolite as separation medium and catalyst depends on its framework structure and the type or location of active sites. To date, 213 framework types have been synthesized and >330000 thermodynamically accessible zeolite structures have been predicted. Hence, identification of optimal zeolites for a given application from the large pool of candidate structures is attractive for accelerating the pace of materials discovery. Here we identify, through a large-scale, multi-step computational screening process, promising zeolite structures for two energy-related applications: the purification of ethanol beyond the ethanol/water azeotropic concentration in a single separation step from fermentation broths and the hydroisomerization of alkanes with 18-30 carbon atoms encountered in petroleum refining. These results demonstrate that predictive modeling and data-driven science can now be applied to solve some of the most challenging separation problems involving highly non-ideal mixtures and highly articulated compounds. Financial support from the Department of Energy Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences under Award DE-FG02-12ER16362 is gratefully acknowledged.

  9. Phosphate uptake studies of cross-linked chitosan bead materials.

    PubMed

    Mahaninia, Mohammad H; Wilson, Lee D

    2017-01-01

    A systematic experimental study is reported that provides a molecular based understanding of cross-linked chitosan beads and their adsorption properties in aqueous solution containing phosphate dianion (HPO 4 2- ) species. Synthetically modified chitosan using epichlorohydrin and glutaraldehyde cross-linkers result in surface modified beads with variable hydrophile-lipophile character and tunable HPO 4 2- uptake properties. The kinetic and thermodynamic adsorption properties of cross-linked chitosan beads with HPO 4 2- species were studied in aqueous solution. Complementary structure and physicochemical characterization of chitosan beads via potentiometry, Raman spectroscopy, DSC, and dye adsorption measurements was carried out to establish structure-property relationships. The maximum uptake (Q m ) of bead systems with HPO 4 2- at equilibrium was 52.1mgg -1 ; whereas, kinetic uptake results for chitosan bead/phosphate systems are relatively rapid (0.111-0.113min -1 ) with an intraparticle diffusion rate-limiting step. The adsorption process follows a multi-step pathway involving inner- and outer-sphere complexes with significant changes in hydration. Phosphate uptake strongly depends on the composition and type of cross-linker used for preparation of chitosan beads. The adsorption isotherms and structural characterization of bead systems illustrate the role of surface charge, hydrophile-lipophile balance, adsorption site accessibility, and hydration properties of the chitosan bead surface. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Construction and characterization of mismatch-containing circular DNA molecules competent for assessment of nick-directed human mismatch repair in vitro.

    PubMed

    Larson, Erik D; Nickens, David; Drummond, James T

    2002-02-01

    The ability of cell-free extracts to correct DNA mismatches has been demonstrated in both prokaryotes and eukaryotes. Such an assay requires a template containing both a mismatch and a strand discrimination signal, and the multi-step construction process can be technically difficult. We have developed a three-step procedure for preparing DNA heteroduplexes containing a site-specific nick. The mismatch composition, sequence context, distance to the strand signal, and the means for assessing repair in each strand are adjustable features built into a synthetic oligonucleotide. Controlled ligation events involving three of the four DNA strands incorporate the oligonucleotide into a circular template and generate the repair-directing nick. Mismatch correction in either strand of a prototype G.T mismatch was achieved by placing a nick 10-40 bp away from the targeted base. This proximity of nick and mismatch represents a setting where repair has not been well characterized, but the presence of a nick was shown to be essential, as was the MSH2/MSH6 heterodimer, although low levels of repair occurred in extract defective in each protein. All repair events were inhibited by a peptide that interacts with proliferating cell nuclear antigen and inhibits both mismatch repair and long-patch replication.

  11. Single-Step Assembly of Multi-Modal Imaging Nanocarriers: MRI and Long-Wavelength Fluorescence Imaging

    PubMed Central

    Pinkerton, Nathalie M.; Gindy, Marian E.; Calero-DdelC, Victoria L.; Wolfson, Theodore; Pagels, Robert F.; Adler, Derek; Gao, Dayuan; Li, Shike; Wang, Ruobing; Zevon, Margot; Yao, Nan; Pacheco, Carlos; Therien, Michael J.; Rinaldi, Carlos; Sinko, Patrick J.

    2015-01-01

    MRI and NIR-active, multi-modal Composite NanoCarriers (CNCs) are prepared using a simple, one-step process, Flash NanoPrecipitation (FNP). The FNP process allows for the independent control of the hydrodynamic diameter, co-core excipient and NIR dye loading, and iron oxide-based nanocrystal (IONC) content of the CNCs. In the controlled precipitation process, 10 nm IONCs are encapsulated into poly(ethylene glycol) stabilized CNCs to make biocompatible T2 contrast agents. By adjusting the formulation, CNC size is tuned between 80 and 360 nm. Holding the CNC size constant at an intensity weighted average diameter of 99 ± 3 nm (PDI width 28 nm), the particle relaxivity varies linearly with encapsulated IONC content ranging from 66 to 533 mM-1s-1 for CNCs formulated with 4 to 16 wt% IONC. To demonstrate the use of CNCs as in vivo MRI contrast agents, CNCs are surface functionalized with liver targeting hydroxyl groups. The CNCs enable the detection of 0.8 mm3 non-small cell lung cancer metastases in mice livers via MRI. Incorporating the hydrophobic, NIR dye PZn3 into CNCs enables complementary visualization with long-wavelength fluorescence at 800 nm. In vivo imaging demonstrates the ability of CNCs to act both as MRI and fluorescent imaging agents. PMID:25925128

  12. Conceptual Transformation and Cognitive Processes in Origami Paper Folding

    ERIC Educational Resources Information Center

    Tenbrink, Thora; Taylor, Holly A.

    2015-01-01

    Research on problem solving typically does not address tasks that involve following detailed and/or illustrated step-by-step instructions. Such tasks are not seen as cognitively challenging problems to be solved. In this paper, we challenge this assumption by analyzing verbal protocols collected during an Origami folding task. Participants…

  13. Understanding, Developing, and Writing Effective IEPs: A Step-by-Step Guide for Educators

    ERIC Educational Resources Information Center

    Pierangelo, Roger; Giuliani, George A.

    2007-01-01

    Creating and evaluating Individualized Education Programs (IEPs) for students with disabilities is a major responsibility for teachers and school leaders, yet the process involves legal components not always understood by educators. In "Understanding, Developing, and Writing Effective IEPs," legal and special education experts Roger…

  14. Advanced metal lift-off process using electron-beam flood exposure of single-layer photoresist

    NASA Astrophysics Data System (ADS)

    Minter, Jason P.; Ross, Matthew F.; Livesay, William R.; Wong, Selmer S.; Narcy, Mark E.; Marlowe, Trey

    1999-06-01

    In the manufacture of many types of integrated circuit and thin film devices, it is desirable to use a lift-of process for the metallization step to avoid manufacturing problems encountered when creating metal interconnect structures using plasma etch. These problems include both metal adhesion and plasma etch difficulties. Key to the success of the lift-off process is the creation of a retrograde or undercut profile in the photoresists before the metal deposition step. Until now, lift-off processing has relied on costly multi-layer photoresists schemes, image reversal, and non-repeatable photoresist processes to obtain the desired lift-off profiles in patterned photoresist. This paper present a simple, repeatable process for creating robust, user-defined lift-off profiles in single layer photoresist using a non-thermal electron beam flood exposure. For this investigation, lift-off profiles created using electron beam flood exposure of many popular photoresists were evaluated. Results of lift-off profiles created in positive tone AZ7209 and ip3250 are presented here.

  15. Effect of paste processing on residue levels of imidacloprid, pyraclostrobin, azoxystrobin and fipronil in winter jujube.

    PubMed

    Peng, Wei; Zhao, Liuwei; Liu, Fengmao; Xue, Jiaying; Li, Huichen; Shi, Kaiwei

    2014-01-01

    The changes of imidacloprid, pyraclostrobin, azoxystrobin and fipronil residues were studied to investigate the carryover of pesticide residues in winter jujube during paste processing. A multi-residue analytical method for winter jujube was developed based on the QuEChERS approach. The recoveries for the pesticides were between 87.5% and 116.2%. LODs ranged from 0.002 to 0.1 mg kg(-1). The processing factor (Pf) is defined as the ratio of pesticide residue concentration in the paste to that in winter jujube. Pf was higher than 1 for the removal of extra water, and other steps were generally less than 1, indicating that the whole process resulted in lower pesticide residue levels in paste. Peeling would be the critical step for pesticide removal. Processing factors varied among different pesticides studied. The results are useful to address optimisation of the processing techniques in a manner that leads to considerable pesticide residue reduction.

  16. Applications of Multi-Body Dynamical Environments: The ARTEMIS Transfer Trajectory Design

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Woodard, Mark; Howell, Kathleen; Patterson, Chris; Schlei, Wayne

    2010-01-01

    The application of forces in multi-body dynamical environments to pennit the transfer of spacecraft from Earth orbit to Sun-Earth weak stability regions and then return to the Earth-Moon libration (L1 and L2) orbits has been successfully accomplished for the first time. This demonstrated transfer is a positive step in the realization of a design process that can be used to transfer spacecraft with minimal Delta-V expenditures. Initialized using gravity assists to overcome fuel constraints; the ARTEMIS trajectory design has successfully placed two spacecraft into EarthMoon libration orbits by means of these applications.

  17. A novel design solution to the fraenal notch of maxillary dentures.

    PubMed

    White, J A P; Bond, I P; Jagger, D C

    2013-09-01

    This study investigates a novel design feature for the fraenal notch of maxillary dentures, using computational and experimental methods, and shows that its use could significantly increase the longevity of the prosthesis. A two-step process can be used to create the design feature with current denture base materials, but would be highly dependent on the individual skill of the dental technician. Therefore, an alternative form of manufacture, multi-material additive layer manufacture (or '3D printing'), has been proposed as a future method for the direct production of complete dentures with multi-material design features.

  18. Energy transport towards magnetosphere: current background and perspectives

    NASA Astrophysics Data System (ADS)

    Savin, Sergey; Zelenyi, Lev

    On the background of rising number of multi-scale magnetospheric constellations of satellites (e.g. MMS, ROY, SCOPE etc.), we discuss realistic options for the future experimental efforts in the current international framework. Now space weather predictions require cross-scale (i.e. multi-point) and micro-scale (down to the electron inertial length and gyroradius, i.e. few km and 0.1 s) measurements, which should facilitate the fundamental turbulence explorations impacting e.g. fusion and astrophysical tasks. Both ROY and SCOPE could provide 4-6 space-craft under wide international collaboration. For SCOPE near-equatorial plane is the region for the multi-scale studies, while ROY will start from high latitudes and finish at the intermediate and, hopefully, low ones. We suggest a new strategy for the correlated measurements instead of a multi-tetrahedron configuration: -place spacecraft along magnetospheric boundaries: magne-topause, neutral sheet, bow shock et. instead of tetrahedron Cluster-like configuration trying to get the multi-scale measurements along the natural boundaries; -monitor the processes along the streamlines in magnetosheath; -use extra 2-8 nano/ pico-satellites for campaigns of the multi-spacecraft explorations, -utilize multi-frequency radio-tomography for monitoring of the inter-spacecraft processes Both SCOPE and ROY launchers have respective payload resources, which, with the respective international cooperation, should provide a new step in the magnetospheric plasma explorations.

  19. Effects of low energy E-beam irradiation on graphene and graphene field effect transistors and raman metrology of graphene on split gate test structures

    NASA Astrophysics Data System (ADS)

    Rao, Gayathri S.

    2011-12-01

    Apart from its compelling performance in conventional nanoelectronic device geometries, graphene is an appropriate candidate to study certain interesting phenomenon (e.g. the Veselago lens effect) predicted on the basis of its linear electron dispersion relation. A key requirement for the observation of such phenomenon in graphene and for its use in conventional field-effect transistor (FET) devices is the need to minimize defects such as consisting of -- or resulting from -- adsorbates and lattice non-uniformities, and reduce deleterious substrate effects. Consequently the investigation of the origin and interaction of defects in the graphene lattice is essential to improve and tailor graphene-based device performance. In this thesis, optical spectroscopic studies on the influence of low-energy electron irradiation on adsorbate-induced defectivity and doping for substrate supported and suspended graphene were carried out along with spectroscopic and transport measurements on graphene FETs. A comparative investigation of the effects of single-step versus multi-step, low-energy electron irradiation (500 eV) on suspended, substrate supported graphene and on graphene FETs is reported. E-beam irradiation (single-step and multi-step) of substrate-supported graphene resulted in an increase in the Raman ID/IG ratio largely from hydrogenation due to radiolysis of the interfacial water layer between the graphene and the SiO2 substrate and from irradiated surface adsorbates. GFETs subjected to single and multi-step irradiation showed n-doping from CNP (charge neutrality point) shift of ˜ -8 and ˜ -16 V respectively. Correlation of this data with Raman analysis of suspended and supported graphene samples implied a strong role of the substrate and irradiation sequence in determining the level of doping. A correspondingly higher reduction in mobility per incident electron was also observed for GFETs subjected to multi-step irradiation compared to single step, in line with measured Raman ID/IG ratios. Additionally, the Raman G-band DeltaFWHM variation was strongly dependent on the nature of the e-beam irradiation and the presence of the substrate. Single-step irradiated, substrate-supported graphene exhibited substantial broadening while multi-step irradiation resulted in G-band narrowing. This behavior was not observed for suspended graphene which indicated the addition or elimination of substrate-induced phonon-relaxation mechanisms in response to each type of irradiation. The narrowing of the FWHM (G) in the multi-step case is attributed to doping consistent with the Dirac point shift of ˜ -16V and the removal of Landau phonon damping above Ef > ℏwG2 . In strong contrast, single step irradiation of substrate supported graphene yielded a broadening of the FWHM (G) accompanied by a CNP shift of ˜ -8V indicating appreciable n-doping. This reveals the presence of alternate phonon decay channels even when Landau damping above Ef > ℏwG2 is removed. It is proposed in this dissertation that this phenomenon is linked to hybridization of silicon oxide defect states (induced by single-step e-beam irradiation) and graphene electron states. This hybridization promotes a graphene phonon decay channel distinct from Landau damping, the latter being forbidden under sufficient doping. It is proposed that the alternate phonon decay channel involves two-component inelastic scattering, wherein the graphene phonons transfer energy to the carriers in the lattice which in turn couple to the polar phonons of the substrate resulting in mobility reduction. Furthermore, it is proposed that this defect-induced, graphene phonon decay channel is inhibited in multi-step e-beam irradiation due to the presence of adsorbates on the graphene introduced during ambient exposure between radiation cycles. On e-beam irradiation the adsorbates induce polar orientation of water dipoles at the graphene/SiO2 interface. This polar layer shifts the hybridized defect bands closer to the graphene Dirac bands thereby reducing the inelastic scattering and inhibiting the phonon decay medicated by SiO2 surface polar phonons (SPP). This model also explains the enhancement of n-type doping in GFETS observed for multi-step irradiation. These results highlight the impact of substrate defects and interaction of induced defectivity with the e-beam along with the role of interfacial water in impacting graphene device performance. The thesis also presents data on Raman-based characterization of graphene including layer number determination and carrier concentration measurement. Determination of layer number for graphene exfoliates focused on the splitting of the 2D Raman band. In addition, an alternate Raman-based thickness metrology was evaluated for CVD-based, polycrystalline graphene. Both were carried out on split gate test structures as a method for monolayer or bilayer confirmation in device geometries. In addition, carrier concentration measurements of exfoliates on 300nm SiO2 and split-gate test structure substrate have also been characterized with back gate biasing. These measurements made use of the stiffening of the Raman G-band with doping and the narrowing of the G-band FWHM. These results were important for validating conclusions from the e-beam irradiation experiments mentioned above regarding carrier doping.

  20. Organic-Inorganic Hybrid Materials: Multi-Functional Solids for Multi-Step Reaction Processes.

    PubMed

    Díaz, Urbano; Corma, Avelino

    2018-03-15

    The design of new hybrid materials with tailored properties at the nano-, meso-, and macro-scale, with the use of structural functional nanobuilding units, is carried out to obtain specific multi-functional materials. Organization into controlled 1D, 2D, and 3D architectures with selected functionalities is key for developing advanced catalysts, but this is hardly accomplished using conventional synthesis procedures. The use of pre-formed nanostructures, derived either from known materials or made with specific innovative synthetic methodologies, has enormous potential in the generation of multi-site catalytic materials for one-pot processes. The present concept article introduces a new archetype wherein self-assembled nanostructured builder units are the base for the design of multifunctional catalysts, which combine catalytic efficiency with fast reactant and product diffusion. The article addresses a new generation of versatile hybrid organic-inorganic multi-site catalytic materials for their use in the production of (chiral) high-added-value products within the scope of chemicals and fine chemicals production. The use of those multi-reactive solids for more nanotechnological applications, such as sensors, due to the inclusion of electron donor-acceptor structural arrays is also considered, together with the adsorption-desorption capacities due to the combination of hydrophobic and hydrophilic sub-domains. The innovative structured hybrid materials for multipurpose processes here considered, can allow the development of multi-stage one-pot reactions with industrial applications, using the materials as one nanoreactor systems, favoring more sustainable production pathways with economic, environmental and energetic advantages. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Mechanical, thermal and morphological characterization of polycarbonate/oxidized carbon nanofiber composites produced with a lean 2-step manufacturing process.

    PubMed

    Lively, Brooks; Kumar, Sandeep; Tian, Liu; Li, Bin; Zhong, Wei-Hong

    2011-05-01

    In this study we report the advantages of a 2-step method that incorporates an additional process pre-conditioning step for rapid and precise blending of the constituents prior to the commonly used melt compounding method for preparing polycarbonate/oxidized carbon nanofiber composites. This additional step (equivalent to a manufacturing cell) involves the formation of a highly concentrated solid nano-nectar of polycarbonate/carbon nanofiber composite using a solution mixing process followed by melt mixing with pure polycarbonate. This combined method yields excellent dispersion and improved mechanical and thermal properties as compared to the 1-step melt mixing method. The test results indicated that inclusion of carbon nanofibers into composites via the 2-step method resulted in dramatically reduced ( 48% lower) coefficient of thermal expansion compared to that of pure polycarbonate and 30% lower than that from the 1-step processing, at the same loading of 1.0 wt%. Improvements were also found in dynamic mechanical analysis and flexural mechanical properties. The 2-step approach is more precise and leads to better dispersion, higher quality, consistency, and improved performance in critical application areas. It is also consistent with Lean Manufacturing principles in which manufacturing cells are linked together using less of the key resources and creates a smoother production flow. Therefore, this 2-step process can be more attractive for industry.

  2. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification.

    PubMed

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang; Li, Jinghong

    2017-08-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5'-ASO could block RNA splicing by inhibiting the first step, while 3'-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs.

  3. Force-Manipulation Single-Molecule Spectroscopy Studies of Enzymatic Dynamics

    NASA Astrophysics Data System (ADS)

    Lu, H. Peter; He, Yufan; Lu, Maolin; Cao, Jin; Guo, Qing

    2014-03-01

    Subtle conformational changes play a crucial role in protein functions, especially in enzymatic reactions involving complex substrate-enzyme interactions and chemical reactions. We applied AFM-enhanced and magnetic tweezers-correlated single-molecule spectroscopy to study the mechanisms and dynamics of enzymatic reactions involved with kinase and lysozyme proteins. Enzymatic reaction turnovers and the associated structure changes of individual protein molecules were observed simultaneously in real-time by single-molecule FRET detections. Our single-molecule spectroscopy measurements of enzymatic conformational dynamics have revealed time bunching effect and intermittent coherence in conformational state change dynamics involving in enzymatic reaction cycles. The coherent conformational state dynamics suggests that the enzymatic catalysis involves a multi-step conformational motion along the coordinates of substrate-enzyme complex formation and product releasing. Our results support a multiple-conformational state model, being consistent with a complementary conformation selection and induced-fit enzymatic loop-gated conformational change mechanism in substrate-enzyme active complex formation.

  4. Multi-step prediction for influenza outbreak by an adjusted long short-term memory.

    PubMed

    Zhang, J; Nawata, K

    2018-05-01

    Influenza results in approximately 3-5 million annual cases of severe illness and 250 000-500 000 deaths. We urgently need an accurate multi-step-ahead time-series forecasting model to help hospitals to perform dynamical assignments of beds to influenza patients for the annually varied influenza season, and aid pharmaceutical companies to formulate a flexible plan of manufacturing vaccine for the yearly different influenza vaccine. In this study, we utilised four different multi-step prediction algorithms in the long short-term memory (LSTM). The result showed that implementing multiple single-output prediction in a six-layer LSTM structure achieved the best accuracy. The mean absolute percentage errors from two- to 13-step-ahead prediction for the US influenza-like illness rates were all <15%, averagely 12.930%. To the best of our knowledge, it is the first time that LSTM has been applied and refined to perform multi-step-ahead prediction for influenza outbreaks. Hopefully, this modelling methodology can be applied in other countries and therefore help prevent and control influenza worldwide.

  5. Multi objective multi refinery optimization with environmental and catastrophic failure effects objectives

    NASA Astrophysics Data System (ADS)

    Khogeer, Ahmed Sirag

    2005-11-01

    Petroleum refining is a capital-intensive business. With stringent environmental regulations on the processing industry and declining refining margins, political instability, increased risk of war and terrorist attacks in which refineries and fuel transportation grids may be targeted, higher pressures are exerted on refiners to optimize performance and find the best combination of feed and processes to produce salable products that meet stricter product specifications, while at the same time meeting refinery supply commitments and of course making profit. This is done through multi objective optimization. For corporate refining companies and at the national level, Intea-Refinery and Inter-Refinery optimization is the second step in optimizing the operation of the whole refining chain as a single system. Most refinery-wide optimization methods do not cover multiple objectives such as minimizing environmental impact, avoiding catastrophic failures, or enhancing product spec upgrade effects. This work starts by carrying out a refinery-wide, single objective optimization, and then moves to multi objective-single refinery optimization. The last step is multi objective-multi refinery optimization, the objectives of which are analysis of the effects of economic, environmental, product spec, strategic, and catastrophic failure. Simulation runs were carried out using both MATLAB and ASPEN PIMS utilizing nonlinear techniques to solve the optimization problem. The results addressed the need to debottleneck some refineries or transportation media in order to meet the demand for essential products under partial or total failure scenarios. They also addressed how importing some high spec products can help recover some of the losses and what is needed in order to accomplish this. In addition, the results showed nonlinear relations among local and global objectives for some refineries. The results demonstrate that refineries can have a local multi objective optimum that does not follow the same trends as either global or local single objective optimums. Catastrophic failure effects on refinery operations and on local objectives are more significant than environmental objective effects, and changes in the capacity or the local objectives follow a discrete behavioral pattern, in contrast to environmental objective cases in which the effects are smoother. (Abstract shortened by UMI.)

  6. Transcriptome Profiling of Khat (Catha edulis) and Ephedra sinica Reveals Gene Candidates Potentially Involved in Amphetamine-Type Alkaloid Biosynthesis

    PubMed Central

    Groves, Ryan A.; Hagel, Jillian M.; Zhang, Ye; Kilpatrick, Korey; Levy, Asaf; Marsolais, Frédéric; Lewinsohn, Efraim; Sensen, Christoph W.; Facchini, Peter J.

    2015-01-01

    Amphetamine analogues are produced by plants in the genus Ephedra and by khat (Catha edulis), and include the widely used decongestants and appetite suppressants (1S,2S)-pseudoephedrine and (1R,2S)-ephedrine. The production of these metabolites, which derive from L-phenylalanine, involves a multi-step pathway partially mapped out at the biochemical level using knowledge of benzoic acid metabolism established in other plants, and direct evidence using khat and Ephedra species as model systems. Despite the commercial importance of amphetamine-type alkaloids, only a single step in their biosynthesis has been elucidated at the molecular level. We have employed Illumina next-generation sequencing technology, paired with Trinity and Velvet-Oases assembly platforms, to establish data-mining frameworks for Ephedra sinica and khat plants. Sequence libraries representing a combined 200,000 unigenes were subjected to an annotation pipeline involving direct searches against public databases. Annotations included the assignment of Gene Ontology (GO) terms used to allocate unigenes to functional categories. As part of our functional genomics program aimed at novel gene discovery, the databases were mined for enzyme candidates putatively involved in alkaloid biosynthesis. Queries used for mining included enzymes with established roles in benzoic acid metabolism, as well as enzymes catalyzing reactions similar to those predicted for amphetamine alkaloid metabolism. Gene candidates were evaluated based on phylogenetic relationships, FPKM-based expression data, and mechanistic considerations. Establishment of expansive sequence resources is a critical step toward pathway characterization, a goal with both academic and industrial implications. PMID:25806807

  7. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  8. To analyse a trace or not? Evaluating the decision-making process in the criminal investigation.

    PubMed

    Bitzer, Sonja; Ribaux, Olivier; Albertini, Nicola; Delémont, Olivier

    2016-05-01

    In order to broaden our knowledge and understanding of the decision steps in the criminal investigation process, we started by evaluating the decision to analyse a trace and the factors involved in this decision step. This decision step is embedded in the complete criminal investigation process, involving multiple decision and triaging steps. Considering robbery cases occurring in a geographic region during a 2-year-period, we have studied the factors influencing the decision to submit biological traces, directly sampled on the scene of the robbery or on collected objects, for analysis. The factors were categorised into five knowledge dimensions: strategic, immediate, physical, criminal and utility and decision tree analysis was carried out. Factors in each category played a role in the decision to analyse a biological trace. Interestingly, factors involving information available prior to the analysis are of importance, such as the fact that a positive result (a profile suitable for comparison) is already available in the case, or that a suspect has been identified through traditional police work before analysis. One factor that was taken into account, but was not significant, is the matrix of the trace. Hence, the decision to analyse a trace is not influenced by this variable. The decision to analyse a trace first is very complex and many of the tested variables were taken into account. The decisions are often made on a case-by-case basis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Measurement of intrahepatic pressure during radiofrequency ablation in porcine liver.

    PubMed

    Kawamoto, Chiaki; Yamauchi, Atsushi; Baba, Yoko; Kaneko, Keiko; Yakabi, Koji

    2010-04-01

    To identify the most effective procedures to avoid increased intrahepatic pressure during radiofrequency ablation, we evaluated different ablation methods. Laparotomy was performed in 19 pigs. Intrahepatic pressure was monitored using an invasive blood pressure monitor. Radiofrequency ablation was performed as follows: single-step standard ablation; single-step at 30 W; single-step at 70 W; 4-step at 30 W; 8-step at 30 W; 8-step at 70 W; and cooled-tip. The array was fully deployed in single-step methods. In the multi-step methods, the array was gradually deployed in four or eight steps. With the cooled-tip, ablation was performed by increasing output by 10 W/min, starting at 40 W. Intrahepatic pressure was as follows: single-step standard ablation, 154.5 +/- 30.9 mmHg; single-step at 30 W, 34.2 +/- 20.0 mmHg; single-step at 70 W, 46.7 +/- 24.3 mmHg; 4-step at 30 W, 42.3 +/- 17.9 mmHg; 8-step at 30 W, 24.1 +/- 18.2 mmHg; 8-step at 70 W, 47.5 +/- 31.5 mmHg; and cooled-tip, 114.5 +/- 16.6 mmHg. The radiofrequency ablation-induced area was spherical with single-step standard ablation, 4-step at 30 W, and 8-step at 30 W. Conversely, the ablated area was irregular with single-step at 30 W, single-step at 70 W, and 8-step at 70 W. The ablation time was significantly shorter for the multi-step method than for the single-step method. Increased intrahepatic pressure could be controlled using multi-step methods. From the shapes of the ablation area, 30-W 8-step expansions appear to be most suitable for radiofrequency ablation.

  10. Social Studies Research Papers: A Writing Process Approach.

    ERIC Educational Resources Information Center

    Gilstrap, Robert L.

    1987-01-01

    Describes a writing process approach to research papers which involves four steps: prewriting, composing, rewriting, and sharing. Illustrates the process using an intermediate grade level example but states that the process is appropriate at higher levels. Stresses that this approach is important because it integrates writing skills with social…

  11. Multi-tier drugs assessment in a decentralised health care system. The Italian case-study.

    PubMed

    Jommi, Claudio; Costa, Enrico; Michelon, Alessandra; Pisacane, Maria; Scroccaro, Giovanna

    2013-10-01

    To investigate the organisation and decision-making processes of regional and local therapeutic committees in Italy, as a case-study of decentralised health care systems. A structured questionnaire was designed, validated, and self-administered to respondents. Committee members, prioritisation, assessment process and criteria, and transparency of committees were investigated. The respondents represent 100% of the 17 regional committees out of 21 regions (in 4 regions there is not any regional formulary), 88% of the 16 hospital networks and 42% of the 183 public hospitals. The assessment process appears fragmented and may take a long time: drugs inclusion into hospital formularies requires two steps in most regions (regional and local assessment). Most of the therapeutic committees are closed to industry and patients associations involvement. Prioritisation in the assessment is mostly driven by disease severity, clinical evidence, and the absence of therapeutic alternatives. Only 13 out of the 17 regional committees have a public application form for drugs inclusion into regional formulary. Regional and local committees (i) often re-assess the clinical evidence already evaluated at central level and (ii) mostly rely on comparative drug unit prices per DDD and drug budget impact. The level of transparency is quite low. The Italian case-study provides useful insights into an appropriate management of multi-tier drugs assessment, which is particularly complex in decentralised health care systems, but exists also in centralised systems where drugs are assessed by local therapeutic committees. A clear definition of regulatory competences at different levels, a higher collaboration between central, regional and local actors, and increased transparency are necessary to pursue consistency between central policies on price and reimbursement and budget accountability at the regional and local levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Mechanism and the origins of stereospecificity in copper-catalyzed ring expansion of vinyl oxiranes: a traceless dual transition-metal-mediated process.

    PubMed

    Mustard, Thomas J L; Mack, Daniel J; Njardarson, Jon T; Cheong, Paul Ha-Yeon

    2013-01-30

    Density functional theory computations of the Cu-catalyzed ring expansion of vinyloxiranes is mediated by a traceless dual Cu(I)-catalyst mechanism. Overall, the reaction involves a monomeric Cu(I)-catalyst, but a single key step, the Cu migration, requires two Cu(I)-catalysts for the transformation. This dual-Cu step is found to be a true double Cu(I) transition state rather than a single Cu(I) transition state in the presence of an adventitious, spectator Cu(I). Both Cu(I) catalysts are involved in the bond forming and breaking process. The single Cu(I) transition state is not a stationary point on the potential energy surface. Interestingly, the reductive elimination is rate-determining for the major diastereomeric product, while the Cu(I) migration step is rate-determining for the minor. Thus, while the reaction requires dual Cu(I) activation to proceed, kinetically, the presence of the dual-Cu(I) step is untraceable. The diastereospecificity of this reaction is controlled by the Cu migration step. Suprafacial migration is favored over antarafacial migration due to the distorted Cu π-allyl in the latter.

  13. Genetics Home Reference: peroxisomal acyl-CoA oxidase deficiency

    MedlinePlus

    ... of certain fat molecules called very long-chain fatty acids (VLCFAs). Specifically, it is involved in the first step of a process called the peroxisomal fatty acid beta-oxidation pathway. This process shortens the VLCFA ...

  14. Cryogenic deformation of high temperature superconductive composite structures

    DOEpatents

    Roberts, Peter R.; Michels, William; Bingert, John F.

    2001-01-01

    An improvement in a process of preparing a composite high temperature oxide superconductive wire is provided and involves conducting at least one cross-sectional reduction step in the processing preparation of the wire at sub-ambient temperatures.

  15. Facile one-step construction of covalently networked, self-healable, and transparent superhydrophobic composite films

    NASA Astrophysics Data System (ADS)

    Lee, Yujin; You, Eun-Ah; Ha, Young-Geun

    2018-07-01

    Despite the considerable demand for bioinspired superhydrophobic surfaces with highly transparent, self-cleaning, and self-healable properties, a facile and scalable fabrication method for multifunctional superhydrophobic films with strong chemical networks has rarely been established. Here, we report a rationally designed facile one-step construction of covalently networked, transparent, self-cleaning, and self-healable superhydrophobic films via a one-step preparation and single-reaction process of multi-components. As coating materials for achieving the one-step fabrication of multifunctional superhydrophobic films, we included two different sizes of Al2O3 nanoparticles for hierarchical micro/nano dual-scale structures and transparent films, fluoroalkylsilane for both low surface energy and covalent binding functions, and aluminum nitrate for aluminum oxide networked films. On the basis of stability tests for the robust film composition, the optimized, covalently linked superhydrophobic composite films with a high water contact angle (>160°) and low sliding angle (<1°) showed excellent thermal stability (up to 400 °C), transparency (≈80%), self-healing, self-cleaning, and waterproof abilities. Therefore, the rationally designed, covalently networked superhydrophobic composite films, fabricated via a one-step solution-based process, can be further utilized for various optical and optoelectronic applications.

  16. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  17. Melanin fluorescence spectra by step-wise three photon excitation

    NASA Astrophysics Data System (ADS)

    Lai, Zhenhua; Kerimo, Josef; DiMarzio, Charles A.

    2012-03-01

    Melanin is the characteristic chromophore of human skin with various potential biological functions. Kerimo discovered enhanced melanin fluorescence by stepwise three-photon excitation in 2011. In this article, step-wise three-photon excited fluorescence (STPEF) spectrum between 450 nm -700 nm of melanin is reported. The melanin STPEF spectrum exhibited an exponential increase with wavelength. However, there was a probability of about 33% that another kind of step-wise multi-photon excited fluorescence (SMPEF) that peaks at 525 nm, shown by previous research, could also be generated using the same process. Using an excitation source at 920 nm as opposed to 830 nm increased the potential for generating SMPEF peaks at 525 nm. The SMPEF spectrum peaks at 525 nm photo-bleached faster than STPEF spectrum.

  18. Effects of industrial processing on folate content in green vegetables.

    PubMed

    Delchier, Nicolas; Ringling, Christiane; Le Grandois, Julie; Aoudé-Werner, Dalal; Galland, Rachel; Georgé, Stéphane; Rychlik, Michael; Renard, Catherine M G C

    2013-08-15

    Folates are described to be sensitive to different physical parameters such as heat, light, pH and leaching. Most studies on folates degradation during processing or cooking treatments were carried out on model solutions or vegetables only with thermal treatments. Our aim was to identify which steps were involved in folates loss in industrial processing chains, and which mechanisms were underlying these losses. For this, the folates contents were monitored along an industrial canning chain of green beans and along an industrial freezing chain of spinach. Folates contents decreased significantly by 25% during the washing step for spinach in the freezing process, and by 30% in the green beans canning process after sterilisation, with 20% of the initial amount being transferred into the covering liquid. The main mechanism involved in folate loss during both canning green beans and freezing spinach was leaching. Limiting the contact between vegetables and water or using steaming seems to be an adequate measure to limit folates losses during processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Process for remediation of plastic waste

    DOEpatents

    Pol, Vilas G [Westmont, IL; Thiyagarajan, Pappannan [Germantown, MD

    2012-04-10

    A single step process for degrading plastic waste by converting the plastic waste into carbonaceous products via thermal decomposition of the plastic waste by placing the plastic waste into a reactor, heating the plastic waste under an inert or air atmosphere until the temperature of 700.degree. C. is achieved, allowing the reactor to cool down, and recovering the resulting decomposition products therefrom. The decomposition products that this process yields are carbonaceous materials, and more specifically egg-shaped and spherical-shaped solid carbons. Additionally, in the presence of a transition metal compound, this thermal decomposition process produces multi-walled carbon nanotubes.

  20. The exoribonuclease Nibbler controls 3' end processing of microRNAs in Drosophila.

    PubMed

    Liu, Nan; Abe, Masashi; Sabin, Leah R; Hendriks, Gert-Jan; Naqvi, Ammar S; Yu, Zhenming; Cherry, Sara; Bonini, Nancy M

    2011-11-22

    MicroRNAs (miRNAs) are endogenous noncoding small RNAs with important roles in many biological pathways; their generation and activity are under precise regulation [1-3]. Emerging evidence suggests that miRNA pathways are precisely modulated with controls at the level of transcription [4-8], processing [9-11], and stability [12, 13], with miRNA deregulation linked with diseases [14] and neurodegenerative disorders [15]. In the Drosophila miRNA biogenesis pathway, long primary miRNA transcripts undergo sequential cleavage [16-18] to release the embedded miRNAs. Mature miRNAs are then loaded into Argonaute1 (Ago1) within the RNA-induced silencing complex (RISC) [19, 20]. Intriguingly, we found that Drosophila miR-34 displays multiple isoforms that differ at the 3' end, suggesting a novel biogenesis mechanism involving 3' end processing. To define the cellular factors responsible, we performed an RNA interference (RNAi) screen and identified a putative 3'→5' exoribonuclease CG9247/nibbler essential for the generation of the smaller isoforms of miR-34. Nibbler (Nbr) interacts with Ago1 and processes miR-34 within RISC. Deep sequencing analysis revealed a larger set of multi-isoform miRNAs that are controlled by nibbler. These findings suggest that Nbr-mediated 3' end processing represents a critical step in miRNA maturation that impacts miRNA diversity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Signal Processing in Functional Near-Infrared Spectroscopy (fNIRS): Methodological Differences Lead to Different Statistical Results.

    PubMed

    Pfeifer, Mischa D; Scholkmann, Felix; Labruyère, Rob

    2017-01-01

    Even though research in the field of functional near-infrared spectroscopy (fNIRS) has been performed for more than 20 years, consensus on signal processing methods is still lacking. A significant knowledge gap exists between established researchers and those entering the field. One major issue regularly observed in publications from researchers new to the field is the failure to consider possible signal contamination by hemodynamic changes unrelated to neurovascular coupling (i.e., scalp blood flow and systemic blood flow). This might be due to the fact that these researchers use the signal processing methods provided by the manufacturers of their measurement device without an advanced understanding of the performed steps. The aim of the present study was to investigate how different signal processing approaches (including and excluding approaches that partially correct for the possible signal contamination) affect the results of a typical functional neuroimaging study performed with fNIRS. In particular, we evaluated one standard signal processing method provided by a commercial company and compared it to three customized approaches. We thereby investigated the influence of the chosen method on the statistical outcome of a clinical data set (task-evoked motor cortex activity). No short-channels were used in the present study and therefore two types of multi-channel corrections based on multiple long-channels were applied. The choice of the signal processing method had a considerable influence on the outcome of the study. While methods that ignored the contamination of the fNIRS signals by task-evoked physiological noise yielded several significant hemodynamic responses over the whole head, the statistical significance of these findings disappeared when accounting for part of the contamination using a multi-channel regression. We conclude that adopting signal processing methods that correct for physiological confounding effects might yield more realistic results in cases where multi-distance measurements are not possible. Furthermore, we recommend using manufacturers' standard signal processing methods only in case the user has an advanced understanding of every signal processing step performed.

  2. The Generation and Maintenance of Visual Mental Images: Evidence from Image Type and Aging

    ERIC Educational Resources Information Center

    De Beni, Rossana; Pazzaglia, Francesca; Gardini, Simona

    2007-01-01

    Imagery is a multi-componential process involving different mental operations. This paper addresses whether separate processes underlie the generation, maintenance and transformation of mental images or whether these cognitive processes rely on the same mental functions. We also examine the influence of age on these mental operations for…

  3. Optical pattern recognition algorithms on neural-logic equivalent models and demonstration of their prospects and possible implementations

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Zaitsev, Alexandr V.; Voloshin, Victor M.

    2001-03-01

    Historic information regarding the appearance and creation of fundamentals of algebra-logical apparatus-`equivalental algebra' for description of neuro-nets paradigms and algorithms is considered which is unification of theory of neuron nets (NN), linear algebra and the most generalized neuro-biology extended for matrix case. A survey is given of `equivalental models' of neuron nets and associative memory is suggested new, modified matrix-tenzor neurological equivalental models (MTNLEMS) are offered with double adaptive-equivalental weighing (DAEW) for spatial-non- invariant recognition (SNIR) and space-invariant recognition (SIR) of 2D images (patterns). It is shown, that MTNLEMS DAEW are the most generalized, they can describe the processes in NN both within the frames of known paradigms and within new `equivalental' paradigm of non-interaction type, and the computing process in NN under using the offered MTNLEMs DAEW is reduced to two-step and multi-step algorithms and step-by-step matrix-tenzor procedures (for SNIR) and procedures of defining of space-dependent equivalental functions from two images (for SIR).

  4. Multiple dual mode counter-current chromatography with variable duration of alternating phase elution steps.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A; Shishilov, Oleg N

    2014-06-20

    The multiple dual mode (MDM) counter-current chromatography separation processes consist of a succession of two isocratic counter-current steps and are characterized by the shuttle (forward and back) transport of the sample in chromatographic columns. In this paper, the improved MDM method based on variable duration of alternating phase elution steps has been developed and validated. The MDM separation processes with variable duration of phase elution steps are analyzed. Basing on the cell model, analytical solutions are developed for impulse and non-impulse sample loading at the beginning of the column. Using the analytical solutions, a calculation program is presented to facilitate the simulation of MDM with variable duration of phase elution steps, which can be used to select optimal process conditions for the separation of a given feed mixture. Two options of the MDM separation are analyzed: 1 - with one-step solute elution: the separation is conducted so, that the sample is transferred forward and back with upper and lower phases inside the column until the desired separation of the components is reached, and then each individual component elutes entirely within one step; 2 - with multi-step solute elution, when the fractions of individual components are collected in over several steps. It is demonstrated that proper selection of the duration of individual cycles (phase flow times) can greatly increase the separation efficiency of CCC columns. Experiments were carried out using model mixtures of compounds from the GUESSmix with solvent systems hexane/ethyl acetate/methanol/water. The experimental results are compared to the predictions of the theory. A good agreement between theory and experiment has been demonstrated. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786

  6. Discovery of optimal zeolites for challenging separations and chemical transformations using predictive materials modeling

    NASA Astrophysics Data System (ADS)

    Bai, Peng; Jeon, Mi Young; Ren, Limin; Knight, Chris; Deem, Michael W.; Tsapatsis, Michael; Siepmann, J. Ilja

    2015-01-01

    Zeolites play numerous important roles in modern petroleum refineries and have the potential to advance the production of fuels and chemical feedstocks from renewable resources. The performance of a zeolite as separation medium and catalyst depends on its framework structure. To date, 213 framework types have been synthesized and >330,000 thermodynamically accessible zeolite structures have been predicted. Hence, identification of optimal zeolites for a given application from the large pool of candidate structures is attractive for accelerating the pace of materials discovery. Here we identify, through a large-scale, multi-step computational screening process, promising zeolite structures for two energy-related applications: the purification of ethanol from fermentation broths and the hydroisomerization of alkanes with 18-30 carbon atoms encountered in petroleum refining. These results demonstrate that predictive modelling and data-driven science can now be applied to solve some of the most challenging separation problems involving highly non-ideal mixtures and highly articulated compounds.

  7. Photoionization pathways and thresholds in generation of Lyman-α radiation by resonant four-wave mixing in Kr-Ar mixture

    NASA Astrophysics Data System (ADS)

    Louchev, Oleg A.; Saito, Norihito; Oishi, Yu; Miyazaki, Koji; Okamura, Kotaro; Nakamura, Jumpei; Iwasaki, Masahiko; Wada, Satoshi

    2016-09-01

    We develop a set of analytical approximations for the estimation of the combined effect of various photoionization processes involved in the resonant four-wave mixing generation of ns pulsed Lyman-α (L-α ) radiation by using 212.556 nm and 820-845 nm laser radiation pulses in Kr-Ar mixture: (i) multi-photon ionization, (ii) step-wise (2+1)-photon ionization via the resonant 2-photon excitation of Kr followed by 1-photon ionization and (iii) laser-induced avalanche ionization produced by generated free electrons. Developed expressions validated by order of magnitude estimations and available experimental data allow us to identify the area for the operation under high input laser intensities avoiding the onset of full-scale discharge, loss of efficiency and inhibition of generated L-α radiation. Calculations made reveal an opportunity for scaling up the output energy of the experimentally generated pulsed L-α radiation without significant enhancement of photoionization.

  8. Modernizing an ambulatory care pharmacy in a large multi-clinic institution.

    PubMed

    Miller, R F; Herrick, J D

    1979-03-01

    The steps involved in modernizing an outdated outpatient pharmacy, including the functional planning process, development of a work-flow pattern which makes the patient an integral part of the system, budget considerations and evaluation of the new pharmacy, are described. Objectives of the modernization were to: (1) provide a facility conductive to efficient and high quality services to the ambulatory patient; (2) provide an attractive and comfortable area for patients and staff; (3) provide a work flow which keeps the patient in the system and allows the pharmacist time for instruction and patient education; and (4) establish a patient medication record system. After one year of operation, average overall prescription volume increased by 50%, while average waiting time declined by 74%. Facility and procedural changes allowed the pharmacist to substantially increase patient counseling activity. The application of functional planning and facility design to the renovation and restructuring of an outpatient pharmacy allowed pharmacists to provide efficient, patient-oriented service.

  9. How to Build the Master Schedule in 10 Easy Steps: A Guide for Secondary School Administrators

    ERIC Educational Resources Information Center

    Kussin, Steven S.

    2007-01-01

    This book is an incredibly valuable resource to anyone involved in building a master schedule. The author provides a comprehensive description of the processes involved and makes the reader aware of what needs to be considered and done throughout the process. One of the most time-consuming tasks for school leaders is creating a master schedule…

  10. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    PubMed

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity as well as seed chipping and tissue sampling approaches to facilitate genotyping. With selfing approaches, two generations of selfing rather than one for trait fixation (i.e. 'F2 enrichment' as per Bonnett et al. in Strategies for efficient implementation of molecular markers in wheat breeding. Mol Breed 15:75-85, 2005) were utilized to eliminate bottlenecking due to extremely low frequencies of desired genotypes in the population. The efficiency indicators such as total number of plants grown across generations, total number of marker data points, total number of generations, number of seeds sampled by seed chipping, number of plants requiring tissue sampling, and number of pollinations (i.e. selfing and crossing) were considered in comparisons of breeding strategies. A breeding strategy involving seed chipping and a two-generation selfing approach (SC + SELF) was determined to be the most efficient breeding strategy in terms of time to market and resource requirements. Doubled haploidy may have limited utility in trait fixation for MTI under the defined breeding scenario. This outcome paves the way for optimizing the last step in the MTI process, version testing, which involves hybridization of female and male RP conversions to create versions of the converted hybrid for performance evaluation and possible commercial release.

  11. Evaluating the compatibility of multi-functional and intensive urban land uses

    NASA Astrophysics Data System (ADS)

    Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.

    2007-12-01

    This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).

  12. A transition from using multi-step procedures to a fully integrated system for performing extracorporeal photopheresis: A comparison of costs and efficiencies.

    PubMed

    Azar, Nabih; Leblond, Veronique; Ouzegdouh, Maya; Button, Paul

    2017-12-01

    The Pitié Salpêtrière Hospital Hemobiotherapy Department, Paris, France, has been providing extracorporeal photopheresis (ECP) since November 2011, and started using the Therakos ® CELLEX ® fully integrated system in 2012. This report summarizes our single-center experience of transitioning from the use of multi-step ECP procedures to the fully integrated ECP system, considering the capacity and cost implications. The total number of ECP procedures performed 2011-2015 was derived from department records. The time taken to complete a single ECP treatment using a multi-step technique and the fully integrated system at our department was assessed. Resource costs (2014€) were obtained for materials and calculated for personnel time required. Time-driven activity-based costing methods were applied to provide a cost comparison. The number of ECP treatments per year increased from 225 (2012) to 727 (2015). The single multi-step procedure took 270 min compared to 120 min for the fully integrated system. The total calculated per-session cost of performing ECP using the multi-step procedure was greater than with the CELLEX ® system (€1,429.37 and €1,264.70 per treatment, respectively). For hospitals considering a transition from multi-step procedures to fully integrated methods for ECP where cost may be a barrier, time-driven activity-based costing should be utilized to gain a more comprehensive understanding the full benefit that such a transition offers. The example from our department confirmed that there were not just cost and time savings, but that the time efficiencies gained with CELLEX ® allow for more patient treatments per year. © 2017 The Authors Journal of Clinical Apheresis Published by Wiley Periodicals, Inc.

  13. The Use of Novel Camtasia Videos to Improve Performance of At-Risk Students in Undergraduate Physiology Courses

    ERIC Educational Resources Information Center

    Miller, Cynthia J.

    2014-01-01

    Students in undergraduate physiology courses often have difficulty understanding complex, multi-step processes, and these concepts consume a large portion of class time. For this pilot study, it was hypothesized that online multimedia resources may improve student performance in a high-risk population and reduce the in-class workload. A narrated…

  14. Global Business Literacy in the Classroom: Developing and Applying an Assessment Framework

    ERIC Educational Resources Information Center

    Arevalo, Jorge A.; McCrea, Elizabeth; Yin, Jason Z.

    2012-01-01

    This study develops and applies a framework to evaluate undergraduate Global Business Literacy (GBL) learning outcomes, which is defined here as the ability to adapt and function in the global business context and to be knowledgeable about its core issues and trends. As a first step in a multi-stage research process, we used extant expatriate and…

  15. 77 FR 11066 - Fisheries of the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-24

    ...: Notice of SEDAR Data/Assessment Workshop for Highly Migratory Species (HMS) blacktip sharks. SUMMARY: The SEDAR assessment of the HMS stocks of Gulf of Mexico blacktip sharks will consist of one workshop and a..., Assessment and Review (SEDAR) process, a multi-step method for determining the status of fish stocks in the...

  16. RCRA/UST, Superfund, and EPCRA hotline training module. Introduction to superfund community involvement. Directive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-03-01

    This module covers EPA`s Superfund community involvement program, a set of requirements under the National Contingency Plan (NCP) designed to ensure that public is informed about site conditions and given the opportunity to comment on the proposed remedy of a Superfund site. The NCP serves to uphold the public`s right to voice opinions and express concerns about Superfund site activities. EPA must involve communities throughout Superfund process - particularly at critical decision-making steps in the process.

  17. Multi-stage circulating fluidized bed syngas cooling

    DOEpatents

    Liu, Guohai; Vimalchand, Pannalal; Guan, Xiaofeng; Peng, WanWang

    2016-10-11

    A method and apparatus for cooling hot gas streams in the temperature range 800.degree. C. to 1600.degree. C. using multi-stage circulating fluid bed (CFB) coolers is disclosed. The invention relates to cooling the hot syngas from coal gasifiers in which the hot syngas entrains substances that foul, erode and corrode heat transfer surfaces upon contact in conventional coolers. The hot syngas is cooled by extracting and indirectly transferring heat to heat transfer surfaces with circulating inert solid particles in CFB syngas coolers. The CFB syngas coolers are staged to facilitate generation of steam at multiple conditions and hot boiler feed water that are necessary for power generation in an IGCC process. The multi-stage syngas cooler can include internally circulating fluid bed coolers, externally circulating fluid bed coolers and hybrid coolers that incorporate features of both internally and externally circulating fluid bed coolers. Higher process efficiencies can be realized as the invention can handle hot syngas from various types of gasifiers without the need for a less efficient precooling step.

  18. An efficient multi-resolution GA approach to dental image alignment

    NASA Astrophysics Data System (ADS)

    Nassar, Diaa Eldin; Ogirala, Mythili; Adjeroh, Donald; Ammar, Hany

    2006-02-01

    Automating the process of postmortem identification of individuals using dental records is receiving an increased attention in forensic science, especially with the large volume of victims encountered in mass disasters. Dental radiograph alignment is a key step required for automating the dental identification process. In this paper, we address the problem of dental radiograph alignment using a Multi-Resolution Genetic Algorithm (MR-GA) approach. We use location and orientation information of edge points as features; we assume that affine transformations suffice to restore geometric discrepancies between two images of a tooth, we efficiently search the 6D space of affine parameters using GA progressively across multi-resolution image versions, and we use a Hausdorff distance measure to compute the similarity between a reference tooth and a query tooth subject to a possible alignment transform. Testing results based on 52 teeth-pair images suggest that our algorithm converges to reasonable solutions in more than 85% of the test cases, with most of the error in the remaining cases due to excessive misalignments.

  19. Thermodynamic modeling of small scale biomass gasifiers: Development and assessment of the ''Multi-Box'' approach.

    PubMed

    Vakalis, Stergios; Patuzzi, Francesco; Baratieri, Marco

    2016-04-01

    Modeling can be a powerful tool for designing and optimizing gasification systems. Modeling applications for small scale/fixed bed biomass gasifiers have been interesting due to their increased commercial practices. Fixed bed gasifiers are characterized by a wide range of operational conditions and are multi-zoned processes. The reactants are distributed in different phases and the products from each zone influence the following process steps and thus the composition of the final products. The present study aims to improve the conventional 'Black-Box' thermodynamic modeling by means of developing multiple intermediate 'boxes' that calculate two phase (solid-vapor) equilibriums in small scale gasifiers. Therefore the model is named ''Multi-Box''. Experimental data from a small scale gasifier have been used for the validation of the model. The returned results are significantly closer with the actual case study measurements in comparison to single-stage thermodynamic modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Sea-land segmentation for infrared remote sensing images based on superpixels and multi-scale features

    NASA Astrophysics Data System (ADS)

    Lei, Sen; Zou, Zhengxia; Liu, Dunge; Xia, Zhenghuan; Shi, Zhenwei

    2018-06-01

    Sea-land segmentation is a key step for the information processing of ocean remote sensing images. Traditional sea-land segmentation algorithms ignore the local similarity prior of sea and land, and thus fail in complex scenarios. In this paper, we propose a new sea-land segmentation method for infrared remote sensing images to tackle the problem based on superpixels and multi-scale features. Considering the connectivity and local similarity of sea or land, we interpret the sea-land segmentation task in view of superpixels rather than pixels, where similar pixels are clustered and the local similarity are explored. Moreover, the multi-scale features are elaborately designed, comprising of gray histogram and multi-scale total variation. Experimental results on infrared bands of Landsat-8 satellite images demonstrate that the proposed method can obtain more accurate and more robust sea-land segmentation results than the traditional algorithms.

  1. Holistic approach for overlay and edge placement error to meet the 5nm technology node requirements

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Slachter, Bram; Kubis, Michael; Tel, Wim; Hinnen, Paul; Maslow, Mark; Dillen, Harm; Ma, Eric; Chou, Kevin; Liu, Xuedong; Ren, Weiming; Hu, Xuerang; Wang, Fei; Liu, Kevin

    2018-03-01

    In this paper, we discuss the metrology methods and error budget that describe the edge placement error (EPE). EPE quantifies the pattern fidelity of a device structure made in a multi-patterning scheme. Here the pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. EPE is computed by combining optical and ebeam metrology data. We show that high NA optical scatterometer can be used to densely measure in device CD and overlay errors. Large field e-beam system enables massive CD metrology which is used to characterize the local CD error. Local CD distribution needs to be characterized beyond 6 sigma, and requires high throughput e-beam system. We present in this paper the first images of a multi-beam e-beam inspection system. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As a use case, we evaluated a 5-nm logic patterning process based on Self-Aligned-QuadruplePatterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography.

  2. The design of a multi-harmonic step-tunable gyrotron

    NASA Astrophysics Data System (ADS)

    Qi, Xiang-Bo; Du, Chao-Hai; Zhu, Juan-Feng; Pan, Shi; Liu, Pu-Kun

    2017-03-01

    The theoretical study of a step-tunable gyrotron controlled by successive excitation of multi-harmonic modes is presented in this paper. An axis-encircling electron beam is employed to eliminate the harmonic mode competition. Physics images are depicted to elaborate the multi-harmonic interaction mechanism in determining the operating parameters at which arbitrary harmonic tuning can be realized by magnetic field sweeping to achieve controlled multiband frequencies' radiation. An important principle is revealed that a weak coupling coefficient under a high-harmonic interaction can be compensated by a high Q-factor. To some extent, the complementation between the high Q-factor and weak coupling coefficient makes the high-harmonic mode potential to achieve high efficiency. Based on a previous optimized magnetic cusp gun, the multi-harmonic step-tunable gyrotron is feasible by using harmonic tuning of first-to-fourth harmonic modes. Multimode simulation shows that the multi-harmonic gyrotron can operate on the 34 GHz first-harmonic TE11 mode, 54 GHz second-harmonic TE21 mode, 74 GHz third-harmonic TE31 mode, and 94 GHz fourth-harmonic TE41 mode, corresponding to peak efficiencies of 28.6%, 35.7%, 17.1%, and 11.4%, respectively. The multi-harmonic step-tunable gyrotron provides new possibilities in millimeter-terahertz source development especially for advanced terahertz applications.

  3. Characterization of a multi-metal binding biosorbent: Chemical modification and desorption studies.

    PubMed

    Abdolali, Atefeh; Ngo, Huu Hao; Guo, Wenshan; Zhou, John L; Du, Bin; Wei, Qin; Wang, Xiaochang C; Nguyen, Phuoc Dan

    2015-10-01

    This work attends to preparation and characterization of a novel multi-metal binding biosorbent after chemical modification and desorption studies. Biomass is a combination of tea waste, maple leaves and mandarin peels with a certain proportion to adsorb cadmium, copper, lead and zinc ions from aqueous solutions. The mechanism involved in metal removal was investigated by SEM, SEM/EDS and FTIR. SEM/EDS showed the presence of different chemicals and adsorbed heavy metal ions on the surface of biosorbent. FTIR of both unmodified and modified biosorbents revealed the important role of carboxylate groups in heavy metal biosorption. Desorption using different eluents and 0.1 M HCl showed the best desorption performance. The effectiveness of regeneration step by 1 M CaCl2 on five successive cycles of sorption and desorption displays this multi-metal binding biosorbent (MMBB) can effectively be utilized as an adsorbent to remove heavy metal ions from aqueous solutions in five cycles of sorption/desorption/regeneration. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Molecular mechanisms involved in the early steps of flavivirus cell entry.

    PubMed

    Kaufmann, Bärbel; Rossmann, Michael G

    2011-01-01

    Flaviviruses enter their host cells by receptor-mediated endocytosis, a well-orchestrated process of receptor recognition, penetration and uncoating. Recent findings on these early steps in the life cycle of flaviviruses are the focus of this review. Copyright © 2010 Institut Pasteur. Published by Elsevier SAS. All rights reserved.

  5. Building and Managing Your Private Practice.

    ERIC Educational Resources Information Center

    Richards, Daniel L.

    The number of clinicians entering private practice is growing each day. This book presents a step-by-step process for prospective entrepreneurs who wish to become a private practitioner. The text is divided into eight sections. Section 1 looks at the rationale for private practice and addresses the personal questions involving clinical skills,…

  6. Real-Time Visualization of an HPF-based CFD Simulation

    NASA Technical Reports Server (NTRS)

    Kremenetsky, Mark; Vaziri, Arsi; Haimes, Robert; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Current time-dependent CFD simulations produce very large multi-dimensional data sets at each time step. The visual analysis of computational results are traditionally performed by post processing the static data on graphics workstations. We present results from an alternate approach in which we analyze the simulation data in situ on each processing node at the time of simulation. The locally analyzed results, usually more economical and in a reduced form, are then combined and sent back for visualization on a graphics workstation.

  7. Development of a Colorectal Cancer Screening Intervention for Iranian Adults: Appling Intervention Mapping

    PubMed

    Besharati, Fereshteh; Karimi-Shahanjarini, Akram; Hazavehei, Seyed Mohammad Mehdi; Bashirian, Saeid; Bagheri, Fahimeh; Faradmal, Javad

    2017-08-27

    Background: While the incidence rate of the colorectal cancer (CRC) has been increasing over the last three decades in Iran, very limited interventions to increase CRC screening have been developed for Iranian population. The purpose of this study was to describe the use of Intervention Mapping (IM) for applying theory and evidence and considering local contexts to develop a CRC screening program among adults in Iran. Materials and Methods: From April 2014 to July 2016 following the IM process, six steps were formulated and implemented. First a need assessment was conducted involving relevant stakeholders and using focus groups discussions (n=10), individual interviews (n=20), and a household survey (n= 480). Then a matrix of change objectives was developed for each behavioral outcome and theoretical methods and their practical applications were identified to guide intervention development and implementation. A multi-component intervention was developed and piloted. Decision on suitable parts of intervention was made based on feedback of pilot study. Finally, evaluation plan including process and outcome evaluation was generated and conducted to inform future scale up. Results: The needs assessment highlighted factors affecting CRC screening including knowledge, self efficacy, social support and perceived benefit and barriers (financial problems, fear of detection of cancer and etc). Results of needs assessment were used to develop next steps IM. The program utilized methods like information delivery, modeling, and persuasion. Practical applications included video presentation, group discussion, role playing and postcards.This program was assessed through a cluster-randomized controlled trial. Results showed that there were significant differences in CRC screening uptake between intervention groups and control (P<0.001). Conclusions: IM is a useful process in the design of a theory-based intervention addressing CRC screening among Iranian population. Creative Commons Attribution License

  8. Multi-Aperture Shower Design for the Improvement of the Transverse Uniformity of MOCVD-Derived GdYBCO Films

    PubMed Central

    Zhao, Ruipeng; Liu, Qing; Xia, Yudong; Zhang, Fei; Lu, Yuming; Cai, Chuanbing; Tao, Bowan; Li, Yanrong

    2017-01-01

    A multi-aperture shower design is reported to improve the transverse uniformity of GdYBCO superconducting films on the template of sputtered-LaMnO3/epitaxial-MgO/IBAD-MgO/solution deposition planarization (SDP)-Y2O3-buffered Hastelloy tapes. The GdYBCO films were prepared by the metal organic chemical vapor deposition (MOCVD) process. The transverse uniformities of structure, morphology, thickness, and performance were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), step profiler, and the standard four-probe method using the criteria of 1 μV/cm, respectively. Through adopting the multi-aperture shower instead of the slit shower, measurement by step profiler revealed that the thickness difference between the middle and the edges based on the slit shower design was well eliminated. Characterization by SEM showed that a GdYBCO film with a smooth surface was successfully prepared. Moreover, the transport critical current density (Jc) of its middle and edge positions at 77 K and self-field were found to be over 5 MA/cm2 through adopting the micro-bridge four-probe method. PMID:28914793

  9. Performance implications from sizing a VM on multi-core systems: A Data analytic application s view

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Horey, James L; Begoli, Edmon

    In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less

  10. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  11. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    PubMed

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  12. Protein crystal growth in low gravity

    NASA Technical Reports Server (NTRS)

    Feigelson, Robert S.

    1994-01-01

    This research involved (1) using the Atomic Force Microscope (AFM) in a study on the growth of lysozyme crystals and (2) refinement of the design of the Thermonucleator which controls the supersaturation required for the nucleation and growth of protein crystals separately. AFM studies of the (110) tetragonal face confirmed that lysozyme crystals grow by step propagation. There appears to be very little step pile up in the growth regimes which we studied. The step height was measured at = 54A which was equal to the (110) interpane spacing. The AFM images showed areas of step retardation and the formation of pits. These defects ranged in size from 0.1 to 0.4 mu. The source of these defects was not determined. The redesign of the Thermonucleator produced an instrument based on thermoelectric technology which is both easier to use and more amenable to use in a mu g environment. The use of thermoelectric technology resulted in a considerable size reduction which will allow for the design of a multi-unit growth apparatus. The performance of the new apparatus was demonstrated to be the same as the original design.

  13. Scatterometry-based metrology for SAQP pitch walking using virtual reference

    NASA Astrophysics Data System (ADS)

    Kagalwala, Taher; Vaid, Alok; Mahendrakar, Sridhar; Lenahan, Michael; Fang, Fang; Isbester, Paul; Shifrin, Michael; Etzioni, Yoav; Cepler, Aron; Yellai, Naren; Dasari, Prasad; Bozdog, Cornel

    2016-03-01

    Advanced technology nodes, 10nm and beyond, employing multi-patterning techniques for pitch reduction pose new process and metrology challenges in maintaining consistent positioning of structural features. Self-Aligned Quadruple Patterning (SAQP) process is used to create the Fins in FinFET devices with pitch values well below optical lithography limits. The SAQP process bares compounding effects from successive Reactive Ion Etch (RIE) and spacer depositions. These processes induce a shift in the pitch value from one fin compared to another neighboring fin. This is known as pitch walking. Pitch walking affects device performance as well as later processes which work on an assumption that there is consistent spacing between fins. In SAQP there are 3 pitch walking parameters of interest, each linked to specific process steps in the flow. These pitch walking parameters are difficult to discriminate at a specific process step by singular evaluation technique or even with reference metrology such as Transmission Electron Microscopy (TEM). In this paper we will utilize a virtual reference to generate a scatterometry model to measure pitch walk for SAQP process flow.

  14. Rule Driven Multi-Objective Management (RDMOM) - An Alternative Form for Describing and Developing Effective Water Resources Management Strategies

    NASA Astrophysics Data System (ADS)

    Sheer, D. P.

    2011-12-01

    Economics provides a model for describing human behavior applied to the management of water resources, but that model assumes, among other things, that managers have a way of directly relating immediate actions to long-term economic outcomes. This is rarely the case in water resources problems where uncertainty has significant impacts on the effectiveness of management strategies and where the management objectives are very difficult to commensurate. The difficulty in using economics is even greater in multiparty disputes, where each party has a different relative value for each of the management objectives, and many of the management objectives are shared. A three step approach to collaborative decision making can overcome these difficulties. The first step involves creating science based performance measures and evaluation tools to estimate the effect of alternative management strategies on each of the non-commensurate objectives. The second step involves developing short-term surrogate operating objectives that implicitly deal with all of the aspects of the long term uncertainty. Management that continually "optimizes" the short-term objectives subject to physical and other constraints that change through time can be characterized as Rule Driven Multi-Objective Management (RDMOM). RDMOM strategies are then tested in simulation models to provide the basis for evaluating performance measures. Participants in the collaborative process then engage in multiparty discussions that create new alternatives, and "barter" a deal. RDMOM does not assume that managers fully understand the link between current actions and long term goals. Rather, it assumes that managers operate to achieve short-term surrogate objectives which they believe will achieve an appropriate balance of both short and long-term incommensurable benefits. A reservoir rule curve is a simple, but often not particularly effective, example of the real-world implementation of RDMOM. Water managers find they can easily describe and explain their written and unwritten protocols using the RDMOM, and that the use of short-term surrogates is both intellectually appealing and pragmatic. The identification of operating targets as short-term surrogates leads naturally to a critical discussion of long-term objectives, and to the development of performance measures for the long-term objectives. The transparency and practical feasibility RDMOM based strategies is often crucial to the success of collaborative efforts. Complex disputes in the Delaware and Susquehanna Basins, the Everglades and Lower East Coast South Florida, Southern Nevada, Washington DC and many others have been resolved using RDMOM strategies.

  15. Applying Psychology in Local Authority Emergency Planning Processes

    ERIC Educational Resources Information Center

    Posada, Susan E.

    2006-01-01

    This article describes the work of two EPs involved in a multi-agency project to produce Local Authority (LA) guidelines on psycho/social support following critical incidents and disasters. EPs were involved as participant observers during a simulation of setting up and running a LA reception centre for evacuees. A questionnaire was then…

  16. Simulation of materials processing: Fantasy or reality?

    NASA Technical Reports Server (NTRS)

    Jenkins, Thomas J.; Bright, Victor M.

    1994-01-01

    This experiment introduces students to the application of computer-aided design (CAD) and analysis of materials processing in the context of integrated circuit (IC) fabrication. The fabrication of modern IC's is a complex process which consists of several sequential steps. These steps involve the precise control of processing variables such as temperature, humidity, and ambient gas composition. In essence, the particular process employed during the fabrication becomes a 'recipe'. Due to economic and other considerations, CAD is becoming an indispensable part of the development of new recipes for IC fabrication. In particular, this experiment permits the students to explore the CAD of the thermal oxidation of silicon.

  17. Multi-model groundwater-management optimization: reconciling disparate conceptual models

    NASA Astrophysics Data System (ADS)

    Timani, Bassel; Peralta, Richard

    2015-09-01

    Disagreement among policymakers often involves policy issues and differences between the decision makers' implicit utility functions. Significant disagreement can also exist concerning conceptual models of the physical system. Disagreement on the validity of a single simulation model delays discussion on policy issues and prevents the adoption of consensus management strategies. For such a contentious situation, the proposed multi-conceptual model optimization (MCMO) can help stakeholders reach a compromise strategy. MCMO computes mathematically optimal strategies that simultaneously satisfy analogous constraints and bounds in multiple numerical models that differ in boundary conditions, hydrogeologic stratigraphy, and discretization. Shadow prices and trade-offs guide the process of refining the first MCMO-developed `multi-model strategy into a realistic compromise management strategy. By employing automated cycling, MCMO is practical for linear and nonlinear aquifer systems. In this reconnaissance study, MCMO application to the multilayer Cache Valley (Utah and Idaho, USA) river-aquifer system employs two simulation models with analogous background conditions but different vertical discretization and boundary conditions. The objective is to maximize additional safe pumping (beyond current pumping), subject to constraints on groundwater head and seepage from the aquifer to surface waters. MCMO application reveals that in order to protect the local ecosystem, increased groundwater pumping can satisfy only 40 % of projected water demand increase. To explore the possibility of increasing that pumping while protecting the ecosystem, MCMO clearly identifies localities requiring additional field data. MCMO is applicable to other areas and optimization problems than used here. Steps to prepare comparable sub-models for MCMO use are area-dependent.

  18. Conducting a paediatric multi-centre RCT with an industry partner: challenges and lessons learned.

    PubMed

    Maskell, Jessica; Newcombe, Peter; Martin, Graham; Kimble, Roy

    2012-11-01

    There are many benefits of multi-centred research including large sample sizes, statistical power, timely recruitment and generalisability of results. However, there are numerous considerations when planning and implementing a multi-centred study. This article reviews the challenges and successes of planning and implementing a multi-centred prospective randomised control trial involving an industry partner. The research investigated the impact on psychosocial functioning of a cosmetic camouflage product for children and adolescents with burn scarring. Multi-centred studies commonly have many stakeholders. Within this study, six Australian and New Zealand paediatric burn units as well as an industry partner were involved. The inclusion of an industry partner added complexities as they brought different priorities and expectations to the research. Further, multifaceted ethical and institutional approval processes needed to be negotiated. The challenges, successes, lessons learned and recommendations from this study regarding Australian and New Zealand ethics and research governance approval processes, collaboration with industry partners and the management of differing expectations will be outlined. Recommendations for future multi-centred research with industry partners include provision of regular written reports for the industry partner; continual monitoring and prompt resolution of concerns; basic research practices education for industry partners; minimisation of industry partner contact with participants; clear roles and responsibilities of all stakeholders and utilisation of single ethical review if available. © 2012 The Authors. Journal of Paediatrics and Child Health © 2012 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  19. Pozzolanic filtration/solidification of radionuclides in nuclear reactor cooling water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Englehardt, J.D.; Peng, C.

    1995-12-31

    Laboratory studies to investigate the feasibility of one- and two-step processes for precipitation/coprecipitating radionuclides from nuclear reactor cooling water, filtering with pozzolanic filter aid, and solidifying, are reported in this paper. In the one-step process, ferrocyanide salt and excess lime are added ahead of the filter, and the resulting filter cake solidifies by a pozzolanic reaction. The two-step process involves addition of solidifying agents subsequent to filtration. It was found that high surface area diatomaceous synthetic calcium silicate powders, sold commercially as functional fillers and carriers, adsorb nickel isotopes from solution at neutral and slightly basic pH. Addition of themore » silicates to cooling water allowed removal of the tested metal isotopes (nickel, iron, manganese, cobalt, and cesium) simultaneously at neutral to slightly basic pH. Lime to diatomite ratio was the most influential characteristic of composition on final strength tested, with higher lime ratios giving higher strength. Diatomaceous earth filter aids manufactured without sodium fluxes exhibited higher pozzolanic activity. Pozzolanic filter cake solidified with sodium silicate and a ratio of 0.45 parts lime to 1 part diatomite had compressive strength ranging from 470 to 595 psi at a 90% confidence level. Leachability indices of all tested metals in the solidified waste were acceptable. In light of the typical requirement of removing iron and desirability of control over process pH, a two-step process involving addition of Portland cement to the filter cake may be most generally applicable.« less

  20. Next generation calmodulin affinity purification: Clickable calmodulin facilitates improved protein purification

    PubMed Central

    Kinzer-Ursem, Tamara L.

    2018-01-01

    As the proteomics field continues to expand, scientists are looking to integrate cross-disciplinary tools for studying protein structure, function, and interactions. Protein purification remains a key tool for many characterization studies. Calmodulin (CaM) is a calcium-binding messenger protein with over a hundred downstream binding partners, and is involved in a host of physiological processes, from learning and memory to immune and cardiac function. To facilitate biophysical studies of calmodulin, researchers have designed a site-specific labeling process for use in bioconjugation applications while maintaining high levels of protein activity. Here, we present a platform for selective conjugation of calmodulin directly from clarified cell lysates under bioorthogonal reaction conditions. Using a chemoenzymatically modified calmodulin, we employ popular click chemistry reactions for the conjugation of calmodulin to Sepharose resin, thereby streamlining a previously multi-step purification and conjugation process. We show that this “next-generation” calmodulin-Sepharose resin is not only easy to produce, but is also able to purify more calmodulin-binding proteins per volume of resin than traditional calmodulin-Sepharose resins. We expect these methods to be translatable to other proteins of interest and to other conjugation applications such as surface-based assays for the characterization of protein-protein interaction dynamics. PMID:29864125

Top