Sample records for complex multi-step process

  1. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    PubMed Central

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  2. Lignocellulose hydrolysis by multienzyme complexes

    USDA-ARS?s Scientific Manuscript database

    Lignocellulosic biomass is the most abundant renewable resource on the planet. Converting this material into a usable fuel is a multi-step process, the rate-limiting step being enzymatic hydrolysis of organic polymers into monomeric sugars. While the substrate can be complex and require a multitud...

  3. Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.

    PubMed

    Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny

    2010-12-01

    A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.

  4. Fabrication of magnetic bubble memory overlay

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Self-contained magnetic bubble memory overlay is fabricated by process that employs epitaxial deposition to form multi-layered complex of magnetically active components on single chip. Overlay fabrication comprises three metal deposition steps followed by subtractive etch.

  5. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  6. A unified inversion scheme to process multifrequency measurements of various dispersive electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Han, Y.; Misra, S.

    2018-04-01

    Multi-frequency measurement of a dispersive electromagnetic (EM) property, such as electrical conductivity, dielectric permittivity, or magnetic permeability, is commonly analyzed for purposes of material characterization. Such an analysis requires inversion of the multi-frequency measurement based on a specific relaxation model, such as Cole-Cole model or Pelton's model. We develop a unified inversion scheme that can be coupled to various type of relaxation models to independently process multi-frequency measurement of varied EM properties for purposes of improved EM-based geomaterial characterization. The proposed inversion scheme is firstly tested in few synthetic cases in which different relaxation models are coupled into the inversion scheme and then applied to multi-frequency complex conductivity, complex resistivity, complex permittivity, and complex impedance measurements. The method estimates up to seven relaxation-model parameters exhibiting convergence and accuracy for random initializations of the relaxation-model parameters within up to 3-orders of magnitude variation around the true parameter values. The proposed inversion method implements a bounded Levenberg algorithm with tuning initial values of damping parameter and its iterative adjustment factor, which are fixed in all the cases shown in this paper and irrespective of the type of measured EM property and the type of relaxation model. Notably, jump-out step and jump-back-in step are implemented as automated methods in the inversion scheme to prevent the inversion from getting trapped around local minima and to honor physical bounds of model parameters. The proposed inversion scheme can be easily used to process various types of EM measurements without major changes to the inversion scheme.

  7. Optimal design of an alignment-free two-DOF rehabilitation robot for the shoulder complex.

    PubMed

    Galinski, Daniel; Sapin, Julien; Dehez, Bruno

    2013-06-01

    This paper presents the optimal design of an alignment-free exoskeleton for the rehabilitation of the shoulder complex. This robot structure is constituted of two actuated joints and is linked to the arm through passive degrees of freedom (DOFs) to drive the flexion-extension and abduction-adduction movements of the upper arm. The optimal design of this structure is performed through two steps. The first step is a multi-objective optimization process aiming to find the best parameters characterizing the robot and its position relative to the patient. The second step is a comparison process aiming to select the best solution from the optimization results on the basis of several criteria related to practical considerations. The optimal design process leads to a solution outperforming an existing solution on aspects as kinematics or ergonomics while being more simple.

  8. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    PubMed

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.

  9. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  10. Multi-compartmental modeling of SORLA’s influence on amyloidogenic processing in Alzheimer’s disease

    PubMed Central

    2012-01-01

    Background Proteolytic breakdown of the amyloid precursor protein (APP) by secretases is a complex cellular process that results in formation of neurotoxic Aβ peptides, causative of neurodegeneration in Alzheimer’s disease (AD). Processing involves monomeric and dimeric forms of APP that traffic through distinct cellular compartments where the various secretases reside. Amyloidogenic processing is also influenced by modifiers such as sorting receptor-related protein (SORLA), an inhibitor of APP breakdown and major AD risk factor. Results In this study, we developed a multi-compartment model to simulate the complexity of APP processing in neurons and to accurately describe the effects of SORLA on these processes. Based on dose–response data, our study concludes that SORLA specifically impairs processing of APP dimers, the preferred secretase substrate. In addition, SORLA alters the dynamic behavior of β-secretase, the enzyme responsible for the initial step in the amyloidogenic processing cascade. Conclusions Our multi-compartment model represents a major conceptual advance over single-compartment models previously used to simulate APP processing; and it identified APP dimers and β-secretase as the two distinct targets of the inhibitory action of SORLA in Alzheimer’s disease. PMID:22727043

  11. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  12. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  13. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  14. Analysis, design, fabrication, and performance of three-dimensional braided composites

    NASA Astrophysics Data System (ADS)

    Kostar, Timothy D.

    1998-11-01

    Cartesian 3-D (track and column) braiding as a method of composite preforming has been investigated. A complete analysis of the process was conducted to understand the limitations and potentials of the process. Knowledge of the process was enhanced through development of a computer simulation, and it was discovered that individual control of each track and column and multiple-step braid cycles greatly increases possible braid architectures. Derived geometric constraints coupled with the fundamental principles of Cartesian braiding resulted in an algorithm to optimize preform geometry in relation to processing parameters. The design of complex and unusual 3-D braids was investigated in three parts: grouping of yarns to form hybrid composites via an iterative simulation; design of composite cross-sectional shape through implementation of the Universal Method; and a computer algorithm developed to determine the braid plan based on specified cross-sectional shape. Several 3-D braids, which are the result of variations or extensions to Cartesian braiding, are presented. An automated four-step braiding machine with axial yarn insertion has been constructed and used to fabricate two-step, double two-step, four-step, and four-step with axial and transverse yarn insertion braids. A working prototype of a multi-step braiding machine was used to fabricate four-step braids with surrogate material insertion, unique hybrid structures from multiple track and column displacement and multi-step cycles, and complex-shaped structures with constant or varying cross-sections. Braid materials include colored polyester yarn to study the yarn grouping phenomena, Kevlar, glass, and graphite for structural reinforcement, and polystyrene, silicone rubber, and fasteners for surrogate material insertion. A verification study for predicted yarn orientation and volume fraction was conducted, and a topological model of 3-D braids was developed. The solid model utilizes architectural parameters, generated from the process simulation, to determine the composite elastic properties. Methods of preform consolidation are investigated and the results documented. The extent of yarn deformation (packing) resulting from preform consolidation was investigated through cross-sectional micrographs. The fiber volume fraction of select hybrid composites was measured and representative unit cells are suggested. Finally, a comparison study of the elastic performance of Kevlar/epoxy and carbon/Kevlar hybrid composites was conducted.

  15. The Use of Novel Camtasia Videos to Improve Performance of At-Risk Students in Undergraduate Physiology Courses

    ERIC Educational Resources Information Center

    Miller, Cynthia J.

    2014-01-01

    Students in undergraduate physiology courses often have difficulty understanding complex, multi-step processes, and these concepts consume a large portion of class time. For this pilot study, it was hypothesized that online multimedia resources may improve student performance in a high-risk population and reduce the in-class workload. A narrated…

  16. Multi-objective optimization of process parameters of multi-step shaft formed with cross wedge rolling based on orthogonal test

    NASA Astrophysics Data System (ADS)

    Han, S. T.; Shu, X. D.; Shchukin, V.; Kozhevnikova, G.

    2018-06-01

    In order to achieve reasonable process parameters in forming multi-step shaft by cross wedge rolling, the research studied the rolling-forming process multi-step shaft on the DEFORM-3D finite element software. The interactive orthogonal experiment was used to study the effect of the eight parameters, the first section shrinkage rate φ1, the first forming angle α1, the first spreading angle β1, the first spreading length L1, the second section shrinkage rate φ2, the second forming angle α2, the second spreading angle β2 and the second spreading length L2, on the quality of shaft end and the microstructure uniformity. By using the fuzzy mathematics comprehensive evaluation method and the extreme difference analysis, the influence degree of the process parameters on the quality of the multi-step shaft is obtained: β2>φ2L1>α1>β1>φ1>α2L2. The results of the study can provide guidance for obtaining multi-stepped shaft with high mechanical properties and achieving near net forming without stub bar in cross wedge rolling.

  17. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  18. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.

  19. One-pot growth of two-dimensional lateral heterostructures via sequential edge-epitaxy

    NASA Astrophysics Data System (ADS)

    Sahoo, Prasana K.; Memaran, Shahriar; Xin, Yan; Balicas, Luis; Gutiérrez, Humberto R.

    2018-01-01

    Two-dimensional heterojunctions of transition-metal dichalcogenides have great potential for application in low-power, high-performance and flexible electro-optical devices, such as tunnelling transistors, light-emitting diodes, photodetectors and photovoltaic cells. Although complex heterostructures have been fabricated via the van der Waals stacking of different two-dimensional materials, the in situ fabrication of high-quality lateral heterostructures with multiple junctions remains a challenge. Transition-metal-dichalcogenide lateral heterostructures have been synthesized via single-step, two-step or multi-step growth processes. However, these methods lack the flexibility to control, in situ, the growth of individual domains. In situ synthesis of multi-junction lateral heterostructures does not require multiple exchanges of sources or reactors, a limitation in previous approaches as it exposes the edges to ambient contamination, compromises the homogeneity of domain size in periodic structures, and results in long processing times. Here we report a one-pot synthetic approach, using a single heterogeneous solid source, for the continuous fabrication of lateral multi-junction heterostructures consisting of monolayers of transition-metal dichalcogenides. The sequential formation of heterojunctions is achieved solely by changing the composition of the reactive gas environment in the presence of water vapour. This enables selective control of the water-induced oxidation and volatilization of each transition-metal precursor, as well as its nucleation on the substrate, leading to sequential edge-epitaxy of distinct transition-metal dichalcogenides. Photoluminescence maps confirm the sequential spatial modulation of the bandgap, and atomic-resolution images reveal defect-free lateral connectivity between the different transition-metal-dichalcogenide domains within a single crystal structure. Electrical transport measurements revealed diode-like responses across the junctions. Our new approach offers greater flexibility and control than previous methods for continuous growth of transition-metal-dichalcogenide-based multi-junction lateral heterostructures. These findings could be extended to other families of two-dimensional materials, and establish a foundation for the development of complex and atomically thin in-plane superlattices, devices and integrated circuits.

  20. Monte Carlo Planning Method Estimates Planning Horizons during Interactive Social Exchange.

    PubMed

    Hula, Andreas; Montague, P Read; Dayan, Peter

    2015-06-01

    Reciprocating interactions represent a central feature of all human exchanges. They have been the target of various recent experiments, with healthy participants and psychiatric populations engaging as dyads in multi-round exchanges such as a repeated trust task. Behaviour in such exchanges involves complexities related to each agent's preference for equity with their partner, beliefs about the partner's appetite for equity, beliefs about the partner's model of their partner, and so on. Agents may also plan different numbers of steps into the future. Providing a computationally precise account of the behaviour is an essential step towards understanding what underlies choices. A natural framework for this is that of an interactive partially observable Markov decision process (IPOMDP). However, the various complexities make IPOMDPs inordinately computationally challenging. Here, we show how to approximate the solution for the multi-round trust task using a variant of the Monte-Carlo tree search algorithm. We demonstrate that the algorithm is efficient and effective, and therefore can be used to invert observations of behavioural choices. We use generated behaviour to elucidate the richness and sophistication of interactive inference.

  1. Monte Carlo Planning Method Estimates Planning Horizons during Interactive Social Exchange

    PubMed Central

    Hula, Andreas; Montague, P. Read; Dayan, Peter

    2015-01-01

    Reciprocating interactions represent a central feature of all human exchanges. They have been the target of various recent experiments, with healthy participants and psychiatric populations engaging as dyads in multi-round exchanges such as a repeated trust task. Behaviour in such exchanges involves complexities related to each agent’s preference for equity with their partner, beliefs about the partner’s appetite for equity, beliefs about the partner’s model of their partner, and so on. Agents may also plan different numbers of steps into the future. Providing a computationally precise account of the behaviour is an essential step towards understanding what underlies choices. A natural framework for this is that of an interactive partially observable Markov decision process (IPOMDP). However, the various complexities make IPOMDPs inordinately computationally challenging. Here, we show how to approximate the solution for the multi-round trust task using a variant of the Monte-Carlo tree search algorithm. We demonstrate that the algorithm is efficient and effective, and therefore can be used to invert observations of behavioural choices. We use generated behaviour to elucidate the richness and sophistication of interactive inference. PMID:26053429

  2. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  3. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  4. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  5. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    PubMed

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Structure determination of an 11-subunit exosome in complex with RNA by molecular replacement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makino, Debora Lika, E-mail: dmakino@biochem.mpg.de; Conti, Elena

    The crystallographic steps towards the structure determination of a complete eukaryotic exosome complex bound to RNA are presented. Phasing of this 11-protein subunit complex was carried out via molecular replacement. The RNA exosome is an evolutionarily conserved multi-protein complex involved in the 3′ degradation of a variety of RNA transcripts. In the nucleus, the exosome participates in the maturation of structured RNAs, in the surveillance of pre-mRNAs and in the decay of a variety of noncoding transcripts. In the cytoplasm, the exosome degrades mRNAs in constitutive and regulated turnover pathways. Several structures of subcomplexes of eukaryotic exosomes or related prokaryoticmore » exosome-like complexes are known, but how the complete assembly is organized to fulfil processive RNA degradation has been unclear. An atomic snapshot of a Saccharomyces cerevisiae 420 kDa exosome complex bound to an RNA substrate in the pre-cleavage state of a hydrolytic reaction has been determined. Here, the crystallographic steps towards the structural elucidation, which was carried out by molecular replacement, are presented.« less

  7. Microelectrode voltammetry of multi-electron transfers complicated by coupled chemical equilibria: a general theory for the extended square scheme.

    PubMed

    Laborda, Eduardo; Gómez-Gil, José María; Molina, Angela

    2017-06-28

    A very general and simple theoretical solution is presented for the current-potential-time response of reversible multi-electron transfer processes complicated by homogeneous chemical equilibria (the so-called extended square scheme). The expressions presented here are applicable regardless of the number of electrons transferred and coupled chemical processes, and they are particularized for a wide variety of microelectrode geometries. The voltammetric response of very different systems presenting multi-electron transfers is considered for the most widely-used techniques (namely, cyclic voltammetry, square wave voltammetry, differential pulse voltammetry and steady state voltammetry), studying the influence of the microelectrode geometry and the number and thermodynamics of the (electro)chemical steps. Most appropriate techniques and procedures for the determination of the 'interaction' between successive transfers are discussed. Special attention is paid to those situations where homogeneous chemical processes, such as protonation, complexation or ion association, affect the electrochemical behaviour of the system by different stabilization of the oxidation states.

  8. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    NASA Technical Reports Server (NTRS)

    Reck, Theodore (Inventor); Perez, Jose Vicente Siles (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Jung-Kubiak, Cecile (Inventor); Mehdi, Imran (Inventor); Chattopadhyay, Goutam (Inventor); Lin, Robert H. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  9. The multiBac protein complex production platform at the EMBL.

    PubMed

    Berger, Imre; Garzoni, Frederic; Chaillet, Maxime; Haffke, Matthias; Gupta, Kapil; Aubert, Alice

    2013-07-11

    Proteomics research revealed the impressive complexity of eukaryotic proteomes in unprecedented detail. It is now a commonly accepted notion that proteins in cells mostly exist not as isolated entities but exert their biological activity in association with many other proteins, in humans ten or more, forming assembly lines in the cell for most if not all vital functions.(1,2) Knowledge of the function and architecture of these multiprotein assemblies requires their provision in superior quality and sufficient quantity for detailed analysis. The paucity of many protein complexes in cells, in particular in eukaryotes, prohibits their extraction from native sources, and necessitates recombinant production. The baculovirus expression vector system (BEVS) has proven to be particularly useful for producing eukaryotic proteins, the activity of which often relies on post-translational processing that other commonly used expression systems often cannot support.(3) BEVS use a recombinant baculovirus into which the gene of interest was inserted to infect insect cell cultures which in turn produce the protein of choice. MultiBac is a BEVS that has been particularly tailored for the production of eukaryotic protein complexes that contain many subunits.(4) A vital prerequisite for efficient production of proteins and their complexes are robust protocols for all steps involved in an expression experiment that ideally can be implemented as standard operating procedures (SOPs) and followed also by non-specialist users with comparative ease. The MultiBac platform at the European Molecular Biology Laboratory (EMBL) uses SOPs for all steps involved in a multiprotein complex expression experiment, starting from insertion of the genes into an engineered baculoviral genome optimized for heterologous protein production properties to small-scale analysis of the protein specimens produced.(5-8) The platform is installed in an open-access mode at EMBL Grenoble and has supported many scientists from academia and industry to accelerate protein complex research projects.

  10. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor); Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  11. A ruthenium dimer complex with a flexible linker slowly threads between DNA bases in two distinct steps.

    PubMed

    Bahira, Meriem; McCauley, Micah J; Almaqwashi, Ali A; Lincoln, Per; Westerlund, Fredrik; Rouzina, Ioulia; Williams, Mark C

    2015-10-15

    Several multi-component DNA intercalating small molecules have been designed around ruthenium-based intercalating monomers to optimize DNA binding properties for therapeutic use. Here we probe the DNA binding ligand [μ-C4(cpdppz)2(phen)4Ru2](4+), which consists of two Ru(phen)2dppz(2+) moieties joined by a flexible linker. To quantify ligand binding, double-stranded DNA is stretched with optical tweezers and exposed to ligand under constant applied force. In contrast to other bis-intercalators, we find that ligand association is described by a two-step process, which consists of fast bimolecular intercalation of the first dppz moiety followed by ∼10-fold slower intercalation of the second dppz moiety. The second step is rate-limited by the requirement for a DNA-ligand conformational change that allows the flexible linker to pass through the DNA duplex. Based on our measured force-dependent binding rates and ligand-induced DNA elongation measurements, we are able to map out the energy landscape and structural dynamics for both ligand binding steps. In addition, we find that at zero force the overall binding process involves fast association (∼10 s), slow dissociation (∼300 s), and very high affinity (Kd ∼10 nM). The methodology developed in this work will be useful for studying the mechanism of DNA binding by other multi-step intercalating ligands and proteins. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Multi-electrolyte-step anodic aluminum oxide method for the fabrication of self-organized nanochannel arrays

    PubMed Central

    2012-01-01

    Nanochannel arrays were fabricated by the self-organized multi-electrolyte-step anodic aluminum oxide [AAO] method in this study. The anodization conditions used in the multi-electrolyte-step AAO method included a phosphoric acid solution as the electrolyte and an applied high voltage. There was a change in the phosphoric acid by the oxalic acid solution as the electrolyte and the applied low voltage. This method was used to produce self-organized nanochannel arrays with good regularity and circularity, meaning less power loss and processing time than with the multi-step AAO method. PMID:22333268

  13. AN INTEGRATED PERSPECTIVE ON THE ASSESSMENT OF TECHNOLOGIES: INTEGRATE-HTA.

    PubMed

    Wahlster, Philip; Brereton, Louise; Burns, Jacob; Hofmann, Björn; Mozygemba, Kati; Oortwijn, Wija; Pfadenhauer, Lisa; Polus, Stephanie; Rehfuess, Eva; Schilling, Imke; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2017-01-01

    Current health technology assessment (HTA) is not well equipped to assess complex technologies as insufficient attention is being paid to the diversity in patient characteristics and preferences, context, and implementation. Strategies to integrate these and several other aspects, such as ethical considerations, in a comprehensive assessment are missing. The aim of the European research project INTEGRATE-HTA was to develop a model for an integrated HTA of complex technologies. A multi-method, four-stage approach guided the development of the INTEGRATE-HTA Model: (i) definition of the different dimensions of information to be integrated, (ii) literature review of existing methods for integration, (iii) adjustment of concepts and methods for assessing distinct aspects of complex technologies in the frame of an integrated process, and (iv) application of the model in a case study and subsequent revisions. The INTEGRATE-HTA Model consists of five steps, each involving stakeholders: (i) definition of the technology and the objective of the HTA; (ii) development of a logic model to provide a structured overview of the technology and the system in which it is embedded; (iii) evidence assessment on effectiveness, economic, ethical, legal, and socio-cultural aspects, taking variability of participants, context, implementation issues, and their interactions into account; (iv) populating the logic model with the data generated in step 3; (v) structured process of decision-making. The INTEGRATE-HTA Model provides a structured process for integrated HTAs of complex technologies. Stakeholder involvement in all steps is essential as a means of ensuring relevance and meaningful interpretation of the evidence.

  14. A Geometry Based Infra-structure for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1997-01-01

    The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.

  15. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  16. Sea-land segmentation for infrared remote sensing images based on superpixels and multi-scale features

    NASA Astrophysics Data System (ADS)

    Lei, Sen; Zou, Zhengxia; Liu, Dunge; Xia, Zhenghuan; Shi, Zhenwei

    2018-06-01

    Sea-land segmentation is a key step for the information processing of ocean remote sensing images. Traditional sea-land segmentation algorithms ignore the local similarity prior of sea and land, and thus fail in complex scenarios. In this paper, we propose a new sea-land segmentation method for infrared remote sensing images to tackle the problem based on superpixels and multi-scale features. Considering the connectivity and local similarity of sea or land, we interpret the sea-land segmentation task in view of superpixels rather than pixels, where similar pixels are clustered and the local similarity are explored. Moreover, the multi-scale features are elaborately designed, comprising of gray histogram and multi-scale total variation. Experimental results on infrared bands of Landsat-8 satellite images demonstrate that the proposed method can obtain more accurate and more robust sea-land segmentation results than the traditional algorithms.

  17. Biomimetic surface structuring using cylindrical vector femtosecond laser beams

    NASA Astrophysics Data System (ADS)

    Skoulas, Evangelos; Manousaki, Alexandra; Fotakis, Costas; Stratakis, Emmanuel

    2017-03-01

    We report on a new, single-step and scalable method to fabricate highly ordered, multi-directional and complex surface structures that mimic the unique morphological features of certain species found in nature. Biomimetic surface structuring was realized by exploiting the unique and versatile angular profile and the electric field symmetry of cylindrical vector (CV) femtosecond (fs) laser beams. It is shown that, highly controllable, periodic structures exhibiting sizes at nano-, micro- and dual- micro/nano scales can be directly written on Ni upon line and large area scanning with radial and azimuthal polarization beams. Depending on the irradiation conditions, new complex multi-directional nanostructures, inspired by the Shark’s skin morphology, as well as superhydrophobic dual-scale structures mimicking the Lotus’ leaf water repellent properties can be attained. It is concluded that the versatility and features variations of structures formed is by far superior to those obtained via laser processing with linearly polarized beams. More important, by exploiting the capabilities offered by fs CV fields, the present technique can be further extended to fabricate even more complex and unconventional structures. We believe that our approach provides a new concept in laser materials processing, which can be further exploited for expanding the breadth and novelty of applications.

  18. Recognising and referring children exposed to domestic abuse: a multi-professional, proactive systems-based evaluation using a modified Failure Mode and Effects Analysis (FMEA).

    PubMed

    Ashley, Laura; Armitage, Gerry; Taylor, Julie

    2017-03-01

    Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.

  19. Porous polycarbene-bearing membrane actuator for ultrasensitive weak-acid detection and real-time chemical reaction monitoring.

    PubMed

    Sun, Jian-Ke; Zhang, Weiyi; Guterman, Ryan; Lin, Hui-Juan; Yuan, Jiayin

    2018-04-30

    Soft actuators with integration of ultrasensitivity and capability of simultaneous interaction with multiple stimuli through an entire event ask for a high level of structure complexity, adaptability, and/or multi-responsiveness, which is a great challenge. Here, we develop a porous polycarbene-bearing membrane actuator built up from ionic complexation between a poly(ionic liquid) and trimesic acid (TA). The actuator features two concurrent structure gradients, i.e., an electrostatic complexation (EC) degree and a density distribution of a carbene-NH 3 adduct (CNA) along the membrane cross-section. The membrane actuator performs the highest sensitivity among the state-of-the-art soft proton actuators toward acetic acid at 10 -6  mol L -1 (M) level in aqueous media. Through competing actuation of the two gradients, it is capable of monitoring an entire process of proton-involved chemical reactions that comprise multiple stimuli and operational steps. The present achievement constitutes a significant step toward real-life application of soft actuators in chemical sensing and reaction technology.

  20. Deciphering Late-Pleistocence landscape evolution: linking proxies by combining pedo-stratigraphy and luminescence dating

    NASA Astrophysics Data System (ADS)

    Kreutzer, Sebastian; Meszner, Sascha; Faust, Dominik; Fuchs, Markus

    2014-05-01

    Interpreting former landscape evolution asks for understanding the processes that sculpt such landforms by means of deciphering complex systems. For reconstructing terrestrial Quaternary environments based on loess archives this might be considered, at least, as a three step process: (1) Identifying valuable records in appropriate morphological positions in a previously defined research area, (2) analysing the profiles by field work and laboratory methods and finally (3) linking the previously considered pseudo-isolated systems to set up a comprehensive picture. Especially the first and the last step might bring some pitfalls, as it is tempting to specify single records as pseudo-isolated, closed systems. They might be, with regard to their preservation in their specific morphological position, but in fact they are part of a complex, open system. Between 2008 and 2013, Late-Pleistocene loess archives in Saxony have been intensively investigated by field and laboratory methods. Linking pedo- and luminescence dating based chronostratigraphies, a composite profile for the entire Saxonian Loess Region has been established. With this, at least, two-fold approach we tried to avoid misinterpretations that might appear when focussing on one standard profile in an open morphological system. Our contribution focuses on this multi-proxy approach to decipher the Late-Pleistocene landscape evolution in the Saxonian Loess Region. Highlighting the challenges and advantages of combining different methods, we believe that (1) this multi-proxy approach is without alternative, (2) the combination of different profiles may simplify the more complex reality, but it may be a useful generalisation to understand and reveal the stratigraphical significance of the landscape evolution in this region.

  1. Contaminant source and release history identification in groundwater: A multi-step approach

    NASA Astrophysics Data System (ADS)

    Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.

  2. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research

    PubMed Central

    Singh, Sonal

    2013-01-01

    Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences. PMID:24555077

  3. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research.

    PubMed

    Maruthur, Nisa M; Joy, Susan; Dolan, James; Segal, Jodi B; Shihab, Hasan M; Singh, Sonal

    2013-01-01

    Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences.

  4. Metaphase II oocytes from human unilaminar follicles grown in a multi-step culture system.

    PubMed

    McLaughlin, M; Albertini, D F; Wallace, W H B; Anderson, R A; Telfer, E E

    2018-03-01

    Can complete oocyte development be achieved from human ovarian tissue containing primordial/unilaminar follicles and grown in vitro in a multi-step culture to meiotic maturation demonstrated by the formation of polar bodies and a Metaphase II spindle? Development of human oocytes from primordial/unilaminar stages to resumption of meiosis (Metaphase II) and emission of a polar body was achieved within a serum free multi-step culture system. Complete development of oocytes in vitro has been achieved in mouse, where in vitro grown (IVG) oocytes from primordial follicles have resulted in the production of live offspring. Human oocytes have been grown in vitro from the secondary/multi-laminar stage to obtain fully grown oocytes capable of meiotic maturation. However, there are no reports of a culture system supporting complete growth from the earliest stages of human follicle development through to Metaphase II. Ovarian cortical biopsies were obtained with informed consent from women undergoing elective caesarean section (mean age: 30.7 ± 1.7; range: 25-39 years, n = 10). Laboratory setting. Ovarian biopsies were dissected into thin strips, and after removal of growing follicles were cultured in serum free medium for 8 days (Step 1). At the end of this period secondary/multi-laminar follicles were dissected from the strips and intact follicles 100-150 μm in diameter were selected for further culture. Isolated follicles were cultured individually in serum free medium in the presence of 100 ng/ml of human recombinant Activin A (Step 2). Individual follicles were monitored and after 8 days, cumulus oocyte complexes (COCs) were retrieved by gentle pressure on the cultured follicles. Complexes with complete cumulus and adherent mural granulosa cells were selected and cultured in the presence of Activin A and FSH on membranes for a further 4 days (Step 3). At the end of Step 3, complexes containing oocytes >100 μm diameter were selected for IVM in SAGE medium (Step 4) then fixed for analysis. Pieces of human ovarian cortex cultured in serum free medium for 8 days (Step 1) supported early follicle growth and 87 secondary follicles of diameter 120 ± 6 μm (mean ± SEM) could be dissected for further culture. After a further 8 days, 54 of the 87 follicles had reached the antral stage of development. COCs were retrieved by gentle pressure from the cultured follicles and those with adherent mural granulosa cells (n = 48) were selected and cultured for a further 4 days (Step 3). At the end of Step 3, 32 complexes contained oocytes >100 μm diameter were selected for IVM (Step 4). Nine of these complexes contained polar bodies within 24 h and all polar bodies were abnormally large. Confocal immuno-histochemical analysis showed the presence of a Metaphase II spindle confirming that these IVG oocytes had resumed meiosis but their developmental potential is unknown. This is a small number of samples but provides proof of concept that complete development of human oocytes can occur in vitro. Further optimization with morphological evaluation and fertilization potential of IVG oocytes is required to determine whether they are normal. The ability to develop human oocytes from the earliest follicular stages in vitro through to maturation and fertilization would benefit fertility preservation practice. Funded by MRC Grants (G0901839 and MR/L00299X/1). No competing interests.

  5. The eClinical Care Pathway Framework: a novel structure for creation of online complex clinical care pathways and its application in the management of sexually transmitted infections.

    PubMed

    Gibbs, Jo; Sutcliffe, Lorna J; Gkatzidou, Voula; Hone, Kate; Ashcroft, Richard E; Harding-Esch, Emma M; Lowndes, Catherine M; Sadiq, S Tariq; Sonnenberg, Pam; Estcourt, Claudia S

    2016-07-22

    Despite considerable international eHealth impetus, there is no guidance on the development of online clinical care pathways. Advances in diagnostics now enable self-testing with home diagnosis, to which comprehensive online clinical care could be linked, facilitating completely self-directed, remote care. We describe a new framework for developing complex online clinical care pathways and its application to clinical management of people with genital chlamydia infection, the commonest sexually transmitted infection (STI) in England. Using the existing evidence-base, guidelines and examples from contemporary clinical practice, we developed the eClinical Care Pathway Framework, a nine-step iterative process. Step 1: define the aims of the online pathway; Step 2: define the functional units; Step 3: draft the clinical consultation; Step 4: expert review; Step 5: cognitive testing; Step 6: user-centred interface testing; Step 7: specification development; Step 8: software testing, usability testing and further comprehension testing; Step 9: piloting. We then applied the Framework to create a chlamydia online clinical care pathway (Online Chlamydia Pathway). Use of the Framework elucidated content and structure of the care pathway and identified the need for significant changes in sequences of care (Traditional: history, diagnosis, information versus Online: diagnosis, information, history) and prescribing safety assessment. The Framework met the needs of complex STI management and enabled development of a multi-faceted, fully-automated consultation. The Framework provides a comprehensive structure on which complex online care pathways such as those needed for STI management, which involve clinical services, public health surveillance functions and third party (sexual partner) management, can be developed to meet national clinical and public health standards. The Online Chlamydia Pathway's standardised method of collecting data on demographics and sexual behaviour, with potential for interoperability with surveillance systems, could be a powerful tool for public health and clinical management.

  6. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part ismore » to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.« less

  7. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    NASA Astrophysics Data System (ADS)

    Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang

    2016-06-01

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  8. Heat damaged forages: effects on forage energy content

    USDA-ARS?s Scientific Manuscript database

    Traditionally, educational materials describing the effects of heat damage within baled hays have focused on reduced bioavailability of crude protein as a result of Maillard reactions. These reactions are not simple, but actually occur in complex, multi-step pathways. Typically, the initial step inv...

  9. Reducing workpieces to their base geometry for multi-step incremental forming using manifold harmonics

    NASA Astrophysics Data System (ADS)

    Carette, Yannick; Vanhove, Hans; Duflou, Joost

    2018-05-01

    Single Point Incremental Forming is a flexible process that is well-suited for small batch production and rapid prototyping of complex sheet metal parts. The distributed nature of the deformation process and the unsupported sheet imply that controlling the final accuracy of the workpiece is challenging. To improve the process limits and the accuracy of SPIF, the use of multiple forming passes has been proposed and discussed by a number of authors. Most methods use multiple intermediate models, where the previous one is strictly smaller than the next one, while gradually increasing the workpieces' wall angles. Another method that can be used is the manufacture of a smoothed-out "base geometry" in the first pass, after which more detailed features can be added in subsequent passes. In both methods, the selection of these intermediate shapes is freely decided by the user. However, their practical implementation in the production of complex freeform parts is not straightforward. The original CAD model can be manually adjusted or completely new CAD models can be created. This paper discusses an automatic method that is able to extract the base geometry from a full STL-based CAD model in an analytical way. Harmonic decomposition is used to express the final geometry as the sum of individual surface harmonics. It is then possible to filter these harmonic contributions to obtain a new CAD model with a desired level of geometric detail. This paper explains the technique and its implementation, as well as its use in the automatic generation of multi-step geometries.

  10. Multi-User Spaceport Update News Conference

    NASA Image and Video Library

    2014-01-23

    CAPE CANAVERAL, Fla. – Frank DiBello, right, president and CEO of Space Florida, joins Sierra Nevada Corporation, or SNC, Space Systems, as the company announces the steps it will take to prepare for a November 2016 orbital flight of its Dream Chaser spacecraft from Florida’s Space Coast. The steps are considered substantial for SNC and important to plans by NASA and Space Florida for Kennedy Space Center’s transformation into a multi-user spaceport for both commercial and government customers. SNC said it plans to work with United Launch Alliance, or ULA, to launch the Dream Chaser spacecraft into orbit atop an Atlas V rocket from Space Launch Complex 41 at Cape Canaveral Air Force Station intends to land the winged spacecraft at Kennedy’s 3.5-mile long runway at the Shuttle Landing Facility lease office space at Exploration Park, right outside Kennedy’s gates and process the spacecraft in the high bay of the Operations and Checkout Building at Kennedy, with Lockheed Martin performing the work. Photo credit: NASA/Kim Shiflett

  11. Process Simulation of Aluminium Sheet Metal Deep Drawing at Elevated Temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winklhofer, Johannes; Trattnig, Gernot; Lind, Christoph

    Lightweight design is essential for an economic and environmentally friendly vehicle. Aluminium sheet metal is well known for its ability to improve the strength to weight ratio of lightweight structures. One disadvantage of aluminium is that it is less formable than steel. Therefore complex part geometries can only be realized by expensive multi-step production processes. One method for overcoming this disadvantage is deep drawing at elevated temperatures. In this way the formability of aluminium sheet metal can be improved significantly, and the number of necessary production steps can thereby be reduced. This paper introduces deep drawing of aluminium sheet metalmore » at elevated temperatures, a corresponding simulation method, a characteristic process and its optimization. The temperature and strain rate dependent material properties of a 5xxx series alloy and their modelling are discussed. A three dimensional thermomechanically coupled finite element deep drawing simulation model and its validation are presented. Based on the validated simulation model an optimised process strategy regarding formability, time and cost is introduced.« less

  12. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    PubMed

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  14. Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes

    NASA Astrophysics Data System (ADS)

    Huang, Shaoming

    2003-06-01

    An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.

  15. Biomimetic surface structuring using cylindrical vector femtosecond laser beams

    PubMed Central

    Skoulas, Evangelos; Manousaki, Alexandra; Fotakis, Costas; Stratakis, Emmanuel

    2017-01-01

    We report on a new, single-step and scalable method to fabricate highly ordered, multi-directional and complex surface structures that mimic the unique morphological features of certain species found in nature. Biomimetic surface structuring was realized by exploiting the unique and versatile angular profile and the electric field symmetry of cylindrical vector (CV) femtosecond (fs) laser beams. It is shown that, highly controllable, periodic structures exhibiting sizes at nano-, micro- and dual- micro/nano scales can be directly written on Ni upon line and large area scanning with radial and azimuthal polarization beams. Depending on the irradiation conditions, new complex multi-directional nanostructures, inspired by the Shark’s skin morphology, as well as superhydrophobic dual-scale structures mimicking the Lotus’ leaf water repellent properties can be attained. It is concluded that the versatility and features variations of structures formed is by far superior to those obtained via laser processing with linearly polarized beams. More important, by exploiting the capabilities offered by fs CV fields, the present technique can be further extended to fabricate even more complex and unconventional structures. We believe that our approach provides a new concept in laser materials processing, which can be further exploited for expanding the breadth and novelty of applications. PMID:28327611

  16. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  17. Complex supramolecular interfacial tessellation through convergent multi-step reaction of a dissymmetric simple organic precursor

    NASA Astrophysics Data System (ADS)

    Zhang, Yi-Qi; Paszkiewicz, Mateusz; Du, Ping; Zhang, Liding; Lin, Tao; Chen, Zhi; Klyatskaya, Svetlana; Ruben, Mario; Seitsonen, Ari P.; Barth, Johannes V.; Klappenberger, Florian

    2018-03-01

    Interfacial supramolecular self-assembly represents a powerful tool for constructing regular and quasicrystalline materials. In particular, complex two-dimensional molecular tessellations, such as semi-regular Archimedean tilings with regular polygons, promise unique properties related to their nontrivial structures. However, their formation is challenging, because current methods are largely limited to the direct assembly of precursors, that is, where structure formation relies on molecular interactions without using chemical transformations. Here, we have chosen ethynyl-iodophenanthrene (which features dissymmetry in both geometry and reactivity) as a single starting precursor to generate the rare semi-regular (3.4.6.4) Archimedean tiling with long-range order on an atomically flat substrate through a multi-step reaction. Intriguingly, the individual chemical transformations converge to form a symmetric alkynyl-Ag-alkynyl complex as the new tecton in high yields. Using a combination of microscopy and X-ray spectroscopy tools, as well as computational modelling, we show that in situ generated catalytic Ag complexes mediate the tecton conversion.

  18. Self-regenerating column chromatography

    DOEpatents

    Park, Woo K.

    1995-05-30

    The present invention provides a process for treating both cations and anions by using a self-regenerating, multi-ionic exchange resin column system which requires no separate regeneration steps. The process involves alternating ion-exchange chromatography for cations and anions in a multi-ionic exchange column packed with a mixture of cation and anion exchange resins. The multi-ionic mixed-charge resin column works as a multi-function column, capable of independently processing either cationic or anionic exchange, or simultaneously processing both cationic and anionic exchanges. The major advantage offered by the alternating multi-function ion exchange process is the self-regeneration of the resins.

  19. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    PubMed

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  20. Synthetic Biology for Cell-Free Biosynthesis: Fundamentals of Designing Novel In Vitro Multi-Enzyme Reaction Networks.

    PubMed

    Morgado, Gaspar; Gerngross, Daniel; Roberts, Tania M; Panke, Sven

    Cell-free biosynthesis in the form of in vitro multi-enzyme reaction networks or enzyme cascade reactions emerges as a promising tool to carry out complex catalysis in one-step, one-vessel settings. It combines the advantages of well-established in vitro biocatalysis with the power of multi-step in vivo pathways. Such cascades have been successfully applied to the synthesis of fine and bulk chemicals, monomers and complex polymers of chemical importance, and energy molecules from renewable resources as well as electricity. The scale of these initial attempts remains small, suggesting that more robust control of such systems and more efficient optimization are currently major bottlenecks. To this end, the very nature of enzyme cascade reactions as multi-membered systems requires novel approaches for implementation and optimization, some of which can be obtained from in vivo disciplines (such as pathway refactoring and DNA assembly), and some of which can be built on the unique, cell-free properties of cascade reactions (such as easy analytical access to all system intermediates to facilitate modeling).

  1. Improved perovskite phototransistor prepared using multi-step annealing method

    NASA Astrophysics Data System (ADS)

    Cao, Mingxuan; Zhang, Yating; Yu, Yu; Yao, Jianquan

    2018-02-01

    Organic-inorganic hybrid perovskites with good intrinsic physical properties have received substantial interest for solar cell and optoelectronic applications. However, perovskite film always suffers from a low carrier mobility due to its structural imperfection including sharp grain boundaries and pinholes, restricting their device performance and application potential. Here we demonstrate a straightforward strategy based on multi-step annealing process to improve the performance of perovskite photodetector. Annealing temperature and duration greatly affects the surface morphology and optoelectrical properties of perovskites which determines the device property of phototransistor. The perovskite films treated with multi-step annealing method tend to form highly uniform, well-crystallized and high surface coverage perovskite film, which exhibit stronger ultraviolet-visible absorption and photoluminescence spectrum compare to the perovskites prepared by conventional one-step annealing process. The field-effect mobilities of perovskite photodetector treated by one-step direct annealing method shows mobility as 0.121 (0.062) cm2V-1s-1 for holes (electrons), which increases to 1.01 (0.54) cm2V-1s-1 for that treated with muti-step slow annealing method. Moreover, the perovskite phototransistors exhibit a fast photoresponse speed of 78 μs. In general, this work focuses on the influence of annealing methods on perovskite phototransistor, instead of obtains best parameters of it. These findings prove that Multi-step annealing methods is feasible to prepared high performance based photodetector.

  2. Applying macromolecular crowding to 3D bioprinting: fabrication of 3D hierarchical porous collagen-based hydrogel constructs.

    PubMed

    Ng, Wei Long; Goh, Min Hao; Yeong, Wai Yee; Naing, May Win

    2018-02-27

    Native tissues and/or organs possess complex hierarchical porous structures that confer highly-specific cellular functions. Despite advances in fabrication processes, it is still very challenging to emulate the hierarchical porous collagen architecture found in most native tissues. Hence, the ability to recreate such hierarchical porous structures would result in biomimetic tissue-engineered constructs. Here, a single-step drop-on-demand (DOD) bioprinting strategy is proposed to fabricate hierarchical porous collagen-based hydrogels. Printable macromolecule-based bio-inks (polyvinylpyrrolidone, PVP) have been developed and printed in a DOD manner to manipulate the porosity within the multi-layered collagen-based hydrogels by altering the collagen fibrillogenesis process. The experimental results have indicated that hierarchical porous collagen structures could be achieved by controlling the number of macromolecule-based bio-ink droplets printed on each printed collagen layer. This facile single-step bioprinting process could be useful for the structural design of collagen-based hydrogels for various tissue engineering applications.

  3. The Study of Residential Areas Extraction Based on GF-3 Texture Image Segmentation

    NASA Astrophysics Data System (ADS)

    Shao, G.; Luo, H.; Tao, X.; Ling, Z.; Huang, Y.

    2018-04-01

    The study chooses the standard stripe and dual polarization SAR images of GF-3 as the basic data. Residential areas extraction processes and methods based upon GF-3 images texture segmentation are compared and analyzed. GF-3 images processes include radiometric calibration, complex data conversion, multi-look processing, images filtering, and then conducting suitability analysis for different images filtering methods, the filtering result show that the filtering method of Kuan is efficient for extracting residential areas, then, we calculated and analyzed the texture feature vectors using the GLCM (the Gary Level Co-occurrence Matrix), texture feature vectors include the moving window size, step size and angle, the result show that window size is 11*11, step is 1, and angle is 0°, which is effective and optimal for the residential areas extracting. And with the FNEA (Fractal Net Evolution Approach), we segmented the GLCM texture images, and extracted the residential areas by threshold setting. The result of residential areas extraction verified and assessed by confusion matrix. Overall accuracy is 0.897, kappa is 0.881, and then we extracted the residential areas by SVM classification based on GF-3 images, the overall accuracy is less 0.09 than the accuracy of extraction method based on GF-3 Texture Image Segmentation. We reached the conclusion that residential areas extraction based on GF-3 SAR texture image multi-scale segmentation is simple and highly accurate. although, it is difficult to obtain multi-spectrum remote sensing image in southern China, in cloudy and rainy weather throughout the year, this paper has certain reference significance.

  4. Failure Analysis of a Complex Learning Framework Incorporating Multi-Modal and Semi-Supervised Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pullum, Laura L; Symons, Christopher T

    2011-01-01

    Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learningmore » system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.« less

  5. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  6. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, J.L.

    1990-07-10

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed. 7 figs.

  7. Reynolds-averaged Navier-Stokes based ice accretion for aircraft wings

    NASA Astrophysics Data System (ADS)

    Lashkajani, Kazem Hasanzadeh

    This thesis addresses one of the current issues in flight safety towards increasing icing simulation capabilities for prediction of complex 2D and 3D glaze ice shapes over aircraft surfaces. During the 1980's and 1990's, the field of aero-icing was established to support design and certification of aircraft flying in icing conditions. The multidisciplinary technologies used in such codes were: aerodynamics (panel method), droplet trajectory calculations (Lagrangian framework), thermodynamic module (Messinger model) and geometry module (ice accretion). These are embedded in a quasi-steady module to simulate the time-dependent ice accretion process (multi-step procedure). The objectives of the present research are to upgrade the aerodynamic module from Laplace to Reynolds-Average Navier-Stokes equations solver. The advantages are many. First, the physical model allows accounting for viscous effects in the aerodynamic module. Second, the solution of the aero-icing module directly provides the means for characterizing the aerodynamic effects of icing, such as loss of lift and increased drag. Third, the use of a finite volume approach to solving the Partial Differential Equations allows rigorous mesh and time convergence analysis. Finally, the approaches developed in 2D can be easily transposed to 3D problems. The research was performed in three major steps, each providing insights into the overall numerical approaches. The most important realization comes from the need to develop specific mesh generation algorithms to ensure feasible solutions in very complex multi-step aero-icing calculations. The contributions are presented in chronological order of their realization. First, a new framework for RANS based two-dimensional ice accretion code, CANICE2D-NS, is developed. A multi-block RANS code from U. of Liverpool (named PMB) is providing the aerodynamic field using the Spalart-Allmaras turbulence model. The ICEM-CFD commercial tool is used for the iced airfoil remeshing and field smoothing. The new coupling is fully automated and capable of multi-step ice accretion simulations via a quasi-steady approach. In addition, the framework allows for flow analysis and aerodynamic performance prediction of the iced airfoils. The convergence of the quasi-steady algorithm is verified and identifies the need for an order of magnitude increase in the number of multi-time steps in icing simulations to achieve solver independent solutions. Second, a Multi-Block Navier-Stokes code, NSMB, is coupled with the CANICE2D icing framework. Attention is paid to the roughness implementation of the ONERA roughness model within the Spalart-Allmaras turbulence model, and to the convergence of the steady and quasi-steady iterative procedure. Effects of uniform surface roughness in quasi-steady ice accretion simulation are analyzed through different validation test cases. The results of CANICE2D-NS show good agreement with experimental data both in terms of predicted ice shapes as well as aerodynamic analysis of predicted and experimental ice shapes. Third, an efficient single-block structured Navier-Stokes CFD code, NSCODE, is coupled with the CANICE2D-NS icing framework. Attention is paid to the roughness implementation of the Boeing model within the Spalart-Allmaras turbulence model, and to acceleration of the convergence of the steady and quasi-steady iterative procedures. Effects of uniform surface roughness in quasi-steady ice accretion simulation are analyzed through different validation test cases, including code to code comparisons with the same framework coupled with the NSMB Navier-Stokes solver. The efficiency of the J-multigrid approach to solve the flow equations on complex iced geometries is demonstrated. Since it was noted in all these calculations that the ICEM-CFD grid generation package produced a number of issues such as inefficient mesh quality and smoothing deficiencies (notably grid shocks), a fourth study proposes a new mesh generation algorithm. A PDE based multi-block structured grid generation code, NSGRID, is developed for this purpose. The study includes the developments of novel mesh generation algorithms over complex glaze ice shapes containing multi-curvature ice accretion geometries, such as single/double ice horns. The twofold approaches tackle surface geometry discretization as well as field mesh generation. An adaptive curvilinear curvature control algorithm is constructed solving a 1D elliptic PDE equation with periodic source terms. This method controls the arclength grid spacing so that high convex and concave curvature regions around ice horns are appropriately captured and is shown to effectively treat the grid shock problem. Then, a novel blended method is developed by defining combinations of source terms with 2D elliptic equations. The source terms include two common control functions, Sorenson and Spekreijse, and an additional third source term to improve orthogonality. This blended method is shown to be very effective for improving grid quality metrics for complex glaze ice meshes with RANS resolution. The performance in terms of residual reduction per non-linear iteration of several solution algorithms (Point-Jacobi, Gauss-Seidel, ADI, Point and Line SOR) are discussed within the context of a full Multi-grid operator. Details are given on the various formulations used in the linearization process. It is shown that the performance of the solution algorithm depends on the type of control function used. Finally, the algorithms are validated on standard complex experimental ice shapes, demonstrating the applicability of the methods. Finally, the automated framework of RANS based two-dimensional multi-step ice accretion, CANICE2D-NS is developed, coupled with a Multi-Block Navier-Stokes CFD code, NSCODE2D, a Multi-Block elliptic grid generation code, NSGRID2D, and a Multi-Block Eulerian droplet solver, NSDROP2D (developed at Polytechnique Montreal). The framework allows Lagrangian and Eulerian droplet computations within a chimera approach treating multi-elements geometries. The code was tested on public and confidential validation test cases including standard NATO cases. In addition, up to 10 times speedup is observed in the mesh generation procedure by using the implicit line SOR and ADI smoothers within a multigrid procedure. The results demonstrate the benefits and robustness of the new framework in predicting ice shapes and aerodynamic performance parameters.

  8. Multi-scale modelling of non-uniform consolidation of uncured toughened unidirectional prepregs

    NASA Astrophysics Data System (ADS)

    Sorba, G.; Binetruy, C.; Syerko, E.; Leygue, A.; Comas-Cardona, S.; Belnoue, J. P.-H.; Nixon-Pearson, O. J.; Ivanov, D. S.; Hallett, S. R.; Advani, S. G.

    2018-05-01

    Consolidation is a crucial step in manufacturing of composite parts with prepregs because its role is to eliminate inter- and intra-ply gaps and porosity. Some thermoset prepreg systems are toughened with thermoplastic particles. Depending on their size, thermoplastic particles can be either located in between plies or distributed within the inter-fibre regions. When subjected to transverse compaction, resin will bleed out of low-viscosity unidirectional prepregs along the fibre direction, whereas one would expect transverse squeeze flow to dominate for higher viscosity prepregs. Recent experimental work showed that the consolidation of uncured toughened prepregs involves complex flow and deformation mechanisms where both bleeding and squeeze flow patterns are observed [1]. Micrographs of compacted and cured samples confirm these features as shown in Fig.1. A phenomenological model was proposed [2] where bleeding flow and squeeze flow are combined. A criterion for the transition from shear flow to resin bleeding was also proposed. However, the micrographs also reveal a resin rich layer between plies which may be contributing to the complex flow mechanisms during the consolidation process. In an effort to provide additional insight into these complex mechanisms, this work focuses on the 3D numerical modelling of the compaction of uncured toughened prepregs in the cross-ply configuration described in [1]. A transversely isotropic fluid model is used to describe the flow behaviour of the plies coupled with interplay resin flow of an isotropic fluid. The multi-scale flow model used is based on [3, 4]. A numerical parametric study is carried out where the resin viscosity, permeability and inter-ply thickness are varied to identify the role of important variables. The squeezing flow and the bleeding flow are compared for a range of process parameters to investigate the coupling and competition between the two flow mechanisms. Figure 4 shows the predicted displacement of the sample edge with the multi-scale compaction model after one time step [3]. The ply distortion and resin flow observed in Fig.1 is qualitatively retrieved by the computational model.

  9. Diversity in structure and function of tethering complexes: evidence for different mechanisms in vesicular transport regulation.

    PubMed

    Kümmel, D; Heinemann, U

    2008-04-01

    The term 'tethering factor' has been coined for a heterogeneous group of proteins that all are required for protein trafficking prior to vesicle docking and SNARE-mediated membrane fusion. Two groups of tethering factors can be distinguished, long coiled-coil proteins and multi-subunit complexes. To date, eight such protein complexes have been identified in yeast, and they are required for different trafficking steps. Homologous complexes are found in all eukaryotic organisms, but conservation seems to be less strict than for other components of the trafficking machinery. In fact, for most proposed multi-subunit tethers their ability to actually bridge two membranes remains to be shown. Here we discuss recent progress in the structural and functional characterization of tethering complexes and present the emerging view that the different complexes are quite diverse in their structure and the molecular mechanisms underlying their function. TRAPP and the exocyst are the structurally best characterized tethering complexes. Their comparison fails to reveal any similarity on a struc nottural level. Furthermore, the interactions with regulatory Rab GTPases vary, with TRAPP acting as a nucleotide exchange factor and the exocyst being an effector. Considering these differences among the tethering complexes as well as between their yeast and mammalian orthologs which is apparent from recent studies, we suggest that tethering complexes do not mediate a strictly conserved process in vesicular transport but are diverse regulators acting after vesicle budding and prior to membrane fusion.

  10. Multi-User Spaceport Update News Conference

    NASA Image and Video Library

    2014-01-23

    CAPE CANAVERAL, Fla. – Larry Price, Lockheed Martin Space Systems deputy program manager for NASA's Orion spacecraft, joins Sierra Nevada Corporation, or SNC, Space Systems, as the company announces the steps it will take to prepare for a November 2016 orbital flight of its Dream Chaser spacecraft from Florida’s Space Coast. The steps are considered substantial for SNC and important to plans by NASA and Space Florida for Kennedy Space Center’s transformation into a multi-user spaceport for both commercial and government customers. SNC said it plans to work with United Launch Alliance, or ULA, to launch the Dream Chaser spacecraft into orbit atop an Atlas V rocket from Space Launch Complex 41 at Cape Canaveral Air Force Station intends to land the winged spacecraft at Kennedy’s 3.5-mile long runway at the Shuttle Landing Facility lease office space at Exploration Park, right outside Kennedy’s gates and process the spacecraft in the high bay of the Operations and Checkout Building at Kennedy, with Lockheed Martin performing the work. Photo credit: NASA/Kim Shiflett

  11. Multi-frequency complex network from time series for uncovering oil-water flow structure.

    PubMed

    Gao, Zhong-Ke; Yang, Yu-Xuan; Fang, Peng-Cheng; Jin, Ning-De; Xia, Cheng-Yi; Hu, Li-Dan

    2015-02-04

    Uncovering complex oil-water flow structure represents a challenge in diverse scientific disciplines. This challenge stimulates us to develop a new distributed conductance sensor for measuring local flow signals at different positions and then propose a novel approach based on multi-frequency complex network to uncover the flow structures from experimental multivariate measurements. In particular, based on the Fast Fourier transform, we demonstrate how to derive multi-frequency complex network from multivariate time series. We construct complex networks at different frequencies and then detect community structures. Our results indicate that the community structures faithfully represent the structural features of oil-water flow patterns. Furthermore, we investigate the network statistic at different frequencies for each derived network and find that the frequency clustering coefficient enables to uncover the evolution of flow patterns and yield deep insights into the formation of flow structures. Current results present a first step towards a network visualization of complex flow patterns from a community structure perspective.

  12. ComplexQuant: high-throughput computational pipeline for the global quantitative analysis of endogenous soluble protein complexes using high resolution protein HPLC and precision label-free LC/MS/MS.

    PubMed

    Wan, Cuihong; Liu, Jian; Fong, Vincent; Lugowski, Andrew; Stoilova, Snejana; Bethune-Waddell, Dylan; Borgeson, Blake; Havugimana, Pierre C; Marcotte, Edward M; Emili, Andrew

    2013-04-09

    The experimental isolation and characterization of stable multi-protein complexes are essential to understanding the molecular systems biology of a cell. To this end, we have developed a high-throughput proteomic platform for the systematic identification of native protein complexes based on extensive fractionation of soluble protein extracts by multi-bed ion exchange high performance liquid chromatography (IEX-HPLC) combined with exhaustive label-free LC/MS/MS shotgun profiling. To support these studies, we have built a companion data analysis software pipeline, termed ComplexQuant. Proteins present in the hundreds of fractions typically collected per experiment are first identified by exhaustively interrogating MS/MS spectra using multiple database search engines within an integrative probabilistic framework, while accounting for possible post-translation modifications. Protein abundance is then measured across the fractions based on normalized total spectral counts and precursor ion intensities using a dedicated tool, PepQuant. This analysis allows co-complex membership to be inferred based on the similarity of extracted protein co-elution profiles. Each computational step has been optimized for processing large-scale biochemical fractionation datasets, and the reliability of the integrated pipeline has been benchmarked extensively. This article is part of a Special Issue entitled: From protein structures to clinical applications. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  14. Studies on Stress-Strain Relationships of Polymeric Materials Used in Space Applications

    NASA Technical Reports Server (NTRS)

    Jana, Sadhan C.; Freed, Alan

    2002-01-01

    A two-year research plan was undertaken in association with Polymers Branch, NASA Glenn Research Center, to carry out experimental and modeling work relating stress and strain behavior of polymeric materials, especially elastomers and vulcanized rubber. An experimental system based on MTS (Mechanical Testing and Simulation) A/T-4 test facility environment has been developed for a broader range of polymeric materials in addition to a design of laser compatible temperature control chamber for online measurements of various strains. Necessary material processing has been accomplished including rubber compounding and thermoplastic elastomer processing via injection molding. A broad suite of testing methodologies has been identified to reveal the complex non-linear mechanical behaviors of rubbery materials when subjected to complex modes of deformation. This suite of tests required the conceptualization, design and development of new specimen geometries, test fixtures, and test systems including development of a new laser based technique to measure large multi-axial deformations. Test data has been generated for some of these new fixtures and has revealed some complex coupling effects generated during multi-axial deformations. In addition, fundamental research has been conducted concerning the foundation principles of rubber thermodynamics and resulting theories of rubber elasticity. Studies have been completed on morphological properties of several thermoplastic elastomers. Finally, a series of steps have been identified to further advance the goals of NASA's ongoing effort.

  15. Processing of zero-derived words in English: an fMRI investigation.

    PubMed

    Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C

    2014-01-01

    Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalitybridge-V) i.e., zero-derivation (Aronoff, 1980). We compared the processing of one-step (soaking

  16. HIPS: A new hippocampus subfield segmentation method.

    PubMed

    Romero, José E; Coupé, Pierrick; Manjón, José V

    2017-12-01

    The importance of the hippocampus in the study of several neurodegenerative diseases such as Alzheimer's disease makes it a structure of great interest in neuroimaging. However, few segmentation methods have been proposed to measure its subfields due to its complex structure and the lack of high resolution magnetic resonance (MR) data. In this work, we present a new pipeline for automatic hippocampus subfield segmentation using two available hippocampus subfield delineation protocols that can work with both high and standard resolution data. The proposed method is based on multi-atlas label fusion technology that benefits from a novel multi-contrast patch match search process (using high resolution T1-weighted and T2-weighted images). The proposed method also includes as post-processing a new neural network-based error correction step to minimize systematic segmentation errors. The method has been evaluated on both high and standard resolution images and compared to other state-of-the-art methods showing better results in terms of accuracy and execution time. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Numerical simulation of machining distortions on a forged aerospace component following a one and a multi-step approaches

    NASA Astrophysics Data System (ADS)

    Prete, Antonio Del; Franchi, Rodolfo; Antermite, Fabrizio; Donatiello, Iolanda

    2018-05-01

    Residual stresses appear in a component as a consequence of thermo-mechanical processes (e.g. ring rolling process) casting and heat treatments. When machining these kinds of components, distortions arise due to the redistribution of residual stresses due to the foregoing process history inside the material. If distortions are excessive, they can lead to a large number of scrap parts. Since dimensional accuracy can affect directly the engines efficiency, the dimensional control for aerospace components is a non-trivial issue. In this paper, the problem related to the distortions of large thin walled aeroengines components in nickel superalloys has been addressed. In order to estimate distortions on inner diameters after internal turning operations, a 3D Finite Element Method (FEM) analysis has been developed on a real industrial test case. All the process history, has been taken into account by developing FEM models of ring rolling process and heat treatments. Three different strategies of ring rolling process have been studied and the combination of related parameters which allows to obtain the best dimensional accuracy has been found. Furthermore, grain size evolution and recrystallization phenomena during manufacturing process has been numerically investigated using a semi empirical Johnson-Mehl-Avrami-Kohnogorov (JMAK) model. The volume subtractions have been simulated by boolean trimming: a one step and a multi step analysis have been performed. The multi-step procedure has allowed to choose the best material removal sequence in order to reduce machining distortions.

  18. Electron correlations and pre-collision in the re-collision picture of high harmonic generation

    NASA Astrophysics Data System (ADS)

    Mašín, Zdeněk; Harvey, Alex G.; Spanner, Michael; Patchkovskii, Serguei; Ivanov, Misha; Smirnova, Olga

    2018-07-01

    We discuss the seminal three-step model and the re-collision picture in the context of high harmonic generation in molecules. In particular, we stress the importance of multi-electron correlation during the first and the third of the three steps of the process: (1) the strong-field ionization and (3) the recombination. We point out how an accurate account of multi-electron correlations during the third recombination step allows one to gauge the importance of pre-collision: the term coined by Eberly (n.d. private communication) to describe unusual pathways during the first, ionization, step.

  19. Video Processes in Teacher Education Programs; Scope, Techniques, and Assessment. Multi-State Teacher Education Project, Monograph III.

    ERIC Educational Resources Information Center

    Bosley, Howard E.; And Others

    "Video Processes Are Changing Teacher Education" by Howard Bosley (the first of five papers comprising this document) discusses the Multi-State Teacher Education Project (M-STEP) experimentation with media; it lists various uses of video processes, concentrating specifically on microteaching and the use of simulation and critical…

  20. Development of a Robust and Cost-Effective Friction Stir Welding Process for Use in Advanced Military Vehicles

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Pandurangan, B.; Hariharan, A.; Yen, C.-F.; Cheeseman, B. A.

    2011-02-01

    To respond to the advent of more lethal threats, recently designed aluminum-armor-based military-vehicle systems have resorted to an increasing use of higher strength aluminum alloys (with superior ballistic resistance against armor piercing (AP) threats and with high vehicle-light weighing potential). Unfortunately, these alloys are not very amenable to conventional fusion-based welding technologies and in-order to obtain high-quality welds, solid-state joining technologies such as Friction stir welding (FSW) have to be employed. However, since FSW is a relatively new and fairly complex joining technology, its introduction into advanced military vehicle structures is not straight forward and entails a comprehensive multi-step approach. One such (three-step) approach is developed in the present work. Within the first step, experimental and computational techniques are utilized to determine the optimal tool design and the optimal FSW process parameters which result in maximal productivity of the joining process and the highest quality of the weld. Within the second step, techniques are developed for the identification and qualification of the optimal weld joint designs in different sections of a prototypical military vehicle structure. In the third step, problems associated with the fabrication of a sub-scale military vehicle test structure and the blast survivability of the structure are assessed. The results obtained and the lessons learned are used to judge the potential of the current approach in shortening the development time and in enhancing reliability and blast survivability of military vehicle structures.

  1. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    PubMed

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-07-01

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  2. Design of multi-phase dynamic chemical networks

    NASA Astrophysics Data System (ADS)

    Chen, Chenrui; Tan, Junjun; Hsieh, Ming-Chien; Pan, Ting; Goodwin, Jay T.; Mehta, Anil K.; Grover, Martha A.; Lynn, David G.

    2017-08-01

    Template-directed polymerization reactions enable the accurate storage and processing of nature's biopolymer information. This mutualistic relationship of nucleic acids and proteins, a network known as life's central dogma, is now marvellously complex, and the progressive steps necessary for creating the initial sequence and chain-length-specific polymer templates are lost to time. Here we design and construct dynamic polymerization networks that exploit metastable prion cross-β phases. Mixed-phase environments have been used for constructing synthetic polymers, but these dynamic phases emerge naturally from the growing peptide oligomers and create environments suitable both to nucleate assembly and select for ordered templates. The resulting templates direct the amplification of a phase containing only chain-length-specific peptide-like oligomers. Such multi-phase biopolymer dynamics reveal pathways for the emergence, self-selection and amplification of chain-length- and possibly sequence-specific biopolymers.

  3. A preliminary model of work during initial examination and treatment planning appointments.

    PubMed

    Irwin, J Y; Torres-Urquidy, M H; Schleyer, T; Monaco, V

    2009-01-10

    Objective This study's objective was to formally describe the work process for charting and treatment planning in general dental practice to inform the design of a new clinical computing environment.Methods Using a process called contextual inquiry, researchers observed 23 comprehensive examination and treatment planning sessions during 14 visits to 12 general US dental offices. For each visit, field notes were analysed and reformulated as formalised models. Subsequently, each model type was consolidated across all offices and visits. Interruptions to the workflow, called breakdowns, were identified.Results Clinical work during dental examination and treatment planning appointments is a highly collaborative activity involving dentists, hygienists and assistants. Personnel with multiple overlapping roles complete complex multi-step tasks supported by a large and varied collection of equipment, artifacts and technology. Most of the breakdowns were related to technology which interrupted the workflow, caused rework and increased the number of steps in work processes.Conclusion Current dental software could be significantly improved with regard to its support for communication and collaboration, workflow, information design and presentation, information content, and data entry.

  4. KEEPING A STEP AHEAD - FORMATIVE PHASE OF A WORKPLACE INTERVENTION TRIAL TO PREVENT OBESITY

    PubMed Central

    Zapka, Jane; Lemon, Stephenie C.; Estabrook, Barbara B.; Jolicoeur, Denise G.

    2008-01-01

    Background Ecological interventions hold promise for promoting overweight and obesity prevention in worksites. Given the paucity of evaluative research in the hospital worksite setting, considerable formative work is required for successful implementation and evaluation. Purpose This paper describes the formative phases of Step Ahead, a site-randomized controlled trial of a multi-level intervention that promotes physical activity and healthy eating in 6 hospitals in central Massachusetts. The purpose of the formative research phase was to increase the feasibility, effectiveness and likelihood of sustainability of the intervention. Design and Procedures The Step Ahead ecological intervention approach targets change at the organization, the interpersonal work environment and the individual levels. The intervention was developed using fundamental steps of intervention mapping and important tenets of participatory research. Formative research methods were used to engage leadership support and assistance and to develop an intervention plan that is both theoretically and practically grounded. This report uses observational data, program minutes and reports, and process tracking data. Developmental Strategies and Observations Leadership involvement (key informant interviews and advisory boards), employee focus groups and advisory boards, and quantitative environmental assessments cultivated participation and support. Determining multiple foci of change and designing measurable objectives and generic assessment tools to document progress are complex challenges encountered in planning phases. Lessons Learned Multi-level trials in diverse organizations require flexibility and balance of theory application and practice-based perspectives to affect impact and outcome objectives. Formative research is an essential component. PMID:18073339

  5. How to implement information technology in the operating room and the intensive care unit.

    PubMed

    Meyfroidt, Geert

    2009-03-01

    The number of operating rooms and intensive care units looking for a data management system to perform their increasingly complex tasks is rising. Although at this time only a minority is computerized, within the next few years many centres will start implementing information technology. The transition towards a computerized system is a major venture, which will have a major impact on workflow. This chapter reviews the present literature. Published papers on this subject are predominantly single- or multi-centre implementation reports. The general principles that should guide such a process are described. For healthcare institutions or individual practitioners that plan to undertake this venture, the implementation process is described in a practical, nine-step overview.

  6. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  7. Molecular genetic analysis of plant gravitropism

    NASA Technical Reports Server (NTRS)

    Lomax, T. L.

    1997-01-01

    The analysis of mutants is a powerful approach for elucidating the components of complex biological processes. A growing number of mutants have been isolated which affect plant gravitropism and the classes of mutants found thus far provide important information about the gravity response mechanism. The wide variety of mutants isolated, especially in Arabidopsis, indicates that gravitropism is a complex, multi-step process. The existence of mutants altered in either root gravitropism alone, shoot gravitropism alone, or both indicates that the root and shoot gravitropic mechanisms have both separate and common steps. Reduced starch mutants have confirmed the role of amyloplasts in sensing the gravity signal. The hormone auxin is thought to act as the transducing signal between the sites of gravity perception (the starch parenchyma cells surrounding the vascular tissue in shoots and the columella cells of root caps) and asymmetric growth (the epidermal cells of the elongation zone(s) of each organ). To date, all mutants that are resistant to high concentrations of auxin have also been found to exhibit a reduced gravitropic response, thus supporting the role of auxin. Not all gravitropic mutants are auxin-resistant, however, indicating that there are additional steps which do not involve auxin. Studies with mutants of tomato which exhibit either reduced or reversed gravitropic responses further support the role of auxin redistribution in gravitropism and suggest that both red light and cytokinin interact with gravitropism through controlling lateral auxin transport. Plant responses to gravity thus likely involve changes in both auxin transport and sensitivity.

  8. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE PAGES

    Chen, Bo; Chen, Chen; Wang, Jianhui; ...

    2017-07-07

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  9. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bo; Chen, Chen; Wang, Jianhui

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  10. Accurate quantum Z rotations with less magic

    NASA Astrophysics Data System (ADS)

    Landahl, Andrew; Cesare, Chris

    2013-03-01

    We present quantum protocols for executing arbitrarily accurate π /2k rotations of a qubit about its Z axis. Unlike reduced instruction set computing (RISC) protocols which use a two-step process of synthesizing high-fidelity ``magic'' states from which T = Z (π / 4) gates can be teleported and then compiling a sequence of adaptive stabilizer operations and T gates to approximate Z (π /2k) , our complex instruction set computing (CISC) protocol distills magic states for the Z (π /2k) gates directly. Replacing this two-step process with a single step results in substantial reductions in the number of gates needed. The key to our construction is a family of shortened quantum Reed-Muller codes of length 2 k + 2 - 1 , whose distillation threshold shrinks with k but is greater than 0.85% for k <= 6 . AJL and CC were supported in part by the Laboratory Directed Research and Development program at Sandia National Laboratories. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  11. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    PubMed Central

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  12. Macro-fingerprint analysis-through-separation of licorice based on FT-IR and 2DCOS-IR

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Wang, Ping; Xu, Changhua; Yang, Yan; Li, Jin; Chen, Tao; Li, Zheng; Cui, Weili; Zhou, Qun; Sun, Suqin; Li, Huifen

    2014-07-01

    In this paper, a step-by-step analysis-through-separation method under the navigation of multi-step IR macro-fingerprint (FT-IR integrated with second derivative IR (SD-IR) and 2DCOS-IR) was developed for comprehensively characterizing the hierarchical chemical fingerprints of licorice from entirety to single active components. Subsequently, the chemical profile variation rules of three parts (flavonoids, saponins and saccharides) in the separation process were holistically revealed and the number of matching peaks and correlation coefficients with standards of pure compounds was increasing along the extracting directions. The findings were supported by UPLC results and a verification experiment of aqueous separation process. It has been demonstrated that the developed multi-step IR macro-fingerprint analysis-through-separation approach could be a rapid, effective and integrated method not only for objectively providing comprehensive chemical characterization of licorice and all its separated parts, but also for rapidly revealing the global enrichment trend of the active components in licorice separation process.

  13. Adding Semantics and OPM Ontology for the Provenance of Multi-sensor Merged Climate Data Records. Now What About Reproducibility?

    NASA Astrophysics Data System (ADS)

    Hua, H.; Wilson, B. D.; Manipon, G.; Pan, L.; Fetzer, E.

    2011-12-01

    Multi-decadal climate data records are critical to studying climate variability and change. These often also require merging data from multiple instruments such as those from NASA's A-Train that contain measurements covering a wide range of atmospheric conditions and phenomena. Multi-decadal climate data record of water vapor measurements from sensors on A-Train, operational weather, and other satellites are being assembled from existing data sources, or produced from well-established methods published in peer-reviewed literature. However, the immense volume and inhomogeneity of data often requires an "exploratory computing" approach to product generation where data is processed in a variety of different ways with varying algorithms, parameters, and code changes until an acceptable intermediate product is generated. This process is repeated until a desirable final merged product can be generated. Typically the production legacy is often lost due to the complexity of processing steps that were tried along the way. The data product information associated with source data, processing methods, parameters used, intermediate product outputs, and associated materials are often hidden in each of the trials and scattered throughout the processing system(s). We will discuss methods to help users better capture and explore the production legacy of the data, metadata, ancillary files, code, and computing environment changes used during the production of these merged and multi-sensor data products. By leveraging existing semantic and provenance tools, we can capture sufficient information to enable users to track, perform faceted searches, and visualize the provenance of the products and processing lineage. We will explore if sufficient provenance information can be captured to enable science reproducibility of these climate data records.

  14. Inkjet Deposition of Layer by Layer Assembled Films

    PubMed Central

    Andres, Christine M.; Kotov, Nicholas A.

    2010-01-01

    Layer-by-layer assembly (LBL) can create advanced composites with exceptional properties unavailable by other means, but the laborious deposition process and multiple dipping cycles hamper their utilization in microtechnologies and electronics. Multiple rinse steps provide both structural control and thermodynamic stability to LBL multilayers but they significantly limit their practical applications and contribute significantly to the processing time and waste. Here we demonstrate that by employing inkjet technology one can deliver the necessary quantities of LBL components required for film build-up without excess, eliminating the need for repetitive rinsing steps. This feature differentiates this approach from all other recognized LBL modalities. Using a model system of negatively charged gold nanoparticles and positively charged poly(diallyldimethylammonium) chloride, the material stability, nanoscale control over thickness and particle coverage offered by the inkjet LBL technique are shown to be equal or better than the multilayers made with traditional dipping cycles. The opportunity for fast deposition of complex metallic patterns using a simple inkjet printer was also shown. The additive nature of LBL deposition based on the formation of insoluble nanoparticle-polyelectrolyte complexes of various compositions provides an excellent opportunity for versatile, multi-component, and non-contact patterning for the simple production of stratified patterns that are much needed in advanced devices. PMID:20863114

  15. Aub and Ago3 are recruited to nuage through two mechanisms to form a ping-pong complex assembled by Krimper

    PubMed Central

    Webster, Alexandre; Li, Sisi; Hur, Junho K.; Wachsmuth, Malte; Bois, Justin S.; Perkins, Edward M.; Patel, Dinshaw J.; Aravin, Alexei A.

    2015-01-01

    In Drosophila, two Piwi proteins, Aubergine (Aub) and Argonaute-3 (Ago3) localize to perinuclear ‘nuage’ granules and use guide piRNAs to target and destroy transposable element transcripts. We find that Aub and Ago3 are recruited to nuage by two different mechanisms. Aub requires a piRNA guide for nuage recruitment, indicating that its localization depends on recognition of RNA targets. Ago3 is recruited to nuage independently of a piRNA cargo and relies on interaction with Krimper, a stable component of nuage that is able to aggregate in the absence of other nuage proteins. We show that Krimper interacts directly with Aub and Ago3 to coordinate the assembly of the ping-pong piRNA processing (4P) complex. Symmetrical dimethylated arginines are required for Aub to interact with Krimper, but are dispensable for Ago3 to bind Krimper. Our study reveals a multi-step process responsible for the assembly and function of nuage complexes in piRNA-guided transposon repression. PMID:26295961

  16. Structural insight into the role of VAL1 B3 domain for targeting to FLC locus in Arabidopsis thaliana.

    PubMed

    Wu, Baixing; Zhang, Mengmeng; Su, Shichen; Liu, Hehua; Gan, Jianhua; Ma, Jinbiao

    2018-06-22

    Vernalization is a pivotal stage for some plants involving many epigenetic changes during cold exposure. In Arabidopsis, an essential step in vernalization for further flowering is successful silence the potent floral repressor Flowering Locus C (FLC) by repressing histone mark. AtVal1 is a multi-function protein containing five domains that participate into many recognition processes and is validated to recruit the repress histone modifier PHD-PRC2 complex and interact with components of the ASAP complex target to the FLC nucleation region through recognizing a cis element known as CME (cold memory element) by its plant-specific B3 domain. Here, we determine the crystal structure of the B3 domain in complex with Sph/RY motif in CME. Our structural analysis reveals the specific DNA recognition by B3 domain, combined with our in vitro experiments, we provide the structural insight into the important implication of AtVAL1-B3 domain in flowering process. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics.

    PubMed

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-10-05

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics.

  18. Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics

    PubMed Central

    Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone

    2016-01-01

    Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics. PMID:27703141

  19. Multi-Step Lithiation of Tin Sulfide: An Investigation Using In Situ Electron Microscopy

    DOE PAGES

    Hwang, Sooyeon; Yao, Zhenpeng; Zhang, Lei; ...

    2018-04-03

    Two-dimensional metal sulfides have been widely explored as promising electrodes for lithium ion batteries since their two-dimensional layered structure allows lithium ions to intercalate between layers. For tin disulfide, the lithiation process proceeds via a sequence of three different types of reactions: intercalation, conversion, and alloying but the full scenario of reaction dynamics remains nebulous. In this paper, we investigate the dynamical process of the multi-step reactions using in situ electron microscopy and discover an intermediate rock-salt phase with the disordering of Li and Sn cations, after the initial 2-dimensional intercalation. The disordered cations occupy all the octahedral sites andmore » block the channels for intercalation, which alter the reaction pathways during further lithiation. Our first principles calculations of the non-equilibrium lithiation of SnS2 corroborate the energetic preference of the disordered rock-salt structure over known layered polymorphs. The in situ observations and calculations suggest a two-phase reaction nature for intercalation, disordering, and following conversion reactions. In addition, in situ de-lithiation observation confirms that the alloying reaction is reversible while the conversion reaction is not, which is consistent to the ex situ analysis. This work reveals the full lithiation characteristic of SnS2 and sheds light on the understanding of complex multistep reactions in two-dimensional materials.« less

  20. Multi-Step Lithiation of Tin Sulfide: An Investigation Using In Situ Electron Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Sooyeon; Yao, Zhenpeng; Zhang, Lei

    Two-dimensional metal sulfides have been widely explored as promising electrodes for lithium ion batteries since their two-dimensional layered structure allows lithium ions to intercalate between layers. For tin disulfide, the lithiation process proceeds via a sequence of three different types of reactions: intercalation, conversion, and alloying but the full scenario of reaction dynamics remains nebulous. In this paper, we investigate the dynamical process of the multi-step reactions using in situ electron microscopy and discover an intermediate rock-salt phase with the disordering of Li and Sn cations, after the initial 2-dimensional intercalation. The disordered cations occupy all the octahedral sites andmore » block the channels for intercalation, which alter the reaction pathways during further lithiation. Our first principles calculations of the non-equilibrium lithiation of SnS2 corroborate the energetic preference of the disordered rock-salt structure over known layered polymorphs. The in situ observations and calculations suggest a two-phase reaction nature for intercalation, disordering, and following conversion reactions. In addition, in situ de-lithiation observation confirms that the alloying reaction is reversible while the conversion reaction is not, which is consistent to the ex situ analysis. This work reveals the full lithiation characteristic of SnS2 and sheds light on the understanding of complex multistep reactions in two-dimensional materials.« less

  1. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  2. Using the Binary Phase-Field Crystal Model to Describe Non-Classical Nucleation Pathways in Gold Nanoparticles

    NASA Astrophysics Data System (ADS)

    Smith, Nathan; Provatas, Nikolas

    Recent experimental work has shown that gold nanoparticles can precipitate from an aqueous solution through a non-classical, multi-step nucleation process. This multi-step process begins with spinodal decomposition into solute-rich and solute-poor liquid domains followed by nucleation from within the solute-rich domains. We present a binary phase-field crystal theory that shows the same phenomology and examine various cross-over regimes in the growth and coarsening of liquid and solid domains. We'd like to the thank Canada Research Chairs (CRC) program for funding this work.

  3. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics.

    PubMed

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-07-21

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.

  4. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics

    PubMed Central

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-01-01

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876

  5. Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP)

    PubMed Central

    Murayama, Tomonori; Nakajima, Jun

    2016-01-01

    Anatomical segmentectomies play an important role in oncological lung resection, particularly for ground-glass types of primary lung cancers. This operation can also be applied to metastatic lung tumors deep in the lung. Virtual assisted lung mapping (VAL-MAP) is a novel technique that allows for bronchoscopic multi-spot dye markings to provide “geometric information” to the lung surface, using three-dimensional virtual images. In addition to wedge resections, VAL-MAP has been found to be useful in thoracoscopic segmentectomies, particularly complex segmentectomies, such as combined subsegmentectomies or extended segmentectomies. There are five steps in VAL-MAP-assisted segmentectomies: (I) “standing” stitches along the resection lines; (II) cleaning hilar anatomy; (III) confirming hilar anatomy; (IV) going 1 cm deeper; (V) step-by-step stapling technique. Depending on the anatomy, segmentectomies can be classified into linear (lingular, S6, S2), V- or U-shaped (right S1, left S3, S2b + S3a), and three dimensional (S7, S8, S9, S10) segmentectomies. Particularly three dimensional segmentectomies are challenging in the complexity of stapling techniques. This review focuses on how VAL-MAP can be utilized in segmentectomy, and how this technique can assist the stapling process in even the most challenging ones. PMID:28066675

  6. Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.

    PubMed

    Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen

    2018-07-20

    Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. A multi-step approach for testing non-toxic amphiphilic antifouling coatings against marine microfouling at different levels of biological complexity.

    PubMed

    Zecher, Karsten; Aitha, Vishwa Prasad; Heuer, Kirsten; Ahlers, Herbert; Roland, Katrin; Fiedel, Michael; Philipp, Bodo

    2018-03-01

    Marine biofouling on artificial surfaces such as ship hulls or fish farming nets causes enormous economic damage. The time for the developmental process of antifouling coatings can be shortened by reliable laboratory assays. For designing such test systems, it is important that toxic effects can be excluded, that multiple parameters can be addressed simultaneously and that mechanistic aspects can be included. In this study, a multi-step approach for testing antifouling coatings was established employing photoautotrophic biofilm formation of marine microorganisms in micro- and mesoscoms. Degree and pattern of biofilm formation was determined by quantification of chlorophyll fluorescence. For the microcosms, co-cultures of diatoms and a heterotrophic bacterium were exposed to fouling-release coatings. For the mesocosms, a novel device was developed that permits parallel quantification of a multitude of coatings under defined conditions with varying degrees of shear stress. Additionally, the antifouling coatings were tested for leaching of potential compounds and finally tested in sea trials. This multistep-approach revealed that the individual steps led to consistent results regarding antifouling activity of the coatings. Furthermore, the novel mesocosm system can be employed for advanced antifouling analysis including metagenomic approaches for determination of microbial diversity attaching to different coatings under changing shear forces. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Multi-step cure kinetic model of ultra-thin glass fiber epoxy prepreg exhibiting both autocatalytic and diffusion-controlled regimes under isothermal and dynamic-heating conditions

    NASA Astrophysics Data System (ADS)

    Kim, Ye Chan; Min, Hyunsung; Hong, Sungyong; Wang, Mei; Sun, Hanna; Park, In-Kyung; Choi, Hyouk Ryeol; Koo, Ja Choon; Moon, Hyungpil; Kim, Kwang J.; Suhr, Jonghwan; Nam, Jae-Do

    2017-08-01

    As packaging technologies are demanded that reduce the assembly area of substrate, thin composite laminate substrates require the utmost high performance in such material properties as the coefficient of thermal expansion (CTE), and stiffness. Accordingly, thermosetting resin systems, which consist of multiple fillers, monomers and/or catalysts in thermoset-based glass fiber prepregs, are extremely complicated and closely associated with rheological properties, which depend on the temperature cycles for cure. For the process control of these complex systems, it is usually required to obtain a reliable kinetic model that could be used for the complex thermal cycles, which usually includes both the isothermal and dynamic-heating segments. In this study, an ultra-thin prepreg with highly loaded silica beads and glass fibers in the epoxy/amine resin system was investigated as a model system by isothermal/dynamic heating experiments. The maximum degree of cure was obtained as a function of temperature. The curing kinetics of the model prepreg system exhibited a multi-step reaction and a limited conversion as a function of isothermal curing temperatures, which are often observed in epoxy cure system because of the rate-determining diffusion of polymer chain growth. The modified kinetic equation accurately described the isothermal behavior and the beginning of the dynamic-heating behavior by integrating the obtained maximum degree of cure into the kinetic model development.

  9. Structural insights into NHEJ: building up an integrated picture of the dynamic DSB repair super complex, one component and interaction at a time

    PubMed Central

    Williams, Gareth J.; Hammel, Michal; Radhakrishnan, Sarvan Kumar; Ramsden, Dale; Lees-Miller, Susan P.; Tainer, John A.

    2014-01-01

    Non-homologous end joining (NHEJ) is the major pathway for repair of DNA double-strand breaks (DSBs) in human cells. NHEJ is also needed for V(D)J recombination and the development of T and B cells in vertebrate immune systems, and acts in both the generation and prevention of non-homologous chromosomal translocations, a hallmark of genomic instability and many human cancers. X-ray crystal structures, cryo-electron microscopy envelopes, and small angle X-ray scattering (SAXS) solution conformations and assemblies are defining most of the core protein components for NHEJ: Ku70/Ku80 heterodimer; the DNA dependent protein kinase catalytic subunit (DNA-PKcs); the structure-specific endonuclease Artemis along with polynucleotide kinase/phosphatase (PNKP), aprataxin and PNKP related protein (APLF); the scaffolding proteins XRCC4 and XLF (XRCC4-like factor); DNA polymerases, and DNA ligase IV (Lig IV). The dynamic assembly of multi-protein NHEJ complexes at DSBs is regulated in part by protein phosphorylation. The basic steps of NHEJ have been biochemically defined to require: 1) DSB detection by the Ku heterodimer with subsequent DNA-PKcs tethering to form the DNA-PKcs-Ku-DNA complex (termed DNA-PK), 2) lesion processing, and 3) DNA end ligation by Lig IV, which functions in complex with XRCC4 and XLF. The current integration of structures by combined methods is resolving puzzles regarding the mechanisms, coordination and regulation of these three basic steps. Overall, structural results suggest the NHEJ system forms a flexing scaffold with the DNA-PKcs HEAT repeats acting as compressible macromolecular springs suitable to store and release conformational energy to apply forces to regulate NHEJ complexes and the DNA substrate for DNA end protection, processing, and ligation. PMID:24656613

  10. 78 FR 13868 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    .... It is an important early step in what will be a multi-step process to develop the policy. The agenda... opportunity for communications between participants about management and science issues that relate to the ABC...

  11. Microarc oxidation coating covered Ti implants with micro-scale gouges formed by a multi-step treatment for improving osseointegration.

    PubMed

    Bai, Yixin; Zhou, Rui; Cao, Jianyun; Wei, Daqing; Du, Qing; Li, Baoqiang; Wang, Yaming; Jia, Dechang; Zhou, Yu

    2017-07-01

    The sub-microporous microarc oxidation (MAO) coating covered Ti implant with micro-scale gouges has been fabricated via a multi-step MAO process to overcome the compromised bone-implant integration. The as-prepared implant has been further mediated by post-heat treatment to compare the effects of -OH functional group and the nano-scale orange peel-like morphology on osseointegration. The bone regeneration, bone-implant contact interface, and biomechanical push-out force of the modified Ti implant have been discussed thoroughly in this work. The greatly improved push-out force for the MAO coated Ti implants with micro-scale gouges could be attributed to the excellent mechanical interlocking effect between implants and biologically meshed bone tissues. Attributed to the -OH functional group which promotes synostosis between the biologically meshed bone and the gouge surface of implant, the multi-step MAO process could be an effective strategy to improve the osseointegration of Ti implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A Multi-Decadal Sample Return Campaign Will Advance Lunar and Solar System Science and Exploration by 2050

    NASA Technical Reports Server (NTRS)

    Neal, C. R.; Lawrence, S. J.

    2017-01-01

    There have been 11 missions to the Moon this century, 10 of which have been orbital, from 5 different space agencies. China became the third country to successfully soft-land on the Moon in 2013, and the second to successfully remotely operate a rover on the lunar surface. We now have significant global datasets that, coupled with the 1990s Clementine and Lunar Prospector missions, show that the sample collection is not representative of the lithologies present on the Moon. The M3 data from the Indian Chandrayaan-1 mission have identified lithologies that are not present/under-represented in the sample collection. LRO datasets show that volcanism could be as young as 100 Ma and that significant felsic complexes exist within the lunar crust. A multi-decadal sample return campaign is the next logical step in advancing our understanding of lunar origin and evolution and Solar System processes.

  13. Healthy and productive workers: using intervention mapping to design a workplace health promotion and wellness program to improve presenteeism.

    PubMed

    Ammendolia, Carlo; Côté, Pierre; Cancelliere, Carol; Cassidy, J David; Hartvigsen, Jan; Boyle, Eleanor; Soklaridis, Sophie; Stern, Paula; Amick, Benjamin

    2016-11-25

    Presenteeism is a growing problem in developed countries mostly due to an aging workforce. The economic costs related to presenteeism exceed those of absenteeism and employer health costs. Employers are implementing workplace health promotion and wellness programs to improve health among workers and reduce presenteeism. How best to design, integrate and deliver these programs are unknown. The main purpose of this study was to use an intervention mapping approach to develop a workplace health promotion and wellness program aimed at reducing presenteeism. We partnered with a large international financial services company and used a qualitative synthesis based on an intervention mapping methodology. Evidence from systematic reviews and key articles on reducing presenteeism and implementing health promotion programs was combined with theoretical models for changing behavior and stakeholder experience. This was then systematically operationalized into a program using discussion groups and consensus among experts and stakeholders. The top health problem impacting our workplace partner was mental health. Depression and stress were the first and second highest cause of productivity loss respectively. A multi-pronged program with detailed action steps was developed and directed at key stakeholders and health conditions. For mental health, regular sharing focus groups, social networking, monthly personal stories from leadership using webinars and multi-media communications, expert-led workshops, lunch and learn sessions and manager and employee training were part of a comprehensive program. Comprehensive, specific and multi-pronged strategies were developed and aimed at encouraging healthy behaviours that impact presenteeism such as regular exercise, proper nutrition, adequate sleep, smoking cessation, socialization and work-life balance. Limitations of the intervention mapping process included high resource and time requirements, the lack of external input and viewpoints skewed towards middle and upper management, and using secondary workplace data of unknown validity and reliability. In general, intervention mapping was a useful method to develop a workplace health promotion and wellness program aimed at reducing presenteeism. The methodology provided a step-by-step process to unravel a complex problem. The process compelled participants to think critically, collaboratively and in nontraditional ways.

  14. Immobilised enzyme microreactor for screening of multi-step bioconversions: characterisation of a de novo transketolase-ω-transaminase pathway to synthesise chiral amino alcohols.

    PubMed

    Matosevic, S; Lye, G J; Baganz, F

    2011-09-20

    Complex molecules are synthesised via a number of multi-step reactions in living cells. In this work, we describe the development of a continuous flow immobilized enzyme microreactor platform for use in evaluation of multi-step bioconversion pathways demonstrating a de novo transketolase/ω-transaminase-linked asymmetric amino alcohol synthesis. The prototype dual microreactor is based on the reversible attachment of His₆-tagged enzymes via Ni-NTA linkage to two surface derivatised capillaries connected in series. Kinetic parameters established for the model transketolase (TK)-catalysed conversion of lithium-hydroxypyruvate (Li-HPA) and glycolaldehyde (GA) to L-erythrulose using a continuous flow system with online monitoring of reaction output was in good agreement with kinetic parameters determined for TK in stop-flow mode. By coupling the transketolase catalysed chiral ketone forming reaction with the biocatalytic addition of an amine to the TK product using a transaminase (ω-TAm) it is possible to generate chiral amino alcohols from achiral starting compounds. We demonstrated this in a two-step configuration, where the TK reaction was followed by the ω-TAm-catalysed amination of L-erythrulose to synthesise 2-amino-1,3,4-butanetriol (ABT). Synthesis of the ABT product via the dual reaction and the on-line monitoring of each component provided a full profile of the de novo two-step bioconversion and demonstrated the utility of this microreactor system to provide in vitro multi-step pathway evaluation. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Strategic disruption of nuclear pores structure, integrity and barrier for nuclear apoptosis.

    PubMed

    Shahin, Victor

    2017-08-01

    Apoptosis is a programmed cell death playing key roles in physiology and pathophysiology of multi cellular organisms. Its nuclear manifestation requires transmission of the death signals across the nuclear pore complexes (NPCs). In strategic sequential steps apoptotic factors disrupt NPCs structure, integrity and barrier ultimately leading to nuclear breakdown. The present review reflects on these steps. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Towards a first implementation of the WLIMES approach in living system studies advancing the diagnostics and therapy in augmented personalized medicine.

    PubMed

    Simeonov, Plamen L

    2017-12-01

    The goal of this paper is to advance an extensible theory of living systems using an approach to biomathematics and biocomputation that suitably addresses self-organized, self-referential and anticipatory systems with multi-temporal multi-agents. Our first step is to provide foundations for modelling of emergent and evolving dynamic multi-level organic complexes and their sustentative processes in artificial and natural life systems. Main applications are in life sciences, medicine, ecology and astrobiology, as well as robotics, industrial automation, man-machine interface and creative design. Since 2011 over 100 scientists from a number of disciplines have been exploring a substantial set of theoretical frameworks for a comprehensive theory of life known as Integral Biomathics. That effort identified the need for a robust core model of organisms as dynamic wholes, using advanced and adequately computable mathematics. The work described here for that core combines the advantages of a situation and context aware multivalent computational logic for active self-organizing networks, Wandering Logic Intelligence (WLI), and a multi-scale dynamic category theory, Memory Evolutive Systems (MES), hence WLIMES. This is presented to the modeller via a formal augmented reality language as a first step towards practical modelling and simulation of multi-level living systems. Initial work focuses on the design and implementation of this visual language and calculus (VLC) and its graphical user interface. The results will be integrated within the current methodology and practices of theoretical biology and (personalized) medicine to deepen and to enhance the holistic understanding of life. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A road map for multi-way calibration models.

    PubMed

    Escandar, Graciela M; Olivieri, Alejandro C

    2017-08-07

    A large number of experimental applications of multi-way calibration are known, and a variety of chemometric models are available for the processing of multi-way data. While the main focus has been directed towards three-way data, due to the availability of various instrumental matrix measurements, a growing number of reports are being produced on order signals of increasing complexity. The purpose of this review is to present a general scheme for selecting the appropriate data processing model, according to the properties exhibited by the multi-way data. In spite of the complexity of the multi-way instrumental measurements, simple criteria can be proposed for model selection, based on the presence and number of the so-called multi-linearity breaking modes (instrumental modes that break the low-rank multi-linearity of the multi-way arrays), and also on the existence of mutually dependent instrumental modes. Recent literature reports on multi-way calibration are reviewed, with emphasis on the models that were selected for data processing.

  18. The parallel algorithm for the 2D discrete wavelet transform

    NASA Astrophysics Data System (ADS)

    Barina, David; Najman, Pavel; Kleparnik, Petr; Kula, Michal; Zemcik, Pavel

    2018-04-01

    The discrete wavelet transform can be found at the heart of many image-processing algorithms. Until now, the transform on general-purpose processors (CPUs) was mostly computed using a separable lifting scheme. As the lifting scheme consists of a small number of operations, it is preferred for processing using single-core CPUs. However, considering a parallel processing using multi-core processors, this scheme is inappropriate due to a large number of steps. On such architectures, the number of steps corresponds to the number of points that represent the exchange of data. Consequently, these points often form a performance bottleneck. Our approach appropriately rearranges calculations inside the transform, and thereby reduces the number of steps. In other words, we propose a new scheme that is friendly to parallel environments. When evaluating on multi-core CPUs, we consistently overcome the original lifting scheme. The evaluation was performed on 61-core Intel Xeon Phi and 8-core Intel Xeon processors.

  19. Biomimicry Promotes the Efficiency of a 10-Step Sequential Enzymatic Reaction on Nanoparticles, Converting Glucose to Lactate.

    PubMed

    Mukai, Chinatsu; Gao, Lizeng; Nelson, Jacquelyn L; Lata, James P; Cohen, Roy; Wu, Lauren; Hinchman, Meleana M; Bergkvist, Magnus; Sherwood, Robert W; Zhang, Sheng; Travis, Alexander J

    2017-01-02

    For nanobiotechnology to achieve its potential, complex organic-inorganic systems must grow to utilize the sequential functions of multiple biological components. Critical challenges exist: immobilizing enzymes can block substrate-binding sites or prohibit conformational changes, substrate composition can interfere with activity, and multistep reactions risk diffusion of intermediates. As a result, the most complex tethered reaction reported involves only 3 enzymes. Inspired by the oriented immobilization of glycolytic enzymes on the fibrous sheath of mammalian sperm, here we show a complex reaction of 10 enzymes tethered to nanoparticles. Although individual enzyme efficiency was higher in solution, the efficacy of the 10-step pathway measured by conversion of glucose to lactate was significantly higher when tethered. To our knowledge, this is the most complex organic-inorganic system described, and it shows that tethered, multi-step biological pathways can be reconstituted in hybrid systems to carry out functions such as energy production or delivery of molecular cargo. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Biomimicry promotes the efficiency of a 10-step sequential enzymatic reaction on nanoparticles, converting glucose to lactate

    PubMed Central

    Mukai, Chinatsu; Gao, Lizeng; Nelson, Jacquelyn L.; Lata, James P.; Cohen, Roy; Wu, Lauren; Hinchman, Meleana M.; Bergkvist, Magnus; Sherwood, Robert W.; Zhang, Sheng; Travis, Alexander J.

    2016-01-01

    For nanobiotechnology to achieve its potential, complex organic-inorganic systems must grow to utilize the sequential functions of multiple biological components. Critical challenges exist: immobilizing enzymes can block substrate-binding sites or prohibit conformational changes, substrate composition can interfere with activity, and multistep reactions risk diffusion of intermediates. As a result, the most complex tethered reaction reported involves only 3 enzymes. Inspired by the oriented immobilization of glycolytic enzymes on the fibrous sheath of mammalian sperm, here we show a complex reaction of 10 enzymes tethered to nanoparticles. Although individual enzyme efficiency was higher in solution, the efficacy of the 10-step pathway measured by conversion of glucose to lactate was significantly higher when tethered. To our knowledge, this is the most complex organic-inorganic system described, and it shows that tethered, multi-step biological pathways can be reconstituted in hybrid systems to carry out functions such as energy production or delivery of molecular cargo. PMID:27901298

  1. Use of a biosynthetic intermediate to explore the chemical diversity of pseudo-natural fungal polyketides.

    PubMed

    Asai, Teigo; Tsukada, Kento; Ise, Satomi; Shirata, Naoki; Hashimoto, Makoto; Fujii, Isao; Gomi, Katsuya; Nakagawara, Kosuke; Kodama, Eiichi N; Oshima, Yoshiteru

    2015-09-01

    The structural complexity and diversity of natural products make them attractive sources for potential drug discovery, with their characteristics being derived from the multi-step combination of enzymatic and non-enzymatic conversions of intermediates in each biosynthetic pathway. Intermediates that exhibit multipotent behaviour have great potential for use as starting points in diversity-oriented synthesis. Inspired by the biosynthetic pathways that form complex metabolites from simple intermediates, we developed a semi-synthetic process that combines heterologous biosynthesis and artificial diversification. The heterologous biosynthesis of fungal polyketide intermediates led to the isolation of novel oligomers and provided evidence for ortho-quinonemethide equivalency in their isochromene form. The intrinsic reactivity of the isochromene polyketide enabled us to access various new chemical entities by modifying and remodelling the polyketide core and through coupling with indole molecules. We thus succeeded in generating exceptionally diverse pseudo-natural polyketides through this process and demonstrated an advanced method of using biosynthetic intermediates.

  2. Use of a biosynthetic intermediate to explore the chemical diversity of pseudo-natural fungal polyketides

    NASA Astrophysics Data System (ADS)

    Asai, Teigo; Tsukada, Kento; Ise, Satomi; Shirata, Naoki; Hashimoto, Makoto; Fujii, Isao; Gomi, Katsuya; Nakagawara, Kosuke; Kodama, Eiichi N.; Oshima, Yoshiteru

    2015-09-01

    The structural complexity and diversity of natural products make them attractive sources for potential drug discovery, with their characteristics being derived from the multi-step combination of enzymatic and non-enzymatic conversions of intermediates in each biosynthetic pathway. Intermediates that exhibit multipotent behaviour have great potential for use as starting points in diversity-oriented synthesis. Inspired by the biosynthetic pathways that form complex metabolites from simple intermediates, we developed a semi-synthetic process that combines heterologous biosynthesis and artificial diversification. The heterologous biosynthesis of fungal polyketide intermediates led to the isolation of novel oligomers and provided evidence for ortho-quinonemethide equivalency in their isochromene form. The intrinsic reactivity of the isochromene polyketide enabled us to access various new chemical entities by modifying and remodelling the polyketide core and through coupling with indole molecules. We thus succeeded in generating exceptionally diverse pseudo-natural polyketides through this process and demonstrated an advanced method of using biosynthetic intermediates.

  3. Density functional theory and RRKM calculations of decompositions of the metastable E-2,4-pentadienal molecular ions.

    PubMed

    Solano Espinoza, Eduardo A; Vallejo Narváez, Wilmer E

    2010-07-01

    The potential energy profiles for the fragmentations that lead to [C(5)H(5)O](+) and [C(4)H(6)](+*) ions from the molecular ions [C(5)H(6)O](+*) of E-2,4-pentadienal were obtained from calculations at the UB3LYP/6-311G + + (3df,3pd)//UB3LYP/6-31G(d,p) level of theory. Kinetic barriers and harmonic frequencies obtained by the density functional method were then employed in Rice-Ramsperger-Kassel-Marcus calculations of individual rate coefficients for a large number of reaction steps. The pre-equilibrium and rate-controlling step approximations were applied to different regions of the complex potential energy surface, allowing the overall rate of decomposition to be calculated and discriminated between three rival pathways: C-H bond cleavage, decarbonylation and cyclization. These processes should have to compete for an equilibrated mixture of four conformers of the E-2,4-pentadienal ions. The direct dissociation, however, can only become important in the high-energy regime. In contrast, loss of CO and cyclization are observable processes in the metastable kinetic window. The former involves a slow 1,2-hydrogen shift from the carbonyl group that is immediately followed by the formation of an ion-neutral complex which, in turn, decomposes rapidly to the s-trans-1,3-butadiene ion [C(4)H(6)](+*). The predominating metastable channel is the second one, that is, a multi-step ring closure which starts with a rate-limiting cis-trans isomerization. This process yields a mixture of interconverting pyran ions that dissociates to the pyrylium ions [C(5)H(5)O](+). These results can be used to rationalize the CID mass spectrum of E-2,4-pentadienal in a low-energy regime. 2010 John Wiley & Sons, Ltd.

  4. Unifying Temporal and Structural Credit Assignment Problems

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian K.; Tumer, Kagan

    2004-01-01

    Single-agent reinforcement learners in time-extended domains and multi-agent systems share a common dilemma known as the credit assignment problem. Multi-agent systems have the structural credit assignment problem of determining the contributions of a particular agent to a common task. Instead, time-extended single-agent systems have the temporal credit assignment problem of determining the contribution of a particular action to the quality of the full sequence of actions. Traditionally these two problems are considered different and are handled in separate ways. In this article we show how these two forms of the credit assignment problem are equivalent. In this unified frame-work, a single-agent Markov decision process can be broken down into a single-time-step multi-agent process. Furthermore we show that Monte-Carlo estimation or Q-learning (depending on whether the values of resulting actions in the episode are known at the time of learning) are equivalent to different agent utility functions in a multi-agent system. This equivalence shows how an often neglected issue in multi-agent systems is equivalent to a well-known deficiency in multi-time-step learning and lays the basis for solving time-extended multi-agent problems, where both credit assignment problems are present.

  5. Step-by-step growth of complex oxide microstructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datskos, Panos G.; Cullen, David A.; Sharma, Jaswinder K.

    The synthesis of complex and hybrid oxide microstructures is of fundamental interest and practical applications. However, the design and synthesis of such structures is a challenging task. A solution-phase process to synthesize complex silica and silica-titania hybrid microstructures was developed by exploiting the emulsion-droplet-based step-by-step growth featuring shape control. Lastly, the strategy is robust and can be extended to the preparation of complex hybrid structures consisting of two or more materials, with each having its own shape.

  6. Step-by-step growth of complex oxide microstructures

    DOE PAGES

    Datskos, Panos G.; Cullen, David A.; Sharma, Jaswinder K.

    2015-06-10

    The synthesis of complex and hybrid oxide microstructures is of fundamental interest and practical applications. However, the design and synthesis of such structures is a challenging task. A solution-phase process to synthesize complex silica and silica-titania hybrid microstructures was developed by exploiting the emulsion-droplet-based step-by-step growth featuring shape control. Lastly, the strategy is robust and can be extended to the preparation of complex hybrid structures consisting of two or more materials, with each having its own shape.

  7. Immobilized magnetic beads-based multi-target affinity selection coupled with HPLC-MS for screening active compounds from traditional Chinese medicine and natural products.

    PubMed

    Chen, Yaqi; Chen, Zhui; Wang, Yi

    2015-01-01

    Screening and identifying active compounds from traditional Chinese medicine (TCM) and other natural products plays an important role in drug discovery. Here, we describe a magnetic beads-based multi-target affinity selection-mass spectrometry approach for screening bioactive compounds from natural products. Key steps and parameters including activation of magnetic beads, enzyme/protein immobilization, characterization of functional magnetic beads, screening and identifying active compounds from a complex mixture by LC/MS, are illustrated. The proposed approach is rapid and efficient in screening and identification of bioactive compounds from complex natural products.

  8. Modeling and Grid Generation of Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Hackenberg, Anthony W.; Pennline, James A.; Schilling, Herbert W.

    2007-01-01

    SmaggIce Version 2.0 is a software toolkit for geometric modeling and grid generation for two-dimensional, singleand multi-element, clean and iced airfoils. A previous version of SmaggIce was described in Preparing and Analyzing Iced Airfoils, NASA Tech Briefs, Vol. 28, No. 8 (August 2004), page 32. To recapitulate: Ice shapes make it difficult to generate quality grids around airfoils, yet these grids are essential for predicting ice-induced complex flow. This software efficiently creates high-quality structured grids with tools that are uniquely tailored for various ice shapes. SmaggIce Version 2.0 significantly enhances the previous version primarily by adding the capability to generate grids for multi-element airfoils. This version of the software is an important step in streamlining the aeronautical analysis of ice airfoils using computational fluid dynamics (CFD) tools. The user may prepare the ice shape, define the flow domain, decompose it into blocks, generate grids, modify/divide/merge blocks, and control grid density and smoothness. All these steps may be performed efficiently even for the difficult glaze and rime ice shapes. Providing the means to generate highly controlled grids near rough ice, the software includes the creation of a wrap-around block (called the "viscous sublayer block"), which is a thin, C-type block around the wake line and iced airfoil. For multi-element airfoils, the software makes use of grids that wrap around and fill in the areas between the viscous sub-layer blocks for all elements that make up the airfoil. A scripting feature records the history of interactive steps, which can be edited and replayed later to produce other grids. Using this version of SmaggIce, ice shape handling and grid generation can become a practical engineering process, rather than a laborious research effort.

  9. Adenoma–carcinoma sequence in intrahepatic cholangiocarcinoma

    PubMed Central

    Pinho, André Costa; Melo, Renato Bessa; Oliveira, Manuel; Almeida, Marinho; Lopes, Joanne; Graça, Luís; Costa-Maia, J.

    2012-01-01

    Introduction Cholangiocarcinoma is a rare tumor but recent data report a worldwide increase in incidence and mortality. There are several risk factors associated with cholangiocarcinoma, and chronic inflammation of billiary tree seems to be implied in the cholangiocarcinogenesis, but little is known about this process. Presentation of case We present a 56-year-old female with a bile duct adenoma incidentally discovered in the follow up of breast cancer that 18 months later progress to intrahepatic cholangiocarcinoma. Discussion This is a rare presentation of intrahepatic cholangiocarcinoma that suggests the classic adenoma-carcinoma sequence in cholangiocarcinogenesis. Furthermore this case gives rise to some questions about the possible common ground on intrahepatic cholangiocarcinoma and breast cancer. Conclusion Cholangiocarcinogenesis is a complex multi-step mechanism and further investigations are needed to fully understand this process. PMID:22326450

  10. High-throughput protein concentration and buffer exchange: comparison of ultrafiltration and ammonium sulfate precipitation.

    PubMed

    Moore, Priscilla A; Kery, Vladimir

    2009-01-01

    High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.

  11. A multi-step chromatographic strategy to purify three fungal endo-β-glucanases.

    PubMed

    McCarthy, Tracey; Tuohy, Maria G

    2011-01-01

    Fungi and fungal enzymes have traditionally occupied a central role in biotechnology. Understanding the biochemical properties of the variety of enzymes produced by these eukaryotes has been an area of research interest for decades and again more recently due to global interest in greener bio-production technologies. Purification of an individual enzyme allows its unique biochemical and functional properties to be determined, can provide key information as to the role of individual biocatalysts within a complex enzyme system, and can inform both protein engineering and enzyme production strategies in the development of novel green technologies based on fungal biocatalysts. Many enzymes of current biotechnological interest are secreted by fungi into the extracellular culture medium. These crude enzyme mixtures are typically complex, multi-component, and generally also contain other non-enzymatic proteins and secondary metabolites. In this chapter, we describe a multi-step chromatographic strategy required to isolate three new endo-β-glucanases (denoted EG V, EG VI, and EG VII) with activity against cereal mixed-linkage β-glucans from the thermophilic fungus Talaromyces emersonii. This work also illustrates the challenges frequently involved in isolating individual extracellular fungal proteins in general.

  12. Enhancing the functional properties of thermophilic enzymes by chemical modification and immobilization.

    PubMed

    Cowan, Don A; Fernandez-Lafuente, Roberto

    2011-09-10

    The immobilization of proteins (mostly typically enzymes) onto solid supports is mature technology and has been used successfully to enhance biocatalytic processes in a wide range of industrial applications. However, continued developments in immobilization technology have led to more sophisticated and specialized applications of the process. A combination of targeted chemistries, for both the support and the protein, sometimes in combination with additional chemical and/or genetic engineering, has led to the development of methods for the modification of protein functional properties, for enhancing protein stability and for the recovery of specific proteins from complex mixtures. In particular, the development of effective methods for immobilizing large multi-subunit proteins with multiple covalent linkages (multi-point immobilization) has been effective in stabilizing proteins where subunit dissociation is the initial step in enzyme inactivation. In some instances, multiple benefits are achievable in a single process. Here we comprehensively review the literature pertaining to immobilization and chemical modification of different enzyme classes from thermophiles, with emphasis on the chemistries involved and their implications for modification of the enzyme functional properties. We also highlight the potential for synergies in the combined use of immobilization and other chemical modifications. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. The advanced linked extended reconnaissance and targeting technology demonstration project

    NASA Astrophysics Data System (ADS)

    Cruickshank, James; de Villers, Yves; Maheux, Jean; Edwards, Mark; Gains, David; Rea, Terry; Banbury, Simon; Gauthier, Michelle

    2007-06-01

    The Advanced Linked Extended Reconnaissance & Targeting (ALERT) Technology Demonstration (TD) project is addressing key operational needs of the future Canadian Army's Surveillance and Reconnaissance forces by fusing multi-sensor and tactical data, developing automated processes, and integrating beyond line-of-sight sensing. We discuss concepts for displaying and fusing multi-sensor and tactical data within an Enhanced Operator Control Station (EOCS). The sensor data can originate from the Coyote's own visible-band and IR cameras, laser rangefinder, and ground-surveillance radar, as well as beyond line-of-sight systems such as a mini-UAV and unattended ground sensors. The authors address technical issues associated with the use of fully digital IR and day video cameras and discuss video-rate image processing developed to assist the operator to recognize poorly visible targets. Automatic target detection and recognition algorithms processing both IR and visible-band images have been investigated to draw the operator's attention to possible targets. The machine generated information display requirements are presented with the human factors engineering aspects of the user interface in this complex environment, with a view to establishing user trust in the automation. The paper concludes with a summary of achievements to date and steps to project completion.

  14. Short Note on Complexity of Multi-Value Byzantine Agreement

    DTIC Science & Technology

    2010-07-27

    which lead to nBl /D bits over the whole algorithm. Broadcasts in extended step: In the extended step, every node broadcasts D bits. Thus nDB bits...bits, as: (n− 1)l + n(n− 1)(k +D/k)l/D + nBl /D + nDBt(t+ 1) (4) = (n− 1)l +O(n2kl/D + n2l/k + nBl /D + n3BD). (5) Notice that broadcast algorithm of

  15. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  16. Cellulosome-based, Clostridium-derived multi-functional enzyme complexes for advanced biotechnology tool development: advances and applications.

    PubMed

    Hyeon, Jeong Eun; Jeon, Sang Duck; Han, Sung Ok

    2013-11-01

    The cellulosome is one of nature's most elegant and elaborate nanomachines and a key biological and biotechnological macromolecule that can be used as a multi-functional protein complex tool. Each protein module in the cellulosome system is potentially useful in an advanced biotechnology application. The high-affinity interactions between the cohesin and dockerin domains can be used in protein-based biosensors to improve both sensitivity and selectivity. The scaffolding protein includes a carbohydrate-binding module (CBM) that attaches strongly to cellulose substrates and facilitates the purification of proteins fused with the dockerin module through a one-step CBM purification method. Although the surface layer homology (SLH) domain of CbpA is not present in other strains, replacement of the cell surface anchoring domain allows a foreign protein to be displayed on the surface of other strains. The development of a hydrolysis enzyme complex is a useful strategy for consolidated bioprocessing (CBP), enabling microorganisms with biomass hydrolysis activity. Thus, the development of various configurations of multi-functional protein complexes for use as tools in whole-cell biocatalyst systems has drawn considerable attention as an attractive strategy for bioprocess applications. This review provides a detailed summary of the current achievements in Clostridium-derived multi-functional complex development and the impact of these complexes in various areas of biotechnology. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Synthesis of highly aligned carbon nanotubes by one-step liquid-phase process: Effects of carbon sources on morphology of carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Yamagiwa, Kiyofumi; Kuwano, Jun

    2017-06-01

    This paper describes a unique and innovative synthesis technique for carbon nanotubes (CNTs) by a one-step liquid-phase process under ambient pressure. Vertically aligned multi-walled CNT arrays with a maximum height of 100 µm are prepared on stainless steel substrates, which are submerged and electrically heated in straight-chain primary alcohols with n C = 1-4 (n C: number of C atoms in the molecule) containing an appropriate amount of cobalt-based organometallic complex as a catalyst precursor. Structural isomers of butanol were also used for the synthesis to examine the effects of structural factors on the morphology of the deposited products. Notably, 2-methyl-2-propanol, which is a tertiary alcohol, produced only a small amount of low-crystallinity carbonaceous deposits, whereas vertically aligned CNTs were grown from the other isomers of butanol. These results suggest that the presence or absence of β-hydrogen in the molecular structure is a key factor for understanding the dissociation behavior of the carbon source molecules on the catalyst.

  18. Dissolution of Fe(III) (hydr) oxides by metal-EDTA complexes

    NASA Astrophysics Data System (ADS)

    Ngwack, Bernd; Sigg, Laura

    1997-03-01

    The dissolution of Fe(III)(hydr)oxides (goethite and hydrous ferric oxide) by metal-EDTA complexes occurs by ligand-promoted dissolution. The process is initiated by the adsorption of metal-EDTA complexes to the surface and is followed by the dissociation of the complex at the surface and the release of Fe(III)EDTA into solution. The dissolution rate is decreased to a great extent if EDTA is complexed by metals in comparison to the uncomplexed EDTA. The rate decreases in the order EDTA CaEDTA ≫ PbEDTA > ZnEDTA > CuEDTA > Co(II)EDTA > NiEDTA. Two different rate-limiting steps determine the dissolution process: (1) detachment of Fe(III) from the oxide-structure and (2) dissociation of the metal-EDTA complexes. In the case of goethite, step 1 is slower than step 2 and the dissolution rates by various metals are similar. In the case of hydrous ferric oxide, step 2 is rate-limiting and the effect of the complexed metal is very pronounced.

  19. GIST Clinic Application 2018 | Center for Cancer Research

    Cancer.gov

    Clinic date: June 20-22, 2018 This Application is the first step in a multi-step process for being considered for participation in our upcoming Pediatric and wild-type GIST clinic. Please review all 3 pages and complete all questions in full.

  20. Production of multi-fiber modifying enzyme from Mamillisphaeria sp. for refining of recycled paper pulp.

    PubMed

    Laothanachareon, Thanaporn; Khonzue, Parichart; Rattanaphan, Nakul; Tinnasulanon, Phungjai; Apawasin, Saowanee; Paemanee, Atchara; Ruanglek, Vasimon; Tanapongpipat, Sutipa; Champreda, Verawat; Eurwilaichitr, Lily

    2011-01-01

    Enzymatic modification of pulp is receiving increasing interest for energy reduction at the refining step of the paper-making process. In this study, the production of a multi-fiber modifying enzyme from Mamillisphaeria sp. BCC8893 was optimized in submerged fermentation using a response-surface methodology. Maximal production was obtained in a complex medium comprising wheat bran, soybean, and rice bran supplemented with yeast extract at pH 6.0 and a harvest time of 7 d, resulting in 9.2 IU/mL of carboxymethyl cellulase (CMCase), 14.9 IU/mL of filter paper activity (FPase), and 242.7 IU/mL of xylanase. Treatment of old corrugated container pulp at 0.2-0.3 IU of CMCase/g of pulp led to reductions in refining energy of 8.5-14.8%. The major physical properties were retained, including tensile and compression strength. Proteomic analysis showed that the enzyme was a complex composite of endo-glucanases, cellobiohydrolases, beta-1,4-xylanases, and beta-glucanases belonging to various glycosyl hydrolase families, suggestive of cooperative enzyme action in fiber modification, providing the basis for refining efficiency.

  1. Decision tree-based method for integrating gene expression, demographic, and clinical data to determine disease endotypes

    PubMed Central

    2013-01-01

    Background Complex diseases are often difficult to diagnose, treat and study due to the multi-factorial nature of the underlying etiology. Large data sets are now widely available that can be used to define novel, mechanistically distinct disease subtypes (endotypes) in a completely data-driven manner. However, significant challenges exist with regard to how to segregate individuals into suitable subtypes of the disease and understand the distinct biological mechanisms of each when the goal is to maximize the discovery potential of these data sets. Results A multi-step decision tree-based method is described for defining endotypes based on gene expression, clinical covariates, and disease indicators using childhood asthma as a case study. We attempted to use alternative approaches such as the Student’s t-test, single data domain clustering and the Modk-prototypes algorithm, which incorporates multiple data domains into a single analysis and none performed as well as the novel multi-step decision tree method. This new method gave the best segregation of asthmatics and non-asthmatics, and it provides easy access to all genes and clinical covariates that distinguish the groups. Conclusions The multi-step decision tree method described here will lead to better understanding of complex disease in general by allowing purely data-driven disease endotypes to facilitate the discovery of new mechanisms underlying these diseases. This application should be considered a complement to ongoing efforts to better define and diagnose known endotypes. When coupled with existing methods developed to determine the genetics of gene expression, these methods provide a mechanism for linking genetics and exposomics data and thereby accounting for both major determinants of disease. PMID:24188919

  2. The dimerization of the yeast cytochrome bc1 complex is an early event and is independent of Rip1.

    PubMed

    Conte, Annalea; Papa, Benedetta; Ferramosca, Alessandra; Zara, Vincenzo

    2015-05-01

    In Saccharomyces cerevisiae the mature cytochrome bc1 complex exists as an obligate homo-dimer in which each monomer consists of ten distinct protein subunits inserted into or bound to the inner mitochondrial membrane. Among them, the Rieske iron-sulfur protein (Rip1), besides its catalytic role in electron transfer, may be implicated in the bc1 complex dimerization. Indeed, Rip1 has the globular domain containing the catalytic center in one monomer while the transmembrane helix interacts with the adjacent monomer. In addition, the lack of Rip1 leads to the accumulation of an immature bc1 intermediate, only loosely associated with cytochrome c oxidase. In this study we have investigated the biogenesis of the yeast cytochrome bc1 complex using epitope tagged proteins to purify native assembly intermediates. We showed that the dimerization process is an early event during bc1 complex biogenesis and that the presence of Rip1, differently from previous proposals, is not essential for this process. We also investigated the multi-step model of bc1 assembly thereby lending further support to the existence of bona fide subcomplexes during bc1 maturation in the inner mitochondrial membrane. Finally, a new model of cytochrome bc1 complex assembly, in which distinct intermediates sequentially interact during bc1 maturation, has been proposed. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. On the evolutionary advantage of multi-cusped teeth

    PubMed Central

    Bush, Mark B.; Barani, Amir; Lawn, Brian R.

    2016-01-01

    A hallmark of mammalian evolution is a progressive complexity in postcanine tooth morphology. However, the driving force for this complexity remains unclear: whether to expand the versatility in diet source, or to bolster tooth structural integrity. In this study, we take a quantitative approach to this question by examining the roles of number, position and height of multiple cusps in determining sustainable bite forces. Our approach is to use an extended finite-element methodology with due provision for step-by-step growth of an embedded crack to determine how fracture progresses with increasing occlusal load. We argue that multi-cusp postcanine teeth are well configured to withstand high bite forces provided that multiple cusps are contacted simultaneously to share the load. However, contact on a single near-wall cusp diminishes the strength. Location of the load points and cusp height, rather than cusp number or radius, are principal governing factors. Given these findings, we conclude that while complex tooth structures can enhance durability, increases in cusp number are more likely to be driven by the demands of food manipulation. Structural integrity of complex teeth is maintained when individual cusps remain sufficiently distant from the side walls and do not become excessively tall relative to tooth width. PMID:27558851

  4. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  5. Regulation of wound healing and fibrosis by hypoxia and hypoxia-inducible factor-1.

    PubMed

    Ruthenborg, Robin J; Ban, Jae-Jun; Wazir, Anum; Takeda, Norihiko; Kim, Jung-Whan

    2014-09-01

    Wound healing is a complex multi-step process that requires spatial and temporal orchestration of cellular and non-cellular components. Hypoxia is one of the prominent microenvironmental factors in tissue injury and wound healing. Hypoxic responses, mainly mediated by a master transcription factor of oxygen homeostasis, hypoxia-inducible factor-1 (HIF-1), have been shown to be critically involved in virtually all processes of wound healing and remodeling. Yet, mechanisms underlying hypoxic regulation of wound healing are still poorly understood. Better understanding of how the wound healing process is regulated by the hypoxic microenvironment and HIF-1 signaling pathway will provide insight into the development of a novel therapeutic strategy for impaired wound healing conditions such as diabetic wound and fibrosis. In this review, we will discuss recent studies illuminating the roles of HIF-1 in physiologic and pathologic wound repair and further, the therapeutic potentials of HIF-1 stabilization or inhibition.

  6. 4D-SFM Photogrammetry for Monitoring Sediment Dynamics in a Debris-Flow Catchment: Software Testing and Results Comparison

    NASA Astrophysics Data System (ADS)

    Cucchiaro, S.; Maset, E.; Fusiello, A.; Cazorzi, F.

    2018-05-01

    In recent years, the combination of Structure-from-Motion (SfM) algorithms and UAV-based aerial images has revolutionised 3D topographic surveys for natural environment monitoring, offering low-cost, fast and high quality data acquisition and processing. A continuous monitoring of the morphological changes through multi-temporal (4D) SfM surveys allows, e.g., to analyse the torrent dynamic also in complex topography environment like debris-flow catchments, provided that appropriate tools and procedures are employed in the data processing steps. In this work we test two different software packages (3DF Zephyr Aerial and Agisoft Photoscan) on a dataset composed of both UAV and terrestrial images acquired on a debris-flow reach (Moscardo torrent - North-eastern Italian Alps). Unlike other papers in the literature, we evaluate the results not only on the raw point clouds generated by the Structure-from- Motion and Multi-View Stereo algorithms, but also on the Digital Terrain Models (DTMs) created after post-processing. Outcomes show differences between the DTMs that can be considered irrelevant for the geomorphological phenomena under analysis. This study confirms that SfM photogrammetry can be a valuable tool for monitoring sediment dynamics, but accurate point cloud post-processing is required to reliably localize geomorphological changes.

  7. Progress in centralised ethics review processes: Implications for multi-site health evaluations.

    PubMed

    Prosser, Brenton; Davey, Rachel; Gibson, Diane

    2015-04-01

    Increasingly, public sector programmes respond to complex social problems that intersect specific fields and individual disciplines. Such responses result in multi-site initiatives that can span nations, jurisdictions, sectors and organisations. The rigorous evaluation of public sector programmes is now a baseline expectation. For evaluations of large and complex multi-site programme initiatives, the processes of ethics review can present a significant challenge. However in recent years, there have been new developments in centralised ethics review processes in many nations. This paper provides the case study of an evaluation of a national, inter-jurisdictional, cross-sector, aged care health initiative and its encounters with Australian centralised ethics review processes. Specifically, the paper considers progress against the key themes of a previous five-year, five nation study (Fitzgerald and Phillips, 2006), which found that centralised ethics review processes would save time, money and effort, as well as contribute to more equitable workloads for researchers and evaluators. The paper concludes with insights for those charged with refining centralised ethics review processes, as well as recommendations for future evaluators of complex multi-site programme initiatives. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. EUV patterning using CAR or MOX photoresist at low dose exposure for sub 36nm pitch

    NASA Astrophysics Data System (ADS)

    Thibaut, Sophie; Raley, Angélique; Lazarrino, Frederic; Mao, Ming; De Simone, Danilo; Piumi, Daniele; Barla, Kathy; Ko, Akiteru; Metz, Andrew; Kumar, Kaushik; Biolsi, Peter

    2018-04-01

    The semiconductor industry has been pushing the limits of scalability by combining 193nm immersion lithography with multi-patterning techniques for several years. Those integrations have been declined in a wide variety of options to lower their cost but retain their inherent variability and process complexity. EUV lithography offers a much desired path that allows for direct print of line and space at 36nm pitch and below and effectively addresses issues like cycle time, intra-level overlay and mask count costs associated with multi-patterning. However it also brings its own sets of challenges. One of the major barrier to high volume manufacturing implementation has been hitting the 250W power exposure required for adequate throughput [1]. Enabling patterning using a lower dose resist could help move us closer to the HVM throughput targets assuming required performance for roughness and pattern transfer can be met. As plasma etching is known to reduce line edge roughness on 193nm lithography printed features [2], we investigate in this paper the level of roughness that can be achieved on EUV photoresist exposed at a lower dose through etch process optimization into a typical back end of line film stack. We will study 16nm lines printed at 32 and 34nm pitch. MOX and CAR photoresist performance will be compared. We will review step by step etch chemistry development to reach adequate selectivity and roughness reduction to successfully pattern the target layer.

  9. CorRECTreatment: A Web-based Decision Support Tool for Rectal Cancer Treatment that Uses the Analytic Hierarchy Process and Decision Tree

    PubMed Central

    Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.

    2015-01-01

    Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options. PMID:25848413

  10. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    PubMed

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  11. Complex Degradation Processes Lead to Non-Exponential Decay Patterns and Age-Dependent Decay Rates of Messenger RNA

    PubMed Central

    Deneke, Carlus; Lipowsky, Reinhard; Valleriani, Angelo

    2013-01-01

    Experimental studies on mRNA stability have established several, qualitatively distinct decay patterns for the amount of mRNA within the living cell. Furthermore, a variety of different and complex biochemical pathways for mRNA degradation have been identified. The central aim of this paper is to bring together both the experimental evidence about the decay patterns and the biochemical knowledge about the multi-step nature of mRNA degradation in a coherent mathematical theory. We first introduce a mathematical relationship between the mRNA decay pattern and the lifetime distribution of individual mRNA molecules. This relationship reveals that the mRNA decay patterns at steady state expression level must obey a general convexity condition, which applies to any degradation mechanism. Next, we develop a theory, formulated as a Markov chain model, that recapitulates some aspects of the multi-step nature of mRNA degradation. We apply our theory to experimental data for yeast and explicitly derive the lifetime distribution of the corresponding mRNAs. Thereby, we show how to extract single-molecule properties of an mRNA, such as the age-dependent decay rate and the residual lifetime. Finally, we analyze the decay patterns of the whole translatome of yeast cells and show that yeast mRNAs can be grouped into three broad classes that exhibit three distinct decay patterns. This paper provides both a method to accurately analyze non-exponential mRNA decay patterns and a tool to validate different models of degradation using decay data. PMID:23408982

  12. Regulation of floral stem cell termination in Arabidopsis

    PubMed Central

    Sun, Bo; Ito, Toshiro

    2015-01-01

    In Arabidopsis, floral stem cells are maintained only at the initial stages of flower development, and they are terminated at a specific time to ensure proper development of the reproductive organs. Floral stem cell termination is a dynamic and multi-step process involving many transcription factors, chromatin remodeling factors and signaling pathways. In this review, we discuss the mechanisms involved in floral stem cell maintenance and termination, highlighting the interplay between transcriptional regulation and epigenetic machinery in the control of specific floral developmental genes. In addition, we discuss additional factors involved in floral stem cell regulation, with the goal of untangling the complexity of the floral stem cell regulatory network. PMID:25699061

  13. Combining biophysical methods for the analysis of protein complex stoichiometry and affinity in SEDPHAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Huaying, E-mail: zhaoh3@mail.nih.gov; Schuck, Peter, E-mail: zhaoh3@mail.nih.gov

    2015-01-01

    Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysicalmore » techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design.« less

  14. Joint explorative analysis of neuroreceptor subsystems in the human brain: application to receptor-transporter correlation using PET data.

    PubMed

    Cselényi, Zsolt; Lundberg, Johan; Halldin, Christer; Farde, Lars; Gulyás, Balázs

    2004-10-01

    Positron emission tomography (PET) has proved to be a highly successful technique in the qualitative and quantitative exploration of the human brain's neurotransmitter-receptor systems. In recent years, the number of PET radioligands, targeted to different neuroreceptor systems of the human brain, has increased considerably. This development paves the way for a simultaneous analysis of different receptor systems and subsystems in the same individual. The detailed exploration of the versatility of neuroreceptor systems requires novel technical approaches, capable of operating on huge parametric image datasets. An initial step of such explorative data processing and analysis should be the development of novel exploratory data-mining tools to gain insight into the "structure" of complex multi-individual, multi-receptor data sets. For practical reasons, a possible and feasible starting point of multi-receptor research can be the analysis of the pre- and post-synaptic binding sites of the same neurotransmitter. In the present study, we propose an unsupervised, unbiased data-mining tool for this task and demonstrate its usefulness by using quantitative receptor maps, obtained with positron emission tomography, from five healthy subjects on (pre-synaptic) serotonin transporters (5-HTT or SERT) and (post-synaptic) 5-HT(1A) receptors. Major components of the proposed technique include the projection of the input receptor maps to a feature space, the quasi-clustering and classification of projected data (neighbourhood formation), trans-individual analysis of neighbourhood properties (trajectory analysis), and the back-projection of the results of trajectory analysis to normal space (creation of multi-receptor maps). The resulting multi-receptor maps suggest that complex relationships and tendencies in the relationship between pre- and post-synaptic transporter-receptor systems can be revealed and classified by using this method. As an example, we demonstrate the regional correlation of the serotonin transporter-receptor systems. These parameter-specific multi-receptor maps can usefully guide the researchers in their endeavour to formulate models of multi-receptor interactions and changes in the human brain.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Robert; McConnell, Elizabeth

    Machining methods across many industries generally require multiple operations to machine and process advanced materials, features with micron precision, and complex shapes. The resulting multiple machining platforms can significantly affect manufacturing cycle time and the precision of the final parts, with a resultant increase in cost and energy consumption. Ultrafast lasers represent a transformative and disruptive technology that removes material with micron precision and in a single step manufacturing process. Such precision results from athermal ablation without modification or damage to the remaining material which is the key differentiator between ultrafast laser technologies and traditional laser technologies or mechanical processes.more » Athermal ablation without modification or damage to the material eliminates post-processing or multiple manufacturing steps. Combined with the appropriate technology to control the motion of the work piece, ultrafast lasers are excellent candidates to provide breakthrough machining capability for difficult-to-machine materials. At the project onset in early 2012, the project team recognized that substantial effort was necessary to improve the application of ultrafast laser and precise motion control technologies (for micromachining difficult-to-machine materials) to further the aggregate throughput and yield improvements over conventional machining methods. The project described in this report advanced these leading-edge technologies thru the development and verification of two platforms: a hybrid enhanced laser chassis and a multi-application testbed.« less

  16. A Study on Segmented Multiple-Step Forming of Doubly Curved Thick Plate by Reconfigurable Multi-Punch Dies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Young Ho; Han, Myoung Soo; Han, Jong Man

    2007-05-17

    Doubly curved thick plate forming in shipbuilding industries is currently performed by a thermal forming process, called as Line Heating by using gas flame torches. Due to the empirical manual work of it, the industries are eager for an alternative way to manufacture curved thick plates for ships. It was envisaged in this study to manufacture doubly curved thick plates by the multi-punch die forming. Experiments and finite element analyses were conducted to evaluate the feasibility of the reconfigurable discrete die forming to the thick plates. Single and segmented multiple step forming procedures were considered from both forming efficiency andmore » accuracy. Configuration of the multi-punch dies suitable for the segmented multiple step forming was also explored. As a result, Segmented multiple step forming with matched dies had a limited formability when the objective shapes become complicate, while a unmatched die configuration provided better possibility to manufacture large curved plates for ships.« less

  17. Modeling the MHC class I pathway by combining predictions of proteasomal cleavage, TAP transport and MHC class I binding.

    PubMed

    Tenzer, S; Peters, B; Bulik, S; Schoor, O; Lemmel, C; Schatz, M M; Kloetzel, P-M; Rammensee, H-G; Schild, H; Holzhütter, H-G

    2005-05-01

    Epitopes presented by major histocompatibility complex (MHC) class I molecules are selected by a multi-step process. Here we present the first computational prediction of this process based on in vitro experiments characterizing proteasomal cleavage, transport by the transporter associated with antigen processing (TAP) and MHC class I binding. Our novel prediction method for proteasomal cleavages outperforms existing methods when tested on in vitro cleavage data. The analysis of our predictions for a new dataset consisting of 390 endogenously processed MHC class I ligands from cells with known proteasome composition shows that the immunological advantage of switching from constitutive to immunoproteasomes is mainly to suppress the creation of peptides in the cytosol that TAP cannot transport. Furthermore, we show that proteasomes are unlikely to generate MHC class I ligands with a C-terminal lysine residue, suggesting processing of these ligands by a different protease that may be tripeptidyl-peptidase II (TPPII).

  18. The Ocean Observatories Initiative: Data pre-Processing: Diagnostic Tools to Prepare Data for QA/QC Processing.

    NASA Astrophysics Data System (ADS)

    Belabbassi, L.; Garzio, L. M.; Smith, M. J.; Knuth, F.; Vardaro, M.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of deployed oceanographic sensors. The Pioneer Array in the Atlantic Ocean off the Coast of New England hosts 10 moorings and 6 gliders. Each mooring is outfitted with 6 to 19 different instruments telemetering more than 1000 data streams. These data are available to science users to collaborate on common scientific goals such as water quality monitoring and scale variability measures of continental shelf processes and coastal open ocean exchanges. To serve this purpose, the acquired datasets undergo an iterative multi-step quality assurance and quality control procedure automated to work with all types of data. Data processing involves several stages, including a fundamental pre-processing step when the data are prepared for processing. This takes a considerable amount of processing time and is often not given enough thought in development initiatives. The volume and complexity of OOI data necessitates the development of a systematic diagnostic tool to enable the management of a comprehensive data information system for the OOI arrays. We present two examples to demonstrate the current OOI pre-processing diagnostic tool. First, Data Filtering is used to identify incomplete, incorrect, or irrelevant parts of the data and then replaces, modifies or deletes the coarse data. This provides data consistency with similar datasets in the system. Second, Data Normalization occurs when the database is organized in fields and tables to minimize redundancy and dependency. At the end of this step, the data are stored in one place to reduce the risk of data inconsistency and promote easy and efficient mapping to the database.

  19. Comparing an annual and daily time-step model for predicting field-scale phosphorus loss

    USDA-ARS?s Scientific Manuscript database

    Numerous models exist for describing phosphorus (P) losses from agricultural fields. The complexity of these models varies considerably ranging from simple empirically-based annual time-step models to more complex process-based daily time step models. While better accuracy is often assumed with more...

  20. Cyclic Game Dynamics Driven by Iterated Reasoning

    PubMed Central

    Frey, Seth; Goldstone, Robert L.

    2013-01-01

    Recent theories from complexity science argue that complex dynamics are ubiquitous in social and economic systems. These claims emerge from the analysis of individually simple agents whose collective behavior is surprisingly complicated. However, economists have argued that iterated reasoning–what you think I think you think–will suppress complex dynamics by stabilizing or accelerating convergence to Nash equilibrium. We report stable and efficient periodic behavior in human groups playing the Mod Game, a multi-player game similar to Rock-Paper-Scissors. The game rewards subjects for thinking exactly one step ahead of others in their group. Groups that play this game exhibit cycles that are inconsistent with any fixed-point solution concept. These cycles are driven by a “hopping” behavior that is consistent with other accounts of iterated reasoning: agents are constrained to about two steps of iterated reasoning and learn an additional one-half step with each session. If higher-order reasoning can be complicit in complex emergent dynamics, then cyclic and chaotic patterns may be endogenous features of real-world social and economic systems. PMID:23441191

  1. Evaluation of accuracy in implant site preparation performed in single- or multi-step drilling procedures.

    PubMed

    Marheineke, Nadine; Scherer, Uta; Rücker, Martin; von See, Constantin; Rahlf, Björn; Gellrich, Nils-Claudius; Stoetzer, Marcus

    2018-06-01

    Dental implant failure and insufficient osseointegration are proven results of mechanical and thermal damage during the surgery process. We herein performed a comparative study of a less invasive single-step drilling preparation protocol and a conventional multiple drilling sequence. Accuracy of drilling holes was precisely analyzed and the influence of different levels of expertise of the handlers and additional use of drill template guidance was evaluated. Six experimental groups, deployed in an osseous study model, were representing template-guided and freehanded drilling actions in a stepwise drilling procedure in comparison to a single-drill protocol. Each experimental condition was studied by the drilling actions of respectively three persons without surgical knowledge as well as three highly experienced oral surgeons. Drilling actions were performed and diameters were recorded with a precision measuring instrument. Less experienced operators were able to significantly increase the drilling accuracy using a guiding template, especially when multi-step preparations are performed. Improved accuracy without template guidance was observed when experienced operators were executing single-step versus multi-step technique. Single-step drilling protocols have shown to produce more accurate results than multi-step procedures. The outcome of any protocol can be further improved by use of guiding templates. Operator experience can be a contributing factor. Single-step preparations are less invasive and are promoting osseointegration. Even highly experienced surgeons are achieving higher levels of accuracy by combining this technique with template guidance. Hereby template guidance enables a reduction of hands-on time and side effects during surgery and lead to a more predictable clinical diameter.

  2. Data fusion for QRS complex detection in multi-lead electrocardiogram recordings

    NASA Astrophysics Data System (ADS)

    Ledezma, Carlos A.; Perpiñan, Gilberto; Severeyn, Erika; Altuve, Miguel

    2015-12-01

    Heart diseases are the main cause of death worldwide. The first step in the diagnose of these diseases is the analysis of the electrocardiographic (ECG) signal. In turn, the ECG analysis begins with the detection of the QRS complex, which is the one with the most energy in the cardiac cycle. Numerous methods have been proposed in the bibliography for QRS complex detection, but few authors have analyzed the possibility of taking advantage of the information redundancy present in multiple ECG leads (simultaneously acquired) to produce accurate QRS detection. In our previous work we presented such an approach, proposing various data fusion techniques to combine the detections made by an algorithm on multiple ECG leads. In this paper we present further studies that show the advantages of this multi-lead detection approach, analyzing how many leads are necessary in order to observe an improvement in the detection performance. A well known QRS detection algorithm was used to test the fusion techniques on the St. Petersburg Institute of Cardiological Technics database. Results show improvement in the detection performance with as little as three leads, but the reliability of these results becomes interesting only after using seven or more leads. Results were evaluated using the detection error rate (DER). The multi-lead detection approach allows an improvement from DER = 3:04% to DER = 1:88%. Further works are to be made in order to improve the detection performance by implementing further fusion steps.

  3. Comparing multi-criteria decision analysis and integrated assessment to support long-term water supply planning

    PubMed Central

    Maurer, Max; Lienert, Judit

    2017-01-01

    We compare the use of multi-criteria decision analysis (MCDA)–or more precisely, models used in multi-attribute value theory (MAVT)–to integrated assessment (IA) models for supporting long-term water supply planning in a small town case study in Switzerland. They are used to evaluate thirteen system scale water supply alternatives in four future scenarios regarding forty-four objectives, covering technical, social, environmental, and economic aspects. The alternatives encompass both conventional and unconventional solutions and differ regarding technical, spatial and organizational characteristics. This paper focuses on the impact assessment and final evaluation step of the structured MCDA decision support process. We analyze the performance of the alternatives for ten stakeholders. We demonstrate the implications of model assumptions by comparing two IA and three MAVT evaluation model layouts of different complexity. For this comparison, we focus on the validity (ranking stability), desirability (value), and distinguishability (value range) of the alternatives given the five model layouts. These layouts exclude or include stakeholder preferences and uncertainties. Even though all five led us to identify the same best alternatives, they did not produce identical rankings. We found that the MAVT-type models provide higher distinguishability and a more robust basis for discussion than the IA-type models. The needed complexity of the model, however, should be determined based on the intended use of the model within the decision support process. The best-performing alternatives had consistently strong performance for all stakeholders and future scenarios, whereas the current water supply system was outperformed in all evaluation layouts. The best-performing alternatives comprise proactive pipe rehabilitation, adapted firefighting provisions, and decentralized water storage and/or treatment. We present recommendations for possible ways of improving water supply planning in the case study and beyond. PMID:28481881

  4. Age and gender estimation using Region-SIFT and multi-layered SVM

    NASA Astrophysics Data System (ADS)

    Kim, Hyunduk; Lee, Sang-Heon; Sohn, Myoung-Kyu; Hwang, Byunghun

    2018-04-01

    In this paper, we propose an age and gender estimation framework using the region-SIFT feature and multi-layered SVM classifier. The suggested framework entails three processes. The first step is landmark based face alignment. The second step is the feature extraction step. In this step, we introduce the region-SIFT feature extraction method based on facial landmarks. First, we define sub-regions of the face. We then extract SIFT features from each sub-region. In order to reduce the dimensions of features we employ a Principal Component Analysis (PCA) and a Linear Discriminant Analysis (LDA). Finally, we classify age and gender using a multi-layered Support Vector Machines (SVM) for efficient classification. Rather than performing gender estimation and age estimation independently, the use of the multi-layered SVM can improve the classification rate by constructing a classifier that estimate the age according to gender. Moreover, we collect a dataset of face images, called by DGIST_C, from the internet. A performance evaluation of proposed method was performed with the FERET database, CACD database, and DGIST_C database. The experimental results demonstrate that the proposed approach classifies age and performs gender estimation very efficiently and accurately.

  5. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.

  6. Developing a Mind-Body Exercise Programme for Stressed Children

    ERIC Educational Resources Information Center

    Wang, Claudia; Seo, Dong-Chul; Geib, Roy W

    2017-01-01

    Objective: To describe the process of developing a Health Qigong programme for stressed children using a formative evaluation approach. Methods: A multi-step formative evaluation method was utilised. These steps included (1) identifying programme content and drafting the curriculum, (2) synthesising effective and age-appropriate pedagogies, (3)…

  7. Design and Processing of a Novel Chaos-Based Stepped Frequency Synthesized Wideband Radar Signal.

    PubMed

    Zeng, Tao; Chang, Shaoqiang; Fan, Huayu; Liu, Quanhua

    2018-03-26

    The linear stepped frequency and linear frequency shift keying (FSK) signal has been widely used in radar systems. However, such linear modulation signals suffer from the range-Doppler coupling that degrades radar multi-target resolution. Moreover, the fixed frequency-hopping or frequency-coded sequence can be easily predicted by the interception receiver in the electronic countermeasures (ECM) environments, which limits radar anti-jamming performance. In addition, the single FSK modulation reduces the radar low probability of intercept (LPI) performance, for it cannot achieve a large time-bandwidth product. To solve such problems, we propose a novel chaos-based stepped frequency (CSF) synthesized wideband signal in this paper. The signal introduces chaotic frequency hopping between the coherent stepped frequency pulses, and adopts a chaotic frequency shift keying (CFSK) and phase shift keying (PSK) composited coded modulation in a subpulse, called CSF-CFSK/PSK. Correspondingly, the processing method for the signal has been proposed. According to our theoretical analyses and the simulations, the proposed signal and processing method achieve better multi-target resolution and LPI performance. Furthermore, flexible modulation is able to increase the robustness against identification of the interception receiver and improve the anti-jamming performance of the radar.

  8. Generation and multi-octave shaping of mid-infrared intense single-cycle pulses

    NASA Astrophysics Data System (ADS)

    Krogen, Peter; Suchowski, Haim; Liang, Houkun; Flemens, Noah; Hong, Kyung-Han; Kärtner, Franz X.; Moses, Jeffrey

    2017-03-01

    The generation of intense mid-infrared (mid-IR) optical pulses with customizable shape and spectra spanning a multiple-octave range of vibrational frequencies is an elusive technological capability. While some recent approaches to mid-IR supercontinuum generation—such as filamentation, multicolour four-wave-mixing and optical rectification—have successfully generated broad spectra, no process has been identified for achieving complex pulse shaping at the generation step. The adiabatic frequency converter allows for a one-to-one transfer of spectral phase through nonlinear frequency conversion over a larger-than-octave-spanning range and with an overall linear phase transfer function. Here, we show that we can convert shaped near-infrared (near-IR) pulses to shaped, energetic, multi-octave-spanning mid-IR pulses lasting only 1.2 optical cycles, and extendable to the sub-cycle regime. We expect this capability to enable a new class of precisely controlled nonlinear interactions in the mid-IR spectral range, from nonlinear vibrational spectroscopy to strong light-matter interactions and single-shot remote sensing.

  9. Multi-Objective Control Optimization for Greenhouse Environment Using Evolutionary Algorithms

    PubMed Central

    Hu, Haigen; Xu, Lihong; Wei, Ruihua; Zhu, Bingkun

    2011-01-01

    This paper investigates the issue of tuning the Proportional Integral and Derivative (PID) controller parameters for a greenhouse climate control system using an Evolutionary Algorithm (EA) based on multiple performance measures such as good static-dynamic performance specifications and the smooth process of control. A model of nonlinear thermodynamic laws between numerous system variables affecting the greenhouse climate is formulated. The proposed tuning scheme is tested for greenhouse climate control by minimizing the integrated time square error (ITSE) and the control increment or rate in a simulation experiment. The results show that by tuning the gain parameters the controllers can achieve good control performance through step responses such as small overshoot, fast settling time, and less rise time and steady state error. Besides, it can be applied to tuning the system with different properties, such as strong interactions among variables, nonlinearities and conflicting performance criteria. The results implicate that it is a quite effective and promising tuning method using multi-objective optimization algorithms in the complex greenhouse production. PMID:22163927

  10. Low-wave number analysis of observations and ensemble forecasts to develop metrics for the selection of most realistic members to study multi-scale interactions between the environment and the convective organization of hurricanes: Focus on Rapid Intensification

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Chen, H.; Gopalakrishnan, S.; Haddad, Z. S.

    2017-12-01

    Tropical cyclones (TCs) are the product of complex multi-scale processes and interactions. The role of the environment has long been recognized. However, recent research has shown that convective-scale processes in the hurricane core might also play a crucial role in determining TCs intensity and size. Several studies have linked Rapid Intensification to the characteristics of the convective clouds (shallow versus deep), their organization (isolated versus wide-spread) and their location with respect to dynamical controls (the vertical shear, the radius of maximum wind). Yet a third set of controls signifies the interaction between the storm-scale and large-scale processes. Our goal is to use observations and models to advance the still-lacking understanding of these processes. Recently, hurricane models have improved significantly. However, deterministic forecasts have limitations due to the uncertainty in the representation of the physical processes and initial conditions. A crucial step forward is the use of high-resolution ensembles. We adopt the following approach: i) generate a high resolution ensemble forecast using HWRF; ii) produce synthetic data (e.g. brightness temperature) from the model fields for direct comparison to satellite observations; iii) develop metrics to allow us to sub-select the realistic members of the ensemble, based on objective measures of the similarity between observed and forecasted structures; iv) for these most-realistic members, determine the skill in forecasting TCs to provide"guidance on guidance"; v) use the members with the best predictive skill to untangle the complex multi-scale interactions. We will report on the first three goals of our research, using forecasts and observations of hurricane Edouard (2014), focusing on RI. We will focus on describing the metrics for the selection of the most appropriate ensemble members, based on applying low-wave number analysis (WNA - Hristova-Veleva et al., 2016) to the observed and forecasted 2D fields to develop objective criteria for consistency. We investigate the WNA cartoons of environmental moisture, precipitation structure and surface convergence. We will present the preliminary selection of most skillful members and will outline our future goals - analyzing the multi-scale interactions using these members

  11. Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo

    2016-04-01

    Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.

  12. In situ UV curable 3D printing of multi-material tri-legged soft bot with spider mimicked multi-step forward dynamic gait

    NASA Astrophysics Data System (ADS)

    Zeb Gul, Jahan; Yang, Bong-Su; Yang, Young Jin; Chang, Dong Eui; Choi, Kyung Hyun

    2016-11-01

    Soft bots have the expedient ability of adopting intricate postures and fitting in complex shapes compared to mechanical robots. This paper presents a unique in situ UV curing three-dimensional (3D) printed multi-material tri-legged soft bot with spider mimicked multi-step dynamic forward gait using commercial bio metal filament (BMF) as an actuator. The printed soft bot can produce controllable forward motion in response to external signals. The fundamental properties of BMF, including output force, contractions at different frequencies, initial loading rate, and displacement-rate are verified. The tri-pedal soft bot CAD model is designed inspired by spider’s legged structure and its locomotion is assessed by simulating strain and displacement using finite element analysis. A customized rotational multi-head 3D printing system assisted with multiple wavelength’s curing lasers is used for in situ fabrication of tri-pedal soft-bot using two flexible materials (epoxy and polyurethane) in three layered steps. The size of tri-pedal soft-bot is 80 mm in diameter and each pedal’s width and depth is 5 mm × 5 mm respectively. The maximum forward speed achieved is 2.7 mm s-1 @ 5 Hz with input voltage of 3 V and 250 mA on a smooth surface. The fabricated tri-pedal soft bot proved its power efficiency and controllable locomotion at three input signal frequencies (1, 2, 5 Hz).

  13. Governance for public health and health equity: The Tröndelag model for public health work.

    PubMed

    Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim

    2018-06-01

    Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.

  14. The neural correlates of morphological complexity processing: Detecting structure in pseudowords.

    PubMed

    Schuster, Swetlana; Scharinger, Mathias; Brooks, Colin; Lahiri, Aditi; Hartwigsen, Gesa

    2018-06-01

    Morphological complexity is a highly debated issue in visual word recognition. Previous neuroimaging studies have shown that speakers are sensitive to degrees of morphological complexity. Two-step derived complex words (bridging through bridge N  > bridge V  > bridging) led to more enhanced activation in the left inferior frontal gyrus than their 1-step derived counterparts (running through run V  > running). However, it remains unclear whether sensitivity to degrees of morphological complexity extends to pseudowords. If this were the case, it would indicate that abstract knowledge of morphological structure is independent of lexicality. We addressed this question by investigating the processing of two sets of pseudowords in German. Both sets contained morphologically viable two-step derived pseudowords differing in the number of derivational steps required to access an existing lexical representation and therefore the degree of structural analysis expected during processing. Using a 2 × 2 factorial design, we found lexicality effects to be distinct from processing signatures relating to structural analysis in pseudowords. Semantically-driven processes such as lexical search showed a more frontal distribution while combinatorial processes related to structural analysis engaged more parietal parts of the network. Specifically, more complex pseudowords showed increased activation in parietal regions (right superior parietal lobe and left precuneus) relative to pseudowords that required less structural analysis to arrive at an existing lexical representation. As the two sets were matched on cohort size and surface form, these results highlight the role of internal levels of morphological structure even in forms that do not possess a lexical representation. © 2018 Wiley Periodicals, Inc.

  15. Hierarchical fractional-step approximations and parallel kinetic Monte Carlo algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Giorgos, E-mail: garab@math.uoc.gr; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Plechac, Petr, E-mail: plechac@math.udel.edu

    2012-10-01

    We present a mathematical framework for constructing and analyzing parallel algorithms for lattice kinetic Monte Carlo (KMC) simulations. The resulting algorithms have the capacity to simulate a wide range of spatio-temporal scales in spatially distributed, non-equilibrium physiochemical processes with complex chemistry and transport micro-mechanisms. Rather than focusing on constructing exactly the stochastic trajectories, our approach relies on approximating the evolution of observables, such as density, coverage, correlations and so on. More specifically, we develop a spatial domain decomposition of the Markov operator (generator) that describes the evolution of all observables according to the kinetic Monte Carlo algorithm. This domain decompositionmore » corresponds to a decomposition of the Markov generator into a hierarchy of operators and can be tailored to specific hierarchical parallel architectures such as multi-core processors or clusters of Graphical Processing Units (GPUs). Based on this operator decomposition, we formulate parallel Fractional step kinetic Monte Carlo algorithms by employing the Trotter Theorem and its randomized variants; these schemes, (a) are partially asynchronous on each fractional step time-window, and (b) are characterized by their communication schedule between processors. The proposed mathematical framework allows us to rigorously justify the numerical and statistical consistency of the proposed algorithms, showing the convergence of our approximating schemes to the original serial KMC. The approach also provides a systematic evaluation of different processor communicating schedules. We carry out a detailed benchmarking of the parallel KMC schemes using available exact solutions, for example, in Ising-type systems and we demonstrate the capabilities of the method to simulate complex spatially distributed reactions at very large scales on GPUs. Finally, we discuss work load balancing between processors and propose a re-balancing scheme based on probabilistic mass transport methods.« less

  16. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  17. A hybrid multiview stereo algorithm for modeling urban scenes.

    PubMed

    Lafarge, Florent; Keriven, Renaud; Brédif, Mathieu; Vu, Hoang-Hiep

    2013-01-01

    We present an original multiview stereo reconstruction algorithm which allows the 3D-modeling of urban scenes as a combination of meshes and geometric primitives. The method provides a compact model while preserving details: Irregular elements such as statues and ornaments are described by meshes, whereas regular structures such as columns and walls are described by primitives (planes, spheres, cylinders, cones, and tori). We adopt a two-step strategy consisting first in segmenting the initial meshbased surface using a multilabel Markov Random Field-based model and second in sampling primitive and mesh components simultaneously on the obtained partition by a Jump-Diffusion process. The quality of a reconstruction is measured by a multi-object energy model which takes into account both photo-consistency and semantic considerations (i.e., geometry and shape layout). The segmentation and sampling steps are embedded into an iterative refinement procedure which provides an increasingly accurate hybrid representation. Experimental results on complex urban structures and large scenes are presented and compared to state-of-the-art multiview stereo meshing algorithms.

  18. Triggers of key calcium signals during erythrocyte invasion by Plasmodium falciparum

    PubMed Central

    Gao, Xiaohong; Gunalan, Karthigayan; Yap, Sally Shu Lin; Preiser, Peter R.

    2013-01-01

    Invasion of erythrocytes by Plasmodium falciparum merozoites is a complex multi-step process mediated by specific interactions between host receptors and parasite ligands. Reticulocyte-binding protein homologues (RHs) and erythrocyte-binding-like (EBL) proteins are discharged from specialized organelles and used in early steps of invasion. Here we show that monoclonal antibodies against PfRH1 (an RH) block merozoite invasion by specifically inhibiting calcium signalling in the parasite, whereas invasion-inhibiting monoclonal antibodies targeting EBA175 (an EBL protein) have no effect on signalling. We further show that inhibition of this calcium signalling prevents EBA175 discharge and thereby formation of the junction between parasite and host cell. Our results indicate that PfRH1 has an initial sensing as well as signal transduction role that leads to the subsequent release of EBA175. They also provide new insights on how RH–host cell interactions lead to essential downstream signalling events in the parasite, suggesting new targets for malaria intervention. PMID:24280897

  19. Three steps to gold: mechanism of protein adsorption revealed by Brownian and molecular dynamics simulations.

    PubMed

    Ozboyaci, M; Kokh, D B; Wade, R C

    2016-04-21

    The addition of three N-terminal histidines to β-lactamase inhibitor protein was shown experimentally to increase its binding potency to an Au(111) surface substantially but the binding mechanism was not resolved. Here, we propose a complete adsorption mechanism for this fusion protein by means of a multi-scale simulation approach and free energy calculations. We find that adsorption is a three-step process: (i) recognition of the surface predominantly by the histidine fusion peptide and formation of an encounter complex facilitated by a reduced dielectric screening of water in the interfacial region, (ii) adsorption of the protein on the surface and adoption of a specific binding orientation, and (iii) adaptation of the protein structure on the metal surface accompanied by induced fit. We anticipate that the mechanistic features of protein adsorption to an Au(111) surface revealed here can be extended to other inorganic surfaces and proteins and will therefore aid the design of specific protein-surface interactions.

  20. Creating Knock-outs of Conserved Oligomeric Golgi complex subunits using CRISPR-mediated gene editing paired with a selection strategy based on glycosylation defects associated with impaired COG complex function

    PubMed Central

    Blackburn, Jessica Bailey; Lupashin, Vladimir V.

    2017-01-01

    Summary The Conserved Oligomeric Golgi (COG) complex is a key evolutionally conserved multisubunit protein machinery that regulates tethering and fusion of intra-Golgi transport vesicles. The Golgi apparatus specifically promotes sorting and complex glycosylation of glycoconjugates. Without proper glycosylation and processing, proteins and lipids will be mislocalized and/or have impaired function. The Golgi glycosylation machinery is kept in homeostasis by a careful balance of anterograde and retrograde trafficking to ensure proper localization of the glycosylation enzymes and their substrates. This balance, like other steps of membrane trafficking, is maintained by vesicle trafficking machinery that includes COPI vesicular coat proteins, SNAREs, Rabs, and both coiled-coil and multi-subunit vesicular tethers. COG complex interacts with other membrane trafficking components and is essential for proper localization of Golgi glycosylation machinery. Here we describe using CRISPR-mediated gene editing coupled with a phenotype-based selection strategy directly linked to the COG complex’s role in glycosylation homeostasis to obtain COG complex subunit knock-outs (KOs). This has resulted in clonal KOs for each COG subunit in HEK293T cells and gives the ability to further probe the role of the COG complex in Golgi homeostasis. PMID:27632008

  1. Complex network analysis of brain functional connectivity under a multi-step cognitive task

    NASA Astrophysics Data System (ADS)

    Cai, Shi-Min; Chen, Wei; Liu, Dong-Bai; Tang, Ming; Chen, Xun

    2017-01-01

    Functional brain network has been widely studied to understand the relationship between brain organization and behavior. In this paper, we aim to explore the functional connectivity of brain network under a multi-step cognitive task involving consecutive behaviors, and further understand the effect of behaviors on the brain organization. The functional brain networks are constructed based on a high spatial and temporal resolution fMRI dataset and analyzed via complex network based approach. We find that at voxel level the functional brain network shows robust small-worldness and scale-free characteristics, while its assortativity and rich-club organization are slightly restricted to the order of behaviors performed. More interestingly, the functional connectivity of brain network in activated ROIs strongly correlates with behaviors and is obviously restricted to the order of behaviors performed. These empirical results suggest that the brain organization has the generic properties of small-worldness and scale-free characteristics, and its diverse functional connectivity emerging from activated ROIs is strongly driven by these behavioral activities via the plasticity of brain.

  2. DReAM: Demand Response Architecture for Multi-level District Heating and Cooling Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Saptarshi; Chandan, Vikas; Arya, Vijay

    In this paper, we exploit the inherent hierarchy of heat exchangers in District Heating and Cooling (DHC) networks and propose DReAM, a novel Demand Response (DR) architecture for Multi-level DHC networks. DReAM serves to economize system operation while still respecting comfort requirements of individual consumers. Contrary to many present day DR schemes that work on a consumer level granularity, DReAM works at a level of hierarchy above buildings, i.e. substations that supply heat to a group of buildings. This improves the overall DR scalability and reduce the computational complexity. In the first step of the proposed approach, mathematical models ofmore » individual substations and their downstream networks are abstracted into appropriately constructed low-complexity structural forms. In the second step, this abstracted information is employed by the utility to perform DR optimization that determines the optimal heat inflow to individual substations rather than buildings, in order to achieve the targeted objectives across the network. We validate the proposed DReAM framework through experimental results under different scenarios on a test network.« less

  3. Adapting hierarchical bidirectional inter prediction on a GPU-based platform for 2D and 3D H.264 video coding

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sánchez, Rafael; Martínez, José Luis; Cock, Jan De; Fernández-Escribano, Gerardo; Pieters, Bart; Sánchez, José L.; Claver, José M.; de Walle, Rik Van

    2013-12-01

    The H.264/AVC video coding standard introduces some improved tools in order to increase compression efficiency. Moreover, the multi-view extension of H.264/AVC, called H.264/MVC, adopts many of them. Among the new features, variable block-size motion estimation is one which contributes to high coding efficiency. Furthermore, it defines a different prediction structure that includes hierarchical bidirectional pictures, outperforming traditional Group of Pictures patterns in both scenarios: single-view and multi-view. However, these video coding techniques have high computational complexity. Several techniques have been proposed in the literature over the last few years which are aimed at accelerating the inter prediction process, but there are no works focusing on bidirectional prediction or hierarchical prediction. In this article, with the emergence of many-core processors or accelerators, a step forward is taken towards an implementation of an H.264/AVC and H.264/MVC inter prediction algorithm on a graphics processing unit. The results show a negligible rate distortion drop with a time reduction of up to 98% for the complete H.264/AVC encoder.

  4. Dynamic behavior of the weld pool in stationary GMAW

    NASA Astrophysics Data System (ADS)

    Chapuis, J.; Romero, E.; Bordreuil, C.; Soulié, F.; Fras, G.

    2010-06-01

    Because hump formation limits welding productivity, better understanding of the humping phenomena during the welding process is needed to access to process modifications that decrease the tendency for hump formation and then allow higher productivity welding. From a physical point of view, the mechanism identified is the Rayleigh instability initiated by strong surface tension gradient which induces a variation of kinetic flow. But the causes of the appearance of this instability are not yet well explained. Because of the phenomena complex and multi-physics, we chose in first step to conduct an analysis of the characteristic times involved in weld pool in pulsed stationary GMAW. The goal is to study the dynamic behavior of the weld pool, using our experimental multi physics approach. The experimental tool and methodology developed to understand these fast phenomena are presented first: frames acquisition with high speed digital camera and specific optical devices, numerical library. The analysis of geometric parameters of the weld pool during welding operation are presented in the last part: we observe the variations of wetting angles (or contact lines angles), the base and the height of the weld pool (macro-drop) versus weld time.

  5. HARV ANSER Flight Test Data Retrieval and Processing Procedures

    NASA Technical Reports Server (NTRS)

    Yeager, Jessie C.

    1997-01-01

    Under the NASA High-Alpha Technology Program the High Alpha Research Vehicle (HARV) was used to conduct flight tests of advanced control effectors, advanced control laws, and high-alpha design guidelines for future super-maneuverable fighters. The High-Alpha Research Vehicle is a pre-production F/A-18 airplane modified with a multi-axis thrust-vectoring system for augmented pitch and yaw control power and Actuated Nose Strakes for Enhanced Rolling (ANSER) to augment body-axis yaw control power. Flight testing at the Dryden Flight Research Center (DFRC) began in July 1995 and continued until May 1996. Flight data will be utilized to evaluate control law performance and aircraft dynamics, determine aircraft control and stability derivatives using parameter identification techniques, and validate design guidelines. To accomplish these purposes, essential flight data parameters were retrieved from the DFRC data system and stored on the Dynamics and Control Branch (DCB) computer complex at Langley. This report describes the multi-step task used to retrieve and process this data and documents the results of these tasks. Documentation includes software listings, flight information, maneuver information, time intervals for which data were retrieved, lists of data parameters and definitions, and example data plots.

  6. Combining chemometric tools for assessing hazard sources and factors acting simultaneously in contaminated areas. Case study: "Mar Piccolo" Taranto (South Italy).

    PubMed

    Mali, Matilda; Dell'Anna, Maria Michela; Notarnicola, Michele; Damiani, Leonardo; Mastrorilli, Piero

    2017-10-01

    Almost all marine coastal ecosystems possess complex structural and dynamic characteristics, which are influenced by anthropogenic causes and natural processes as well. Revealing the impact of sources and factors controlling the spatial distributions of contaminants within highly polluted areas is a fundamental propaedeutic step of their quality evaluation. Combination of different pattern recognition techniques, applied to one of the most polluted Mediterranean coastal basin, resulted in a more reliable hazard assessment. PCA/CA and factorial ANOVA were exploited as complementary techniques for apprehending the impact of multi-sources and multi-factors acting simultaneously and leading to similarities or differences in the spatial contamination pattern. The combination of PCA/CA and factorial ANOVA allowed, on one hand to determine the main processes and factors controlling the contamination trend within different layers and different basins, and, on the other hand, to ascertain possible synergistic effects. This approach showed the significance of a spatially representative overview given by the combination of PCA-CA/ANOVA in inferring the historical anthropogenic sources loading on the area. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Laser-induced transformation of supramolecular complexes: approach to controlled formation of hybrid multi-yolk-shell Au-Ag@a-C:H nanostructures

    PubMed Central

    Manshina, A. A.; Grachova, E. V.; Povolotskiy, A. V.; Povolotckaia, A. V.; Petrov, Y. V.; Koshevoy, I. O.; Makarova, A. A.; Vyalikh, D. V.; Tunik, S. P.

    2015-01-01

    In the present work an efficient approach of the controlled formation of hybrid Au–Ag–C nanostructures based on laser-induced transformation of organometallic supramolecular cluster compound is suggested. Herein the one-step process of the laser-induced synthesis of hybrid multi-yolk-shell Au-Ag@a-C:H nanoparticles which are bimetallic gold-silver subnanoclusters dispersed in nanospheres of amorphous hydrogenated a-C:H carbon is reported in details. It has been demonstrated that variation of the experimental parameters such as type of the organometallic precursor, solvent, deposition geometry and duration of laser irradiation allows directed control of nanoparticles’ dimension and morphology. The mechanism of Au-Ag@a-C:H nanoparticles formation is suggested: the photo-excitation of the precursor molecule through metal-to-ligand charge transfer followed by rupture of metallophilic bonds, transformation of the cluster core including red-ox intramolecular reaction and aggregation of heterometallic species that results in the hybrid metal/carbon nanoparticles with multi-yolk-shell architecture formation. It has been found that the nanoparticles obtained can be efficiently used for the Surface-Enhanced Raman Spectroscopy label-free detection of human serum albumin in low concentration solution. PMID:26153347

  8. Expression of metastasis suppressor BRMS1 in breast cancer cells results in a marked delay in cellular adhesion to matrix

    USDA-ARS?s Scientific Manuscript database

    Metastatic dissemination is a multi-step process that depends on cancer cells’ ability to respond to microenvironmental cues by adapting adhesion abilities and undergoing cytoskeletal rearrangement. Breast Cancer Metastasis Suppressor 1 (BRMS1) affects several steps of the metastatic cascade: it dec...

  9. Implementing the Indiana Model. Indiana Leadership Consortium: Equity through Change.

    ERIC Educational Resources Information Center

    Indiana Leadership Consortium.

    This guide, which was developed as a part of a multi-year, statewide effort to institutionalize gender equity in various educational settings throughout Indiana, presents a step-by-step process model for achieving gender equity in the state's secondary- and postsecondary-level vocational programs through coalition building and implementation of a…

  10. Vesselness propagation: a fast interactive vessel segmentation method

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Dachille, Frank; Harris, Gordon J.; Yoshida, Hiroyuki

    2006-03-01

    With the rapid development of multi-detector computed tomography (MDCT), resulting in increasing temporal and spatial resolution of data sets, clinical use of computed tomographic angiography (CTA) is rapidly increasing. Analysis of vascular structures is much needed in CTA images; however, the basis of the analysis, vessel segmentation, can still be a challenging problem. In this paper, we present a fast interactive method for CTA vessel segmentation, called vesselness propagation. This method is a two-step procedure, with a pre-processing step and an interactive step. During the pre-processing step, a vesselness volume is computed by application of a CTA transfer function followed by a multi-scale Hessian filtering. At the interactive stage, the propagation is controlled interactively in terms of the priority of the vesselness. This method was used successfully in many CTA applications such as the carotid artery, coronary artery, and peripheral arteries. It takes less than one minute for a user to segment the entire vascular structure. Thus, the proposed method provides an effective way of obtaining an overview of vascular structures.

  11. MultiDrizzle: An Integrated Pyraf Script for Registering, Cleaning and Combining Images

    NASA Astrophysics Data System (ADS)

    Koekemoer, A. M.; Fruchter, A. S.; Hook, R. N.; Hack, W.

    We present the new PyRAF-based `MultiDrizzle' script, which is aimed at providing a one-step approach to combining dithered HST images. The purpose of this script is to allow easy interaction with the complex suite of tasks in the IRAF/STSDAS `dither' package, as well as the new `PyDrizzle' task, while at the same time retaining the flexibility of these tasks through a number of parameters. These parameters control the various individual steps, such as sky subtraction, image registration, `drizzling' onto separate output images, creation of a clean median image, transformation of the median with `blot' and creation of cosmic ray masks, as well as the final image combination step using `drizzle'. The default parameters of all the steps are set so that the task will work automatically for a wide variety of different types of images, while at the same time allowing adjustment of individual parameters for special cases. The script currently works for both ACS and WFPC2 data, and is now being tested on STIS and NICMOS images. We describe the operation of the script and the effect of various parameters, particularly in the context of combining images from dithered observations using ACS and WFPC2. Additional information is also available at the `MultiDrizzle' home page: http://www.stsci.edu/~koekemoe/multidrizzle/

  12. Simplified Helium Refrigerator Cycle Analysis Using the `Carnot Step'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Knudsen; V. Ganni

    2006-05-01

    An analysis of the Claude form of an idealized helium liquefier for the minimum input work reveals the ''Carnot Step'' for helium refrigerator cycles. As the ''Carnot Step'' for a multi-stage polytropic compression process consists of equal pressure ratio stages; similarly for an idealized helium liquefier the ''Carnot Step'' consists of equal temperature ratio stages for a given number of expansion stages. This paper presents the analytical basis and some useful equations for the preliminary examination of existing and new Claude helium refrigeration cycles.

  13. A Multi-touch Tool for Co-creation

    NASA Astrophysics Data System (ADS)

    Ludden, Geke D. S.; Broens, Tom

    Multi-touch technology provides an attractive way for knowledge workers to collaborate. Co-creation is an important collaboration process in which collecting resources, creating results and distributing these results is essential. We propose a wall-based multi-touch system (called CoCreate) in which these steps are made easy due to the notion of connected private spaces and a shared co-create space. We present our ongoing work, expert evaluation of interaction scenarios and future plans.

  14. Automated Geo/Co-Registration of Multi-Temporal Very-High-Resolution Imagery.

    PubMed

    Han, Youkyung; Oh, Jaehong

    2018-05-17

    For time-series analysis using very-high-resolution (VHR) multi-temporal satellite images, both accurate georegistration to the map coordinates and subpixel-level co-registration among the images should be conducted. However, applying well-known matching methods, such as scale-invariant feature transform and speeded up robust features for VHR multi-temporal images, has limitations. First, they cannot be used for matching an optical image to heterogeneous non-optical data for georegistration. Second, they produce a local misalignment induced by differences in acquisition conditions, such as acquisition platform stability, the sensor's off-nadir angle, and relief displacement of the considered scene. Therefore, this study addresses the problem by proposing an automated geo/co-registration framework for full-scene multi-temporal images acquired from a VHR optical satellite sensor. The proposed method comprises two primary steps: (1) a global georegistration process, followed by (2) a fine co-registration process. During the first step, two-dimensional multi-temporal satellite images are matched to three-dimensional topographic maps to assign the map coordinates. During the second step, a local analysis of registration noise pixels extracted between the multi-temporal images that have been mapped to the map coordinates is conducted to extract a large number of well-distributed corresponding points (CPs). The CPs are finally used to construct a non-rigid transformation function that enables minimization of the local misalignment existing among the images. Experiments conducted on five Kompsat-3 full scenes confirmed the effectiveness of the proposed framework, showing that the georegistration performance resulted in an approximately pixel-level accuracy for most of the scenes, and the co-registration performance further improved the results among all combinations of the georegistered Kompsat-3 image pairs by increasing the calculated cross-correlation values.

  15. Glial brain tumor detection by using symmetry analysis

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Binaghi, Elisabetta; Balbi, Sergio; De Benedictis, Alessandro; Monti, Emanuele; Minotto, Renzo

    2012-02-01

    In this work a fully automatic algorithm to detect brain tumors by using symmetry analysis is proposed. In recent years a great effort of the research in field of medical imaging was focused on brain tumors segmentation. The quantitative analysis of MRI brain tumor allows to obtain useful key indicators of disease progression. The complex problem of segmenting tumor in MRI can be successfully addressed by considering modular and multi-step approaches mimicking the human visual inspection process. The tumor detection is often an essential preliminary phase to solvethe segmentation problem successfully. In visual analysis of the MRI, the first step of the experts cognitive process, is the detection of an anomaly respect the normal tissue, whatever its nature. An healthy brain has a strong sagittal symmetry, that is weakened by the presence of tumor. The comparison between the healthy and ill hemisphere, considering that tumors are generally not symmetrically placed in both hemispheres, was used to detect the anomaly. A clustering method based on energy minimization through Graph-Cut is applied on the volume computed as a difference between the left hemisphere and the right hemisphere mirrored across the symmetry plane. Differential analysis involves the loss the knowledge of the tumor side. Through an histogram analysis the ill hemisphere is recognized. Many experiments are performed to assess the performance of the detection strategy on MRI volumes in presence of tumors varied in terms of shapes positions and intensity levels. The experiments showed good results also in complex situations.

  16. Evaluation of TOPLATS on three Mediterranean catchments

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel

    2016-08-01

    Physically based hydrological models are complex tools that provide a complete description of the different processes occurring on a catchment. The TOPMODEL-based Land-Atmosphere Transfer Scheme (TOPLATS) simulates water and energy balances at different time steps, in both lumped and distributed modes. In order to gain insight on the behavior of TOPLATS and its applicability in different conditions a detailed evaluation needs to be carried out. This study aimed to develop a complete evaluation of TOPLATS including: (1) a detailed review of previous research works using this model; (2) a sensitivity analysis (SA) of the model with two contrasted methods (Morris and Sobol) of different complexity; (3) a 4-step calibration strategy based on a multi-start Powell optimization algorithm; and (4) an analysis of the influence of simulation time step (hourly vs. daily). The model was applied on three catchments of varying size (La Tejeria, Cidacos and Arga), located in Navarre (Northern Spain), and characterized by different levels of Mediterranean climate influence. Both Morris and Sobol methods showed very similar results that identified Brooks-Corey Pore Size distribution Index (B), Bubbling pressure (ψc) and Hydraulic conductivity decay (f) as the three overall most influential parameters in TOPLATS. After calibration and validation, adequate streamflow simulations were obtained in the two wettest catchments, but the driest (Cidacos) gave poor results in validation, due to the large climatic variability between calibration and validation periods. To overcome this issue, an alternative random and discontinuous method of cal/val period selection was implemented, improving model results.

  17. Uncertainties in Eddy Covariance fluxes due to post-field data processing: a multi-site, full factorial analysis

    NASA Astrophysics Data System (ADS)

    Sabbatini, S.; Fratini, G.; Arriga, N.; Papale, D.

    2012-04-01

    Eddy Covariance (EC) is the only technologically available direct method to measure carbon and energy fluxes between ecosystems and atmosphere. However, uncertainties related to this method have not been exhaustively assessed yet, including those deriving from post-field data processing. The latter arise because there is no exact processing sequence established for any given situation, and the sequence itself is long and complex, with many processing steps and options available. However, the consistency and inter-comparability of flux estimates may be largely affected by the adoption of different processing sequences. The goal of our work is to quantify the uncertainty introduced in each processing step by the fact that different options are available, and to study how the overall uncertainty propagates throughout the processing sequence. We propose an easy-to-use methodology to assign a confidence level to the calculated fluxes of energy and mass, based on the adopted processing sequence, and on available information such as the EC system type (e.g. open vs. closed path), the climate and the ecosystem type. The proposed methodology synthesizes the results of a massive full-factorial experiment. We use one year of raw data from 15 European flux stations and process them so as to cover all possible combinations of the available options across a selection of the most relevant processing steps. The 15 sites have been selected to be representative of different ecosystems (forests, croplands and grasslands), climates (mediterranean, nordic, arid and humid) and instrumental setup (e.g. open vs. closed path). The software used for this analysis is EddyPro™ 3.0 (www.licor.com/eddypro). The critical processing steps, selected on the basis of the different options commonly used in the FLUXNET community, are: angle of attack correction; coordinate rotation; trend removal; time lag compensation; low- and high- frequency spectral correction; correction for air density fluctuations; and length of the flux averaging interval. We illustrate the results of the full-factorial combination relative to a subset of the selected sites with particular emphasis on the total uncertainty at different time scales and aggregations, as well as a preliminary analysis of the most critical steps for their contribution to the total uncertainties and their potential relation with site set-up characteristics and ecosystem type.

  18. Diastereoselective chain-elongation reactions using microreactors for applications in complex molecule assembly.

    PubMed

    Carter, Catherine F; Lange, Heiko; Sakai, Daiki; Baxendale, Ian R; Ley, Steven V

    2011-03-14

    Diastereoselective chain-elongation reactions are important transformations for the assembly of complex molecular structures, such as those present in polyketide natural products. Here we report new methods for performing crotylation reactions and homopropargylation reactions by using newly developed low-temperature flow-chemistry technology. In-line purification protocols are described, as well as the application of the crotylation protocol in an automated multi-step sequence. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. CAPS and Munc13: CATCHRs that SNARE Vesicles.

    PubMed

    James, Declan J; Martin, Thomas F J

    2013-12-04

    CAPS (Calcium-dependent Activator Protein for Secretion, aka CADPS) and Munc13 (Mammalian Unc-13) proteins function to prime vesicles for Ca(2+)-triggered exocytosis in neurons and neuroendocrine cells. CAPS and Munc13 proteins contain conserved C-terminal domains that promote the assembly of SNARE complexes for vesicle priming. Similarities of the C-terminal domains of CAPS/Munc13 proteins with Complex Associated with Tethering Containing Helical Rods domains in multi-subunit tethering complexes (MTCs) have been reported. MTCs coordinate multiple interactions for SNARE complex assembly at constitutive membrane fusion steps. We review aspects of these diverse tethering and priming factors to identify common operating principles.

  20. Three-dimensional reconstruction of highly complex microscopic samples using scanning electron microscopy and optical flow estimation.

    PubMed

    Baghaie, Ahmadreza; Pahlavan Tafti, Ahmad; Owen, Heather A; D'Souza, Roshan M; Yu, Zeyun

    2017-01-01

    Scanning Electron Microscope (SEM) as one of the major research and industrial equipment for imaging of micro-scale samples and surfaces has gained extensive attention from its emerge. However, the acquired micrographs still remain two-dimensional (2D). In the current work a novel and highly accurate approach is proposed to recover the hidden third-dimension by use of multi-view image acquisition of the microscopic samples combined with pre/post-processing steps including sparse feature-based stereo rectification, nonlocal-based optical flow estimation for dense matching and finally depth estimation. Employing the proposed approach, three-dimensional (3D) reconstructions of highly complex microscopic samples were achieved to facilitate the interpretation of topology and geometry of surface/shape attributes of the samples. As a byproduct of the proposed approach, high-definition 3D printed models of the samples can be generated as a tangible means of physical understanding. Extensive comparisons with the state-of-the-art reveal the strength and superiority of the proposed method in uncovering the details of the highly complex microscopic samples.

  1. CFD-Modeling of the Multistage Gasifier Capacity of 30 KW

    NASA Astrophysics Data System (ADS)

    Levin, A. A.; Kozlov, A. N.; Svishchev, D. A.; Donskoy, I. G.

    2017-11-01

    Single-stage fuel gasification processes have been developed and widely studied in Russia and abroad throughout the 20th century. They are fundamental to the creation and design of modern gas generator equipment. Many studies have shown that single-stage gasification process, have already reached the limit of perfection, which was a significant improvement in their performance becomes impossible and unprofitable. The most fully meet modern technical requirements of multistage gasification technology. In the first step of the process, is organized allothermic biomass pyrolysis using heat of exhaust gas and generating power plant. At this stage, the yield of volatile products (gas and tar) of fuel. In the second step, the layer of fuel is, the tar is decomposed by the action of hot air and steam, steam-gas mixture is formed further reacts with the charcoal in the third process stage. The paper presents a model developed by the authors of the multi-stage gasifier for wood chips. The model is made with the use of CFD-modeling software package (COMSOL Multiphisics). To describe the kinetics of wood pyrolysis and gasification of charcoal studies were carried out using a set of simultaneous thermal analysis. For this complex developed original methods of interpretation of measurements, including methods of technical analysis of fuels and determine the parameters of the detailed kinetics and mechanism of pyrolysis.

  2. Patterning control strategies for minimum edge placement error in logic devices

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Hanna, Michael; Slachter, Bram; Tel, Wim; Kubis, Michael; Maslow, Mark; Spence, Chris; Timoshkov, Vadim

    2017-03-01

    In this paper we discuss the edge placement error (EPE) for multi-patterning semiconductor manufacturing. In a multi-patterning scheme the creation of the final pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. We describe the fidelity of the final pattern in terms of EPE, which is defined as the relative displacement of the edges of two features from their intended target position. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As an experimental test vehicle we use the 7-nm logic device patterning process flow as developed by IMEC. This patterning process is based on Self-Aligned-Quadruple-Patterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography. The computational metrology method to determine EPE is explained. It will be shown that ArF to EUV overlay, CDU from the individual process steps, and local CD and placement of the individual pattern features, are the important contributors. Based on the error budget, we developed an optimization strategy for each individual step and for the final pattern. Solutions include overlay and CD metrology based on angle resolved scatterometry, scanner actuator control to enable high order overlay corrections and computational lithography optimization to minimize imaging induced pattern placement errors of devices and metrology targets.

  3. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  4. Uncovering Hidden Layers of Cell Cycle Regulation through Integrative Multi-omic Analysis

    PubMed Central

    Aviner, Ranen; Shenoy, Anjana; Elroy-Stein, Orna; Geiger, Tamar

    2015-01-01

    Studying the complex relationship between transcription, translation and protein degradation is essential to our understanding of biological processes in health and disease. The limited correlations observed between mRNA and protein abundance suggest pervasive regulation of post-transcriptional steps and support the importance of profiling mRNA levels in parallel to protein synthesis and degradation rates. In this work, we applied an integrative multi-omic approach to study gene expression along the mammalian cell cycle through side-by-side analysis of mRNA, translation and protein levels. Our analysis sheds new light on the significant contribution of both protein synthesis and degradation to the variance in protein expression. Furthermore, we find that translation regulation plays an important role at S-phase, while progression through mitosis is predominantly controlled by changes in either mRNA levels or protein stability. Specific molecular functions are found to be co-regulated and share similar patterns of mRNA, translation and protein expression along the cell cycle. Notably, these include genes and entire pathways not previously implicated in cell cycle progression, demonstrating the potential of this approach to identify novel regulatory mechanisms beyond those revealed by traditional expression profiling. Through this three-level analysis, we characterize different mechanisms of gene expression, discover new cycling gene products and highlight the importance and utility of combining datasets generated using different techniques that monitor distinct steps of gene expression. PMID:26439921

  5. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGES

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  6. Multi-User Spaceport Update News Conference

    NASA Image and Video Library

    2014-01-23

    CAPE CANAVERAL, Fla. – Sierra Nevada Corporation, or SNC, Space Systems, announces the steps the company will take to prepare for a November 2016 orbital flight of its Dream Chaser spacecraft from Florida’s Space Coast during a news conference at NASA’s Kennedy Space Center in Florida. Participants are, from left, Michael Curie, NASA spokesman, Bob Cabana, director of Kennedy, Michael Gass, president and CEO of United Launch Alliance, or ULA, Frank DiBello, president and CEO of Space Florida, Mark Sirangelo, corporate vice president and head of SNC Space Systems, Larry Price, Lockheed Martin Space Systems deputy program manager for NASA's Orion spacecraft, and Steve Lindsey, Dream Chaser program manager for SNC Space Systems. The steps are considered substantial for SNC and important to plans by NASA and Space Florida for Kennedy’s transformation into a multi-user spaceport for both commercial and government customers. SNC said it plans to work with ULA to launch the Dream Chaser spacecraft into orbit atop an Atlas V rocket from Space Launch Complex 41 at Cape Canaveral Air Force Station intends to land the winged spacecraft at Kennedy’s 3.5-mile long runway at the Shuttle Landing Facility lease office space at Exploration Park, right outside Kennedy’s gates and process the spacecraft in the high bay of the Operations and Checkout Building at Kennedy, with Lockheed Martin performing the work. Photo credit: NASA/Kim Shiflett

  7. Perry's Scheme of Intellectual and Epistemological Development as a Framework for Describing Student Difficulties in Learning Organic Chemistry

    ERIC Educational Resources Information Center

    Grove, Nathaniel P.; Bretz, Stacey Lowery

    2010-01-01

    We have investigated student difficulties with the learning of organic chemistry. Using Perry's Model of Intellectual Development as a framework revealed that organic chemistry students who function as dualistic thinkers struggle with the complexity of the subject matter. Understanding substitution/elimination reactions and multi-step syntheses is…

  8. Probabilities and Predictions: Modeling the Development of Scientific Problem-Solving Skills

    ERIC Educational Resources Information Center

    Stevens, Ron; Johnson, David F.; Soller, Amy

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative…

  9. Dual-step synthesis of 3-dimensional niobium oxide - Zinc oxide

    NASA Astrophysics Data System (ADS)

    Rani, Rozina Abdul; Zoolfakar, Ahmad Sabirin; Rusop, M.

    2018-05-01

    A facile fabrication process for constructing 3-dimensional (3D) structure of Niobium oxide - Zinc oxide (Nb2O5-ZnO) consisting of branched ZnO microrods on top of nanoporous Nb2O5 films was developed based on dual-step synthesis approach. The preliminary procedure was anodization of sputtered niobium metal on Fluorine doped Tin Oxide (FTO) to produce nanoporous Nb2O5, and continued with the growth of branched microrods of ZnO by hydrothermal process. This approach offers insight knowledge on the development of novel 3D metal oxide films via dual-step synthesis process, which might potentially use for multi-functional applications ranging from sensing to photoconversion.

  10. Strategies for the structural analysis of multi-protein complexes: lessons from the 3D-Repertoire project.

    PubMed

    Collinet, B; Friberg, A; Brooks, M A; van den Elzen, T; Henriot, V; Dziembowski, A; Graille, M; Durand, D; Leulliot, N; Saint André, C; Lazar, N; Sattler, M; Séraphin, B; van Tilbeurgh, H

    2011-08-01

    Structural studies of multi-protein complexes, whether by X-ray diffraction, scattering, NMR spectroscopy or electron microscopy, require stringent quality control of the component samples. The inability to produce 'keystone' subunits in a soluble and correctly folded form is a serious impediment to the reconstitution of the complexes. Co-expression of the components offers a valuable alternative to the expression of single proteins as a route to obtain sufficient amounts of the sample of interest. Even in cases where milligram-scale quantities of purified complex of interest become available, there is still no guarantee that good quality crystals can be obtained. At this step, protein engineering of one or more components of the complex is frequently required to improve solubility, yield or the ability to crystallize the sample. Subsequent characterization of these constructs may be performed by solution techniques such as Small Angle X-ray Scattering and Nuclear Magnetic Resonance to identify 'well behaved' complexes. Herein, we recount our experiences gained at protein production and complex assembly during the European 3D Repertoire project (3DR). The goal of this consortium was to obtain structural information on multi-protein complexes from yeast by combining crystallography, electron microscopy, NMR and in silico modeling methods. We present here representative set case studies of complexes that were produced and analyzed within the 3DR project. Our experience provides useful insight into strategies that are more generally applicable for structural analysis of protein complexes. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Multiple-input single-output closed-loop isometric force control using asynchronous intrafascicular multi-electrode stimulation.

    PubMed

    Frankel, Mitchell A; Dowden, Brett R; Mathews, V John; Normann, Richard A; Clark, Gregory A; Meek, Sanford G

    2011-06-01

    Although asynchronous intrafascicular multi-electrode stimulation (IFMS) can evoke fatigue-resistant muscle force, a priori determination of the necessary stimulation parameters for precise force production is not possible. This paper presents a proportionally-modulated, multiple-input single-output (MISO) controller that was designed and experimentally validated for real-time, closed-loop force-feedback control of asynchronous IFMS. Experiments were conducted on anesthetized felines with a Utah Slanted Electrode Array implanted in the sciatic nerve, either acutely or chronically ( n = 1 for each). Isometric forces were evoked in plantar-flexor muscles, and target forces consisted of up to 7 min of step, sinusoidal, and more complex time-varying trajectories. The controller was successful in evoking steps in force with time-to-peak of less than 0.45 s, steady-state ripple of less than 7% of the mean steady-state force, and near-zero steady-state error even in the presence of muscle fatigue, but with transient overshoot of near 20%. The controller was also successful in evoking target sinusoidal and complex time-varying force trajectories with amplitude error of less than 0.5 N and time delay of approximately 300 ms. This MISO control strategy can potentially be used to develop closed-loop asynchronous IFMS controllers for a wide variety of multi-electrode stimulation applications to restore lost motor function.

  12. Supramolecular reactivity in the gas phase: investigating the intrinsic properties of non-covalent complexes.

    PubMed

    Cera, Luca; Schalley, Christoph A

    2014-03-21

    The high vacuum inside a mass spectrometer offers unique conditions to broaden our view on the reactivity of supramolecules. Because dynamic exchange processes between complexes are efficiently suppressed, the intrinsic and intramolecular reactivity of the complexes of interest is observed. Besides this, the significantly higher strength of non-covalent interactions in the absence of competing solvent allows processes to occur that are unable to compete in solution. The present review highlights a series of examples illustrating different aspects of supramolecular gas-phase reactivity ranging from the dissociation and formation of covalent bonds in non-covalent complexes through the reactivity in the restricted inner phase of container molecules and step-by-step mechanistic studies of organocatalytic reaction cycles to cage contraction reactions, processes induced by electron capture, and finally dynamic molecular motion within non-covalent complexes as unravelled by hydrogen-deuterium exchange processes performed in the gas phase.

  13. STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python

    PubMed Central

    Wils, Stefan; Schutter, Erik De

    2008-01-01

    We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245

  14. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference

    PubMed Central

    Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304

  15. Novel process chain for hot metal gas forming of ferritic stainless steel 1.4509

    NASA Astrophysics Data System (ADS)

    Mosel, André; Lambarri, Jon; Degenkolb, Lars; Reuther, Franz; Hinojo, José Luis; Rößiger, Jörg; Eurich, Egbert; Albert, André; Landgrebe, Dirk; Wenzel, Holger

    2018-05-01

    Exhaust gas components of automobiles are often produced in ferritic stainless steel 1.4509 due to the low thermal expansion coefficient and the low material price. Until now, components of the stainless steel with complex geometries have been produced in series by means of multi-stage hydroforming at room temperature with intermediate annealing operations. The application of a single-stage hot-forming process, also referred to as hot metal gas forming (HMGF), offers great potential to significantly reduce the production costs of such components. The article describes a novel process chain for the HMGF process. Therefore the tube is heated in two steps. After pre-heating of the semi-finished product outside the press, the tube is heated up to forming start temperature by means of a tool-integrated conductive heating before forming. For the tube of a demonstrator geometry, a simulation model for the conduction heating was set up. In addition to the tool development for this process, experimental results are also described for the production of the demonstrator geometry.

  16. Multi-tap complex-coefficient incoherent microwave photonic filters based on optical single-sideband modulation and narrow band optical filtering.

    PubMed

    Sagues, Mikel; García Olcina, Raimundo; Loayssa, Alayn; Sales, Salvador; Capmany, José

    2008-01-07

    We propose a novel scheme to implement tunable multi-tap complex coefficient filters based on optical single sideband modulation and narrow band optical filtering. A four tap filter is experimentally demonstrated to highlight the enhanced tuning performance provided by complex coefficients. Optical processing is performed by the use of a cascade of four phase-shifted fiber Bragg gratings specifically fabricated for this purpose.

  17. A Multi-Scale, Multi-Physics Optimization Framework for Additively Manufactured Structural Components

    NASA Astrophysics Data System (ADS)

    El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel

    This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.

  18. Structured recording of intraoperative surgical workflows

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.

    2006-03-01

    Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.

  19. The handling of thin substrates and its potential for new architectures in multi-junction solar cells technology

    NASA Astrophysics Data System (ADS)

    Colin, Clément; Jaouad, Abdelatif; Darnon, Maxime; De Lafontaine, Mathieu; Volatier, Maïté; Boucherif, Abderraouf; Arès, Richard; Fafard, Simon; Aimez, Vincent

    2017-09-01

    In this paper, we investigate the development of a robust handling process for thin (<50 µm) substrates in the framework of the monolithic multi-junction solar cell (MJSC) technology. The process, designed for its versatility, is based on a temporary front side bonding of the cell with a polymeric adhesive and then a permanent back side soldering, allowing classical cell micro-fabrication steps on both sides of the wafer. We have demonstrated that the process does not degrade the performances of monolithic MJSC with Ge substrates thickness reduced from 170 µm to 25 µm. Then, we investigate a perspective unlocked with this work: the study of 3D-interconnect architecture for multi-junction solar cells.

  20. Platform-Independence and Scheduling In a Multi-Threaded Real-Time Simulation

    NASA Technical Reports Server (NTRS)

    Sugden, Paul P.; Rau, Melissa A.; Kenney, P. Sean

    2001-01-01

    Aviation research often relies on real-time, pilot-in-the-loop flight simulation as a means to develop new flight software, flight hardware, or pilot procedures. Often these simulations become so complex that a single processor is incapable of performing the necessary computations within a fixed time-step. Threads are an elegant means to distribute the computational work-load when running on a symmetric multi-processor machine. However, programming with threads often requires operating system specific calls that reduce code portability and maintainability. While a multi-threaded simulation allows a significant increase in the simulation complexity, it also increases the workload of a simulation operator by requiring that the operator determine which models run on which thread. To address these concerns an object-oriented design was implemented in the NASA Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. The design provides a portable and maintainable means to use threads and also provides a mechanism to automatically load balance the simulation models.

  1. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  2. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  3. Granular computing with multiple granular layers for brain big data processing.

    PubMed

    Wang, Guoyin; Xu, Ji

    2014-12-01

    Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.

  4. Measurement needs guided by synthetic radar scans in high-resolution model output

    NASA Astrophysics Data System (ADS)

    Varble, A.; Nesbitt, S. W.; Borque, P.

    2017-12-01

    Microphysical and dynamical process interactions within deep convective clouds are not well understood, partly because measurement strategies often focus on statistics of cloud state rather than cloud processes. While processes cannot be directly measured, they can be inferred with sufficiently frequent and detailed scanning radar measurements focused on the life cycleof individual cloud regions. This is a primary goal of the 2018-19 DOE ARM Cloud, Aerosol, and Complex Terrain Interactions (CACTI) and NSF Remote sensing of Electrification, Lightning, And Mesoscale/microscale Processes with Adaptive Ground Observations (RELAMPAGO) field campaigns in central Argentina, where orographic deep convective initiation is frequent with some high-impact systems growing into the tallest and largest in the world. An array of fixed and mobile scanning multi-wavelength dual-polarization radars will be coupled with surface observations, sounding systems, multi-wavelength vertical profilers, and aircraft in situ measurements to characterize convective cloud life cycles and their relationship with environmental conditions. While detailed cloud processes are an observational target, the radar scan patterns that are most ideal for observing them are unclear. They depend on the locations and scales of key microphysical and dynamical processes operating within the cloud. High-resolution simulations of clouds, while imperfect, can provide information on these locations and scales that guide radar measurement needs. Radar locations are set in the model domain based on planned experiment locations, and simulatedorographic deep convective initiation and upscale growth are sampled using a number of different scans involving RHIs or PPIs with predefined elevation and azimuthal angles that approximately conform with radar range and beam width specifications. Each full scan pattern is applied to output atsingle model time steps with time step intervals that depend on the length of time required to complete each scan in the real world. The ability of different scans to detect key processes within the convective cloud life cycle are examined in connection with previous and subsequent dynamical and microphysical transitions. This work will guide strategic scan patterns that will be used during CACTI and RELAMPAGO.

  5. Synthesis of PbS/TiO2 nanocomposite materials using the sol-gel process via the incorporation of lead thiolates

    NASA Astrophysics Data System (ADS)

    Patel, Khushikumari

    PbS/TiO2 nanocomposites were prepared by two methods using the sol-gel process: a one step process and a multi-step process. The incorporation of 3-mercaptopropionic acid, followed by the addition of Pb2+ generated covalently incorporated lead thiolate precursors which can then be converted to PbS/TiO2 nanocomposites by controlled thermal decomposition. Various ratios of bifunctional linker to matrix were used to monitor the incorporation of functional groups of the ceramic matrix, and the sol-gel process was used to produce a high yield ceramic materials. This allows solutions to chemically bind and form solid state ceramics, while allowing complex compounds to combine with a high degree of homogeneity. 3-mercaptoproprionic acid, was added to the titania gel, and as a source of sulfur component to bind to the titania. PbS/TiO2 nanocomposites were studied using FTIR spectroscopy. The covalent bonding between PbS and the titania ceramics was also confirmed with the signal intensity in the infrared spectra. The success of the covalent bond between the thiolate and ceramics led to possibility of nanocomposites. X-ray diffraction was used analyze the structure of the nanocomposites X-ray diffraction results showed lead sulfide nanocrystals in the ceramic matrix as well as the size of the particles. The presence of crystalline PbS and particle size was determined using powder X-ray diffraction.

  6. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.

    2012-07-01

    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get at the same time a point cloud (the resolution depends on the size of the pixel on the object), and therefore an accurate meshed object with its texture. After matching and processing steps, we can use the resulting data in much the same way as a TLS point cloud, but in addition with radiometric information for textures. The discussion in this paper reviews recording and important processing steps as geo-referencing and data merging, the essential assessment of the results, and examples of deliverables from projects of the Photogrammetry and Geomatics Group (INSA Strasbourg, France).

  7. Extrinsic Repair of Injured Dendrites as a Paradigm for Regeneration by Fusion in Caenorhabditis elegans

    PubMed Central

    Oren-Suissa, Meital; Gattegno, Tamar; Kravtsov, Veronika; Podbilewicz, Benjamin

    2017-01-01

    Injury triggers regeneration of axons and dendrites. Research has identified factors required for axonal regeneration outside the CNS, but little is known about regeneration triggered by dendrotomy. Here, we study neuronal plasticity triggered by dendrotomy and determine the fate of complex PVD arbors following laser surgery of dendrites. We find that severed primary dendrites grow toward each other and reconnect via branch fusion. Simultaneously, terminal branches lose self-avoidance and grow toward each other, meeting and fusing at the tips via an AFF-1-mediated process. Ectopic branch growth is identified as a step in the regeneration process required for bypassing the lesion site. Failure of reconnection to the severed dendrites results in degeneration of the distal end of the neuron. We discover pruning of excess branches via EFF-1 that acts to recover the original wild-type arborization pattern in a late stage of the process. In contrast, AFF-1 activity during dendritic auto-fusion is derived from the lateral seam cells and not autonomously from the PVD neuron. We propose a model in which AFF-1-vesicles derived from the epidermal seam cells fuse neuronal dendrites. Thus, EFF-1 and AFF-1 fusion proteins emerge as new players in neuronal arborization and maintenance of arbor connectivity following injury in Caenorhabditis elegans. Our results demonstrate that there is a genetically determined multi-step pathway to repair broken dendrites in which EFF-1 and AFF-1 act on different steps of the pathway. EFF-1 is essential for dendritic pruning after injury and extrinsic AFF-1 mediates dendrite fusion to bypass injuries. PMID:28283540

  8. English for Everyday Activities: A Picture Process Dictionary.

    ERIC Educational Resources Information Center

    Zwier, Lawrence J.

    These books are designed to help English-as-a-Second-Language (ESL) students learn the skills they need to communicate the step-by-step aspects of daily activities. Unlike most picture dictionaries, this is a verb-based multi-skills program that uses a student text with a clear and colorful pictorial detail as a starting point and focuses on the…

  9. [Complex automatic data processing in multi-profile hospitals].

    PubMed

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  10. Cleave and couple: toward fully sustainable catalytic conversion of lignocellulose to value added building blocks and fuels.

    PubMed

    Sun, Zhuohua; Barta, Katalin

    2018-06-21

    The structural complexity of lignocellulose offers unique opportunities for the development of entirely new, energy efficient and waste-free pathways in order to obtain valuable bio-based building blocks. Such sustainable catalytic methods - specifically tailored to address the efficient conversion of abundant renewable starting materials - are necessary to successfully compete, in the future, with fossil-based multi-step processes. In this contribution we give a summary of recent developments in this field and describe our "cleave and couple" strategy, where "cleave" refers to the catalytic deconstruction of lignocellulose to aromatic and aliphatic alcohol intermediates, and "couple" involves the development of novel, sustainable transformations for the formation of C-C and C-N bonds in order to obtain a range of attractive products from lignocellulose.

  11. Best geoscience approach to complex systems in environment

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2017-04-01

    The environment is a social issue that continues to grow in importance. Its complexity, both cross-disciplinary and multi-scale, has given rise to a large number of scientific and technological locks, that complex systems approaches can solve. Significant challenges must met to achieve the understanding of the environmental complexes systems. There study should proceed in some steps in which the use of data and models is crucial: - Exploration, observation and basic data acquisition - Identification of correlations, patterns, and mechanisms - Modelling - Model validation, implementation and prediction - Construction of a theory Since the e-learning becomes a powerful tool for knowledge and best practice shearing, we use it to teach the environmental complexities and systems. In this presentation we promote the e-learning course dedicated for a large public (undergraduates, graduates, PhD students and young scientists) which gather and puts in coherence different pedagogical materials of complex systems and environmental studies. This course describes a complex processes using numerous illustrations, examples and tests that make it "easy to enjoy" learning process. For the seek of simplicity, the course is divided in different modules and at the end of each module a set of exercises and program codes are proposed for a best practice. The graphical user interface (GUI) which is constructed using an open source Opale Scenari offers a simple navigation through the different module. The course treats the complex systems that can be found in environment and their observables, we particularly highlight the extreme variability of these observables over a wide range of scales. Using the multifractal formalism through different applications (turbulence, precipitation, hydrology) we demonstrate how such extreme variability of the geophysical/biological fields should be used solving everyday (geo-)environmental chalenges.

  12. Cdt1p, through its interaction with Mcm6p, is required for the formation, nuclear accumulation and chromatin loading of the MCM complex.

    PubMed

    Wu, Rentian; Wang, Jiafeng; Liang, Chun

    2012-01-01

    Regulation of DNA replication initiation is essential for the faithful inheritance of genetic information. Replication initiation is a multi-step process involving many factors including ORC, Cdt1p, Mcm2-7p and other proteins that bind to replication origins to form a pre-replicative complex (pre-RC). As a prerequisite for pre-RC assembly, Cdt1p and the Mcm2-7p heterohexameric complex accumulate in the nucleus in G1 phase in an interdependent manner in budding yeast. However, the nature of this interdependence is not clear, nor is it known whether Cdt1p is required for the assembly of the MCM complex. In this study, we provide the first evidence that Cdt1p, through its interaction with Mcm6p with the C-terminal regions of the two proteins, is crucial for the formation of the MCM complex in both the cytoplasm and nucleoplasm. We demonstrate that disruption of the interaction between Cdt1p and Mcm6p prevents the formation of the MCM complex, excludes Mcm2-7p from the nucleus, and inhibits pre-RC assembly and DNA replication. Our findings suggest a function for Cdt1p in promoting the assembly of the MCM complex and maintaining its integrity by interacting with Mcm6p.

  13. Phenol removal pretreatment process

    DOEpatents

    Hames, Bonnie R.

    2004-04-13

    A process for removing phenols from an aqueous solution is provided, which comprises the steps of contacting a mixture comprising the solution and a metal oxide, forming a phenol metal oxide complex, and removing the complex from the mixture.

  14. An automated flow system incorporating in-line acid dissolution of bismuth metal from a cyclotron irradiated target assembly for use in the isolation of astatine-211

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Hara, Matthew J.; Krzysko, Anthony J.; Niver, Cynthia M.

    Astatine-211 (211At) is a promising cyclotron-produced radionuclide being investigated for use in targeted alpha therapy of blood borne and metastatic cancers, as well as treatment of tumor remnants after surgical resections. The isolation of trace quantities of 211At, produced within several grams of a Bi metal cyclotron target, involves a complex, multi-step procedure: (1) Bi metal dissolution in strong HNO3, (2) distillation of the HNO3 to yield Bi salts containing 211At, (3) dissolution of the salts in strong HCl, (4) solvent extraction of 211At from bismuth salts with diisopropyl ether (DIPE), and (5) back-extraction of 211At from DIPE into NaOH,more » leading to a purified 211At product. Step (1) has been addressed first to begin the process of automating the onerous 211At isolation process. A computer-controlled Bi target dissolution system has been designed. The system performs in-line dissolution of Bi metal from the target assembly using an enclosed target dissolution block, routing the resulting solubilized 211At/Bi mixture to the subsequent process step. The primary parameters involved in Bi metal solubilization (HNO3 concentration and influent flow rate) were optimized prior to evaluation of the system performance on replicate cyclotron irradiated targets. The results indicate that the system performs reproducibly, having nearly quantitative release of 211At from irradiated targets, with cumulative 211At recoveries that follow a sigmoidal function. The predictable nature of the 211At release profile allows the user to tune the system to meet target processing requirements.« less

  15. Examining Food Risk in the Large using a Complex, Networked System-of-sytems Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, John; Newkirk, Ryan; Mc Donald, Mark P

    2010-12-03

    The food production infrastructure is a highly complex system of systems. Characterizing the risks of intentional contamination in multi-ingredient manufactured foods is extremely challenging because the risks depend on the vulnerabilities of food processing facilities and on the intricacies of the supply-distribution networks that link them. A pure engineering approach to modeling the system is impractical because of the overall system complexity and paucity of data. A methodology is needed to assess food contamination risk 'in the large', based on current, high-level information about manufacturing facilities, corrunodities and markets, that will indicate which food categories are most at risk ofmore » intentional contamination and warrant deeper analysis. The approach begins by decomposing the system for producing a multi-ingredient food into instances of two subsystem archetypes: (1) the relevant manufacturing and processing facilities, and (2) the networked corrunodity flows that link them to each other and consumers. Ingredient manufacturing subsystems are modeled as generic systems dynamics models with distributions of key parameters that span the configurations of real facilities. Networks representing the distribution systems are synthesized from general information about food corrunodities. This is done in a series of steps. First, probability networks representing the aggregated flows of food from manufacturers to wholesalers, retailers, other manufacturers, and direct consumers are inferred from high-level approximate information. This is followed by disaggregation of the general flows into flows connecting 'large' and 'small' categories of manufacturers, wholesalers, retailers, and consumers. Optimization methods are then used to determine the most likely network flows consistent with given data. Vulnerability can be assessed for a potential contamination point using a modified CARVER + Shock model. Once the facility and corrunodity flow models are instantiated, a risk consequence analysis can be performed by injecting contaminant at chosen points in the system and propagating the event through the overarching system to arrive at morbidity and mortality figures. A generic chocolate snack cake model, consisting of fluid milk, liquid eggs, and cocoa, is described as an intended proof of concept for multi-ingredient food systems. We aim for an eventual tool that can be used directly by policy makers and planners.« less

  16. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  17. Enhancement of Tropical Land Cover Mapping with Wavelet-Based Fusion and Unsupervised Clustering of SAR and Landsat Image Data

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Laporte, Nadine; Netanyahuy, Nathan S.; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    The characterization and the mapping of land cover/land use of forest areas, such as the Central African rainforest, is a very complex task. This complexity is mainly due to the extent of such areas and, as a consequence, to the lack of full and continuous cloud-free coverage of those large regions by one single remote sensing instrument, In order to provide improved vegetation maps of Central Africa and to develop forest monitoring techniques for applications at the local and regional scales, we propose to utilize multi-sensor remote sensing observations coupled with in-situ data. Fusion and clustering of multi-sensor data are the first steps towards the development of such a forest monitoring system. In this paper, we will describe some preliminary experiments involving the fusion of SAR and Landsat image data of the Lope Reserve in Gabon. Similarly to previous fusion studies, our fusion method is wavelet-based. The fusion provides a new image data set which contains more detailed texture features and preserves the large homogeneous regions that are observed by the Thematic Mapper sensor. The fusion step is followed by unsupervised clustering and provides a vegetation map of the area.

  18. Combination therapies - the next logical step for the treatment of synucleinopathies?

    PubMed Central

    Valera, E.; Masliah, E.

    2015-01-01

    Currently there are no disease-modifying alternatives for the treatment of most neurodegenerative disorders. The available therapies for diseases such as Parkinson’s disease (PD), PD dementia (PDD), Dementia with Lewy bodies (DLB) and Multiple system atrophy (MSA), in which the protein alpha-synuclein (α-syn) accumulates within neurons and glial cells with toxic consequences, are focused on managing the disease symptoms. However, utilizing strategic drug combinations and/or multi-target drugs might increase the treatment efficiency when compared to monotherapies. Synucleinopathies are complex disorders that progress through several stages, and toxic α-syn aggregates exhibit prion-like behavior spreading from cell to cell. Therefore, it follows that these neurodegenerative disorders might require equally complex therapeutic approaches in order to obtain significant and long-lasting results. Hypothetically, therapies aimed at reducing α-syn accumulation and cell-to-cell transfer, such as immunotherapy against α-syn, could be combined with agents that reduce neuroinflammation with potential synergistic outcomes. Here we review the current evidence supporting this type of approach, suggesting that such rational therapy combinations, together with the use of multi-target drugs, may hold promise as the next logical step for the treatment of synucleinopathies. PMID:26388203

  19. Interactive Design Strategy for a Multi-Functional PAMAM Dendrimer-Based Nano-Therapeutic Using Computational Models and Experimental Analysis

    PubMed Central

    Lee, Inhan; Williams, Christopher R.; Athey, Brian D.; Baker, James R.

    2010-01-01

    Molecular dynamics simulations of nano-therapeutics as a final product and of all intermediates in the process of generating a multi-functional nano-therapeutic based on a poly(amidoamine) (PAMAM) dendrimer were performed along with chemical analyses of each of them. The actual structures of the dendrimers were predicted, based on potentiometric titration, gel permeation chromatography, and NMR. The chemical analyses determined the numbers of functional molecules, based on the actual structure of the dendrimer. Molecular dynamics simulations calculated the configurations of the intermediates and the radial distributions of functional molecules, based on their numbers. This interactive process between the simulation results and the chemical analyses provided a further strategy to design the next reaction steps and to gain insight into the products at each chemical reaction step. PMID:20700476

  20. Bayesian parameter estimation for the Wnt pathway: an infinite mixture models approach.

    PubMed

    Koutroumpas, Konstantinos; Ballarini, Paolo; Votsi, Irene; Cournède, Paul-Henry

    2016-09-01

    Likelihood-free methods, like Approximate Bayesian Computation (ABC), have been extensively used in model-based statistical inference with intractable likelihood functions. When combined with Sequential Monte Carlo (SMC) algorithms they constitute a powerful approach for parameter estimation and model selection of mathematical models of complex biological systems. A crucial step in the ABC-SMC algorithms, significantly affecting their performance, is the propagation of a set of parameter vectors through a sequence of intermediate distributions using Markov kernels. In this article, we employ Dirichlet process mixtures (DPMs) to design optimal transition kernels and we present an ABC-SMC algorithm with DPM kernels. We illustrate the use of the proposed methodology using real data for the canonical Wnt signaling pathway. A multi-compartment model of the pathway is developed and it is compared to an existing model. The results indicate that DPMs are more efficient in the exploration of the parameter space and can significantly improve ABC-SMC performance. In comparison to alternative sampling schemes that are commonly used, the proposed approach can bring potential benefits in the estimation of complex multimodal distributions. The method is used to estimate the parameters and the initial state of two models of the Wnt pathway and it is shown that the multi-compartment model fits better the experimental data. Python scripts for the Dirichlet Process Gaussian Mixture model and the Gibbs sampler are available at https://sites.google.com/site/kkoutroumpas/software konstantinos.koutroumpas@ecp.fr. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Rubisco Accumulation Factor 1 from Thermosynechococcus elongatus participates in the final stages of ribulose-1,5-bisphosphate carboxylase/oxygenase assembly in Escherichia coli cells and in vitro.

    PubMed

    Kolesinski, Piotr; Belusiak, Iwona; Czarnocki-Cieciura, Mariusz; Szczepaniak, Andrzej

    2014-09-01

    Ribulose-1,5-bisphosphate carboxylase/oxygenase (Rubisco) biosynthesis is a multi-step process in which specific chaperones are involved. Recently, a novel polypeptide, Rubisco Accumulation Factor 1 (RAF1), has been identified as a protein that is necessary for proper assembly of this enzyme in maize cells (Zea mays). However, neither its specific function nor its mode of action have as yet been determined. The results presented here show that the prokaryotic homolog of RAF1 from Thermosynechococcus elongatus is expressed in cyanobacterial cells and interacts with a large Rubisco subunit (RbcL). Using a heterologous expression system, it was demonstrated that this protein promotes Rubisco assembly in Escherichia coli cells. Moreover, when co-expressed with RbcL alone, a stable RbcL-RAF1 complex is formed. Molecular mass determination for this Rubisco assembly intermediate by size-exclusion chromatography coupled with multi-angle light scattering indicates that it consists of an RbcL dimer and two RAF1 molecules. A purified RbcL-RAF1 complex dissociated upon addition of a small Rubisco subunit (RbcS), leading to formation of the active holoenzyme. Moreover, titration of the octameric (RbcL8) core of Rubisco with RAF1 results in disassembly of such a stucture and creation of an RbcL-RAF1 intermediate. The results presented here are the first attempt to elucidate the role of cyanobacterial Rubisco Accumulation Factor 1 in the Rubisco biosynthesis process. © 2014 FEBS.

  2. Multi-Temporal Interferometry to Investigate Landslide Dynamics in a Tropical Urban Environment: Focus on Bukavu (DR Congo)

    NASA Astrophysics Data System (ADS)

    Monsieurs, E.; Dille, A.; Nobile, A.; d'Oreye, N.; Kervyn, F.; Dewitte, O.

    2017-12-01

    Landslides can lead to high impacts in less developed countries, particularly in some urban tropical environments where a combination of intense rainfall, active tectonics, steep topography and high population density can be found. However, the processes controlling landslides initiation and their evolution through time remains poorly understood. Here we show the relevance of the use of multi-temporal differential SAR interferometry (DInSAR) to characterize ground deformations associated to landslides in the rapidly expanding city of Bukavu (DR Congo). A series of 70 COSMO-SkyMed SAR images acquired between March 2015 and April 2016 with a mean revisiting time of 8 days were used to produce displacement rate maps and ground deformation time series using the Small Baseline Subset approach. Results show that various landslide processes of different ages, mechanisms and state of activity can be identified across Bukavu city. InSAR ground deformation maps reveal for instance the complexity of a large (1.5 km²) active slide affecting a densely inhabited slum neighbourhood and characterized by the presence of sectors moving at different rates (ranging from 10 mm/yr up to 75 mm/yr in LOS direction). The evaluation of the ground deformations captured by DInSAR through a two-step validation procedure combining Differential GPS measurements and field observations attested the reliability of the measurements as well as the capability of the technique to grasp the deformation pattern affecting this complex tropical-urban environment. However, longer time series will be needed to infer landside response to climate, seismic and anthropogenic activities.

  3. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  4. A novel hybrid MCDM model for performance evaluation of research and technology organizations based on BSC approach.

    PubMed

    Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi

    2016-10-01

    Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Securing Information with Complex Optical Encryption Networks

    DTIC Science & Technology

    2015-08-11

    Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military

  6. Modeling synthetic lethality

    PubMed Central

    Le Meur, Nolwenn; Gentleman, Robert

    2008-01-01

    Background Synthetic lethality defines a genetic interaction where the combination of mutations in two or more genes leads to cell death. The implications of synthetic lethal screens have been discussed in the context of drug development as synthetic lethal pairs could be used to selectively kill cancer cells, but leave normal cells relatively unharmed. A challenge is to assess genome-wide experimental data and integrate the results to better understand the underlying biological processes. We propose statistical and computational tools that can be used to find relationships between synthetic lethality and cellular organizational units. Results In Saccharomyces cerevisiae, we identified multi-protein complexes and pairs of multi-protein complexes that share an unusually high number of synthetic genetic interactions. As previously predicted, we found that synthetic lethality can arise from subunits of an essential multi-protein complex or between pairs of multi-protein complexes. Finally, using multi-protein complexes allowed us to take into account the pleiotropic nature of the gene products. Conclusions Modeling synthetic lethality using current estimates of the yeast interactome is an efficient approach to disentangle some of the complex molecular interactions that drive a cell. Our model in conjunction with applied statistical methods and computational methods provides new tools to better characterize synthetic genetic interactions. PMID:18789146

  7. Advanced real-time multi-display educational system (ARMES): An innovative real-time audiovisual mentoring tool for complex robotic surgery.

    PubMed

    Lee, Joong Ho; Tanaka, Eiji; Woo, Yanghee; Ali, Güner; Son, Taeil; Kim, Hyoung-Il; Hyung, Woo Jin

    2017-12-01

    The recent scientific and technologic advances have profoundly affected the training of surgeons worldwide. We describe a novel intraoperative real-time training module, the Advanced Robotic Multi-display Educational System (ARMES). We created a real-time training module, which can provide a standardized step by step guidance to robotic distal subtotal gastrectomy with D2 lymphadenectomy procedures, ARMES. The short video clips of 20 key steps in the standardized procedure for robotic gastrectomy were created and integrated with TilePro™ software to delivery on da Vinci Surgical Systems (Intuitive Surgical, Sunnyvale, CA). We successfully performed the robotic distal subtotal gastrectomy with D2 lymphadenectomy for patient with gastric cancer employing this new teaching method without any transfer errors or system failures. Using this technique, the total operative time was 197 min and blood loss was 50 mL and there were no intra- or post-operative complications. Our innovative real-time mentoring module, ARMES, enables standardized, systematic guidance during surgical procedures. © 2017 Wiley Periodicals, Inc.

  8. Discrepancy between mRNA and protein abundance: Insight from information retrieval process in computers

    PubMed Central

    Wang, Degeng

    2008-01-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers - multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks – biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird’s-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation. PMID:18757239

  9. Discrepancy between mRNA and protein abundance: insight from information retrieval process in computers.

    PubMed

    Wang, Degeng

    2008-12-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers-multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory, respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks-biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird's-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation.

  10. On the role of the chaperonin CCT in the just-in-time assembly process of APC/CCdc20.

    PubMed

    Dekker, Carien

    2010-02-05

    The just-in-time hypothesis relates to the assembly of large multi-protein complexes and their regulation of activation in the cell. Here I postulate that chaperonins may contribute to the timely assembly and activation of such complexes. For the case of anaphase promoting complex/cyclosome(Cdc20) assembly by the eukaryotic chaperonin chaperonin containing Tcp1 it is shown that just-in-time synthesis and chaperone-assisted folding can synergise to generate a highly regulated assembly process of a protein complex that is vital for cell cycle progression. Once dependency has been established transcriptional regulation and chaperonin-dependency may have co-evolved to safeguard the timely activation of important multi-protein complexes. 2009 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  11. 75 FR 42378 - Fisheries of the South Atlantic; Southeast Data, Assessment, and Review (SEDAR); South Atlantic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... (SEDAR) process, a multi-step method for determining the status of fish stocks in the Southeast Region. SEDAR includes a Data Workshop, a Stock Assessment Process and a Review Workshop. The product of the... datasets are appropriate for assessment analyses. The product of the Stock Assessment Process is a stock...

  12. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings. PMID:25285151

  13. Understanding missed opportunities for more timely diagnosis of cancer in symptomatic patients after presentation

    PubMed Central

    Lyratzopoulos, G; Vedsted, P; Singh, H

    2015-01-01

    The diagnosis of cancer is a complex, multi-step process. In this paper, we highlight factors involved in missed opportunities to diagnose cancer more promptly in symptomatic patients and discuss responsible mechanisms and potential strategies to shorten intervals from presentation to diagnosis. Missed opportunities are instances in which post-hoc judgement indicates that alternative decisions or actions could have led to more timely diagnosis. They can occur in any of the three phases of the diagnostic process (initial diagnostic assessment; diagnostic test performance and interpretation; and diagnostic follow-up and coordination) and can involve patient, doctor/care team, and health-care system factors, often in combination. In this perspective article, we consider epidemiological ‘signals' suggestive of missed opportunities and draw on evidence from retrospective case reviews of cancer patient cohorts to summarise factors that contribute to missed opportunities. Multi-disciplinary research targeting such factors is important to shorten diagnostic intervals post presentation. Insights from the fields of organisational and cognitive psychology, human factors science and informatics can be extremely valuable in this emerging research agenda. We provide a conceptual foundation for the development of future interventions to minimise the occurrence of missed opportunities in cancer diagnosis, enriching current approaches that chiefly focus on clinical decision support or on widening access to investigations. PMID:25734393

  14. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  15. On being green: can flow chemistry help?

    PubMed

    Ley, Steven V

    2012-08-01

    The principles of Green Chemistry are important but challenging drivers for most modern synthesis programs. To meet these challenges new flow chemistry tools are proving to be very effective by providing improved heat/mass transfer opportunities, lower solvent usage, less waste generation, hazardous compound containment, and the possibility of a 24/7 working regime. This machine-assisted approach can be used to effect repetitive or routine scale-up steps or when combined with reagent and scavenger cartridges, to achieve multi-step synthesis of complex natural products and pharmaceutical agents. Copyright © 2012 The Japan Chemical Journal Forum and Wiley Periodicals, Inc.

  16. Quasi-multi-pulse voltage source converter design with two control degrees of freedom

    NASA Astrophysics Data System (ADS)

    Vural, A. M.; Bayindir, K. C.

    2015-05-01

    In this article, the design details of a quasi-multi-pulse voltage source converter (VSC) switched at line frequency of 50 Hz are given in a step-by-step process. The proposed converter is comprised of four 12-pulse converter units, which is suitable for the simulation of single-/multi-converter flexible alternating current transmission system devices as well as high voltage direct current systems operating at the transmission level. The magnetic interface of the converter is originally designed with given all parameters for 100 MVA operation. The so-called two-angle control method is adopted to control the voltage magnitude and the phase angle of the converter independently. PSCAD simulation results verify both four-quadrant converter operation and closed-loop control of the converter operated as static synchronous compensator (STATCOM).

  17. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  18. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    NASA Astrophysics Data System (ADS)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2017-04-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  19. Standardized Access and Processing of Multi-Source Earth Observation Time-Series Data within a Regional Data Middleware

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Schmullius, C.

    2017-12-01

    Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.

  20. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    NASA Astrophysics Data System (ADS)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  1. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  2. Continuous-Time Bilinear System Identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan

    2003-01-01

    The objective of this paper is to describe a new method for identification of a continuous-time multi-input and multi-output bilinear system. The approach is to make judicious use of the linear-model properties of the bilinear system when subjected to a constant input. Two steps are required in the identification process. The first step is to use a set of pulse responses resulting from a constant input of one sample period to identify the state matrix, the output matrix, and the direct transmission matrix. The second step is to use another set of pulse responses with the same constant input over multiple sample periods to identify the input matrix and the coefficient matrices associated with the coupling terms between the state and the inputs. Numerical examples are given to illustrate the concept and the computational algorithm for the identification method.

  3. Eureka: Six Easy Steps to Research Success

    ERIC Educational Resources Information Center

    Hubel, Joy Alter

    2005-01-01

    Eureka is similar to the Big6(super TM) research skills by Michael Eisenberg and Bob Berkowitz, as both methods simplify the complex process of critical information gathering into six user-friendly steps. The six research steps to Eureka are presented.

  4. Model-based MPC enables curvilinear ILT using either VSB or multi-beam mask writers

    NASA Astrophysics Data System (ADS)

    Pang, Linyong; Takatsukasa, Yutetsu; Hara, Daisuke; Pomerantsev, Michael; Su, Bo; Fujimura, Aki

    2017-07-01

    Inverse Lithography Technology (ILT) is becoming the choice for Optical Proximity Correction (OPC) of advanced technology nodes in IC design and production. Multi-beam mask writers promise significant mask writing time reduction for complex ILT style masks. Before multi-beam mask writers become the main stream working tools in mask production, VSB writers will continue to be the tool of choice to write both curvilinear ILT and Manhattanized ILT masks. To enable VSB mask writers for complex ILT style masks, model-based mask process correction (MB-MPC) is required to do the following: 1). Make reasonable corrections for complex edges for those features that exhibit relatively large deviations from both curvilinear ILT and Manhattanized ILT designs. 2). Control and manage both Edge Placement Errors (EPE) and shot count. 3. Assist in easing the migration to future multi-beam mask writer and serve as an effective backup solution during the transition. In this paper, a solution meeting all those requirements, MB-MPC with GPU acceleration, will be presented. One model calibration per process allows accurate correction regardless of the target mask writer.

  5. Short-Chain 3-Hydroxyacyl-Coenzyme A Dehydrogenase Associates with a Protein Super-Complex Integrating Multiple Metabolic Pathways

    PubMed Central

    Narayan, Srinivas B.; Master, Stephen R.; Sireci, Anthony N.; Bierl, Charlene; Stanley, Paige E.; Li, Changhong; Stanley, Charles A.; Bennett, Michael J.

    2012-01-01

    Proteins involved in mitochondrial metabolic pathways engage in functionally relevant multi-enzyme complexes. We previously described an interaction between short-chain 3-hydroxyacyl-coenzyme A dehydrogenase (SCHAD) and glutamate dehydrogenase (GDH) explaining the clinical phenotype of hyperinsulinism in SCHAD-deficient patients and adding SCHAD to the list of mitochondrial proteins capable of forming functional, multi-pathway complexes. In this work, we provide evidence of SCHAD's involvement in additional interactions forming tissue-specific metabolic super complexes involving both membrane-associated and matrix-dwelling enzymes and spanning multiple metabolic pathways. As an example, in murine liver, we find SCHAD interaction with aspartate transaminase (AST) and GDH from amino acid metabolic pathways, carbamoyl phosphate synthase I (CPS-1) from ureagenesis, other fatty acid oxidation and ketogenesis enzymes and fructose-bisphosphate aldolase, an extra-mitochondrial enzyme of the glycolytic pathway. Most of the interactions appear to be independent of SCHAD's role in the penultimate step of fatty acid oxidation suggesting an organizational, structural or non-enzymatic role for the SCHAD protein. PMID:22496890

  6. Uncovering the stoichiometry of Pyrococcus furiosus RNase P, a multi-subunit catalytic ribonucleoprotein complex, by surface-induced dissociation and ion mobility mass spectrometry.

    PubMed

    Ma, Xin; Lai, Lien B; Lai, Stella M; Tanimoto, Akiko; Foster, Mark P; Wysocki, Vicki H; Gopalan, Venkat

    2014-10-20

    We demonstrate that surface-induced dissociation (SID) coupled with ion mobility mass spectrometry (IM-MS) is a powerful tool for determining the stoichiometry of a multi-subunit ribonucleoprotein (RNP) complex assembled in a solution containing Mg(2+). We investigated Pyrococcus furiosus (Pfu) RNase P, an archaeal RNP that catalyzes tRNA 5' maturation. Previous step-wise, Mg(2+)-dependent reconstitutions of Pfu RNase P with its catalytic RNA subunit and two interacting protein cofactor pairs (RPP21⋅RPP29 and POP5⋅RPP30) revealed functional RNP intermediates en route to the RNase P enzyme, but provided no information on subunit stoichiometry. Our native MS studies with the proteins showed RPP21⋅RPP29 and (POP5⋅RPP30)2 complexes, but indicated a 1:1 composition for all subunits when either one or both protein complexes bind the cognate RNA. These results highlight the utility of SID and IM-MS in resolving conformational heterogeneity and yielding insights on RNP assembly. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Health behavior change models for HIV prevention and AIDS care: practical recommendations for a multi-level approach.

    PubMed

    Kaufman, Michelle R; Cornish, Flora; Zimmerman, Rick S; Johnson, Blair T

    2014-08-15

    Despite increasing recent emphasis on the social and structural determinants of HIV-related behavior, empirical research and interventions lag behind, partly because of the complexity of social-structural approaches. This article provides a comprehensive and practical review of the diverse literature on multi-level approaches to HIV-related behavior change in the interest of contributing to the ongoing shift to more holistic theory, research, and practice. It has the following specific aims: (1) to provide a comprehensive list of relevant variables/factors related to behavior change at all points on the individual-structural spectrum, (2) to map out and compare the characteristics of important recent multi-level models, (3) to reflect on the challenges of operating with such complex theoretical tools, and (4) to identify next steps and make actionable recommendations. Using a multi-level approach implies incorporating increasing numbers of variables and increasingly context-specific mechanisms, overall producing greater intricacies. We conclude with recommendations on how best to respond to this complexity, which include: using formative research and interdisciplinary collaboration to select the most appropriate levels and variables in a given context; measuring social and institutional variables at the appropriate level to ensure meaningful assessments of multiple levels are made; and conceptualizing intervention and research with reference to theoretical models and mechanisms to facilitate transferability, sustainability, and scalability.

  8. At a glance: cellular biology for engineers.

    PubMed

    Khoshmanesh, K; Kouzani, A Z; Nahavandi, S; Baratchi, S; Kanwar, J R

    2008-10-01

    Engineering contributions have played an important role in the rise and evolution of cellular biology. Engineering technologies have helped biologists to explore the living organisms at cellular and molecular levels, and have created new opportunities to tackle the unsolved biological problems. There is now a growing demand to further expand the role of engineering in cellular biology research. For an engineer to play an effective role in cellular biology, the first essential step is to understand the cells and their components. However, the stumbling block of this step is to comprehend the information given in the cellular biology literature because it best suits the readers with a biological background. This paper aims to overcome this bottleneck by describing the human cell components as micro-plants that form cells as micro-bio-factories. This concept can accelerate the engineers' comprehension of the subject. In this paper, first the structure and function of different cell components are described. In addition, the engineering attempts to mimic various cell components through numerical modelling or physical implementation are highlighted. Next, the interaction of different cell components that facilitate complicated chemical processes, such as energy generation and protein synthesis, are described. These complex interactions are translated into simple flow diagrams, generally used by engineers to represent multi-component processes.

  9. Long range personalized cancer treatment strategies incorporating evolutionary dynamics.

    PubMed

    Yeang, Chen-Hsiang; Beckman, Robert A

    2016-10-22

    Current cancer precision medicine strategies match therapies to static consensus molecular properties of an individual's cancer, thus determining the next therapeutic maneuver. These strategies typically maintain a constant treatment while the cancer is not worsening. However, cancers feature complicated sub-clonal structure and dynamic evolution. We have recently shown, in a comprehensive simulation of two non-cross resistant therapies across a broad parameter space representing realistic tumors, that substantial improvement in cure rates and median survival can be obtained utilizing dynamic precision medicine strategies. These dynamic strategies explicitly consider intratumoral heterogeneity and evolutionary dynamics, including predicted future drug resistance states, and reevaluate optimal therapy every 45 days. However, the optimization is performed in single 45 day steps ("single-step optimization"). Herein we evaluate analogous strategies that think multiple therapeutic maneuvers ahead, considering potential outcomes at 5 steps ahead ("multi-step optimization") or 40 steps ahead ("adaptive long term optimization (ALTO)") when recommending the optimal therapy in each 45 day block, in simulations involving both 2 and 3 non-cross resistant therapies. We also evaluate an ALTO approach for situations where simultaneous combination therapy is not feasible ("Adaptive long term optimization: serial monotherapy only (ALTO-SMO)"). Simulations utilize populations of 764,000 and 1,700,000 virtual patients for 2 and 3 drug cases, respectively. Each virtual patient represents a unique clinical presentation including sizes of major and minor tumor subclones, growth rates, evolution rates, and drug sensitivities. While multi-step optimization and ALTO provide no significant average survival benefit, cure rates are significantly increased by ALTO. Furthermore, in the subset of individual virtual patients demonstrating clinically significant difference in outcome between approaches, by far the majority show an advantage of multi-step or ALTO over single-step optimization. ALTO-SMO delivers cure rates superior or equal to those of single- or multi-step optimization, in 2 and 3 drug cases respectively. In selected virtual patients incurable by dynamic precision medicine using single-step optimization, analogous strategies that "think ahead" can deliver long-term survival and cure without any disadvantage for non-responders. When therapies require dose reduction in combination (due to toxicity), optimal strategies feature complex patterns involving rapidly interleaved pulses of combinations and high dose monotherapy. This article was reviewed by Wendy Cornell, Marek Kimmel, and Andrzej Swierniak. Wendy Cornell and Andrzej Swierniak are external reviewers (not members of the Biology Direct editorial board). Andrzej Swierniak was nominated by Marek Kimmel.

  10. TkPl_SU: An Open-source Perl Script Builder for Seismic Unix

    NASA Astrophysics Data System (ADS)

    Lorenzo, J. M.

    2017-12-01

    TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.

  11. Approaches of multilayer overlay process control for 28nm FD-SOI derivative applications

    NASA Astrophysics Data System (ADS)

    Duclaux, Benjamin; De Caunes, Jean; Perrier, Robin; Gatefait, Maxime; Le Gratiet, Bertrand; Chapon, Jean-Damien; Monget, Cédric

    2018-03-01

    Derivative technology like embedded Non-Volatile Memories (eNVM) is raising new types of challenges on the "more than Moore" path. By its construction: overlay is critical across multiple layers, by its running mode: usage of high voltage are stressing leakages and breakdown, and finally with its targeted market: Automotive, Industry automation, secure transactions… which are all requesting high device reliability (typically below 1ppm level). As a consequence, overlay specifications are tights, not only between one layer and its reference, but also among the critical layers sharing the same reference. This work describes a broad picture of the key points for multilayer overlay process control in the case of a 28nm FD-SOI technology and its derivative flows. First, the alignment trees of the different flow options have been optimized using a realistic process assumptions calculation for indirect overlay. Then, in the case of a complex alignment tree involving heterogeneous scanner toolset, criticality of tool matching between reference layer and critical layers of the flow has been highlighted. Improving the APC control loops of these multilayer dependencies has been studied with simulations of feed-forward as well as implementing new rework algorithm based on multi-measures. Finally, the management of these measurement steps raises some issues for inline support and using calculations or "virtual overlay" could help to gain some tool capability. A first step towards multilayer overlay process control has been taken.

  12. Predicting High School Completion Using Student Performance in High School Algebra: A Mixed Methods Research Study

    ERIC Educational Resources Information Center

    Chiado, Wendy S.

    2012-01-01

    Too many of our nation's youth have failed to complete high school. Determining why so many of our nation's students fail to graduate is a complex, multi-faceted problem and beyond the scope of any one study. The study presented herein utilized a thirteen-step mixed methods model developed by Leech and Onwuegbuzie (2007) to demonstrate within a…

  13. Path lumping: An efficient algorithm to identify metastable path channels for conformational dynamics of multi-body systems

    NASA Astrophysics Data System (ADS)

    Meng, Luming; Sheong, Fu Kit; Zeng, Xiangze; Zhu, Lizhe; Huang, Xuhui

    2017-07-01

    Constructing Markov state models from large-scale molecular dynamics simulation trajectories is a promising approach to dissect the kinetic mechanisms of complex chemical and biological processes. Combined with transition path theory, Markov state models can be applied to identify all pathways connecting any conformational states of interest. However, the identified pathways can be too complex to comprehend, especially for multi-body processes where numerous parallel pathways with comparable flux probability often coexist. Here, we have developed a path lumping method to group these parallel pathways into metastable path channels for analysis. We define the similarity between two pathways as the intercrossing flux between them and then apply the spectral clustering algorithm to lump these pathways into groups. We demonstrate the power of our method by applying it to two systems: a 2D-potential consisting of four metastable energy channels and the hydrophobic collapse process of two hydrophobic molecules. In both cases, our algorithm successfully reveals the metastable path channels. We expect this path lumping algorithm to be a promising tool for revealing unprecedented insights into the kinetic mechanisms of complex multi-body processes.

  14. Reduced complexity of multi-track joint 2-D Viterbi detectors for bit-patterned media recording channel

    NASA Astrophysics Data System (ADS)

    Myint, L. M. M.; Warisarn, C.

    2017-05-01

    Two-dimensional (2-D) interference is one of the prominent challenges in ultra-high density recording system such as bit patterned media recording (BPMR). The multi-track joint 2-D detection technique with the help of the array-head reading can tackle this problem effectively by jointly processing the multiple readback signals from the adjacent tracks. Moreover, it can robustly alleviate the impairments due to track mis-registration (TMR) and media noise. However, the computational complexity of such detectors is normally too high and hard to implement in a reality, even for a few multiple tracks. Therefore, in this paper, we mainly focus on reducing the complexity of multi-track joint 2-D Viterbi detector without paying a large penalty in terms of the performance. We propose a simplified multi-track joint 2-D Viterbi detector with a manageable complexity level for the BPMR's multi-track multi-head (MTMH) system. In the proposed method, the complexity of detector's trellis is reduced with the help of the joint-track equalization method which employs 1-D equalizers and 2-D generalized partial response (GPR) target. Moreover, we also examine the performance of a full-fledged multi-track joint 2-D detector and the conventional 2-D detection. The results show that the simplified detector can perform close to the full-fledge detector, especially when the system faces high media noise, with the significant low complexity.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiali, E-mail: j.zhang@mpie.de; Morsdorf, Lutz, E-mail: l.morsdorf@mpie.de; Tasan, Cemal Cem, E-mail: c.tasan@mpie.de

    In-situ scanning electron microscopy observations of the microstructure evolution during heat treatments are increasingly demanded due to the growing number of alloys with complex microstructures. Post-mortem characterization of the as-processed microstructures rarely provides sufficient insight on the exact route of the microstructure formation. On the other hand, in-situ SEM approaches are often limited due to the arising challenges upon using an in-situ heating setup, e.g. in (i) employing different detectors, (ii) preventing specimen surface degradation, or (iii) controlling and measuring the temperature precisely. Here, we explore and expand the capabilities of the “mid-way” solution by step-wise microstructure tracking, ex-situ, atmore » selected steps of heat treatment. This approach circumvents the limitations above, as it involves an atmosphere and temperature well-controlled dilatometer, and high resolution microstructure characterization (using electron channeling contrast imaging, electron backscatter diffraction, atom probe tomography, etc.). We demonstrate the capabilities of this approach by focusing on three cases: (i) nano-scale carbide precipitation during low-temperature tempering of martensitic steels, (ii) formation of transformation-induced geometrically necessary dislocations in a dual-phase steel during intercritical annealing, and (iii) the partial recrystallization of a metastable β-Ti alloy. - Highlights: • A multi-probe method to track microstructures during heat treatment is developed. • It enables the analysis of various complex phenomena, even those at atomistic scale. • It circumvents some of the free surface effects of classical in-situ experiments.« less

  16. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. 78 FR 15707 - Fisheries of the Atlantic and Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the status of... Center. Participants include: data collectors and database managers; stock assessment scientists...

  18. Cold-in-place recycling in New York State.

    DOT National Transportation Integrated Search

    2010-07-01

    Cold in-place recycling (CIPR) is a continuous multi-step process in which the existing asphalt pavement is : recycled using specialized equipment that cold mills the asphaltic pavement and blends asphalt emulsion and : aggregate (if necessary) with ...

  19. RISC assembly: Coordination between small RNAs and Argonaute proteins.

    PubMed

    Kobayashi, Hotaka; Tomari, Yukihide

    2016-01-01

    Non-coding RNAs generally form ribonucleoprotein (RNP) complexes with their partner proteins to exert their functions. Small RNAs, including microRNAs, small interfering RNAs, and PIWI-interacting RNAs, assemble with Argonaute (Ago) family proteins into the effector complex called RNA-induced silencing complex (RISC), which mediates sequence-specific target gene silencing. RISC assembly is not a simple binding between a small RNA and Ago; rather, it follows an ordered multi-step pathway that requires specific accessory factors. Some steps of RISC assembly and RISC-mediated gene silencing are dependent on or facilitated by particular intracellular platforms, suggesting their spatial regulation. In this review, we summarize the currently known mechanisms for RISC assembly of each small RNA class and propose a revised model for the role of the chaperone machinery in the duplex-initiated RISC assembly pathway. This article is part of a Special Issue entitled: Clues to long noncoding RNA taxonomy1, edited by Dr. Tetsuro Hirose and Dr. Shinichi Nakagawa. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Multi-step Variable Height Photolithography for Valved Multilayer Microfluidic Devices.

    PubMed

    Brower, Kara; White, Adam K; Fordyce, Polly M

    2017-01-27

    Microfluidic systems have enabled powerful new approaches to high-throughput biochemical and biological analysis. However, there remains a barrier to entry for non-specialists who would benefit greatly from the ability to develop their own microfluidic devices to address research questions. Particularly lacking has been the open dissemination of protocols related to photolithography, a key step in the development of a replica mold for the manufacture of polydimethylsiloxane (PDMS) devices. While the fabrication of single height silicon masters has been explored extensively in literature, fabrication steps for more complicated photolithography features necessary for many interesting device functionalities (such as feature rounding to make valve structures, multi-height single-mold patterning, or high aspect ratio definition) are often not explicitly outlined. Here, we provide a complete protocol for making multilayer microfluidic devices with valves and complex multi-height geometries, tunable for any application. These fabrication procedures are presented in the context of a microfluidic hydrogel bead synthesizer and demonstrate the production of droplets containing polyethylene glycol (PEG diacrylate) and a photoinitiator that can be polymerized into solid beads. This protocol and accompanying discussion provide a foundation of design principles and fabrication methods that enables development of a wide variety of microfluidic devices. The details included here should allow non-specialists to design and fabricate novel devices, thereby bringing a host of recently developed technologies to their most exciting applications in biological laboratories.

  1. Phase shifts in the stoichiometry of rifamycin B fermentation and correlation with the trends in the parameters measured online.

    PubMed

    Bapat, Prashant M; Das, Debasish; Dave, Nishant N; Wangikar, Pramod P

    2006-12-15

    Antibiotic fermentation processes are raw material cost intensive and the profitability is greatly dependent on the product yield per unit substrate consumed. In order to reduce costs, industrial processes use organic nitrogen substrates (ONS) such as corn steep liquor and yeast extract. Thus, although the stoichiometric analysis is the first logical step in process development, it is often difficult to achieve due to the ill-defined nature of the medium. Here, we present a black-box stoichiometric model for rifamycin B production via Amycolatopsis mediterranei S699 fermentation in complex multi-substrate medium. The stoichiometric coefficients have been experimentally evaluated for nine different media compositions. The ONS was quantified in terms of the amino acid content that it provides. Note that the black box stoichiometric model is an overall result of the metabolic reactions that occur during growth. Hence, the observed stoichiometric coefficients are liable to change during the batch cycle. To capture the shifts in stoichiometry, we carried out the stoichiometric analysis over short intervals of 8-16 h in a batch cycle of 100-200 h. An error analysis shows that there are no systematic errors in the measurements and that there are no unaccounted products in the process. The growth stoichiometry shows a shift from one substrate combination to another during the batch cycle. The shifts were observed to correlate well with the shifts in the trends of pH and exit carbon dioxide profiles. To exemplify, the ammonia uptake and nitrate uptake phases were marked by a decreasing pH trend and an increasing pH trend, respectively. Further, we find the product yield per unit carbon substrate to be greatly dependent on the nature of the nitrogen substrate. The analysis presented here can be readily applied to other fermentation systems that employ multi-substrate complex media.

  2. 75 FR 59226 - Fisheries of the South Atlantic, Gulf of Mexico, and Caribbean; Southeastern Data, Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-27

    ... the Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the... the South Atlantic, Gulf of Mexico, and Caribbean; Southeastern Data, Assessment, and Review (SEDAR... Committee will meet to discuss the SEDAR assessment schedule, budget, and the SEDAR process. See...

  3. CNC Machining Of The Complex Copper Electrodes

    NASA Astrophysics Data System (ADS)

    Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina

    2015-07-01

    This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.

  4. Newmark local time stepping on high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch; Institute of Geophysics, ETH Zurich; Grote, Marcus, E-mail: marcus.grote@unibas.ch

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strongmore » element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.« less

  5. Underground structure pattern and multi AO reaction with step feed concept for upgrading an large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Peng, Yi; Zhang, Jie; Li, Dong

    2018-03-01

    A large wastewater treatment plant (WWTP) could not meet the new demand of urban environment and the need of reclaimed water in China, using a US treatment technology. Thus a multi AO reaction process (Anaerobic/oxic/anoxic/oxic/anoxic/oxic) WWTP with underground structure was proposed to carry out the upgrade project. Four main new technologies were applied: (1) multi AO reaction with step feed technology; (2) deodorization; (3) new energy-saving technology such as water resource heat pump and optical fiber lighting system; (4) dependable old WWTP’s water quality support measurement during new WWTP’s construction. After construction, upgrading WWTP had saved two thirds land occupation, increased 80% treatment capacity and improved effluent standard by more than two times. Moreover, it had become a benchmark of an ecological negative capital changing to a positive capital.

  6. TESS Spacecraft Arrival

    NASA Image and Video Library

    2018-02-12

    NASA's Transiting Exoplanet Survey Satellite (TESS) container is pressure washed at the Multi-Payload Processing Facility at the agency's Kennedy Space Center in Florida. Tess will be moved to the Payload Hazardous Servicing Facility to be processed and prepared for flight. TESS is scheduled to launch atop a SpaceX Falcon 9 rocket from Space Launch Complex 40 at Cape Canaveral Air Force Station. TESS is the next step in NASA's search for planets outside our solar system, known as exoplanets. TESS is a NASA Astrophysics Explorer mission led and operated by MIT in Cambridge, Massachusetts, and managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Dr. George Ricker of MIT’s Kavli Institute for Astrophysics and Space Research serves as principal investigator for the mission. Additional partners include Orbital ATK, NASA’s Ames Research Center, the Harvard-Smithsonian Center for Astrophysics and the Space Telescope Science Institute. More than a dozen universities, research institutes and observatories worldwide are participants in the mission. NASA’s Launch Services Program is responsible for launch management.

  7. A multi-approach feature extractions for iris recognition

    NASA Astrophysics Data System (ADS)

    Sanpachai, H.; Settapong, M.

    2014-04-01

    Biometrics is a promising technique that is used to identify individual traits and characteristics. Iris recognition is one of the most reliable biometric methods. As iris texture and color is fully developed within a year of birth, it remains unchanged throughout a person's life. Contrary to fingerprint, which can be altered due to several aspects including accidental damage, dry or oily skin and dust. Although iris recognition has been studied for more than a decade, there are limited commercial products available due to its arduous requirement such as camera resolution, hardware size, expensive equipment and computational complexity. However, at the present time, technology has overcome these obstacles. Iris recognition can be done through several sequential steps which include pre-processing, features extractions, post-processing, and matching stage. In this paper, we adopted the directional high-low pass filter for feature extraction. A box-counting fractal dimension and Iris code have been proposed as feature representations. Our approach has been tested on CASIA Iris Image database and the results are considered successful.

  8. Development of the Fray-Farthing-Chen Cambridge Process: Towards the Sustainable Production of Titanium and Its Alloys

    NASA Astrophysics Data System (ADS)

    Hu, Di; Dolganov, Aleksei; Ma, Mingchan; Bhattacharya, Biyash; Bishop, Matthew T.; Chen, George Z.

    2018-02-01

    The Kroll process has been employed for titanium extraction since the 1950s. It is a labour and energy intensive multi-step semi-batch process. The post-extraction processes for making the raw titanium into alloys and products are also excessive, including multiple remelting steps. Invented in the late 1990s, the Fray-Farthing-Chen (FFC) Cambridge process extracts titanium from solid oxides at lower energy consumption via electrochemical reduction in molten salts. Its ability to produce alloys and powders, while retaining the cathode shape also promises energy and material efficient manufacturing. Focusing on titanium and its alloys, this article reviews the recent development of the FFC-Cambridge process in two aspects, (1) resource and process sustainability and (2) advanced post-extraction processing.

  9. Dynamic rupture simulation of the 2017 Mw 7.8 Kaikoura (New Zealand) earthquake: Is spontaneous multi-fault rupture expected?

    NASA Astrophysics Data System (ADS)

    Ando, R.; Kaneko, Y.

    2017-12-01

    The coseismic rupture of the 2016 Kaikoura earthquake propagated over the distance of 150 km along the NE-SW striking fault system in the northern South Island of New Zealand. The analysis of In-SAR, GPS and field observations (Hamling et al., 2017) revealed that the most of the rupture occurred along the previously mapped active faults, involving more than seven major fault segments. These fault segments, mostly dipping to northwest, are distributed in a quite complex manner, manifested by fault branching and step-over structures. Back-projection rupture imaging shows that the rupture appears to jump between three sub-parallel fault segments in sequence from the south to north (Kaiser et al., 2017). The rupture seems to be terminated on the Needles fault in Cook Strait. One of the main questions is whether this multi-fault rupture can be naturally explained with the physical basis. In order to understand the conditions responsible for the complex rupture process, we conduct fully dynamic rupture simulations that account for 3-D non-planar fault geometry embedded in an elastic half-space. The fault geometry is constrained by previous In-SAR observations and geological inferences. The regional stress field is constrained by the result of stress tensor inversion based on focal mechanisms (Balfour et al., 2005). The fault is governed by a relatively simple, slip-weakening friction law. For simplicity, the frictional parameters are uniformly distributed as there is no direct estimate of them except for a shallow portion of the Kekerengu fault (Kaneko et al., 2017). Our simulations show that the rupture can indeed propagate through the complex fault system once it is nucleated at the southernmost segment. The simulated slip distribution is quite heterogeneous, reflecting the nature of non-planar fault geometry, fault branching and step-over structures. We find that optimally oriented faults exhibit larger slip, which is consistent with the slip model of Hamling et al. (2017). We conclude that the first order characteristics of this event may be interpreted by the effect of irregularity in the fault geometry.

  10. 3D second harmonic generation imaging tomography by multi-view excitation

    PubMed Central

    Campbell, Kirby R.; Wen, Bruce; Shelton, Emily M.; Swader, Robert; Cox, Benjamin L.; Eliceiri, Kevin; Campagnola, Paul J.

    2018-01-01

    Biological tissues have complex 3D collagen fiber architecture that cannot be fully visualized by conventional second harmonic generation (SHG) microscopy due to electric dipole considerations. We have developed a multi-view SHG imaging platform that successfully visualizes all orientations of collagen fibers. This is achieved by rotating tissues relative to the excitation laser plane of incidence, where the complete fibrillar structure is then visualized following registration and reconstruction. We evaluated high frequency and Gaussian weighted fusion reconstruction algorithms, and found the former approach performs better in terms of the resulting resolution. The new approach is a first step toward SHG tomography. PMID:29541654

  11. a Global Registration Algorithm of the Single-Closed Ring Multi-Stations Point Cloud

    NASA Astrophysics Data System (ADS)

    Yang, R.; Pan, L.; Xiang, Z.; Zeng, H.

    2018-04-01

    Aimed at the global registration problem of the single-closed ring multi-stations point cloud, a formula in order to calculate the error of rotation matrix was constructed according to the definition of error. The global registration algorithm of multi-station point cloud was derived to minimize the error of rotation matrix. And fast-computing formulas of transformation matrix with whose implementation steps and simulation experiment scheme was given. Compared three different processing schemes of multi-station point cloud, the experimental results showed that the effectiveness of the new global registration method was verified, and it could effectively complete the global registration of point cloud.

  12. Application of multi-target phytotherapeutic concept in malaria drug discovery: a systems biology approach in biomarker identification.

    PubMed

    Tarkang, Protus Arrey; Appiah-Opong, Regina; Ofori, Michael F; Ayong, Lawrence S; Nyarko, Alexander K

    2016-01-01

    There is an urgent need for new anti-malaria drugs with broad therapeutic potential and novel mode of action, for effective treatment and to overcome emerging drug resistance. Plant-derived anti-malarials remain a significant source of bioactive molecules in this regard. The multicomponent formulation forms the basis of phytotherapy. Mechanistic reasons for the poly-pharmacological effects of plants constitute increased bioavailability, interference with cellular transport processes, activation of pro-drugs/deactivation of active compounds to inactive metabolites and action of synergistic partners at different points of the same signaling cascade. These effects are known as the multi-target concept. However, due to the intrinsic complexity of natural products-based drug discovery, there is need to rethink the approaches toward understanding their therapeutic effect. This review discusses the multi-target phytotherapeutic concept and its application in biomarker identification using the modified reverse pharmacology - systems biology approach. Considerations include the generation of a product library, high throughput screening (HTS) techniques for efficacy and interaction assessment, High Performance Liquid Chromatography (HPLC)-based anti-malarial profiling and animal pharmacology. This approach is an integrated interdisciplinary implementation of tailored technology platforms coupled to miniaturized biological assays, to track and characterize the multi-target bioactive components of botanicals as well as identify potential biomarkers. While preserving biodiversity, this will serve as a primary step towards the development of standardized phytomedicines, as well as facilitate lead discovery for chemical prioritization and downstream clinical development.

  13. C-mii: a tool for plant miRNA and target identification.

    PubMed

    Numnark, Somrak; Mhuantong, Wuttichai; Ingsriswang, Supawadee; Wichadakul, Duangdao

    2012-01-01

    MicroRNAs (miRNAs) have been known to play an important role in several biological processes in both animals and plants. Although several tools for miRNA and target identification are available, the number of tools tailored towards plants is limited, and those that are available have specific functionality, lack graphical user interfaces, and restrict the number of input sequences. Large-scale computational identifications of miRNAs and/or targets of several plants have been also reported. Their methods, however, are only described as flow diagrams, which require programming skills and the understanding of input and output of the connected programs to reproduce. To overcome these limitations and programming complexities, we proposed C-mii as a ready-made software package for both plant miRNA and target identification. C-mii was designed and implemented based on established computational steps and criteria derived from previous literature with the following distinguishing features. First, software is easy to install with all-in-one programs and packaged databases. Second, it comes with graphical user interfaces (GUIs) for ease of use. Users can identify plant miRNAs and targets via step-by-step execution, explore the detailed results from each step, filter the results according to proposed constraints in plant miRNA and target biogenesis, and export sequences and structures of interest. Third, it supplies bird's eye views of the identification results with infographics and grouping information. Fourth, in terms of functionality, it extends the standard computational steps of miRNA target identification with miRNA-target folding and GO annotation. Fifth, it provides helper functions for the update of pre-installed databases and automatic recovery. Finally, it supports multi-project and multi-thread management. C-mii constitutes the first complete software package with graphical user interfaces enabling computational identification of both plant miRNA genes and miRNA targets. With the provided functionalities, it can help accelerate the study of plant miRNAs and targets, especially for small and medium plant molecular labs without bioinformaticians. C-mii is freely available at http://www.biotec.or.th/isl/c-mii for both Windows and Ubuntu Linux platforms.

  14. C-mii: a tool for plant miRNA and target identification

    PubMed Central

    2012-01-01

    Background MicroRNAs (miRNAs) have been known to play an important role in several biological processes in both animals and plants. Although several tools for miRNA and target identification are available, the number of tools tailored towards plants is limited, and those that are available have specific functionality, lack graphical user interfaces, and restrict the number of input sequences. Large-scale computational identifications of miRNAs and/or targets of several plants have been also reported. Their methods, however, are only described as flow diagrams, which require programming skills and the understanding of input and output of the connected programs to reproduce. Results To overcome these limitations and programming complexities, we proposed C-mii as a ready-made software package for both plant miRNA and target identification. C-mii was designed and implemented based on established computational steps and criteria derived from previous literature with the following distinguishing features. First, software is easy to install with all-in-one programs and packaged databases. Second, it comes with graphical user interfaces (GUIs) for ease of use. Users can identify plant miRNAs and targets via step-by-step execution, explore the detailed results from each step, filter the results according to proposed constraints in plant miRNA and target biogenesis, and export sequences and structures of interest. Third, it supplies bird's eye views of the identification results with infographics and grouping information. Fourth, in terms of functionality, it extends the standard computational steps of miRNA target identification with miRNA-target folding and GO annotation. Fifth, it provides helper functions for the update of pre-installed databases and automatic recovery. Finally, it supports multi-project and multi-thread management. Conclusions C-mii constitutes the first complete software package with graphical user interfaces enabling computational identification of both plant miRNA genes and miRNA targets. With the provided functionalities, it can help accelerate the study of plant miRNAs and targets, especially for small and medium plant molecular labs without bioinformaticians. C-mii is freely available at http://www.biotec.or.th/isl/c-mii for both Windows and Ubuntu Linux platforms. PMID:23281648

  15. Adaptive MPC based on MIMO ARX-Laguerre model.

    PubMed

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Computer-controlled multi-parameter mapping of 3D compressible flowfields using planar laser-induced iodine fluorescence

    NASA Technical Reports Server (NTRS)

    Donohue, James M.; Victor, Kenneth G.; Mcdaniel, James C., Jr.

    1993-01-01

    A computer-controlled technique, using planar laser-induced iodine fluorescence, for measuring complex compressible flowfields is presented. A new laser permits the use of a planar two-line temperature technique so that all parameters can be measured with the laser operated narrowband. Pressure and temperature measurements in a step flowfield show agreement within 10 percent of a CFD model except in regions close to walls. Deviation of near wall temperature measurements from the model was decreased from 21 percent to 12 percent compared to broadband planar temperature measurements. Computer-control of the experiment has been implemented, except for the frequency tuning of the laser. Image data storage and processing has been improved by integrating a workstation into the experimental setup reducing the data reduction time by a factor of 50.

  17. Multi-Dimensional Scaling based grouping of known complexes and intelligent protein complex detection.

    PubMed

    Rehman, Zia Ur; Idris, Adnan; Khan, Asifullah

    2018-06-01

    Protein-Protein Interactions (PPI) play a vital role in cellular processes and are formed because of thousands of interactions among proteins. Advancements in proteomics technologies have resulted in huge PPI datasets that need to be systematically analyzed. Protein complexes are the locally dense regions in PPI networks, which extend important role in metabolic pathways and gene regulation. In this work, a novel two-phase protein complex detection and grouping mechanism is proposed. In the first phase, topological and biological features are extracted for each complex, and prediction performance is investigated using Bagging based Ensemble classifier (PCD-BEns). Performance evaluation through cross validation shows improvement in comparison to CDIP, MCode, CFinder and PLSMC methods Second phase employs Multi-Dimensional Scaling (MDS) for the grouping of known complexes by exploring inter complex relations. It is experimentally observed that the combination of topological and biological features in the proposed approach has greatly enhanced prediction performance for protein complex detection, which may help to understand various biological processes, whereas application of MDS based exploration may assist in grouping potentially similar complexes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Supercomputing with TOUGH2 family codes for coupled multi-physics simulations of geologic carbon sequestration

    NASA Astrophysics Data System (ADS)

    Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.

    2015-12-01

    Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).

  19. Sealed-bladdered chemical processing method and apparatus

    DOEpatents

    Harless, D. Phillip

    1999-01-01

    A method and apparatus which enables a complete multi-stepped chemical treatment process to occur within a single, sealed-bladdered vessel 31. The entire chemical process occurs without interruption of the sealed-bladdered vessel 31 such as opening the sealed-bladdered vessel 31 between various steps of the process. The sealed-bladdered vessel 31 is loaded with a batch to be dissolved, treated, decanted, rinsed and/or dried. A pressure filtration step may also occur. The self-contained chemical processing apparatus 32 contains a sealed-bladder 32, a fluid pump 34, a reservoir 20, a compressed gas inlet, a vacuum pump 24, and a cold trap 23 as well as the associated piping 33, numerous valves 21,22,25,26,29,30,35,36 and other controls associated with such an apparatus. The claimed invention allows for dissolution and/or chemical treatment without the operator of the self-contained chemical processing apparatus 38 coming into contact with any of the process materials.

  20. Tungsten oxide--fly ash oxide composites in adsorption and photocatalysis.

    PubMed

    Visa, Maria; Bogatu, Cristina; Duta, Anca

    2015-05-30

    A novel composite based on tungsten oxide and fly ash was hydrothermally synthetized to be used as substrate in the advanced treatment of wastewaters with complex load resulted from the textile industry. The proposed treatment consists of one single step process combining photocatalysis and adsorption. The composite's crystalline structure was investigated by X-ray diffraction and FTIR, while atomic force microscopy (AFM) and scanning electron microscopy (SEM) were used to analyze the morphology. The adsorption capacity and photocatalytic properties of the material were tested on mono- and multi-pollutants systems containing two dyes (Bemacid Blau - BB and Bemacid Rot - BR) and one heavy metal ion-Cu(2+), and the optimized process conditions were identified. The results indicate better removal efficiencies using the novel composite material in the combined adsorption and photocatalysis, as compared to the separated processes. Dyes removal was significantly enhanced in the photocatalytic process by adding hydrogen peroxide and the mechanism was presented and discussed. The pseudo second order kinetics model best fitted the experimental data, both in the adsorption and in the combined processes. The kinetic parameters were calculated and correlated with the properties of the composite substrate. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Cellulose Biosynthesis: Current Views and Evolving Concepts

    PubMed Central

    SAXENA, INDER M.; BROWN, R. MALCOLM

    2005-01-01

    • Aims To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. • Scope Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. • Conclusions With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back. PMID:15894551

  2. Cellulose biosynthesis: current views and evolving concepts.

    PubMed

    Saxena, Inder M; Brown, R Malcolm

    2005-07-01

    To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. * Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. * With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back.

  3. Extrinsic Repair of Injured Dendrites as a Paradigm for Regeneration by Fusion in Caenorhabditis elegans.

    PubMed

    Oren-Suissa, Meital; Gattegno, Tamar; Kravtsov, Veronika; Podbilewicz, Benjamin

    2017-05-01

    Injury triggers regeneration of axons and dendrites. Research has identified factors required for axonal regeneration outside the CNS, but little is known about regeneration triggered by dendrotomy. Here, we study neuronal plasticity triggered by dendrotomy and determine the fate of complex PVD arbors following laser surgery of dendrites. We find that severed primary dendrites grow toward each other and reconnect via branch fusion. Simultaneously, terminal branches lose self-avoidance and grow toward each other, meeting and fusing at the tips via an AFF-1-mediated process. Ectopic branch growth is identified as a step in the regeneration process required for bypassing the lesion site. Failure of reconnection to the severed dendrites results in degeneration of the distal end of the neuron. We discover pruning of excess branches via EFF-1 that acts to recover the original wild-type arborization pattern in a late stage of the process. In contrast, AFF-1 activity during dendritic auto-fusion is derived from the lateral seam cells and not autonomously from the PVD neuron. We propose a model in which AFF-1-vesicles derived from the epidermal seam cells fuse neuronal dendrites. Thus, EFF-1 and AFF-1 fusion proteins emerge as new players in neuronal arborization and maintenance of arbor connectivity following injury in Caenorhabditis elegans Our results demonstrate that there is a genetically determined multi-step pathway to repair broken dendrites in which EFF-1 and AFF-1 act on different steps of the pathway. EFF-1 is essential for dendritic pruning after injury and extrinsic AFF-1 mediates dendrite fusion to bypass injuries. Copyright © 2017 by the Genetics Society of America.

  4. Application of pulsed multi-ion irradiations in radiation damage research: A stochastic cluster dynamics simulation study

    NASA Astrophysics Data System (ADS)

    Hoang, Tuan L.; Nazarov, Roman; Kang, Changwoo; Fan, Jiangyuan

    2018-07-01

    Under the multi-ion irradiation conditions present in accelerated material-testing facilities or fission/fusion nuclear reactors, the combined effects of atomic displacements with radiation products may induce complex synergies in the structural materials. However, limited access to multi-ion irradiation facilities and the lack of computational models capable of simulating the evolution of complex defects and their synergies make it difficult to understand the actual physical processes taking place in the materials under these extreme conditions. In this paper, we propose the application of pulsed single/dual-beam irradiation as replacements for the expensive steady triple-beam irradiation to study radiation damages in materials under multi-ion irradiation.

  5. XUV-induced reactions in benzene on sub-10 fs timescale: nonadiabatic relaxation and proton migration.

    PubMed

    Galbraith, M C E; Smeenk, C T L; Reitsma, G; Marciniak, A; Despré, V; Mikosch, J; Zhavoronkov, N; Vrakking, M J J; Kornilov, O; Lépine, F

    2017-08-02

    Unraveling ultrafast dynamical processes in highly excited molecular species has an impact on our understanding of chemical processes such as combustion or the chemical composition of molecular clouds in the universe. In this article we use short (<7 fs) XUV pulses to produce excited cationic states of benzene molecules and probe their dynamics using few-cycle VIS/NIR laser pulses. The excited states produced by the XUV pulses lie in an especially complex spectral region where multi-electronic effects play a dominant role. We show that very fast τ ≈ 20 fs nonadiabatic processes dominate the relaxation of these states, in agreement with the timescale expected for most excited cationic states in benzene. In the CH 3 + fragmentation channel of the doubly ionized benzene cation we identify pathways that involve structural rearrangement and proton migration to a specific carbon atom. Further, we observe non-trivial transient behavior in this fragment channel, which can be interpreted either in terms of propagation of the nuclear wavepacket in the initially excited electronic state of the cation or as a two-step electronic relaxation via an intermediate state.

  6. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    NASA Astrophysics Data System (ADS)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  7. Multi-step optimization strategy for fuel-optimal orbital transfer of low-thrust spacecraft

    NASA Astrophysics Data System (ADS)

    Rasotto, M.; Armellin, R.; Di Lizia, P.

    2016-03-01

    An effective method for the design of fuel-optimal transfers in two- and three-body dynamics is presented. The optimal control problem is formulated using calculus of variation and primer vector theory. This leads to a multi-point boundary value problem (MPBVP), characterized by complex inner constraints and a discontinuous thrust profile. The first issue is addressed by embedding the MPBVP in a parametric optimization problem, thus allowing a simplification of the set of transversality constraints. The second problem is solved by representing the discontinuous control function by a smooth function depending on a continuation parameter. The resulting trajectory optimization method can deal with different intermediate conditions, and no a priori knowledge of the control structure is required. Test cases in both the two- and three-body dynamics show the capability of the method in solving complex trajectory design problems.

  8. Automatic Road Gap Detection Using Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  9. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis

    PubMed Central

    Liu, Jingxian; Wu, Kefeng

    2017-01-01

    The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with traditional spectral clustering and fast affinity propagation clustering. Experimental results have illustrated its superior performance in terms of quantitative and qualitative evaluations. PMID:28777353

  10. (Semi-)Automated landform mapping of the alpine valley Gradental (Austria) based on LiDAR data

    NASA Astrophysics Data System (ADS)

    Strasser, T.; Eisank, C.

    2012-04-01

    Alpine valleys are typically characterised as complex, hierarchical structured systems with rapid landform changes. Detection of landform changes can be supported by automated geomorphological mapping. Especially, the analysis over short time scales require a method for standardised, unbiased geomorphological map reproduction, which is delivered by automated mapping techniques. In general, digital geomorphological mapping is a challenging task, since knowledge about landforms with respect to their natural boundaries as well as their hierarchical and scaling relationships, has to be integrated in an objective way. A combination of very-high spatial resolution data (VHSR) such as LiDAR and new methods like object based image analysis (OBIA) allow for a more standardised production of geomorphological maps. In OBIA the processing units are spatially configured objects that are created by multi-scale segmentation. Therefore, not only spectral information can be used for assigning the objects to geomorphological classes, but also spatial and topological properties can be exploited. In this study we focus on the detection of landforms, especially bedrock sediment deposits (alluvion, debris cone, talus, moraine, rockglacier), as well as glaciers. The study site Gradental [N 46°58'29.1"/ E 12°48'53.8"] is located in the Schobergruppe (Austria, Carinthia) and is characterised by heterogenic geology conditions and high process activity. The area is difficult to access and dominated by steep slopes, thus hindering a fast and detailed geomorphological field mapping. Landforms are identified using aerial and terrestrial LiDAR data (1 m spatial resolution). These DEMs are analysed by an object based hierarchical approach, which is structured in three main steps. The first step is to define occurring landforms by basic land surface parameters (LSPs), topology and hierarchy relations. Based on those definitions a semantic model is created. Secondly, a multi-scale segmentation is performed on a three-band LSP that integrates slope, aspect and plan curvature, which expresses the driving forces of geomorphological processes. In the third step, the generated multi-level object structures are classified in order to produce the geomorphological map. The classification rules are derived from the semantic model. Due to landform type-specific scale dependencies of LSPs, the values of LSPs used in the classification are calculated in a multi-scale manner by constantly enlarging the size of the moving window. In addition, object form properties (density, compactness, rectangular fit) are utilised as additional information for landform characterisation. Validation of classification is performed by intersecting a visually interpreted reference map with the classification output map and calculating accuracy matrices. Validation shows an overall accuracy of 78.25 % and a Kappa of 0.65. The natural borders of landforms can be easily detected by the use of slope, aspect and plan curvature. This study illustrates the potential of OBIA for a more standardised and automated mapping of surface units (landforms, landcover). Therefore, the presented methodology features a prospective automated geomorphological mapping approach for alpine regions.

  11. Erosion and deposition by supercritical density flows during channel avulsion and backfilling: Field examples from coarse-grained deepwater channel-levée complexes (Sandino Forearc Basin, southern Central America)

    NASA Astrophysics Data System (ADS)

    Lang, Jörg; Brandes, Christian; Winsemann, Jutta

    2017-03-01

    Erosion and deposition by supercritical density flows can strongly impact the facies distribution and architecture of submarine fans. Field examples from coarse-grained channel-levée complexes from the Sandino Forearc Basin (southern Central America) show that cyclic-step and antidune deposits represent common sedimentary facies of these depositional systems and relate to the different stages of avulsion, bypass, levée construction and channel backfilling. During channel avulsion, large-scale scour-fill complexes (18 to 29 m deep, 18 to 25 m wide, 60 to > 120 m long) were incised by supercritical density flows. The multi-storey infill of the large-scale scour-fill complexes comprises amalgamated massive, normally coarse-tail graded or widely spaced subhorizontally stratified conglomerates and pebbly sandstones, interpreted as deposits of the hydraulic-jump zone of cyclic steps. The large-scale scour-fill complexes can be distinguished from small-scale channel fills based on the preservation of a steep upper margin and a coarse-grained infill comprising mainly amalgamated hydraulic-jump zone deposits. Channel fills include repeated successions deposited by cyclic steps with superimposed antidunes. The deposits of the hydraulic-jump zone of cyclic steps comprise regularly spaced scours (0.2 to 2.6 m deep, 0.8 to 23 m long) infilled by intraclast-rich conglomerates or pebbly sandstones, displaying normal coarse-tail grading or backsets. These deposits are laterally and vertically associated with subhorizontally stratified, low-angle cross-stratified or sinusoidally stratified sandstones and pebbly sandstones, which were deposited by antidunes on the stoss side of the cyclic steps during flow re-acceleration. The field examples indicate that so-called spaced stratified deposits may commonly represent antidune deposits with varying stratification styles controlled by the aggradation rate, grain-size distribution and amalgamation. The deposits of small-scale cyclic steps with superimposed antidunes form fining-upwards successions with decreasing antidune wavelengths, indicating waning flows. Such cyclic step-antidune successions form the characteristic basal infill of mid-fan channels, and are inferred to be related to successive supercritical high-density turbidity flows triggered by retrogressive slope failures.

  12. Extremely stable soluble high molecular mass multi-protein complex with DNase activity in human placental tissue.

    PubMed

    Burkova, Evgeniya E; Dmitrenok, Pavel S; Sedykh, Sergey E; Buneva, Valentina N; Soboleva, Svetlana E; Nevinsky, Georgy A

    2014-01-01

    Human placenta is an organ which protects, feeds, and regulates the grooving of the embryo. Therefore, identification and characterization of placental components including proteins and their multi-protein complexes is an important step to understanding the placenta function. We have obtained and analyzed for the first time an extremely stable multi-protein complex (SPC, ∼ 1000 kDa) from the soluble fraction of three human placentas. By gel filtration on Sepharose-4B, the SPC was well separated from other proteins of the placenta extract. Light scattering measurements and gel filtration showed that the SPC is stable in the presence of NaCl, MgCl2, acetonitrile, guanidinium chloride, and Triton in high concentrations, but dissociates efficiently in the presence of 8 M urea, 50 mM EDTA, and 0.5 M NaCl. Such a stable complex is unlikely to be a casual associate of different proteins. According to SDS-PAGE and MALDI mass spectrometry data, this complex contains many major glycosylated proteins with low and moderate molecular masses (MMs) 4-14 kDa and several moderately abundant (79.3, 68.5, 52.8, and 27.2 kDa) as well as minor proteins with higher MMs. The SPC treatment with dithiothreitol led to a disappearance of some protein bands and revealed proteins with lower MMs. The SPCs from three placentas efficiently hydrolyzed plasmid supercoiled DNA with comparable rates and possess at least two DNA-binding sites with different affinities for a 12-mer oligonucleotide. Progress in study of placental protein complexes can promote understanding of their biological functions.

  13. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells.

    PubMed

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J; Rhodes, Christopher; Mukherjee, Partha P

    2016-02-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring.

  14. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells

    PubMed Central

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J.; Rhodes, Christopher; Mukherjee, Partha P.

    2016-01-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring. PMID:26863503

  15. Using Complex Event Processing (CEP) and vocal synthesis techniques to improve comprehension of sonified human-centric data

    NASA Astrophysics Data System (ADS)

    Rimland, Jeff; Ballora, Mark

    2014-05-01

    The field of sonification, which uses auditory presentation of data to replace or augment visualization techniques, is gaining popularity and acceptance for analysis of "big data" and for assisting analysts who are unable to utilize traditional visual approaches due to either: 1) visual overload caused by existing displays; 2) concurrent need to perform critical visually intensive tasks (e.g. operating a vehicle or performing a medical procedure); or 3) visual impairment due to either temporary environmental factors (e.g. dense smoke) or biological causes. Sonification tools typically map data values to sound attributes such as pitch, volume, and localization to enable them to be interpreted via human listening. In more complex problems, the challenge is in creating multi-dimensional sonifications that are both compelling and listenable, and that have enough discrete features that can be modulated in ways that allow meaningful discrimination by a listener. We propose a solution to this problem that incorporates Complex Event Processing (CEP) with speech synthesis. Some of the more promising sonifications to date use speech synthesis, which is an "instrument" that is amenable to extended listening, and can also provide a great deal of subtle nuance. These vocal nuances, which can represent a nearly limitless number of expressive meanings (via a combination of pitch, inflection, volume, and other acoustic factors), are the basis of our daily communications, and thus have the potential to engage the innate human understanding of these sounds. Additionally, recent advances in CEP have facilitated the extraction of multi-level hierarchies of information, which is necessary to bridge the gap between raw data and this type of vocal synthesis. We therefore propose that CEP-enabled sonifications based on the sound of human utterances could be considered the next logical step in human-centric "big data" compression and transmission.

  16. Selection of suitable alternatives to reduce the environmental impact of road traffic noise using a fuzzy multi-criteria decision model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz-Padillo, Alejandro, E-mail: aruizp@correo.ugr.es; Civil Engineering Department, University of Granada, Av. Fuentenueva s/n, 18071 Granada; Ruiz, Diego P., E-mail: druiz@ugr.es

    Road traffic noise is one of the most significant environmental impacts generated by transport systems. To this regard, the recent implementation of the European Environmental Noise Directive by Public Administrations of the European Union member countries has led to various noise action plans (NAPs) for reducing the noise exposure of EU inhabitants. Every country or administration is responsible for applying criteria based on their own experience or expert knowledge, but there is no regulated process for the prioritization of technical measures within these plans. This paper proposes a multi-criteria decision methodology for the selection of suitable alternatives against traffic noisemore » in each of the road stretches included in the NAPs. The methodology first defines the main criteria and alternatives to be considered. Secondly, it determines the relative weights for the criteria and sub-criteria using the fuzzy extended analytical hierarchy process as applied to the results from an expert panel, thereby allowing expert knowledge to be captured in an automated way. A final step comprises the use of discrete multi-criteria analysis methods such as weighted sum, ELECTRE and TOPSIS, to rank the alternatives by suitability. To illustrate an application of the proposed methodology, this paper describes its implementation in a complex real case study: the selection of optimal technical solutions against traffic noise in the top priority road stretch included in the revision of the NAP of the regional road network in the province of Almeria (Spain).« less

  17. Parallel Multi-Step/Multi-Rate Integration of Two-Time Scale Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Chang, Johnny T.; Ploen, Scott R.; Sohl, Garett. A,; Martin, Bryan J.

    2004-01-01

    Increasing demands on the fidelity of simulations for real-time and high-fidelity simulations are stressing the capacity of modern processors. New integration techniques are required that provide maximum efficiency for systems that are parallelizable. However many current techniques make assumptions that are at odds with non-cascadable systems. A new serial multi-step/multi-rate integration algorithm for dual-timescale continuous state systems is presented which applies to these systems, and is extended to a parallel multi-step/multi-rate algorithm. The superior performance of both algorithms is demonstrated through a representative example.

  18. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety

    PubMed Central

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-01-01

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. PMID:27294931

  19. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety.

    PubMed

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-06-09

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety.

  20. The role of particle jamming on the formation and stability of step-pool morphology: insight from a reduced-complexity model

    NASA Astrophysics Data System (ADS)

    Saletti, M.; Molnar, P.; Hassan, M. A.

    2017-12-01

    Granular processes have been recognized as key drivers in earth surface dynamics, especially in steep landscapes because of the large size of sediment found in channels. In this work we focus on step-pool morphologies, studying the effect of particle jamming on step formation. Starting from the jammed-state hypothesis, we assume that grains generate steps because of particle jamming and those steps are inherently more stable because of additional force chains in the transversal direction. We test this hypothesis with a particle-based reduced-complexity model, CAST2, where sediment is organized in patches and entrainment, transport and deposition of grains depend on flow stage and local topography through simplified phenomenological rules. The model operates with 2 grain sizes: fine grains, that can be mobilized both my large and moderate flows, and coarse grains, mobile only during large floods. First, we identify the minimum set of processes necessary to generate and maintain steps in a numerical channel: (a) occurrence of floods, (b) particle jamming, (c) low sediment supply, and (d) presence of sediment with different entrainment probabilities. Numerical results are compared with field observations collected in different step-pool channels in terms of step density, a variable that captures the proportion of the channel occupied by steps. Not only the longitudinal profiles of numerical channels display step sequences similar to those observed in real step-pool streams, but also the values of step density are very similar when all the processes mentioned before are considered. Moreover, with CAST2 it is possible to run long simulations with repeated flood events, to test the effect of flood frequency on step formation. Numerical results indicate that larger step densities belong to system more frequently perturbed by floods, compared to system having a lower flood frequency. Our results highlight the important interactions between external hydrological forcing and internal geomorphic adjustment (e.g. jamming) on the response of step-pool streams, showing the potential of reduced-complexity models in fluvial geomorphology.

  1. An inhibitor of eIF2 activity in the sRNA pool of eukaryotic cells.

    PubMed

    Centrella, Michael; Porter, David L; McCarthy, Thomas L

    2011-08-15

    Eukaryotic protein synthesis is a multi-step and highly controlled process that includes an early initiation complex containing eukaryotic initiation factor 2 (eIF2), GTP, and methionine-charged initiator methionyl-tRNA (met-tRNAi). During studies to reconstruct formation of the ternary complex containing these molecules, we detected a potent inhibitor in low molecular mass RNA (sRNA) preparations of eukaryotic tRNA. The ternary complex inhibitor (TCI) was retained in the total sRNA pool after met-tRNAi was charged by aminoacyl tRNA synthetase, co-eluted with sRNA by size exclusion chromatography, but resolved from met-tRNAi by ion exchange chromatography. The adverse effect of TCI was not overcome by high GTP or magnesium omission and was independent of GTP regeneration. Rather, TCI suppressed the rate of ternary complex formation, and disrupted protein synthesis and the accumulation of heavy polymeric ribosomes in reticulocyte lysates in vitro. Lastly, a component or components in ribosome depleted cell lysate significantly reversed TCI activity. Since assembly of the met-tRNAi/eIF2/GTP ternary complex is integral to protein synthesis, awareness of TCI is important to avoid confusion in studies of translation initiation. A clear definition of TCI may also allow a better appreciation of physiologic or pathologic situations, factors, and events that control protein synthesis in vivo. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Preparatory steps for a robust dynamic model for organically bound tritium dynamics in agricultural crops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melintescu, A.; Galeriu, D.; Diabate, S.

    2015-03-15

    The processes involved in tritium transfer in crops are complex and regulated by many feedback mechanisms. A full mechanistic model is difficult to develop due to the complexity of the processes involved in tritium transfer and environmental conditions. First, a review of existing models (ORYZA2000, CROPTRIT and WOFOST) presenting their features and limits, is made. Secondly, the preparatory steps for a robust model are discussed, considering the role of dry matter and photosynthesis contribution to the OBT (Organically Bound Tritium) dynamics in crops.

  3. Preparation of olanzapine and methyl-β-cyclodextrin complexes using a single-step, organic solvent-free supercritical fluid process: An approach to enhance the solubility and dissolution properties.

    PubMed

    Rudrangi, Shashi Ravi Suman; Trivedi, Vivek; Mitchell, John C; Wicks, Stephen Richard; Alexander, Bruce David

    2015-10-15

    The purpose of this study was to evaluate a single-step, organic solvent-free supercritical fluid process for the preparation of olanzapine-methyl-β-cyclodextrin complexes with an express goal to enhance the dissolution properties of olanzapine. The complexes were prepared by supercritical carbon dioxide processing, co-evaporation, freeze drying and physical mixing. The prepared complexes were then analysed by differential scanning calorimetry, X-ray powder diffraction, scanning electron microscopy, solubility and dissolution studies. Computational molecular docking studies were performed to study the formation of molecular inclusion complexation of olanzapine with methyl-β-cyclodextrin. All the binary mixtures of olanzapine with methyl-β-cyclodextrin, except physical mixture, exhibited a faster and greater extent of drug dissolution than the drug alone. Products obtained by the supercritical carbon dioxide processing method exhibited the highest apparent drug dissolution. The characterisation by different analytical techniques suggests complete complexation or amorphisation of olanzapine and methyl-β-cyclodextrin complexes prepared by supercritical carbon dioxide processing method. Therefore, organic solvent-free supercritical carbon dioxide processing method proved to be novel and efficient for the preparation of solid inclusion complexes of olanzapine with methyl-β-cyclodextrin. The preliminary data also suggests that the complexes of olanzapine with methyl-β-cyclodextrin will lead to better therapeutic efficacy due to better solubility and dissolution properties. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Multi-segmental movement patterns reflect juggling complexity and skill level.

    PubMed

    Zago, Matteo; Pacifici, Ilaria; Lovecchio, Nicola; Galli, Manuela; Federolf, Peter Andreas; Sforza, Chiarella

    2017-08-01

    The juggling action of six experts and six intermediates jugglers was recorded with a motion capture system and decomposed into its fundamental components through Principal Component Analysis. The aim was to quantify trends in movement dimensionality, multi-segmental patterns and rhythmicity as a function of proficiency level and task complexity. Dimensionality was quantified in terms of Residual Variance, while the Relative Amplitude was introduced to account for individual differences in movement components. We observed that: experience-related modifications in multi-segmental actions exist, such as the progressive reduction of error-correction movements, especially in complex task condition. The systematic identification of motor patterns sensitive to the acquisition of specific experience could accelerate the learning process. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme

    NASA Astrophysics Data System (ADS)

    Ghoman, Satyajit S.

    The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of fitness-driven retention. This strategy capitalizes on the advantages of evolutionary algorithm as well as POD-based reduced order modeling, while overcoming the shortcomings inherent with these techniques. When linked with M3 DOE, this strategy offers a computationally efficient methodology for problems with high level of complexity and a challenging design-space. This newly developed framework is demonstrated for its robustness on a nonconventional supersonic tailless air vehicle wing shape optimization problem.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soliman, A; Safigholi, H; Sunnybrook Health Sciences Center, Toronto, ON

    Purpose: To propose a new method that provides a positive contrast visualization of the prostate brachytherapy seeds using the phase information from MR images. Additionally, the feasibility of using the processed phase information to distinguish seeds from calcifications is explored. Methods: A gel phantom was constructed using 2% agar dissolved in 1 L of distilled water. Contrast agents were added to adjust the relaxation times. Four iodine-125 (Eckert & Ziegler SML86999) dummy seeds were placed at different orientations with respect to the main magnetic field (B0). Calcifications were obtained from a sheep femur cortical bone due to its close similaritymore » to human bone tissue composition. Five samples of calcifications were shaped into different dimensions with lengths ranging between 1.2 – 6.1 mm.MR imaging was performed on a 3T Philips Achieva using an 8-channel head coil. Eight images were acquired at eight echo-times using a multi-gradient echo sequence. Spatial resolution was 0.7 × 0.7 × 2 mm, TR/TE/dTE = 20.0/2.3/2.3 ms and BW = 541 Hz/pixel. Complex images were acquired and fed into a two-step processing pipeline: the first includes phase unwrapping and background phase removal using Laplacian operator (Wei et al. 2013). The second step applies a specific phase mask on the resulting tissue phase from the first step to provide the desired positive contrast of the seeds and to, potentially, differentiate them from the calcifications. Results: The phase-processing was performed in less than 30 seconds. The proposed method has successfully resulted in a positive contrast of the brachytherapy seeds. Additionally, the final processed phase image showed difference between the appearance of seeds and calcifications. However, the shape of the seeds was slightly distorted compared to the original dimensions. Conclusion: It is feasible to provide a positive contrast of the seeds from MR images using Laplacian operator-based phase processing.« less

  7. Fully Burdened Cost of Fuel Using Input-Output Analysis

    DTIC Science & Technology

    2011-12-01

    Distribution Model could be used to replace the current seven-step Fully Burdened Cost of Fuel process with a single step, allowing for less complex and...wide extension of the Bulk Fuels Distribution Model could be used to replace the current seven-step Fully Burdened Cost of Fuel process with a single...ABBREVIATIONS AEM Atlantic, Europe, and the Mediterranean AOAs Analysis of Alternatives DAG Defense Acquisition Guidebook DAU Defense Acquisition University

  8. Syntactic and semantic restrictions on morphological recomposition: MEG evidence from Greek.

    PubMed

    Neophytou, K; Manouilidou, C; Stockall, L; Marantz, A

    2018-05-16

    Complex morphological processing has been extensively studied in the past decades. However, most of this work has either focused on only certain steps involved in this process, or it has been conducted on a few languages, like English. The purpose of the present study is to investigate the spatiotemporal cortical processing profile of the distinct steps previously reported in the literature, from decomposition to re-composition of morphologically complex items, in a relatively understudied language, Greek. Using magnetoencephalography, we confirm the role of the fusiform gyrus in early, form-based morphological decomposition, we relate the syntactic licensing of stem-suffix combinations to the ventral visual processing stream, somewhat independent from lexical access for the stem, and we further elucidate the role of orbitofrontal regions in semantic composition. Thus, the current study offers the most comprehensive test to date of visual morphological processing and additional, crosslinguistic validation of the steps involved in it. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  9. The multi-replication protein A (RPA) system--a new perspective.

    PubMed

    Sakaguchi, Kengo; Ishibashi, Toyotaka; Uchiyama, Yukinobu; Iwabata, Kazuki

    2009-02-01

    Replication protein A (RPA) complex has been shown, using both in vivo and in vitro approaches, to be required for most aspects of eukaryotic DNA metabolism: replication, repair, telomere maintenance and homologous recombination. Here, we review recent data concerning the function and biological importance of the multi-RPA complex. There are distinct complexes of RPA found in the biological kingdoms, although for a long time only one type of RPA complex was believed to be present in eukaryotes. Each complex probably serves a different role. In higher plants, three distinct large and medium subunits are present, but only one species of the smallest subunit. Each of these protein subunits forms stable complexes with their respective partners. They are paralogs as complex. Humans possess two paralogs and one analog of RPA. The multi-RPA system can be regarded as universal in eukaryotes. Among eukaryotic kingdoms, paralogs, orthologs, analogs and heterologs of many DNA synthesis-related factors, including RPA, are ubiquitous. Convergent evolution seems to be ubiquitous in these processes. Using recent findings, we review the composition and biological functions of RPA complexes.

  10. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  11. Characterization and multi-step transketolase-ω-transaminase bioconversions in an immobilized enzyme microreactor (IEMR) with packed tube.

    PubMed

    Halim, Amanatuzzakiah Abdul; Szita, Nicolas; Baganz, Frank

    2013-12-01

    The concept of de novo metabolic engineering through novel synthetic pathways offers new directions for multi-step enzymatic synthesis of complex molecules. This has been complemented by recent progress in performing enzymatic reactions using immobilized enzyme microreactors (IEMR). This work is concerned with the construction of de novo designed enzyme pathways in a microreactor synthesizing chiral molecules. An interesting compound, commonly used as the building block in several pharmaceutical syntheses, is a single diastereoisomer of 2-amino-1,3,4-butanetriol (ABT). This chiral amino alcohol can be synthesized from simple achiral substrates using two enzymes, transketolase (TK) and transaminase (TAm). Here we describe the development of an IEMR using His6-tagged TK and TAm immobilized onto Ni-NTA agarose beads and packed into tubes to enable multi-step enzyme reactions. The kinetic parameters of both enzymes were first determined using single IEMRs evaluated by a kinetic model developed for packed bed reactors. The Km(app) for both enzymes appeared to be flow rate dependent, while the turnover number kcat was reduced 3 fold compared to solution-phase TK and TAm reactions. For the multi-step enzyme reaction, single IEMRs were cascaded in series, whereby the first enzyme, TK, catalyzed a model reaction of lithium-hydroxypyruvate (HPA) and glycolaldehyde (GA) to L-erythrulose (ERY), and the second unit of the IEMR with immobilized TAm converted ERY into ABT using (S)-α-methylbenzylamine (MBA) as amine donor. With initial 60mM (HPA and GA each) and 6mM (MBA) substrate concentration mixture, the coupled reaction reached approximately 83% conversion in 20 min at the lowest flow rate. The ability to synthesize a chiral pharmaceutical intermediate, ABT in relatively short time proves this IEMR system as a powerful tool for construction and evaluation of de novo pathways as well as for determination of enzyme kinetics. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Method for network analyzation and apparatus

    DOEpatents

    Bracht, Roger B.; Pasquale, Regina V.

    2001-01-01

    A portable network analyzer and method having multiple channel transmit and receive capability for real-time monitoring of processes which maintains phase integrity, requires low power, is adapted to provide full vector analysis, provides output frequencies of up to 62.5 MHz and provides fine sensitivity frequency resolution. The present invention includes a multi-channel means for transmitting and a multi-channel means for receiving, both in electrical communication with a software means for controlling. The means for controlling is programmed to provide a signal to a system under investigation which steps consecutively over a range of predetermined frequencies. The resulting received signal from the system provides complete time domain response information by executing a frequency transform of the magnitude and phase information acquired at each frequency step.

  13. Automatic Registration of GF4 Pms: a High Resolution Multi-Spectral Sensor on Board a Satellite on Geostationary Orbit

    NASA Astrophysics Data System (ADS)

    Gao, M.; Li, J.

    2018-04-01

    Geometric correction is an important preprocessing process in the application of GF4 PMS image. The method of geometric correction that is based on the manual selection of geometric control points is time-consuming and laborious. The more common method, based on a reference image, is automatic image registration. This method involves several steps and parameters. For the multi-spectral sensor GF4 PMS, it is necessary for us to identify the best combination of parameters and steps. This study mainly focuses on the following issues: necessity of Rational Polynomial Coefficients (RPC) correction before automatic registration, base band in the automatic registration and configuration of GF4 PMS spatial resolution.

  14. Statistical properties of multi-theta polymer chains

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2018-04-01

    We study statistical properties of polymer chains with complex structures whose chemical connectivities are expressed by graphs. The multi-theta curve of m subchains with two branch points connected by them is one of the simplest graphs among those graphs having closed paths, i.e. loops. We denoted it by θm , and for m  =  2 it is given by a ring. We derive analytically the pair distribution function and the scattering function for the θm -shaped polymer chains consisting of m Gaussian random walks of n steps. Surprisingly, it is shown rigorously that the mean-square radius of gyration for the Gaussian θm -shaped polymer chain does not depend on the number m of subchains if each subchain has the same fixed number of steps. For m  =  3 we show the Kratky plot for the theta-shaped polymer chain consisting of hard cylindrical segments by the Monte-Carlo method including reflection at trivalent vertices.

  15. Spatial mapping reveals multi-step pattern of wound healing in Physarum polycephalum

    NASA Astrophysics Data System (ADS)

    Bäuerle, Felix K.; Kramar, Mirna; Alim, Karen

    2017-11-01

    Wounding is a severe impairment of function, especially for an exposed organism like the network-forming true slime mould Physarum polycephalum. The tubular network making up the organism’s body plan is entirely interconnected and shares a common cytoplasm. Oscillatory contractions of the enclosing tube walls drive the shuttle streaming of the cytoplasm. Cytoplasmic flows underlie the reorganization of the network for example by movement toward attractive stimuli or away from repellants. Here, we follow the reorganization of P. polycephalum networks after severe wounding. Spatial mapping of the contraction changes in response to wounding reveal a multi-step pattern. Phases of increased activity alternate with cessation of contractions and stalling of flows, giving rise to coordinated transport and growth at the severing site. Overall, severing surprisingly acts like an attractive stimulus enabling healing of severed tubes. The reproducible cessation of contractions arising during this wound-healing response may open up new venues to investigate the biochemical wiring underlying P. polycephalum’s complex behaviours.

  16. CAD/CAM guided surgery in implant dentistry. A review of software packages and step-by-step protocols for planning surgical guides.

    PubMed

    Scherer, Michael D; Kattadiyil, Mathew T; Parciak, Ewa; Puri, Shweta

    2014-01-01

    Three-dimensional radiographic imaging for dental implant treatment planning is gaining widespread interest and popularity. However, application of the data from 30 imaging can be a complex and daunting process initially. The purpose of this article is to describe features of three software packages and the respective computerized guided surgical templates (GST) fabricated from them. A step-by-step method of interpreting and ordering a GST to simplify the process of the surgical planning and implant placement is discussed.

  17. Evolution of learning and levels of selection: A lesson from avian parent-offspring communication.

    PubMed

    Lotem, Arnon; Biran-Yoeli, Inbar

    2013-09-20

    In recent years, it has become increasingly clear that the evolution of behavior may be better understood as the evolution of the learning mechanisms that produce it, and that such mechanisms should be modeled and tested explicitly. However, this approach, which has recently been applied to animal foraging and decision-making, has rarely been applied to the social and communicative behaviors that are likely to operate in complex social environments and be subject to multi-level selection. Here we use genetic, agent-based evolutionary simulations to explore how learning mechanisms may evolve to adjust the level of nestling begging (offspring signaling of need), and to examine the possible consequences of this process for parent-offspring conflict and communication. In doing so, we also provide the first step-by-step dynamic model of parent-offspring communication. The results confirm several previous theoretical predictions and demonstrate three novel phenomena. First, negatively frequency-dependent group-level selection can generate a stable polymorphism of learning strategies and parental responses. Second, while conventional reinforcement learning models fail to cope successfully with family dynamics at the nest, a newly developed learning model (incorporating behaviors that are consistent with recent experimental results on learning in nestling begging) produced effective learning, which evolved successfully. Third, while kin-selection affects the frequency of the different learning genes, its impact on begging slope and intensity was unexpectedly negligible, demonstrating that evolution is a complex process, and showing that the effect of kin-selection on behaviors that are shaped by learning may not be predicted by simple application of Hamilton's rule. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Evolution of learning and levels of selection: a lesson from avian parent-offspring communication.

    PubMed

    Lotem, Arnon; Biran-Yoeli, Inbar

    2014-02-01

    In recent years, it has become increasingly clear that the evolution of behavior may be better understood as the evolution of the learning mechanisms that produce it, and that such mechanisms should be modeled and tested explicitly. However, this approach, which has recently been applied to animal foraging and decision-making, has rarely been applied to the social and communicative behaviors that are likely to operate in complex social environments and be subject to multi-level selection. Here we use genetic, agent-based evolutionary simulations to explore how learning mechanisms may evolve to adjust the level of nestling begging (offspring signaling of need), and to examine the possible consequences of this process for parent-offspring conflict and communication. In doing so, we also provide the first step-by-step dynamic model of parent-offspring communication. The results confirm several previous theoretical predictions and demonstrate three novel phenomena. First, negatively frequency-dependent group-level selection can generate a stable polymorphism of learning strategies and parental responses. Second, while conventional reinforcement learning models fail to cope successfully with family dynamics at the nest, a newly developed learning model (incorporating behaviors that are consistent with recent experimental results on learning in nestling begging) produced effective learning, which evolved successfully. Third, while kin-selection affects the frequency of the different learning genes, its impact on begging slope and intensity was unexpectedly negligible, demonstrating that evolution is a complex process, and showing that the effect of kin-selection on behaviors that are shaped by learning may not be predicted by simple application of Hamilton's rule. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Eating and drinking interventions for people at risk of lacking decision-making capacity: who decides and how?

    PubMed

    Clarke, Gemma; Galbraith, Sarah; Woodward, Jeremy; Holland, Anthony; Barclay, Stephen

    2015-06-11

    Some people with progressive neurological diseases find they need additional support with eating and drinking at mealtimes, and may require artificial nutrition and hydration. Decisions concerning artificial nutrition and hydration at the end of life are ethically complex, particularly if the individual lacks decision-making capacity. Decisions may concern issues of life and death: weighing the potential for increasing morbidity and prolonging suffering, with potentially shortening life. When individuals lack decision-making capacity, the standard processes of obtaining informed consent for medical interventions are disrupted. Increasingly multi-professional groups are being utilised to make difficult ethical decisions within healthcare. This paper reports upon a service evaluation which examined decision-making within a UK hospital Feeding Issues Multi-Professional Team. A three month observation of a hospital-based multi-professional team concerning feeding issues, and a one year examination of their records. The key research questions are: a) How are decisions made concerning artificial nutrition for individuals at risk of lacking decision-making capacity? b) What are the key decision-making factors that are balanced? c) Who is involved in the decision-making process? Decision-making was not a singular decision, but rather involved many different steps. Discussions involving relatives and other clinicians, often took place outside of meetings. Topics of discussion varied but the outcome relied upon balancing the information along four interdependent axes: (1) Risks, burdens and benefits; (2) Treatment goals; (3) Normative ethical values; (4) Interested parties. Decision-making was a dynamic ongoing process with many people involved. The multiple points of decision-making, and the number of people involved with the decision-making process, mean the question of 'who decides' cannot be fully answered. There is a potential for anonymity of multiple decision-makers to arise. Decisions in real world clinical practice may not fit precisely into a model of decision-making. The findings from this service evaluation illustrate that within multi-professional team decision-making; decisions may contain elements of both substituted and supported decision-making, and may be better represented as existing upon a continuum.

  20. The design of multi-core DSP parallel model based on message passing and multi-level pipeline

    NASA Astrophysics Data System (ADS)

    Niu, Jingyu; Hu, Jian; He, Wenjing; Meng, Fanrong; Li, Chuanrong

    2017-10-01

    Currently, the design of embedded signal processing system is often based on a specific application, but this idea is not conducive to the rapid development of signal processing technology. In this paper, a parallel processing model architecture based on multi-core DSP platform is designed, and it is mainly suitable for the complex algorithms which are composed of different modules. This model combines the ideas of multi-level pipeline parallelism and message passing, and summarizes the advantages of the mainstream model of multi-core DSP (the Master-Slave model and the Data Flow model), so that it has better performance. This paper uses three-dimensional image generation algorithm to validate the efficiency of the proposed model by comparing with the effectiveness of the Master-Slave and the Data Flow model.

  1. Mcqueuer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-09-12

    Mcqueuer is a simple tool that allows anyone from researchers to experienced developers to create multi-node/multi-core jobs by simply creating a file with a list of commands. Users simply combine tasks, which would otherwise each be their own job on the cluster, into a single file that is given to Mcqueuer. Mcqueuer then does the heavy lifting required to process the tasks in parallel in a single multi-node job. In addition, Mcqueuer provides load-balancing, which frees the user from having to worry about complex memory and CPU considerations, and instead focus on the processing itself.

  2. Enhancing Manufacturing Process Education via Computer Simulation and Visualization

    ERIC Educational Resources Information Center

    Manohar, Priyadarshan A.; Acharya, Sushil; Wu, Peter

    2014-01-01

    Industrially significant metal manufacturing processes such as melting, casting, rolling, forging, machining, and forming are multi-stage, complex processes that are labor, time, and capital intensive. Academic research develops mathematical modeling of these processes that provide a theoretical framework for understanding the process variables…

  3. Improvement of the R-SWAT-FME framework to support multiple variables and multi-objective functions

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2014-01-01

    Application of numerical models is a common practice in the environmental field for investigation and prediction of natural and anthropogenic processes. However, process knowledge, parameter identifiability, sensitivity, and uncertainty analyses are still a challenge for large and complex mathematical models such as the hydrological/water quality model, Soil and Water Assessment Tool (SWAT). In this study, the previously developed R program language-SWAT-Flexible Modeling Environment (R-SWAT-FME) was improved to support multiple model variables and objectives at multiple time steps (i.e., daily, monthly, and annually). This expansion is significant because there is usually more than one variable (e.g., water, nutrients, and pesticides) of interest for environmental models like SWAT. To further facilitate its easy use, we also simplified its application requirements without compromising its merits, such as the user-friendly interface. To evaluate the performance of the improved framework, we used a case study focusing on both streamflow and nitrate nitrogen in the Upper Iowa River Basin (above Marengo) in the United States. Results indicated that the R-SWAT-FME performs well and is comparable to the built-in auto-calibration tool in multi-objective model calibration. Overall, the enhanced R-SWAT-FME can be useful for the SWAT community, and the methods we used can also be valuable for wrapping potential R packages with other environmental models.

  4. One-step production of multilayered microparticles by tri-axial electro-flow focusing

    NASA Astrophysics Data System (ADS)

    Si, Ting; Feng, Hanxin; Li, Yang; Luo, Xisheng; Xu, Ronald

    2014-03-01

    Microencapsulation of drugs and imaging agents in the same carrier is of great significance for simultaneous detection and treatment of diseases. In this work, we have developed a tri-axial electro-flow focusing (TEFF) device using three needles with a novel concentric arrangement to one-step form multilayered microparticles. The TEFF process can be characterized as a multi-fluidic compound cone-jet configuration in the core of a high-speed coflowing gas stream under an axial electric field. The tri-axial liquid jet eventually breaks up into multilayered droplets. To validate the method, the effect of main process parameters on characteristics of the cone and the jet has been studied experimentally. The applied electric field can dramatically promote the stability of the compound cone and enhance the atomization of compound liquid jets. Microparticles with both three-layer, double-layer and single-layer structures have been obtained. The results show that the TEFF technique has great benefits in fabricating multilayered microparticles at smaller scales. This method will be able to one-step encapsulate multiple therapeutic and imaging agents for biomedical applications such as multi-modal imaging, drug delivery and biomedicine.

  5. Complexity Optimization and High-Throughput Low-Latency Hardware Implementation of a Multi-Electrode Spike-Sorting Algorithm

    PubMed Central

    Dragas, Jelena; Jäckel, David; Hierlemann, Andreas; Franke, Felix

    2017-01-01

    Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction. PMID:25415989

  6. Complexity optimization and high-throughput low-latency hardware implementation of a multi-electrode spike-sorting algorithm.

    PubMed

    Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix

    2015-03-01

    Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.

  7. Modeling Stepped Leaders Using a Time Dependent Multi-dipole Model and High-speed Video Data

    NASA Astrophysics Data System (ADS)

    Karunarathne, S.; Marshall, T.; Stolzenburg, M.; Warner, T. A.; Orville, R. E.

    2012-12-01

    In summer of 2011, we collected lightning data with 10 stations of electric field change meters (bandwidth of 0.16 Hz - 2.6 MHz) on and around NASA/Kennedy Space Center (KSC) covering nearly 70 km × 100 km area. We also had a high-speed video (HSV) camera recording 50,000 images per second collocated with one of the electric field change meters. In this presentation we describe our use of these data to model the electric field change caused by stepped leaders. Stepped leaders of a cloud to ground lightning flash typically create the initial path for the first return stroke (RS). Most of the time, stepped leaders have multiple complex branches, and one of these branches will create the ground connection for the RS to start. HSV data acquired with a short focal length lens at ranges of 5-25 km from the flash are useful for obtaining the 2-D location of these multiple branches developing at the same time. Using HSV data along with data from the KSC Lightning Detection and Ranging (LDAR2) system and the Cloud to Ground Lightning Surveillance System (CGLSS), the 3D path of a leader may be estimated. Once the path of a stepped leader is obtained, the time dependent multi-dipole model [ Lu, Winn,and Sonnenfeld, JGR 2011] can be used to match the electric field change at various sensor locations. Based on this model, we will present the time-dependent charge distribution along a leader channel and the total charge transfer during the stepped leader phase.

  8. A practical approach for comparing management strategies in complex forest ecosystems using meta-modelling toolkits

    Treesearch

    Andrew Fall; B. Sturtevant; M.-J. Fortin; M. Papaik; F. Doyon; D. Morgan; K. Berninger; C. Messier

    2010-01-01

    The complexity and multi-scaled nature of forests poses significant challenges to understanding and management. Models can provide useful insights into process and their interactions, and implications of alternative management options. Most models, particularly scientific models, focus on a relatively small set of processes and are designed to operate within a...

  9. Parameter optimization of a hydrologic model in a snow-dominated basin using a modular Python framework

    NASA Astrophysics Data System (ADS)

    Volk, J. M.; Turner, M. A.; Huntington, J. L.; Gardner, M.; Tyler, S.; Sheneman, L.

    2016-12-01

    Many distributed models that simulate watershed hydrologic processes require a collection of multi-dimensional parameters as input, some of which need to be calibrated before the model can be applied. The Precipitation Runoff Modeling System (PRMS) is a physically-based and spatially distributed hydrologic model that contains a considerable number of parameters that often need to be calibrated. Modelers can also benefit from uncertainty analysis of these parameters. To meet these needs, we developed a modular framework in Python to conduct PRMS parameter optimization, uncertainty analysis, interactive visual inspection of parameters and outputs, and other common modeling tasks. Here we present results for multi-step calibration of sensitive parameters controlling solar radiation, potential evapo-transpiration, and streamflow in a PRMS model that we applied to the snow-dominated Dry Creek watershed in Idaho. We also demonstrate how our modular approach enables the user to use a variety of parameter optimization and uncertainty methods or easily define their own, such as Monte Carlo random sampling, uniform sampling, or even optimization methods such as the downhill simplex method or its commonly used, more robust counterpart, shuffled complex evolution.

  10. Multi-step splicing of sphingomyelin synthase linear and circular RNAs.

    PubMed

    Filippenkov, Ivan B; Sudarkina, Olga Yu; Limborska, Svetlana A; Dergunova, Lyudmila V

    2018-05-15

    The SGMS1 gene encodes the enzyme sphingomyelin synthase 1 (SMS1), which is involved in the regulation of lipid metabolism, apoptosis, intracellular vesicular transport and other significant processes. The SGMS1 gene is located on chromosome 10 and has a size of 320 kb. Previously, we showed that dozens of alternative transcripts of the SGMS1 gene are present in various human tissues. In addition to mRNAs that provide synthesis of the SMS1 protein, this gene participates in the synthesis of non-coding transcripts, including circular RNAs (circRNAs), which include exons of the 5'-untranslated region (5'-UTR) and are highly represented in the brain. In this study, using the high-throughput technology RNA-CaptureSeq, many new SGMS1 transcripts were identified, including both intronic unspliced RNAs (premature RNAs) and RNAs formed via alternative splicing. Recursive exons (RS-exons) that can participate in the multi-step splicing of long introns of the gene were also identified. These exons participate in the formation of circRNAs. Thus, multi-step splicing may provide a variety of linear and circular RNAs of eukaryotic genes in tissues. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Data Acquisition System for Multi-Frequency Radar Flight Operations Preparation

    NASA Technical Reports Server (NTRS)

    Leachman, Jonathan

    2010-01-01

    A three-channel data acquisition system was developed for the NASA Multi-Frequency Radar (MFR) system. The system is based on a commercial-off-the-shelf (COTS) industrial PC (personal computer) and two dual-channel 14-bit digital receiver cards. The decimated complex envelope representations of the three radar signals are passed to the host PC via the PCI bus, and then processed in parallel by multiple cores of the PC CPU (central processing unit). The innovation is this parallelization of the radar data processing using multiple cores of a standard COTS multi-core CPU. The data processing portion of the data acquisition software was built using autonomous program modules or threads, which can run simultaneously on different cores. A master program module calculates the optimal number of processing threads, launches them, and continually supplies each with data. The benefit of this new parallel software architecture is that COTS PCs can be used to implement increasingly complex processing algorithms on an increasing number of radar range gates and data rates. As new PCs become available with higher numbers of CPU cores, the software will automatically utilize the additional computational capacity.

  12. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    NASA Astrophysics Data System (ADS)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  13. Factors associated with the use of cognitive aids in operating room crises: a cross-sectional study of US hospitals and ambulatory surgical centers.

    PubMed

    Alidina, Shehnaz; Goldhaber-Fiebert, Sara N; Hannenberg, Alexander A; Hepner, David L; Singer, Sara J; Neville, Bridget A; Sachetta, James R; Lipsitz, Stuart R; Berry, William R

    2018-03-26

    Operating room (OR) crises are high-acuity events requiring rapid, coordinated management. Medical judgment and decision-making can be compromised in stressful situations, and clinicians may not experience a crisis for many years. A cognitive aid (e.g., checklist) for the most common types of crises in the OR may improve management during unexpected and rare events. While implementation strategies for innovations such as cognitive aids for routine use are becoming better understood, cognitive aids that are rarely used are not yet well understood. We examined organizational context and implementation process factors influencing the use of cognitive aids for OR crises. We conducted a cross-sectional study using a Web-based survey of individuals who had downloaded OR cognitive aids from the websites of Ariadne Labs or Stanford University between January 2013 and January 2016. In this paper, we report on the experience of 368 respondents from US hospitals and ambulatory surgical centers. We analyzed the relationship of more successful implementation (measured as reported regular cognitive aid use during applicable clinical events) with organizational context and with participation in a multi-step implementation process. We used multivariable logistic regression to identify significant predictors of reported, regular OR cognitive aid use during OR crises. In the multivariable logistic regression, small facility size was associated with a fourfold increase in the odds of a facility reporting more successful implementation (p = 0.0092). Completing more implementation steps was also significantly associated with more successful implementation; each implementation step completed was associated with just over 50% higher odds of more successful implementation (p ≤ 0.0001). More successful implementation was associated with leadership support (p < 0.0001) and dedicated time to train staff (p = 0.0189). Less successful implementation was associated with resistance among clinical providers to using cognitive aids (p < 0.0001), absence of an implementation champion (p = 0.0126), and unsatisfactory content or design of the cognitive aid (p = 0.0112). Successful implementation of cognitive aids in ORs was associated with a supportive organizational context and following a multi-step implementation process. Building strong organizational support and following a well-planned multi-step implementation process will likely increase the use of OR cognitive aids during intraoperative crises, which may improve patient outcomes.

  14. The roles of SSU processome components and surveillance factors in the initial processing of human ribosomal RNA

    PubMed Central

    Sloan, Katherine E.; Bohnsack, Markus T.; Schneider, Claudia; Watkins, Nicholas J.

    2014-01-01

    During eukaryotic ribosome biogenesis, three of the mature ribosomal (r)RNAs are released from a single precursor transcript (pre-rRNA) by an ordered series of endonucleolytic cleavages and exonucleolytic processing steps. Production of the 18S rRNA requires the removal of the 5′ external transcribed spacer (5′ETS) by endonucleolytic cleavages at sites A0 and A1/site 1. In metazoans, an additional cleavage in the 5′ETS, at site A′, upstream of A0, has also been reported. Here, we have investigated how A′ processing is coordinated with assembly of the early preribosomal complex. We find that only the tUTP (UTP-A) complex is critical for A′ cleavage, while components of the bUTP (UTP-B) and U3 snoRNP are important, but not essential, for efficient processing at this site. All other factors involved in the early stages of 18S rRNA processing that were tested here function downstream from this processing step. Interestingly, we show that the RNA surveillance factors XRN2 and MTR4 are also involved in A′ cleavage in humans. A′ cleavage is largely bypassed when XRN2 is depleted, and we also discover that A′ cleavage is not always the initial processing event in all cell types. Together, our data suggest that A′ cleavage is not a prerequisite for downstream pre-rRNA processing steps and may, in fact, represent a quality control step for initial pre-rRNA transcripts. Furthermore, we show that components of the RNA surveillance machinery, including the exosome and TRAMP complexes, also play key roles in the recycling of excised spacer fragments and degradation of aberrant pre-rRNAs in human cells. PMID:24550520

  15. Focused-electron-beam-induced processing (FEBIP) for emerging applications in carbon nanoelectronics

    NASA Astrophysics Data System (ADS)

    Fedorov, Andrei G.; Kim, Songkil; Henry, Mathias; Kulkarni, Dhaval; Tsukruk, Vladimir V.

    2014-12-01

    Focused-electron-beam-induced processing (FEBIP), a resist-free additive nanomanufacturing technique, is an actively researched method for "direct-write" processing of a wide range of structural and functional nanomaterials, with high degree of spatial and time-domain control. This article attempts to critically assess the FEBIP capabilities and unique value proposition in the context of processing of electronics materials, with a particular emphasis on emerging carbon (i.e., based on graphene and carbon nanotubes) devices and interconnect structures. One of the major hurdles in advancing the carbon-based electronic materials and device fabrication is a disjoint nature of various processing steps involved in making a functional device from the precursor graphene/CNT materials. Not only this multi-step sequence severely limits the throughput and increases the cost, but also dramatically reduces the processing reproducibility and negatively impacts the quality because of possible between-the-step contamination, especially for impurity-susceptible materials such as graphene. The FEBIP provides a unique opportunity to address many challenges of carbon nanoelectronics, especially when it is employed as part of an integrated processing environment based on multiple "beams" of energetic particles, including electrons, photons, and molecules. This avenue is promising from the applications' prospective, as such a multi-functional (electron/photon/molecule beam) enables one to define shapes (patterning), form structures (deposition/etching), and modify (cleaning/doping/annealing) properties with locally resolved control on nanoscale using the same tool without ever changing the processing environment. It thus will have a direct positive impact on enhancing functionality, improving quality and reducing fabrication costs for electronic devices, based on both conventional CMOS and emerging carbon (CNT/graphene) materials.

  16. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  17. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  18. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  19. Assays for myasthenia gravis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindstrom, J.M.

    1988-12-06

    This patent describes an improvement in a process for diagnosing myasthenia gravis. The process comprises the steps of preparing a complex of acetycholine receptor protein, toxin and a radioactive isotope, incubating the complex with a serum sample from a patient so as to join antibodies engendered in connection with myasthenia gravis to the complex, precipitating the complex joined with antibody with anti-immunoglobulin and measuring radioactivity, from the radioactive isotope, of the precipitated complex. The improvement is that the acetylcholine receptor protein is derived from cells of the TE671 Line.

  20. Spectral Collocation Time-Domain Modeling of Diffractive Optical Elements

    NASA Astrophysics Data System (ADS)

    Hesthaven, J. S.; Dinesen, P. G.; Lynov, J. P.

    1999-11-01

    A spectral collocation multi-domain scheme is developed for the accurate and efficient time-domain solution of Maxwell's equations within multi-layered diffractive optical elements. Special attention is being paid to the modeling of out-of-plane waveguide couplers. Emphasis is given to the proper construction of high-order schemes with the ability to handle very general problems of considerable geometric and material complexity. Central questions regarding efficient absorbing boundary conditions and time-stepping issues are also addressed. The efficacy of the overall scheme for the time-domain modeling of electrically large, and computationally challenging, problems is illustrated by solving a number of plane as well as non-plane waveguide problems.

  1. Setting the Scope of Concept Inventories for Introductory Computing Subjects

    ERIC Educational Resources Information Center

    Goldman, Ken; Gross, Paul; Heeren, Cinda; Herman, Geoffrey L.; Kaczmarczyk, Lisa; Loui, Michael C.; Zilles, Craig

    2010-01-01

    A concept inventory is a standardized assessment tool intended to evaluate a student's understanding of the core concepts of a topic. In order to create a concept inventory it is necessary to accurately identify these core concepts. A Delphi process is a structured multi-step process that uses a group of experts to achieve a consensus opinion. We…

  2. Batch and multi-step fed-batch enzymatic saccharification of Formiline-pretreated sugarcane bagasse at high solid loadings for high sugar and ethanol titers.

    PubMed

    Zhao, Xuebing; Dong, Lei; Chen, Liang; Liu, Dehua

    2013-05-01

    Formiline pretreatment pertains to a biomass fractionation process. In the present work, Formiline-pretreated sugarcane bagasse was hydrolyzed with cellulases by batch and multi-step fed-batch processes at 20% solid loading. For wet pulp, after 144 h incubation with cellulase loading of 10 FPU/g dry solid, fed-batch process obtained ~150 g/L glucose and ~80% glucan conversion, while batch process obtained ~130 g/L glucose with corresponding ~70% glucan conversion. Solid loading could be further increased to 30% for the acetone-dried pulp. By fed-batch hydrolysis of the dried pulp in pH 4.8 buffer solution, glucose concentration could be 247.3±1.6 g/L with corresponding 86.1±0.6% glucan conversion. The enzymatic hydrolyzates could be well converted to ethanol by a subsequent fermentation using Saccharomices cerevisiae with ethanol titer of 60-70 g/L. Batch and fed-batch SSF indicated that Formiline-pretreated substrate showed excellent fermentability. The final ethanol concentration was 80 g/L with corresponding 82.7% of theoretical yield. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. A custom correlation coefficient (CCC) approach for fast identification of multi-SNP association patterns in genome-wide SNPs data.

    PubMed

    Climer, Sharlee; Yang, Wei; de las Fuentes, Lisa; Dávila-Román, Victor G; Gu, C Charles

    2014-11-01

    Complex diseases are often associated with sets of multiple interacting genetic factors and possibly with unique sets of the genetic factors in different groups of individuals (genetic heterogeneity). We introduce a novel concept of custom correlation coefficient (CCC) between single nucleotide polymorphisms (SNPs) that address genetic heterogeneity by measuring subset correlations autonomously. It is used to develop a 3-step process to identify candidate multi-SNP patterns: (1) pairwise (SNP-SNP) correlations are computed using CCC; (2) clusters of so-correlated SNPs identified; and (3) frequencies of these clusters in disease cases and controls compared to identify disease-associated multi-SNP patterns. This method identified 42 candidate multi-SNP associations with hypertensive heart disease (HHD), among which one cluster of 22 SNPs (six genes) included 13 in SLC8A1 (aka NCX1, an essential component of cardiac excitation-contraction coupling) and another of 32 SNPs had 29 from a different segment of SLC8A1. While allele frequencies show little difference between cases and controls, the cluster of 22 associated alleles were found in 20% of controls but no cases and the other in 3% of controls but 20% of cases. These suggest that both protective and risk effects on HHD could be exerted by combinations of variants in different regions of SLC8A1, modified by variants from other genes. The results demonstrate that this new correlation metric identifies disease-associated multi-SNP patterns overlooked by commonly used correlation measures. Furthermore, computation time using CCC is a small fraction of that required by other methods, thereby enabling the analyses of large GWAS datasets. © 2014 WILEY PERIODICALS, INC.

  4. A custom correlation coefficient (CCC) approach for fast identification of multi-SNP association patterns in genome-wide SNPs data

    PubMed Central

    Climer, Sharlee; Yang, Wei; de las Fuentes, Lisa; Dávila-Román, Victor G.; Gu, C. Charles

    2014-01-01

    Complex diseases are often associated with sets of multiple interacting genetic factors and possibly with unique sets of the genetic factors in different groups of individuals (genetic heterogeneity). We introduce a novel concept of Custom Correlation Coefficient (CCC) between single nucleotide polymorphisms (SNPs) that address genetic heterogeneity by measuring subset correlations autonomously. It is used to develop a 3-step process to identify candidate multi-SNP patterns: (1) pairwise (SNP-SNP) correlations are computed using CCC; (2) clusters of so-correlated SNPs identified; and (3) frequencies of these clusters in disease cases and controls compared to identify disease-associated multi-SNP patterns. This method identified 42 candidate multi-SNP associations with hypertensive heart disease (HHD), among which one cluster of 22 SNPs (6 genes) included 13 in SLC8A1 (aka NCX1, an essential component of cardiac excitation-contraction coupling) and another of 32 SNPs had 29 from a different segment of SLC8A1. While allele frequencies show little difference between cases and controls, the cluster of 22 associated alleles were found in 20% of controls but no cases and the other in 3% of controls but 20% of cases. These suggest that both protective and risk effects on HHD could be exerted by combinations of variants in different regions of SLC8A1, modified by variants from other genes. The results demonstrate that this new correlation metric identifies disease-associated multi-SNP patterns overlooked by commonly used correlation measures. Furthermore, computation time using CCC is a small fraction of that required by other methods, thereby enabling the analyses of large GWAS datasets. PMID:25168954

  5. A 3D numerical study of LO2/GH2 supercritical combustion in the ONERA-Mascotte Test-rig configuration

    NASA Astrophysics Data System (ADS)

    Benmansour, Abdelkrim; Liazid, Abdelkrim; Logerais, Pierre-Olivier; Durastanti, Jean-Félix

    2016-02-01

    Cryogenic propellants LOx/H2 are used at very high pressure in rocket engine combustion. The description of the combustion process in such application is very complex due essentially to the supercritical regime. Ideal gas law becomes invalid. In order to try to capture the average characteristics of this combustion process, numerical computations are performed using a model based on a one-phase multi-component approach. Such work requires fluid properties and a correct definition of the mixture behavior generally described by cubic equations of state with appropriated thermodynamic relations validated against the NIST data. In this study we consider an alternative way to get the effect of real gas by testing the volume-weighted-mixing-law with association of the component transport properties using directly the NIST library data fitting including the supercritical regime range. The numerical simulations are carried out using 3D RANS approach associated with two tested turbulence models, the standard k-Epsilon model and the realizable k-Epsilon one. The combustion model is also associated with two chemical reaction mechanisms. The first one is a one-step generic chemical reaction and the second one is a two-step chemical reaction. The obtained results like temperature profiles, recirculation zones, visible flame lengths and distributions of OH species are discussed.

  6. Delamination detection by Multi-Level Wavelet Processing of Continuous Scanning Laser Doppler Vibrometry data

    NASA Astrophysics Data System (ADS)

    Chiariotti, P.; Martarelli, M.; Revel, G. M.

    2017-12-01

    A novel non-destructive testing procedure for delamination detection based on the exploitation of the simultaneous time and spatial sampling provided by Continuous Scanning Laser Doppler Vibrometry (CSLDV) and the feature extraction capability of Multi-Level wavelet-based processing is presented in this paper. The processing procedure consists in a multi-step approach. Once the optimal mother-wavelet is selected as the one maximizing the Energy to Shannon Entropy Ratio criterion among the mother-wavelet space, a pruning operation aiming at identifying the best combination of nodes inside the full-binary tree given by Wavelet Packet Decomposition (WPD) is performed. The pruning algorithm exploits, in double step way, a measure of the randomness of the point pattern distribution on the damage map space with an analysis of the energy concentration of the wavelet coefficients on those nodes provided by the first pruning operation. A combination of the point pattern distributions provided by each node of the ensemble node set from the pruning algorithm allows for setting a Damage Reliability Index associated to the final damage map. The effectiveness of the whole approach is proven on both simulated and real test cases. A sensitivity analysis related to the influence of noise on the CSLDV signal provided to the algorithm is also discussed, showing that the processing developed is robust enough to measurement noise. The method is promising: damages are well identified on different materials and for different damage-structure varieties.

  7. Robust and fast nonlinear optimization of diffusion MRI microstructure models.

    PubMed

    Harms, R L; Fritz, F J; Tobisch, A; Goebel, R; Roebroeck, A

    2017-07-15

    Advances in biophysical multi-compartment modeling for diffusion MRI (dMRI) have gained popularity because of greater specificity than DTI in relating the dMRI signal to underlying cellular microstructure. A large range of these diffusion microstructure models have been developed and each of the popular models comes with its own, often different, optimization algorithm, noise model and initialization strategy to estimate its parameter maps. Since data fit, accuracy and precision is hard to verify, this creates additional challenges to comparability and generalization of results from diffusion microstructure models. In addition, non-linear optimization is computationally expensive leading to very long run times, which can be prohibitive in large group or population studies. In this technical note we investigate the performance of several optimization algorithms and initialization strategies over a few of the most popular diffusion microstructure models, including NODDI and CHARMED. We evaluate whether a single well performing optimization approach exists that could be applied to many models and would equate both run time and fit aspects. All models, algorithms and strategies were implemented on the Graphics Processing Unit (GPU) to remove run time constraints, with which we achieve whole brain dataset fits in seconds to minutes. We then evaluated fit, accuracy, precision and run time for different models of differing complexity against three common optimization algorithms and three parameter initialization strategies. Variability of the achieved quality of fit in actual data was evaluated on ten subjects of each of two population studies with a different acquisition protocol. We find that optimization algorithms and multi-step optimization approaches have a considerable influence on performance and stability over subjects and over acquisition protocols. The gradient-free Powell conjugate-direction algorithm was found to outperform other common algorithms in terms of run time, fit, accuracy and precision. Parameter initialization approaches were found to be relevant especially for more complex models, such as those involving several fiber orientations per voxel. For these, a fitting cascade initializing or fixing parameter values in a later optimization step from simpler models in an earlier optimization step further improved run time, fit, accuracy and precision compared to a single step fit. This establishes and makes available standards by which robust fit and accuracy can be achieved in shorter run times. This is especially relevant for the use of diffusion microstructure modeling in large group or population studies and in combining microstructure parameter maps with tractography results. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Acoustic resonator and method of making same

    DOEpatents

    Kline, Gerald R.; Lakin, Kenneth M.

    1985-03-05

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  9. Acoustic resonator and method of making same

    DOEpatents

    Kline, G.R.; Lakin, K.M.

    1983-10-13

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  10. Introduction to Remote Sensing Image Registration

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline

    2017-01-01

    For many applications, accurate and fast image registration of large amounts of multi-source data is the first necessary step before subsequent processing and integration. Image registration is defined by several steps and each step can be approached by various methods which all present diverse advantages and drawbacks depending on the type of data, the type of applications, the a prior information known about the data and the type of accuracy that is required. This paper will first present a general overview of remote sensing image registration and then will go over a few specific methods and their applications

  11. Tissue-aware RNA-Seq processing and normalization for heterogeneous and sparse data.

    PubMed

    Paulson, Joseph N; Chen, Cho-Yi; Lopes-Ramos, Camila M; Kuijjer, Marieke L; Platig, John; Sonawane, Abhijeet R; Fagny, Maud; Glass, Kimberly; Quackenbush, John

    2017-10-03

    Although ultrahigh-throughput RNA-Sequencing has become the dominant technology for genome-wide transcriptional profiling, the vast majority of RNA-Seq studies typically profile only tens of samples, and most analytical pipelines are optimized for these smaller studies. However, projects are generating ever-larger data sets comprising RNA-Seq data from hundreds or thousands of samples, often collected at multiple centers and from diverse tissues. These complex data sets present significant analytical challenges due to batch and tissue effects, but provide the opportunity to revisit the assumptions and methods that we use to preprocess, normalize, and filter RNA-Seq data - critical first steps for any subsequent analysis. We find that analysis of large RNA-Seq data sets requires both careful quality control and the need to account for sparsity due to the heterogeneity intrinsic in multi-group studies. We developed Yet Another RNA Normalization software pipeline (YARN), that includes quality control and preprocessing, gene filtering, and normalization steps designed to facilitate downstream analysis of large, heterogeneous RNA-Seq data sets and we demonstrate its use with data from the Genotype-Tissue Expression (GTEx) project. An R package instantiating YARN is available at http://bioconductor.org/packages/yarn .

  12. Evaluation of a UMLS Auditing Process of Semantic Type Assignments

    PubMed Central

    Gu, Huanying; Hripcsak, George; Chen, Yan; Morrey, C. Paul; Elhanan, Gai; Cimino, James J.; Geller, James; Perl, Yehoshua

    2007-01-01

    The UMLS is a terminological system that integrates many source terminologies. Each concept in the UMLS is assigned one or more semantic types from the Semantic Network, an upper level ontology for biomedicine. Due to the complexity of the UMLS, errors exist in the semantic type assignments. Finding assignment errors may unearth modeling errors. Even with sophisticated tools, discovering assignment errors requires manual review. In this paper we describe the evaluation of an auditing project of UMLS semantic type assignments. We studied the performance of the auditors who reviewed potential errors. We found that four auditors, interacting according to a multi-step protocol, identified a high rate of errors (one or more errors in 81% of concepts studied) and that results were sufficiently reliable (0.67 to 0.70) for the two most common types of errors. However, reliability was low for each individual auditor, suggesting that review of potential errors is resource-intensive. PMID:18693845

  13. Multi-perspective smFRET reveals rate-determining late intermediates of ribosomal translocation

    PubMed Central

    Wasserman, Michael R.; Alejo, Jose L.; Altman, Roger B.; Blanchard, Scott C.

    2016-01-01

    Directional translocation of the ribosome through the messenger RNA open reading frame is a critical determinant of translational fidelity. This process entails a complex interplay of large-scale conformational changes within the actively translating particle, which together coordinate the movement of transfer and messenger RNA substrates with respect to the large and small ribosomal subunits. Using pre-steady state, single-molecule fluorescence resonance energy transfer imaging, we have tracked the nature and timing of these conformational events within the Escherichia coli ribosome from five structural perspectives. Our investigations reveal direct evidence of structurally and kinetically distinct, late intermediates during substrate movement, whose resolution is rate-determining to the translocation mechanism. These steps involve intra-molecular events within the EFG(GDP)-bound ribosome, including exaggerated, reversible fluctuations of the small subunit head domain, which ultimately facilitate peptidyl-tRNA’s movement into its final post-translocation position. PMID:26926435

  14. A conserved Mediator–CDK8 kinase module association regulates Mediator–RNA polymerase II interaction

    PubMed Central

    Tsai, Kuang-Lei; Sato, Shigeo; Tomomori-Sato, Chieri; Conaway, Ronald C.; Conaway, Joan W.; Asturias, Francisco J.

    2013-01-01

    The CDK8 kinase module (CKM) is a conserved, dissociable Mediator subcomplex whose component subunits were genetically linked to the RNA polymerase II (RNAPII) carboxy-terminal domain (CTD) and individually recognized as transcriptional repressors before Mediator was identified as a preeminent complex in eukaryotic transcription regulation. We used macromolecular electron microscopy and biochemistry to investigate the subunit organization, structure, and Mediator interaction of the Saccharomyces cerevisiae CKM. We found that interaction of the CKM with Mediator’s Middle module interferes with CTD-dependent RNAPII binding to a previously unknown Middle module CTD-binding site targeted early on in a multi-step holoenzyme formation process. Taken together, our results reveal the basis for CKM repression, clarify the origin of the connection between CKM subunits and the CTD, and suggest that a combination of competitive interactions and conformational changes that facilitate holoenzyme formation underlie the Mediator mechanism. PMID:23563140

  15. Stalking the IQ Quark.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1979-01-01

    An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)

  16. Managing IT service management implementation complexity: from the perspective of the Warfield Version of systems science

    NASA Astrophysics Data System (ADS)

    Wan, Jiangping; Jones, James D.

    2013-11-01

    The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.

  17. Multifunctional nanoparticles for drug/gene delivery in nanomedicine

    NASA Astrophysics Data System (ADS)

    Seale, Mary-Margaret; Zemlyanov, Dimitry; Cooper, Christy L.; Haglund, Emily; Prow, Tarl W.; Reece, Lisa M.; Leary, James F.

    2007-02-01

    Multifunctional nanoparticles hold great promise for drug/gene delivery. Multilayered nanoparticles can act as nanomedical systems with on-board "molecular programming" to accomplish complex multi-step tasks. For example, the targeting process has only begun when the nanosystem has found the correct diseased cell of interest. Then it must pass the cell membrane and avoid enzymatic destruction within the endosomes of the cell. Since the nanosystem is only about one millionth the volume of a human cell, for it to have therapeutic efficacy with its contained package, it must deliver that drug or gene to the appropriate site within the living cell. The successive de-layering of these nanosystems in a controlled fashion allows the system to accomplish operations that would be difficult or impossible to do with even complex single molecules. In addition, portions of the nanosystem may be protected from premature degradation or mistargeting to non-diseased cells. All of these problems remain major obstacles to successful drug delivery with a minimum of deleterious side effects to the patient. This paper describes some of the many components involved in the design of a general platform technology for nanomedical systems. The feasibility of most of these components has been demonstrated by our group and others. But the integration of these interacting sub-components remains a challenge. We highlight four components of this process as examples. Each subcomponent has its own sublevels of complexity. But good nanomedical systems have to be designed/engineered as a full nanomedical system, recognizing the need for the other components.

  18. A derived heuristics based multi-objective optimization procedure for micro-grid scheduling

    NASA Astrophysics Data System (ADS)

    Li, Xin; Deb, Kalyanmoy; Fang, Yanjun

    2017-06-01

    With the availability of different types of power generators to be used in an electric micro-grid system, their operation scheduling as the load demand changes with time becomes an important task. Besides satisfying load balance constraints and the generator's rated power, several other practicalities, such as limited availability of grid power and restricted ramping of power output from generators, must all be considered during the operation scheduling process, which makes it difficult to decide whether the optimization results are accurate and satisfactory. In solving such complex practical problems, heuristics-based customized optimization algorithms are suggested. However, due to nonlinear and complex interactions of variables, it is difficult to come up with heuristics in such problems off-hand. In this article, a two-step strategy is proposed in which the first task deciphers important heuristics about the problem and the second task utilizes the derived heuristics to solve the original problem in a computationally fast manner. Specifically, the specific operation scheduling is considered from a two-objective (cost and emission) point of view. The first task develops basic and advanced level knowledge bases offline from a series of prior demand-wise optimization runs and then the second task utilizes them to modify optimized solutions in an application scenario. Results on island and grid connected modes and several pragmatic formulations of the micro-grid operation scheduling problem clearly indicate the merit of the proposed two-step procedure.

  19. Radical Cations and Acid Protection during Radiolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mincher, Bruce J.; Zarzana, Christopher A.; Mezyk, Stephen P.

    2016-09-09

    Ligand molecules for used nuclear fuel separation schemes are exposed to high radiation fields and high concentrations of acid. Thus, an understanding of the complex interactions between extraction ligands, diluent, and acid is critical to understanding the performance of a separation process. The diglycolamides are ligands with important structural similarities to CMPO; however, previous work has shown that their radiolytic degradation has important mechanistic differences from CMPO. The DGAs do not enjoy radioprotection by HNO3 and the kinetics of DGA radiolytic degradation are different. CMPO degrades with pseudo-zero-order kinetics in linear fashion with absorbed dose while the DGAs degrade inmore » pseudo-first-order, exponential fashion. This suggests that the DGAs degrade by simple reaction with some product of direct diluent radiolysis, while CMPO degradation is probably multi-step, with a slow step that is not dependent on the CMPO concentration, and mitigated by HNO 3. It is thus believed that radio-protection and the zero-order radiolytic degradation kinetics are related, and that these phenomena are a function of either the formation of strong acid complexes with CMPO and/or to the presence of the CMPO phenyl ring. Experiments to test both these hypotheses have been designed and partially conducted. This report summarizes findings related to these phenomena for FY16, in satisfaction of milestone M3FT-16IN030104053. It also reports continued kinetic measurements for the reactions of the dodecane radical cation with solvent extraction ligands.« less

  20. RAMTaB: Robust Alignment of Multi-Tag Bioimages

    PubMed Central

    Raza, Shan-e-Ahmed; Humayun, Ahmad; Abouna, Sylvie; Nattkemper, Tim W.; Epstein, David B. A.; Khan, Michael; Rajpoot, Nasir M.

    2012-01-01

    Background In recent years, new microscopic imaging techniques have evolved to allow us to visualize several different proteins (or other biomolecules) in a visual field. Analysis of protein co-localization becomes viable because molecules can interact only when they are located close to each other. We present a novel approach to align images in a multi-tag fluorescence image stack. The proposed approach is applicable to multi-tag bioimaging systems which (a) acquire fluorescence images by sequential staining and (b) simultaneously capture a phase contrast image corresponding to each of the fluorescence images. To the best of our knowledge, there is no existing method in the literature, which addresses simultaneous registration of multi-tag bioimages and selection of the reference image in order to maximize the overall overlap between the images. Methodology/Principal Findings We employ a block-based method for registration, which yields a confidence measure to indicate the accuracy of our registration results. We derive a shift metric in order to select the Reference Image with Maximal Overlap (RIMO), in turn minimizing the total amount of non-overlapping signal for a given number of tags. Experimental results show that the Robust Alignment of Multi-Tag Bioimages (RAMTaB) framework is robust to variations in contrast and illumination, yields sub-pixel accuracy, and successfully selects the reference image resulting in maximum overlap. The registration results are also shown to significantly improve any follow-up protein co-localization studies. Conclusions For the discovery of protein complexes and of functional protein networks within a cell, alignment of the tag images in a multi-tag fluorescence image stack is a key pre-processing step. The proposed framework is shown to produce accurate alignment results on both real and synthetic data. Our future work will use the aligned multi-channel fluorescence image data for normal and diseased tissue specimens to analyze molecular co-expression patterns and functional protein networks. PMID:22363510

  1. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  2. Tracking Virus Particles in Fluorescence Microscopy Images Using Multi-Scale Detection and Multi-Frame Association.

    PubMed

    Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl

    2015-11-01

    Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.

  3. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    PubMed Central

    2013-01-01

    Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296

  4. Generalist solutions to complex problems: generating practice-based evidence--the example of managing multi-morbidity.

    PubMed

    Reeve, Joanne; Blakeman, Tom; Freeman, George K; Green, Larry A; James, Paul A; Lucassen, Peter; Martin, Carmel M; Sturmberg, Joachim P; van Weel, Chris

    2013-08-07

    A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a 'complex intervention' (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Answers to the complex problem of multi-morbidity won't come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity.

  5. Design and Implementation of the ARTEMIS Lunar Transfer Using Multi-Body Dynamics

    NASA Technical Reports Server (NTRS)

    Folta, David; Woodard, Mark; Sweetser, Theodore; Broschart, Stephen B.; Cosgrove, Daniel

    2011-01-01

    The use of multi-body dynamics to design the transfer of spacecraft from Earth elliptical orbits to the Earth-Moon libration (L(sub 1) and L(sub 2)) orbits has been successfully demonstrated by the Acceleration Reconnection and Turbulence and Electrodynamics of the Moon's Interaction with the Sun (ARTEMIS) mission. Operational support of the two ARTEMIS spacecraft is a final step in the realization of a design process that can be used to transfer spacecraft with restrictive operational constraints and fuel limitations. The focus of this paper is to describe in detail the processes and implementation of this successful approach.

  6. Comparison of Photoluminescence Imaging on Starting Multi-Crystalline Silicon Wafers to Finished Cell Performance: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, S.; Yan, F.; Dorn, D.

    2012-06-01

    Photoluminescence (PL) imaging techniques can be applied to multicrystalline silicon wafers throughout the manufacturing process. Both band-to-band PL and defect-band emissions, which are longer-wavelength emissions from sub-bandgap transitions, are used to characterize wafer quality and defect content on starting multicrystalline silicon wafers and neighboring wafers processed at each step through completion of finished cells. Both PL imaging techniques spatially highlight defect regions that represent dislocations and defect clusters. The relative intensities of these imaged defect regions change with processing. Band-to-band PL on wafers in the later steps of processing shows good correlation to cell quality and performance. The defect bandmore » images show regions that change relative intensity through processing, and better correlation to cell efficiency and reverse-bias breakdown is more evident at the starting wafer stage as opposed to later process steps. We show that thermal processing in the 200 degrees - 400 degrees C range causes impurities to diffuse to different defect regions, changing their relative defect band emissions.« less

  7. PRIMO: An Interactive Homology Modeling Pipeline.

    PubMed

    Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.

  8. PRIMO: An Interactive Homology Modeling Pipeline

    PubMed Central

    Glenister, Michael

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192

  9. Synchrotron x-ray study of a low roughness and high efficiency K 2 CsSb photocathode during film growth

    DOE PAGES

    Xie, Junqi; Demarteau, Marcel; Wagner, Robert; ...

    2017-04-24

    Reduction of roughness to the nm level is critical of achieving the ultimate performance from photocathodes used in high gradient fields. The thrust of this paper is to explore the evolution of roughness during sequential growth, and to show that deposition of multilayer structures consisting of very thin reacted layers results in an nm level smooth photocathode. Synchrotron x-ray methods were applied to study the multi-step growth process of a high efficiency K 2CsSb photocathode. We observed a transition point of the Sb film grown on Si at the film thickness of similar to 40 angstrom with the substrate temperaturemore » at 100 degrees C and the growth rate at 0.1 Å s -1. The final K 2CsSb photocathode exhibits a thickness of around five times that of the total deposited Sb film regardless of how the Sb film was grown. The film surface roughening process occurs first at the step when K diffuses into the crystalline Sb. Furthermore, the photocathode we obtained from the multi-step growth exhibits roughness in an order of magnitude lower than the normal sequential process. X-ray diffraction measurements show that the material goes through two structural changes of the crystalline phase during formation, from crystalline Sb to K 3Sb and finally to K 2CsSb.« less

  10. Synchrotron x-ray study of a low roughness and high efficiency K 2 CsSb photocathode during film growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Junqi; Demarteau, Marcel; Wagner, Robert

    Reduction of roughness to the nm level is critical of achieving the ultimate performance from photocathodes used in high gradient fields. The thrust of this paper is to explore the evolution of roughness during sequential growth, and to show that deposition of multilayer structures consisting of very thin reacted layers results in an nm level smooth photocathode. Synchrotron x-ray methods were applied to study the multi-step growth process of a high efficiency K 2CsSb photocathode. We observed a transition point of the Sb film grown on Si at the film thickness of similar to 40 angstrom with the substrate temperaturemore » at 100 degrees C and the growth rate at 0.1 Å s -1. The final K 2CsSb photocathode exhibits a thickness of around five times that of the total deposited Sb film regardless of how the Sb film was grown. The film surface roughening process occurs first at the step when K diffuses into the crystalline Sb. Furthermore, the photocathode we obtained from the multi-step growth exhibits roughness in an order of magnitude lower than the normal sequential process. X-ray diffraction measurements show that the material goes through two structural changes of the crystalline phase during formation, from crystalline Sb to K 3Sb and finally to K 2CsSb.« less

  11. Star sub-pixel centroid calculation based on multi-step minimum energy difference method

    NASA Astrophysics Data System (ADS)

    Wang, Duo; Han, YanLi; Sun, Tengfei

    2013-09-01

    The star's centroid plays a vital role in celestial navigation, star images which be gotten during daytime, due to the strong sky background, have a low SNR, and the star objectives are nearly submerged in the background, takes a great trouble to the centroid localization. Traditional methods, such as a moment method, weighted centroid calculation method is simple but has a big error, especially in the condition of a low SNR. Gaussian method has a high positioning accuracy, but the computational complexity. Analysis of the energy distribution in star image, a location method for star target centroids based on multi-step minimum energy difference is proposed. This method uses the linear superposition to narrow the centroid area, in the certain narrow area uses a certain number of interpolation to pixels for the pixels' segmentation, and then using the symmetry of the stellar energy distribution, tentatively to get the centroid position: assume that the current pixel is the star centroid position, and then calculates and gets the difference of the sum of the energy which in the symmetric direction(in this paper we take the two directions of transverse and longitudinal) and the equal step length(which can be decided through different conditions, the paper takes 9 as the step length) of the current pixel, and obtain the centroid position in this direction when the minimum difference appears, and so do the other directions, then the validation comparison of simulated star images, and compare with several traditional methods, experiments shows that the positioning accuracy of the method up to 0.001 pixel, has good effect to calculate the centroid of low SNR conditions; at the same time, uses this method on a star map which got at the fixed observation site during daytime in near-infrared band, compare the results of the paper's method with the position messages which were known of the star, it shows that :the multi-step minimum energy difference method achieves a better effect.

  12. Stable aqueous dispersions of functionalized multi-layer graphene by pulsed underwater plasma exfoliation of graphite

    NASA Astrophysics Data System (ADS)

    Meyer-Plath, Asmus; Beckert, Fabian; Tölle, Folke J.; Sturm, Heinz; Mülhaupt, Rolf

    2016-02-01

    A process was developed for graphite particle exfoliation in water to stably dispersed multi-layer graphene. It uses electrohydraulic shockwaves and the functionalizing effect of solution plasma discharges in water. The discharges were excited by 100 ns high voltage pulsing of graphite particle chains that bridge an electrode gap. The underwater discharges allow simultaneous exfoliation and chemical functionalization of graphite particles to partially oxidized multi-layer graphene. Exfoliation is caused by shockwaves that result from rapid evaporation of carbon and water to plasma-excited gas species. Depending on discharge energy and locus of ignition, the shockwaves cause stirring, erosion, exfoliation and/or expansion of graphite flakes. The process was optimized to produce long-term stable aqueous dispersions of multi-layer graphene from graphite in a single process step without requiring addition of intercalants, surfactants, binders or special solvents. A setup was developed that allows continuous production of aqueous dispersions of flake size-selected multi-layer graphenes. Due to the well-preserved sp2-carbon structure, thin films made from the dispersed graphene exhibited high electrical conductivity. Underwater plasma discharge processing exhibits high innovation potential for morphological and chemical modifications of carbonaceous materials and surfaces, especially for the generation of stable dispersions of two-dimensional, layered materials.

  13. [Systematization of nursing care: viewing care as interactive, complementary and multi-professional].

    PubMed

    do Nascimento, Keyla Cristiane; Backes, Dirce Stein; Koerich, Magda Santos; Erdmann, Alacoque Lorenzini

    2008-12-01

    This study is the result of an extended project, named: The systematization of nursing care in the perspective of complex thinking. The objective of this qualitative study is to better comprehend the meaning of the systematization of nursing care among healthcare professionals. The Data-Based Theory was used as a methodological reference. Data were collected by interviewing three sample groups, in a total of fifteen healthcare professionals. Data codification and analysis led us to the central theme: Viewing the Systematization of Nursing Care (SNC) as an Interactive and Complex Phenomenon. This theme is complemented by two phenomena. In this article, we discuss the phenomenon: Verifying the necessity of on interactive, complementary, and multi-professional process. The Systematization of Nursing Care is part of a process that has been developing over time by nurses committed to improve the care given to the patient, since they view the necessity for interactive, complementary, and multi-professional care.

  14. Precursor binding to an 880-kDa Toc complex as an early step during active import of protein into chloroplasts.

    PubMed

    Chen, Kuan-Yu; Li, Hsou-min

    2007-01-01

    The import of protein into chloroplasts is mediated by translocon components located in the chloroplast outer (the Toc proteins) and inner (the Tic proteins) envelope membranes. To identify intermediate steps during active import, we used sucrose density gradient centrifugation and blue-native polyacrylamide gel electrophoresis (BN-PAGE) to identify complexes of translocon components associated with precursor proteins under active import conditions instead of arrested binding conditions. Importing precursor proteins in solubilized chloroplast membranes formed a two-peak distribution in the sucrose density gradient. The heavier peak was in a similar position as the previously reported Tic/Toc supercomplex and was too large to be analyzed by BN-PAGE. The BN-PAGE analyses of the lighter peak revealed that precursors accumulated in at least two complexes. The first complex migrated at a position close to the ferritin dimer (approximately 880 kDa) and contained only the Toc components. Kinetic analyses suggested that this Toc complex represented an earlier step in the import process than the Tic/Toc supercomplex. The second complex in the lighter peak migrated at the position of the ferritin trimer (approximately 1320 kDa). It contained, in addition to the Toc components, Tic110, Hsp93, and an hsp70 homolog, but not Tic40. Two different precursor proteins were shown to associate with the same complexes. Processed mature proteins first appeared in the membranes at the same fractions as the Tic/Toc supercomplex, suggesting that processing of transit peptides occurs while precursors are still associated with the supercomplex.

  15. Precursor binding to an 880-kDa Toc complex as an early step during active import of protein into chloroplasts

    PubMed Central

    Chen, Kuan-Yu; Li, Hsou-min

    2007-01-01

    The import of protein into chloroplasts is mediated by translocon components located in the chloroplast outer (the Toc proteins) and inner (the Tic proteins) envelope membranes. To identify intermediate steps during active import, we used sucrose density gradient centrifugation and blue-native polyacrylamide gel electrophoresis (BN-PAGE) to identify complexes of translocon components associated with precursor proteins under active import conditions instead of arrested binding conditions. Importing precursor proteins in solubilized chloroplast membranes formed a two-peak distribution in the sucrose density gradient. The heavier peak was in a similar position as the previously reported Tic/Toc supercomplex and was too large to be analyzed by BN-PAGE. The BN-PAGE analyses of the lighter peak revealed that precursors accumulated in at least two complexes. The first complex migrated at a position close to the ferritin dimer (approximately 880 kDa) and contained only the Toc components. Kinetic analyses suggested that this Toc complex represented an earlier step in the import process than the Tic/Toc supercomplex. The second complex in the lighter peak migrated at the position of the ferritin trimer (approximately 1320 kDa). It contained, in addition to the Toc components, Tic110, Hsp93, and an hsp70 homolog, but not Tic40. Two different precursor proteins were shown to associate with the same complexes. Processed mature proteins first appeared in the membranes at the same fractions as the Tic/Toc supercomplex, suggesting that processing of transit peptides occurs while precursors are still associated with the supercomplex. PMID:17144891

  16. Evidence of a two-step process and pathway dependency in the thermodynamics of poly(diallyldimethylammonium chloride)/poly(sodium acrylate) complexation.

    PubMed

    Vitorazi, L; Ould-Moussa, N; Sekar, S; Fresnais, J; Loh, W; Chapel, J-P; Berret, J-F

    2014-12-21

    Recent studies have pointed out the importance of polyelectrolyte assembly in the elaboration of innovative nanomaterials. Beyond their structures, many important questions on the thermodynamics of association remain unanswered. Here, we investigate the complexation between poly(diallyldimethylammonium chloride) (PDADMAC) and poly(sodium acrylate) (PANa) chains using a combination of three techniques: isothermal titration calorimetry (ITC), static and dynamic light scattering and electrophoresis. Upon addition of PDADMAC to PANa or vice-versa, the results obtained by the different techniques agree well with each other, and reveal a two-step process. The primary process is the formation of highly charged polyelectrolyte complexes of size 100 nm. The secondary process is the transition towards a coacervate phase made of rich and poor polymer droplets. The binding isotherms measured are accounted for using a phenomenological model that provides the thermodynamic parameters for each reaction. Small positive enthalpies and large positive entropies consistent with a counterion release scenario are found throughout this study. Furthermore, this work stresses the importance of the underestimated formulation pathway or mixing order in polyelectrolyte complexation.

  17. Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES

    NASA Astrophysics Data System (ADS)

    Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu

    2016-11-01

    Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.

  18. Pathway of actin filament branch formation by Arp2/3 complex revealed by single-molecule imaging

    PubMed Central

    Smith, Benjamin A.; Daugherty-Clarke, Karen; Goode, Bruce L.; Gelles, Jeff

    2013-01-01

    Actin filament nucleation by actin-related protein (Arp) 2/3 complex is a critical process in cell motility and endocytosis, yet key aspects of its mechanism are unknown due to a lack of real-time observations of Arp2/3 complex through the nucleation process. Triggered by the verprolin homology, central, and acidic (VCA) region of proteins in the Wiskott-Aldrich syndrome protein (WASp) family, Arp2/3 complex produces new (daughter) filaments as branches from the sides of preexisting (mother) filaments. We visualized individual fluorescently labeled Arp2/3 complexes dynamically interacting with and producing branches on growing actin filaments in vitro. Branch formation was strikingly inefficient, even in the presence of VCA: only ∼1% of filament-bound Arp2/3 complexes yielded a daughter filament. VCA acted at multiple steps, increasing both the association rate of Arp2/3 complexes with mother filament and the fraction of filament-bound complexes that nucleated a daughter. The results lead to a quantitative kinetic mechanism for branched actin assembly, revealing the steps that can be stimulated by additional cellular factors. PMID:23292935

  19. Measurements of gluconeogenesis and glycogenolysis: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to deter...

  20. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices.

    PubMed

    Toley, Bhushan J; Wang, Jessica A; Gupta, Mayuri; Buser, Joshua R; Lafleur, Lisa K; Lutz, Barry R; Fu, Elain; Yager, Paul

    2015-03-21

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically after a) a certain period of time, or b) the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50 s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods - both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device.

  1. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  2. Use of planar array electrophysiology for the development of robust ion channel cell lines.

    PubMed

    Clare, Jeffrey J; Chen, Mao Xiang; Downie, David L; Trezise, Derek J; Powell, Andrew J

    2009-01-01

    The tractability of ion channels as drug targets has been significantly improved by the advent of planar array electrophysiology platforms which have dramatically increased the capacity for electrophysiological profiling of lead series compounds. However, the data quality and through-put obtained with these platforms is critically dependent on the robustness of the expression reagent being used. The generation of high quality, recombinant cell lines is therefore a key step in the early phase of ion channel drug discovery and this can present significant challenges due to the diversity and organisational complexity of many channel types. This article focuses on several complex and difficult to express ion channels and illustrates how improved stable cell lines can be obtained by integration of planar array electrophysiology systems into the cell line generation process per se. By embedding this approach at multiple stages (e.g., during development of the expression strategy, during screening and validation of clonal lines, and during characterisation of the final cell line), the cycle time and success rate in obtaining robust expression of complex multi-subunit channels can be significantly improved. We also review how recent advances in this technology (e.g., population patch clamp) have further widened the versatility and applicability of this approach.

  3. Acoustic surface perception from naturally occurring step sounds of a dexterous hexapod robot

    NASA Astrophysics Data System (ADS)

    Cuneyitoglu Ozkul, Mine; Saranli, Afsar; Yazicioglu, Yigit

    2013-10-01

    Legged robots that exhibit dynamic dexterity naturally interact with the surface to generate complex acoustic signals carrying rich information on the surface as well as the robot platform itself. However, the nature of a legged robot, which is a complex, hybrid dynamic system, renders the more common approach of model-based system identification impractical. The present paper focuses on acoustic surface identification and proposes a non-model-based analysis and classification approach adopted from the speech processing literature. A novel feature set composed of spectral band energies augmented by their vector time derivatives and time-domain averaged zero crossing rate is proposed. Using a multi-dimensional vector classifier, these features carry enough information to accurately classify a range of commonly occurring indoor and outdoor surfaces without using of any mechanical system model. A comparative experimental study is carried out and classification performance and computational complexity are characterized. Different feature combinations, classifiers and changes in critical design parameters are investigated. A realistic and representative acoustic data set is collected with the robot moving at different speeds on a number of surfaces. The study demonstrates promising performance of this non-model-based approach, even in an acoustically uncontrolled environment. The approach also has good chance of performing in real-time.

  4. A process for preparing an ultra-thin, adhesiveless, multi-layered, patterned polymer substrate

    NASA Technical Reports Server (NTRS)

    Bryant, Robert G. (Inventor); Kruse, Nancy H. M. (Inventor); Fox, Robert L. (Inventor); Tran, Sang Q. (Inventor)

    1995-01-01

    A process for preparing an ultra-thin, adhesiveless, multi-layered, patterned polymer substrate is disclosed. The process may be used to prepare both rigid and flexible cables and circuit boards. A substrate is provided and a polymeric solution comprising a self-bonding, soluble polymer and a solvent is applied to the substrate. Next, the polymer solution is dried to form a polymer coated substrate. The polymer coated substrate is metallized and patterned. At least one additional coating of the polymeric solution is applied to the metallized, patterned, polymer coated substrate and the steps of metallizing and patterning are repeated. Lastly, a cover coat is applied. When preparing a flexible cable and flexible circuit board, the polymer coating is removed from the substrate.

  5. Thermodynamics and Kinetics of Prenucleation Clusters, Classical and Non-Classical Nucleation

    PubMed Central

    Zahn, Dirk

    2015-01-01

    Recent observations of prenucleation species and multi-stage crystal nucleation processes challenge the long-established view on the thermodynamics of crystal formation. Here, we review and generalize extensions to classical nucleation theory. Going beyond the conventional implementation as has been used for more than a century now, nucleation inhibitors, precursor clusters and non-classical nucleation processes are rationalized as well by analogous concepts based on competing interface and bulk energy terms. This is illustrated by recent examples of species formed prior to/instead of crystal nucleation and multi-step nucleation processes. Much of the discussed insights were obtained from molecular simulation using advanced sampling techniques, briefly summarized herein for both nucleation-controlled and diffusion-controlled aggregate formation. PMID:25914369

  6. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies

    PubMed Central

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S.; Thwaites, David I.; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-01-01

    Abstract The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP ‘Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques’ was conducted in 2009–2012 as an extension of previously developed audit programs. Material and methods. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. Results. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Discussion. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs. PMID:26934916

  7. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies.

    PubMed

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S; Thwaites, David I; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-07-01

    The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP 'Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques' was conducted in 2009-2012 as an extension of previously developed audit programs. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs.

  8. A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2015-01-01

    The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.; McCorkle, D.; Yang, C.

    Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less

  10. Predictor - Predictive Reaction Design via Informatics, Computation and Theories of Reactivity

    DTIC Science & Technology

    2017-10-10

    into more complex and valuable molecules, but are limited by: 1. The extensive time it takes to design and optimize a synthesis 2. Multi-step...system. As it is fully compatible to the industry standard SQL, designing a server- based system at a later time will be trivial. Producing a JAVA front...Report: PREDICTOR - Predictive REaction Design via Informatics, Computation and Theories of Reactivity The goal of this program was to create a cyber

  11. A time to search: finding the meaning of variable activation energy.

    PubMed

    Vyazovkin, Sergey

    2016-07-28

    This review deals with the phenomenon of variable activation energy frequently observed when studying the kinetics in the liquid or solid phase. This phenomenon commonly manifests itself through nonlinear Arrhenius plots or dependencies of the activation energy on conversion computed by isoconversional methods. Variable activation energy signifies a multi-step process and has a meaning of a collective parameter linked to the activation energies of individual steps. It is demonstrated that by using appropriate models of the processes, the link can be established in algebraic form. This allows one to analyze experimentally observed dependencies of the activation energy in a quantitative fashion and, as a result, to obtain activation energies of individual steps, to evaluate and predict other important parameters of the process, and generally to gain deeper kinetic and mechanistic insights. This review provides multiple examples of such analysis as applied to the processes of crosslinking polymerization, crystallization and melting of polymers, gelation, and solid-solid morphological and glass transitions. The use of appropriate computational techniques is discussed as well.

  12. Forizymes - functionalised artificial forisomes as a platform for the production and immobilisation of single enzymes and multi-enzyme complexes.

    PubMed

    Visser, Franziska; Müller, Boje; Rose, Judith; Prüfer, Dirk; Noll, Gundula A

    2016-08-09

    The immobilisation of enzymes plays an important role in many applications, including biosensors that require enzyme activity, stability and recyclability in order to function efficiently. Here we show that forisomes (plant-derived mechanoproteins) can be functionalised with enzymes by translational fusion, leading to the assembly of structures designated as forizymes. When forizymes are expressed in the yeast Saccharomyces cerevisiae, the enzymes are immobilised by the self-assembly of forisome subunits to form well-structured protein bodies. We used glucose-6-phosphate dehydrogenase (G6PDH) and hexokinase 2 (HXK2) as model enzymes for the one-step production and purification of catalytically active forizymes. These structures retain the typical stimulus-response reaction of the forisome and the enzyme remains active even after multiple assay cycles, which we demonstrated using G6PDH forizymes as an example. We also achieved the co-incorporation of both HXK2 and G6PDH in a single forizyme, facilitating a two-step reaction cascade that was 30% faster than the coupled reaction using the corresponding enzymes on different forizymes or in solution. Our novel forizyme immobilisation technique therefore not only combines the sensory properties of forisome proteins with the catalytic properties of enzymes but also allows the development of multi-enzyme complexes for incorporation into technical devices.

  13. Design of multi-body Lambert type orbits with specified departure and arrival positions

    NASA Astrophysics Data System (ADS)

    Ishii, Nobuaki; Kawaguchi, Jun'ichiro; Matsuo, Hiroki

    1991-10-01

    A new procedure for designing a multi-body Lambert type orbit comprising a multiple swingby process is developed, aiming at relieving a numerical difficulty inherent to a highly nonlinear swingby mechanism. The proposed algorithm, Recursive Multi-Step Linearization, first divides a whole orbit into several trajectory segments. Then, with a maximum use of piecewised transition matrices, a segmentized orbit is repeatedly upgraded until an approximated orbit initially based on a patched conics method eventually converges. In application to the four body earth-moon system with sun's gravitation, one of the double lunar swingby orbits including 12 lunar swingbys is successfully designed without any velocity mismatch.

  14. Multi-frame partially saturated images blind deconvolution

    NASA Astrophysics Data System (ADS)

    Ye, Pengzhao; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-12-01

    When blurred images have saturated or over-exposed pixels, conventional blind deconvolution approaches often fail to estimate accurate point spread function (PSF) and will introduce local ringing artifacts. In this paper, we propose a method to deal with the problem under the modified multi-frame blind deconvolution framework. First, in the kernel estimation step, a light streak detection scheme using multi-frame blurred images is incorporated into the regularization constraint. Second, we deal with image regions affected by the saturated pixels separately by modeling a weighted matrix during each multi-frame deconvolution iteration process. Both synthetic and real-world examples show that more accurate PSFs can be estimated and restored images have richer details and less negative effects compared to state of art methods.

  15. Acoustic resonator with Al electrodes on an AlN layer and using a GaAs substrate

    DOEpatents

    Kline, Gerald R.; Lakin, Kenneth M.

    1985-12-03

    A method of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers.

  16. Acoustic resonator and method of making same

    DOEpatents

    Kline, G.R.; Lakin, K.M.

    1985-03-05

    A method is disclosed of fabricating an acoustic wave resonator wherein all processing steps are accomplished from a single side of said substrate. The method involves deposition of a multi-layered Al/AlN structure on a GaAs substrate followed by a series of fabrication steps to define a resonator from said composite. The resulting resonator comprises an AlN layer between two Al layers and another layer of AlN on an exterior of one of said Al layers. 4 figs.

  17. Large-scale modeling on the fate and transport of polycyclic aromatic hydrocarbons (PAHs) in multimedia over China

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Liu, M.; Wada, Y.; He, X.; Sun, X.

    2017-12-01

    In recent decades, with rapid economic growth, industrial development and urbanization, expanding pollution of polycyclic aromatic hydrocarbons (PAHs) has become a diversified and complicated phenomenon in China. However, the availability of sufficient monitoring activities for PAHs in multi-compartment and the corresponding multi-interface migration processes are still limited, especially at a large geographic area. In this study, we couple the Multimedia Fate Model (MFM) to the Community Multi-Scale Air Quality (CMAQ) model in order to consider the fugacity and the transient contamination processes. This coupled dynamic contaminant model can evaluate the detailed local variations and mass fluxes of PAHs in different environmental media (e.g., air, surface film, soil, sediment, water and vegetation) across different spatial (a county to country) and temporal (days to years) scales. This model has been applied to a large geographical domain of China at a 36 km by 36 km grid resolution. The model considers response characteristics of typical environmental medium to complex underlying surface. Results suggest that direct emission is the main input pathway of PAHs entering the atmosphere, while advection is the main outward flow of pollutants from the environment. In addition, both soil and sediment act as the main sink of PAHs and have the longest retention time. Importantly, the highest PAHs loadings are found in urbanized and densely populated regions of China, such as Yangtze River Delta and Pearl River Delta. This model can provide a good scientific basis towards a better understanding of the large-scale dynamics of environmental pollutants for land conservation and sustainable development. In a next step, the dynamic contaminant model will be integrated with the continental-scale hydrological and water resources model (i.e., Community Water Model, CWatM) to quantify a more accurate representation and feedbacks between the hydrological cycle and water quality at even larger geographical domains. Keywords: PAHs; Community multi-scale air quality model; Multimedia fate model; Land use

  18. One step DNA assembly for combinatorial metabolic engineering.

    PubMed

    Coussement, Pieter; Maertens, Jo; Beauprez, Joeri; Van Bellegem, Wouter; De Mey, Marjan

    2014-05-01

    The rapid and efficient assembly of multi-step metabolic pathways for generating microbial strains with desirable phenotypes is a critical procedure for metabolic engineering, and remains a significant challenge in synthetic biology. Although several DNA assembly methods have been developed and applied for metabolic pathway engineering, many of them are limited by their suitability for combinatorial pathway assembly. The introduction of transcriptional (promoters), translational (ribosome binding site (RBS)) and enzyme (mutant genes) variability to modulate pathway expression levels is essential for generating balanced metabolic pathways and maximizing the productivity of a strain. We report a novel, highly reliable and rapid single strand assembly (SSA) method for pathway engineering. The method was successfully optimized and applied to create constructs containing promoter, RBS and/or mutant enzyme libraries. To demonstrate its efficiency and reliability, the method was applied to fine-tune multi-gene pathways. Two promoter libraries were simultaneously introduced in front of two target genes, enabling orthogonal expression as demonstrated by principal component analysis. This shows that SSA will increase our ability to tune multi-gene pathways at all control levels for the biotechnological production of complex metabolites, achievable through the combinatorial modulation of transcription, translation and enzyme activity. Copyright © 2014 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  19. IceTrendr: a linear time-series approach to monitoring glacier environments using Landsat

    NASA Astrophysics Data System (ADS)

    Nelson, P.; Kennedy, R. E.; Nolin, A. W.; Hughes, J. M.; Braaten, J.

    2017-12-01

    Arctic glaciers in Alaska and Canada have experienced some of the greatest ice mass loss of any region in recent decades. A challenge to understanding these changing ecosystems, however, is developing globally-consistent, multi-decadal monitoring of glacier ice. We present a toolset and approach that captures, labels, and maps glacier change for use in climate science, hydrology, and Earth science education using Landsat Time Series (LTS). The core step is "temporal segmentation," wherein a yearly LTS is cleaned using pre-processing steps, converted to a snow/ice index, and then simplified into the salient shape of the change trajectory ("temporal signature") using linear segmentation. Such signatures can be characterized as simple `stable' or `transition of glacier ice to rock' to more complex multi-year changes like `transition of glacier ice to debris-covered glacier ice to open water to bare rock to vegetation'. This pilot study demonstrates the potential for interactively mapping, visualizing, and labeling glacier changes. What is truly innovative is that IceTrendr not only maps the changes but also uses expert knowledge to label the changes and such labels can be applied to other glaciers exhibiting statistically similar temporal signatures. Our key findings are that the IceTrendr concept and software can provide important functionality for glaciologists and educators interested in studying glacier changes during the Landsat TM timeframe (1984-present). Issues of concern with using dense Landsat time-series approaches for glacier monitoring include many missing images during the period 1984-1995 and that automated cloud mask are challenged and require the user to manually identify cloud-free images. IceTrendr is much more than just a simple "then and now" approach to glacier mapping. This process is a means of integrating the power of computing, remote sensing, and expert knowledge to "tell the story" of glacier changes.

  20. A game theory-reinforcement learning (GT-RL) method to develop optimal operation policies for multi-operator reservoir systems

    NASA Astrophysics Data System (ADS)

    Madani, Kaveh; Hooshyar, Milad

    2014-11-01

    Reservoir systems with multiple operators can benefit from coordination of operation policies. To maximize the total benefit of these systems the literature has normally used the social planner's approach. Based on this approach operation decisions are optimized using a multi-objective optimization model with a compound system's objective. While the utility of the system can be increased this way, fair allocation of benefits among the operators remains challenging for the social planner who has to assign controversial weights to the system's beneficiaries and their objectives. Cooperative game theory provides an alternative framework for fair and efficient allocation of the incremental benefits of cooperation. To determine the fair and efficient utility shares of the beneficiaries, cooperative game theory solution methods consider the gains of each party in the status quo (non-cooperation) as well as what can be gained through the grand coalition (social planner's solution or full cooperation) and partial coalitions. Nevertheless, estimation of the benefits of different coalitions can be challenging in complex multi-beneficiary systems. Reinforcement learning can be used to address this challenge and determine the gains of the beneficiaries for different levels of cooperation, i.e., non-cooperation, partial cooperation, and full cooperation, providing the essential input for allocation based on cooperative game theory. This paper develops a game theory-reinforcement learning (GT-RL) method for determining the optimal operation policies in multi-operator multi-reservoir systems with respect to fairness and efficiency criteria. As the first step to underline the utility of the GT-RL method in solving complex multi-agent multi-reservoir problems without a need for developing compound objectives and weight assignment, the proposed method is applied to a hypothetical three-agent three-reservoir system.

  1. Application of advanced diffraction based optical metrology overlay capabilities for high-volume manufacturing

    NASA Astrophysics Data System (ADS)

    Chen, Kai-Hsiung; Huang, Guo-Tsai; Hsieh, Hung-Chih; Ni, Wei-Feng; Chuang, S. M.; Chuang, T. K.; Ke, Chih-Ming; Huang, Jacky; Rao, Shiuan-An; Cumurcu Gysen, Aysegul; d'Alfonso, Maxime; Yueh, Jenny; Izikson, Pavel; Soco, Aileen; Wu, Jon; Nooitgedagt, Tjitte; Ottens, Jeroen; Kim, Yong Ho; Ebert, Martin

    2017-03-01

    On-product overlay requirements are becoming more challenging with every next technology node due to the continued decrease of the device dimensions and process tolerances. Therefore, current and future technology nodes require demanding metrology capabilities such as target designs that are robust towards process variations and high overlay measurement density (e.g. for higher order process corrections) to enable advanced process control solutions. The impact of advanced control solutions based on YieldStar overlay data is being presented in this paper. Multi patterning techniques are applied for critical layers and leading to additional overlay measurement demands. The use of 1D process steps results in the need of overlay measurements relative to more than one layer. Dealing with the increased number of overlay measurements while keeping the high measurement density and metrology accuracy at the same time presents a challenge for high volume manufacturing (HVM). These challenges are addressed by the capability to measure multi-layer targets with the recently introduced YieldStar metrology tool, YS350. On-product overlay results of such multi-layers and standard targets are presented including measurement stability performance.

  2. Method for sequentially processing a multi-level interconnect circuit in a vacuum chamber

    NASA Technical Reports Server (NTRS)

    Routh, D. E.; Sharma, G. C. (Inventor)

    1982-01-01

    The processing of wafer devices to form multilevel interconnects for microelectronic circuits is described. The method is directed to performing the sequential steps of etching the via, removing the photo resist pattern, back sputtering the entire wafer surface and depositing the next layer of interconnect material under common vacuum conditions without exposure to atmospheric conditions. Apparatus for performing the method includes a vacuum system having a vacuum chamber in which wafers are processed on rotating turntables. The vacuum chamber is provided with an RF sputtering system and a DC magnetron sputtering system. A gas inlet is provided in the chamber for the introduction of various gases to the vacuum chamber and the creation of various gas plasma during the sputtering steps.

  3. Engineering the cell surface display of cohesins for assembly of cellulosome-inspired enzyme complexes on Lactococcus lactis

    PubMed Central

    2010-01-01

    Background The assembly and spatial organization of enzymes in naturally occurring multi-protein complexes is of paramount importance for the efficient degradation of complex polymers and biosynthesis of valuable products. The degradation of cellulose into fermentable sugars by Clostridium thermocellum is achieved by means of a multi-protein "cellulosome" complex. Assembled via dockerin-cohesin interactions, the cellulosome is associated with the cell surface during cellulose hydrolysis, forming ternary cellulose-enzyme-microbe complexes for enhanced activity and synergy. The assembly of recombinant cell surface displayed cellulosome-inspired complexes in surrogate microbes is highly desirable. The model organism Lactococcus lactis is of particular interest as it has been metabolically engineered to produce a variety of commodity chemicals including lactic acid and bioactive compounds, and can efficiently secrete an array of recombinant proteins and enzymes of varying sizes. Results Fragments of the scaffoldin protein CipA were functionally displayed on the cell surface of Lactococcus lactis. Scaffolds were engineered to contain a single cohesin module, two cohesin modules, one cohesin and a cellulose-binding module, or only a cellulose-binding module. Cell toxicity from over-expression of the proteins was circumvented by use of the nisA inducible promoter, and incorporation of the C-terminal anchor motif of the streptococcal M6 protein resulted in the successful surface-display of the scaffolds. The facilitated detection of successfully secreted scaffolds was achieved by fusion with the export-specific reporter staphylococcal nuclease (NucA). Scaffolds retained their ability to associate in vivo with an engineered hybrid reporter enzyme, E. coli β-glucuronidase fused to the type 1 dockerin motif of the cellulosomal enzyme CelS. Surface-anchored complexes exhibited dual enzyme activities (nuclease and β-glucuronidase), and were displayed with efficiencies approaching 104 complexes/cell. Conclusions We report the successful display of cellulosome-inspired recombinant complexes on the surface of Lactococcus lactis. Significant differences in display efficiency among constructs were observed and attributed to their structural characteristics including protein conformation and solubility, scaffold size, and the inclusion and exclusion of non-cohesin modules. The surface-display of functional scaffold proteins described here represents a key step in the development of recombinant microorganisms capable of carrying out a variety of metabolic processes including the direct conversion of cellulosic substrates into fuels and chemicals. PMID:20840763

  4. Uranium Pyrophoricity Phenomena and Prediction (FAI/00-39)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PLYS, M.G.

    2000-10-10

    The purpose of this report is to provide a topical reference on the phenomena and prediction of uranium pyrophoricity for the Hanford Spent Nuclear Fuel (SNF) Project with specific applications to SNF Project processes and situations. Spent metallic uranium nuclear fuel is currently stored underwater at the K basins in the Hanford 100 area, and planned processing steps include: (1) At the basins, cleaning and placing fuel elements and scrap into stainless steel multi-canister overpacks (MCOs) holding about 6 MT of fuel apiece; (2) At nearby cold vacuum drying (CVD) stations, draining, vacuum drying, and mechanically sealing the MCOs; (3)more » Shipping the MCOs to the Canister Storage Building (CSB) on the 200 Area plateau; and (4) Welding shut and placing the MCOs for interim (40 year) dry storage in closed CSB storage tubes cooled by natural air circulation through the surrounding vault. Damaged fuel elements have exposed and corroded fuel surfaces, which can exothermically react with water vapor and oxygen during normal process steps and in off-normal situations, A key process safety concern is the rate of reaction of damaged fuel and the potential for self-sustaining or runaway reactions, also known as uranium fires or fuel ignition. Uranium metal and one of its corrosion products, uranium hydride, are potentially pyrophoric materials. Dangers of pyrophoricity of uranium and its hydride have long been known in the U.S. Department of Energy (Atomic Energy Commission/DOE) complex and will be discussed more below; it is sufficient here to note that there are numerous documented instances of uranium fires during normal operations. The motivation for this work is to place the safety of the present process in proper perspective given past operational experience. Steps in development of such a perspective are: (1) Description of underlying physical causes for runaway reactions, (2) Modeling physical processes to explain runaway reactions, (3) Validation of the method against experimental data, (4) Application of the method to plausibly explain operational experience, and (5) Application of the method to present process steps to demonstrate process safety and margin. Essentially, the logic above is used to demonstrate that runaway reactions cannot occur during normal SNF Project process steps, and to illustrate the depth of the technical basis for such a conclusion. Some off-normal conditions are identified here that could potentially lead to runaway reactions. However, this document is not intended to provide an exhaustive analysis of such cases. In summary, this report provides a ''toolkit'' of models and approaches for analysis of pyrophoricity safety issues at Hanford, and the technical basis for the recommended approaches. A summary of recommended methods appears in Section 9.0.« less

  5. Hot melt extrusion of ion-exchange resin for taste masking.

    PubMed

    Tan, David Cheng Thiam; Ong, Jeremy Jianming; Gokhale, Rajeev; Heng, Paul Wan Sia

    2018-05-30

    Taste masking is important for some unpleasant tasting bioactives in oral dosage forms. Among many methods available for taste-masking, use of ion-exchange resin (IER) holds promise. IER combined with hot melt extrusion (HME) may offer additional advantages over solvent methods. IER provides taste masking by complexing with the drug ions and preventing drug dissolution in the mouth. Drug-IER complexation approaches described in literatures are mainly based either on batch processing or column eluting. These methods of drug-IER complexation have obvious limitations such as high solvent volume requirements, multiprocessing steps and extended processing time. Thus, the objective of this study was to develop a single-step, solvent-free, continuous HME process for complexation of drug-IER. The screening study evaluated drug to IER ratio, types of IER and drug complexation methods. In the screening study, a potassium salt of a weakly acidic carboxylate-based cationic IER was found suitable for the HME method. Thereafter, optimization study was conducted by varying HME process parameters such as screw speed, extrusion temperature and drug to IER ratio. It was observed that extrusion temperature and drug to IER ratio are imperative in drug-IER complexation through HME. In summary, this study has established the feasibility of a continuous complexation method for drug to IER using HME for taste masking. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. A fractional-N frequency divider for multi-standard wireless transceiver fabricated in 0.18 μm CMOS process

    NASA Astrophysics Data System (ADS)

    Wang, Jiafeng; Fan, Xiangning; Shi, Xiaoyang; Wang, Zhigong

    2017-12-01

    With the rapid evolution of wireless communication technology, integrating various communication modes in a mobile terminal has become the popular trend. Because of this, multi-standard wireless technology is one of the hot spots in current research. This paper presents a wideband fractional-N frequency divider of the multi-standard wireless transceiver for many applications. High-speed divider-by-2 with traditional source-coupled-logic is designed for very wide band usage. Phase switching technique and a chain of divider-by-2/3 are applied to the programmable frequency divider with 0.5 step. The phase noise of the whole frequency synthesizer will be decreased by the narrower step of programmable frequency divider. Δ-Σ modulator is achieved by an improved MASH 1-1-1 structure. This structure has excellent performance in many ways, such as noise, spur and input dynamic range. Fabricated in TSMC 0.18μm CMOS process, the fractional-N frequency divider occupies a chip area of 1130 × 510 μm2 and it can correctly divide within the frequency range of 0.8-9 GHz. With 1.8 V supply voltage, its division ratio ranges from 62.5 to 254 and the total current consumption is 29 mA.

  7. Systems Engineering Lessons Learned for Class D Missions

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Piatek, Irene; Moore, Josh; Calvert, Derek

    2015-01-01

    One of NASA's goals within human exploration is to determine how to get humans to Mars safely and to live and work on the Martian surface. To accomplish this goal, several smaller missions act as stepping-stones to the larger end goal. NASA uses these smaller missions to develop new technologies and learn about how to survive outside of Low Earth Orbit for long periods. Additionally, keeping a cadence of these missions allows the team to maintain proficiency in the complex art of bringing spacecraft to fruition. Many of these smaller missions are robotic in nature and have smaller timescales, whereas there are others that involve crew and have longer mission timelines. Given the timelines associated with these various missions, different levels of risk and rigor need to be implemented to be more in line with what is appropriate for the mission. Thus, NASA has four different classifications that range from Class A to Class D based on the mission details. One of these projects is the Resource Prospector (RP) Mission, which is a multi-center and multi-institution collaborative project to search for volatiles in the polar regions of the Moon. The RP mission is classified as a Class D mission and as such, has the opportunity to more tightly manage, and therefore accept, greater levels of risk. The requirements for Class D missions were at the forefront of the design and thus presented unique challenges in vehicle development and systems engineering processes. This paper will discuss the systems engineering process at NASA and how that process is tailored for Class D missions, specifically the RP mission.

  8. Collaboration Modality, Cognitive Load, and Science Inquiry Learning in Virtual Inquiry Environments

    ERIC Educational Resources Information Center

    Erlandson, Benjamin E.; Nelson, Brian C.; Savenye, Wilhelmina C.

    2010-01-01

    Educational multi-user virtual environments (MUVEs) have been shown to be effective platforms for situated science inquiry curricula. While researchers find MUVEs to be supportive of collaborative scientific inquiry processes, the complex mix of multi-modal messages present in MUVEs can lead to cognitive overload, with learners unable to…

  9. Improving IT Portfolio Management Decision Confidence Using Multi-Criteria Decision Making and Hypervariate Display Techniques

    ERIC Educational Resources Information Center

    Landmesser, John Andrew

    2014-01-01

    Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…

  10. A Futures Study of Internationalization of the Carlson School of Management: Diverse Perspectives of Key Stakeholders

    ERIC Educational Resources Information Center

    D'Angelo, Anne Marie

    2010-01-01

    Internationalization is a multi-faceted, multi-dimensional and complex concept described most notably as a higher educational process that integrates an international perspective into its organizational leadership, vision, and curricular goals. Success is dependent upon ongoing engagement of a multitude of internal and external stakeholders with…

  11. Quantifying multi-dimensional attributes of human activities at various geographic scales based on smartphone tracking.

    PubMed

    Zhou, Xiaolu; Li, Dongying

    2018-05-09

    Advancement in location-aware technologies, and information and communication technology in the past decades has furthered our knowledge of the interaction between human activities and the built environment. An increasing number of studies have collected data regarding individual activities to better understand how the environment shapes human behavior. Despite this growing interest, some challenges exist in collecting and processing individual's activity data, e.g., capturing people's precise environmental contexts and analyzing data at multiple spatial scales. In this study, we propose and implement an innovative system that integrates smartphone-based step tracking with an app and the sequential tile scan techniques to collect and process activity data. We apply the OpenStreetMap tile system to aggregate positioning points at various scales. We also propose duration, step and probability surfaces to quantify the multi-dimensional attributes of activities. Results show that, by running the app in the background, smartphones can measure multi-dimensional attributes of human activities, including space, duration, step, and location uncertainty at various spatial scales. By coordinating Global Positioning System (GPS) sensor with accelerometer sensor, this app can save battery which otherwise would be drained by GPS sensor quickly. Based on a test dataset, we were able to detect the recreational center and sports center as the space where the user was most active, among other places visited. The methods provide techniques to address key issues in analyzing human activity data. The system can support future studies on behavioral and health consequences related to individual's environmental exposure.

  12. Salient contour extraction from complex natural scene in night vision image

    NASA Astrophysics Data System (ADS)

    Han, Jing; Yue, Jiang; Zhang, Yi; Bai, Lian-fa

    2014-03-01

    The theory of center-surround interaction in non-classical receptive field can be applied in night vision information processing. In this work, an optimized compound receptive field modulation method is proposed to extract salient contour from complex natural scene in low-light-level (LLL) and infrared images. The kernel idea is that multi-feature analysis can recognize the inhomogeneity in modulatory coverage more accurately and that center and surround with the grouping structure satisfying Gestalt rule deserves high connection-probability. Computationally, a multi-feature contrast weighted inhibition model is presented to suppress background and lower mutual inhibition among contour elements; a fuzzy connection facilitation model is proposed to achieve the enhancement of contour response, the connection of discontinuous contour and the further elimination of randomly distributed noise and texture; a multi-scale iterative attention method is designed to accomplish dynamic modulation process and extract contours of targets in multi-size. This work provides a series of biologically motivated computational visual models with high-performance for contour detection from cluttered scene in night vision images.

  13. Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.

    PubMed

    Howard, Allison M; Fragaszy, Dorothy M

    2014-09-01

    Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies. © 2014 Wiley Periodicals, Inc.

  14. An ESL Audio-Script Writing Workshop

    ERIC Educational Resources Information Center

    Miller, Carla

    2012-01-01

    The roles of dialogue, collaborative writing, and authentic communication have been explored as effective strategies in second language writing classrooms. In this article, the stages of an innovative, multi-skill writing method, which embeds students' personal voices into the writing process, are explored. A 10-step ESL Audio Script Writing Model…

  15. Energy--What to Do until the Computer Comes.

    ERIC Educational Resources Information Center

    Johnston, Archie B.

    Drawing from Tallahassee Community College's (TCC's) experiences with energy conservation, this paper offers suggestions for reducing energy costs through computer-controlled systems and other means. After stating the energy problems caused by TCC's multi-zone heating and cooling system, the paper discusses the five-step process by which TCC…

  16. A Quantitative Tunneling/Desorption Model for the Exchange Current at the Porous Electrode/Beta - Alumina/Alkali Metal Gas Three Phase Zone at 700-1300K

    NASA Technical Reports Server (NTRS)

    Williams, R. M.; Ryan, M. A.; Saipetch, C.; LeDuc, H. G.

    1996-01-01

    The exchange current observed at porous metal electrodes on sodium or potassium beta -alumina solid electrolytes in alkali metal vapor is quantitatively modeled with a multi-step process with good agreement with experimental results.

  17. Simulating Carbon cycle and phenology in complex forests using a multi-layer process based ecosystem model; evaluation and use of 3D-CMCC-Forest Ecosystem Model in a deciduous and an evergreen neighboring forests, within the area of Brasschaat (Be)

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Collalti, A.; Santini, M.; Valentini, R.

    2013-12-01

    3D-CMCC-Forest Ecosystem Model is a process based model formerly developed for complex forest ecosystems to estimate growth, water and carbon cycles, phenology and competition processes on a daily/monthly time scale. The Model integrates some characteristics of the functional-structural tree models with the robustness of the light use efficiency approach. It treats different heights, ages and species as discrete classes, in competition for light (vertical structure) and space (horizontal structure). The present work evaluates the results of the recently developed daily version of 3D-CMCC-FEM for two neighboring different even aged and mono specific study cases. The former is a heterogeneous Pedunculate oak forest (Quercus robur L. ), the latter a more homogeneous Scot pine forest (Pinus sylvestris L.). The multi-layer approach has been evaluated against a series of simplified versions to determine whether the improved model complexity in canopy structure definition increases its predictive ability. Results show that a more complex structure (three height layers) should be preferable to simulate heterogeneous scenarios (Pedunculate oak stand), where heights distribution within the canopy justify the distinction in dominant, dominated and sub-dominated layers. On the contrary, it seems that using a multi-layer approach for more homogeneous stands (Scot pine stand) may be disadvantageous. Forcing the structure of an homogeneous stand to a multi-layer approach may in fact increase sources of uncertainty. On the other hand forcing complex forests to a mono layer simplified model, may cause an increase in mortality and a reduction in average DBH and Height. Compared with measured CO2 flux data, model results show good ability in estimating carbon sequestration trends, on both a monthly/seasonal and daily time scales. Moreover the model simulates quite well leaf phenology and the combined effects of the two different forest stands on CO2 fluxes.

  18. Update of KDBI: Kinetic Data of Bio-molecular Interaction database

    PubMed Central

    Kumar, Pankaj; Han, B. C.; Shi, Z.; Jia, J.; Wang, Y. P.; Zhang, Y. T.; Liang, L.; Liu, Q. F.; Ji, Z. L.; Chen, Y. Z.

    2009-01-01

    Knowledge of the kinetics of biomolecular interactions is important for facilitating the study of cellular processes and underlying molecular events, and is essential for quantitative study and simulation of biological systems. Kinetic Data of Bio-molecular Interaction database (KDBI) has been developed to provide information about experimentally determined kinetic data of protein–protein, protein–nucleic acid, protein–ligand, nucleic acid–ligand binding or reaction events described in the literature. To accommodate increasing demand for studying and simulating biological systems, numerous improvements and updates have been made to KDBI, including new ways to access data by pathway and molecule names, data file in System Biology Markup Language format, more efficient search engine, access to published parameter sets of simulation models of 63 pathways, and 2.3-fold increase of data (19 263 entries of 10 532 distinctive biomolecular binding and 11 954 interaction events, involving 2635 proteins/protein complexes, 847 nucleic acids, 1603 small molecules and 45 multi-step processes). KDBI is publically available at http://bidd.nus.edu.sg/group/kdbi/kdbi.asp. PMID:18971255

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Small, Leo J.; Brumbach, Michael T.; Clem, Paul G.

    A new multi-step, solution-phase method for the spontaneous deposition of tungsten from a room temperature ethereal solution is reported. This immersion process relies on the deposition of a sacrificial zinc coating which is galvanically displaced by the ether-mediated reduction of oxophilic WCl 6. Subsequent thermal treatment renders a crystalline, metallic tungsten film. The chemical evolution of the surface and formation of a complex intermediate tungsten species is characterized by X-ray diffraction, infrared spectroscopy, and X-ray photoelectron spectroscopy. Efficient metallic tungsten deposition is first characterized on a graphite substrate and then demonstrated on a functional carbon foam electrode. The resulting electrochemicalmore » performance of the modified electrode is interrogated with the canonical aqueous ferricyanide system. A tungsten-coated carbon foam electrode showed that both electrode resistance and overall electrochemical cell resistance were reduced by 50%, resulting in a concomitant decrease in redox peak separation from 1.902 V to 0.783 V. Furthermore, this process promises voltage efficiency gains in electrodes for energy storage technologies and demonstrates the viability of a new route to tungsten coating for technologies and industries where high conductivity and chemical stability are paramount.« less

  20. Genetic approaches of the Fe-S cluster biogenesis process in bacteria: Historical account, methodological aspects and future challenges.

    PubMed

    Py, Béatrice; Barras, Frédéric

    2015-06-01

    Since their discovery in the 50's, Fe-S cluster proteins have attracted much attention from chemists, biophysicists and biochemists. However, in the 80's they were joined by geneticists who helped to realize that in vivo maturation of Fe-S cluster bound proteins required assistance of a large number of factors defining complex multi-step pathways. The question of how clusters are formed and distributed in vivo has since been the focus of much effort. Here we review how genetics in discovering genes and investigating processes as they unfold in vivo has provoked seminal advances toward our understanding of Fe-S cluster biogenesis. The power and limitations of genetic approaches are discussed. As a final comment, we argue how the marriage of classic strategies and new high-throughput technologies should allow genetics of Fe-S cluster biology to be even more insightful in the future. This article is part of a Special Issue entitled: Fe/S proteins: Analysis, structure, function, biogenesis and diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Recent advances in the chemistry of Rh carbenoids: multicomponent reactions of diazocarbonyl compounds

    NASA Astrophysics Data System (ADS)

    Medvedev, J. J.; Nikolaev, V. A.

    2015-07-01

    Multicomponent reactions of diazo compounds catalyzed by RhII complexes become a powerful tool for organic synthesis. They enable three- or four-step processes to be carried out as one-pot procedures (actually as one step) with high stereoselectivity to give complex organic molecules, including biologically active compounds. This review addresses recent results in the chemistry of Rh-catalyzed multicomponent reactions of diazocarbonyl compounds with the intermediate formation of N-, O- and C=O-ylides. The diastereo- and enantioselectivity of these reactions and the possibility of using various co-catalysts to increase the efficiency of the processes under consideration are discussed. The bibliography includes 120 references.

  2. Environmental Signals and Regulatory Pathways That Influence Exopolysaccharide Production in Rhizobia

    PubMed Central

    Janczarek, Monika

    2011-01-01

    Rhizobia are Gram-negative bacteria that can exist either as free-living bacteria or as nitrogen-fixing symbionts inside root nodules of leguminous plants. The composition of the rhizobial outer surface, containing a variety of polysaccharides, plays a significant role in the adaptation of these bacteria in both habitats. Among rhizobial polymers, exopolysaccharide (EPS) is indispensable for the invasion of a great majority of host plants which form indeterminate-type nodules. Various functions are ascribed to this heteropolymer, including protection against environmental stress and host defense, attachment to abiotic and biotic surfaces, and in signaling. The synthesis of EPS in rhizobia is a multi-step process regulated by several proteins at both transcriptional and post-transcriptional levels. Also, some environmental factors (carbon source, nitrogen and phosphate starvation, flavonoids) and stress conditions (osmolarity, ionic strength) affect EPS production. This paper discusses the recent data concerning the function of the genes required for EPS synthesis and the regulation of this process by several environmental signals. Up till now, the synthesis of rhizobial EPS has been best studied in two species, Sinorhizobium meliloti and Rhizobium leguminosarum. The latest data indicate that EPS synthesis in rhizobia undergoes very complex hierarchical regulation, in which proteins engaged in quorum sensing and the regulation of motility genes also participate. This finding enables a better understanding of the complex processes occurring in the rhizosphere which are crucial for successful colonization and infection of host plant roots. PMID:22174640

  3. A method for discrimination of noise and EMG signal regions recorded during rhythmic behaviors.

    PubMed

    Ying, Rex; Wall, Christine E

    2016-12-08

    Analyses of muscular activity during rhythmic behaviors provide critical data for biomechanical studies. Electrical potentials measured from muscles using electromyography (EMG) require discrimination of noise regions as the first step in analysis. An experienced analyst can accurately identify the onset and offset of EMG but this process takes hours to analyze a short (10-15s) record of rhythmic EMG bursts. Existing computational techniques reduce this time but have limitations. These include a universal threshold for delimiting noise regions (i.e., a single signal value for identifying the EMG signal onset and offset), pre-processing using wide time intervals that dampen sensitivity for EMG signal characteristics, poor performance when a low frequency component (e.g., DC offset) is present, and high computational complexity leading to lack of time efficiency. We present a new statistical method and MATLAB script (EMG-Extractor) that includes an adaptive algorithm to discriminate noise regions from EMG that avoids these limitations and allows for multi-channel datasets to be processed. We evaluate the EMG-Extractor with EMG data on mammalian jaw-adductor muscles during mastication, a rhythmic behavior typified by low amplitude onsets/offsets and complex signal pattern. The EMG-Extractor consistently and accurately distinguishes noise from EMG in a manner similar to that of an experienced analyst. It outputs the raw EMG signal region in a form ready for further analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Artificial concurrent catalytic processes involving enzymes.

    PubMed

    Köhler, Valentin; Turner, Nicholas J

    2015-01-11

    The concurrent operation of multiple catalysts can lead to enhanced reaction features including (i) simultaneous linear multi-step transformations in a single reaction flask (ii) the control of intermediate equilibria (iii) stereoconvergent transformations (iv) rapid processing of labile reaction products. Enzymes occupy a prominent position for the development of such processes, due to their high potential compatibility with other biocatalysts. Genes for different enzymes can be co-expressed to reconstruct natural or construct artificial pathways and applied in the form of engineered whole cell biocatalysts to carry out complex transformations or, alternatively, the enzymes can be combined in vitro after isolation. Moreover, enzyme variants provide a wider substrate scope for a given reaction and often display altered selectivities and specificities. Man-made transition metal catalysts and engineered or artificial metalloenzymes also widen the range of reactivities and catalysed reactions that are potentially employable. Cascades for simultaneous cofactor or co-substrate regeneration or co-product removal are now firmly established. Many applications of more ambitious concurrent cascade catalysis are only just beginning to appear in the literature. The current review presents some of the most recent examples, with an emphasis on the combination of transition metal with enzymatic catalysis and aims to encourage researchers to contribute to this emerging field.

  5. Deposition of tungsten metal by an immersion process

    DOE PAGES

    Small, Leo J.; Brumbach, Michael T.; Clem, Paul G.; ...

    2017-03-23

    A new multi-step, solution-phase method for the spontaneous deposition of tungsten from a room temperature ethereal solution is reported. This immersion process relies on the deposition of a sacrificial zinc coating which is galvanically displaced by the ether-mediated reduction of oxophilic WCl 6. Subsequent thermal treatment renders a crystalline, metallic tungsten film. The chemical evolution of the surface and formation of a complex intermediate tungsten species is characterized by X-ray diffraction, infrared spectroscopy, and X-ray photoelectron spectroscopy. Efficient metallic tungsten deposition is first characterized on a graphite substrate and then demonstrated on a functional carbon foam electrode. The resulting electrochemicalmore » performance of the modified electrode is interrogated with the canonical aqueous ferricyanide system. A tungsten-coated carbon foam electrode showed that both electrode resistance and overall electrochemical cell resistance were reduced by 50%, resulting in a concomitant decrease in redox peak separation from 1.902 V to 0.783 V. Furthermore, this process promises voltage efficiency gains in electrodes for energy storage technologies and demonstrates the viability of a new route to tungsten coating for technologies and industries where high conductivity and chemical stability are paramount.« less

  6. Discovery of multi-ring basins - Gestalt perception in planetary science

    NASA Technical Reports Server (NTRS)

    Hartmann, W. K.

    1981-01-01

    Early selenographers resolved individual structural components of multi-ring basin systems but missed the underlying large-scale multi-ring basin patterns. The recognition of multi-ring basins as a general class of planetary features can be divided into five steps. Gilbert (1893) took a first step in recognizing radial 'sculpture' around the Imbrium basin system. Several writers through the 1940's rediscovered the radial sculpture and extended this concept by describing concentric rings around several circular maria. Some reminiscences are given about the fourth step - discovery of the Orientale basin and other basin systems by rectified lunar photography at the University of Arizona in 1961-62. Multi-ring basins remained a lunar phenomenon until the fifth step - discovery of similar systems of features on other planets, such as Mars (1972), Mercury (1974), and possibly Callisto and Ganymede (1979). This sequence is an example of gestalt recognition whose implications for scientific research are discussed.

  7. Efficient brain lesion segmentation using multi-modality tissue-based feature selection and support vector machines.

    PubMed

    Fiot, Jean-Baptiste; Cohen, Laurent D; Raniga, Parnesh; Fripp, Jurgen

    2013-09-01

    Support vector machines (SVM) are machine learning techniques that have been used for segmentation and classification of medical images, including segmentation of white matter hyper-intensities (WMH). Current approaches using SVM for WMH segmentation extract features from the brain and classify these followed by complex post-processing steps to remove false positives. The method presented in this paper combines advanced pre-processing, tissue-based feature selection and SVM classification to obtain efficient and accurate WMH segmentation. Features from 125 patients, generated from up to four MR modalities [T1-w, T2-w, proton-density and fluid attenuated inversion recovery(FLAIR)], differing neighbourhood sizes and the use of multi-scale features were compared. We found that although using all four modalities gave the best overall classification (average Dice scores of 0.54  ±  0.12, 0.72  ±  0.06 and 0.82  ±  0.06 respectively for small, moderate and severe lesion loads); this was not significantly different (p = 0.50) from using just T1-w and FLAIR sequences (Dice scores of 0.52  ±  0.13, 0.71  ±  0.08 and 0.81  ±  0.07). Furthermore, there was a negligible difference between using 5 × 5 × 5 and 3 × 3 × 3 features (p = 0.93). Finally, we show that careful consideration of features and pre-processing techniques not only saves storage space and computation time but also leads to more efficient classification, which outperforms the one based on all features with post-processing. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  9. Multi-objective design optimization of antenna structures using sequential domain patching with automated patch size determination

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2018-02-01

    In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.

  10. Multi-Criteria Approach in Multifunctional Building Design Process

    NASA Astrophysics Data System (ADS)

    Gerigk, Mateusz

    2017-10-01

    The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.

  11. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    PubMed Central

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships. PMID:23900554

  12. A novel method for a multi-level hierarchical composite with brick-and-mortar structure.

    PubMed

    Brandt, Kristina; Wolff, Michael F H; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  13. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    NASA Astrophysics Data System (ADS)

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-07-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  14. Multi-scale heat and mass transfer modelling of cell and tissue cryopreservation

    PubMed Central

    Xu, Feng; Moon, Sangjun; Zhang, Xiaohui; Shao, Lei; Song, Young Seok; Demirci, Utkan

    2010-01-01

    Cells and tissues undergo complex physical processes during cryopreservation. Understanding the underlying physical phenomena is critical to improve current cryopreservation methods and to develop new techniques. Here, we describe multi-scale approaches for modelling cell and tissue cryopreservation including heat transfer at macroscale level, crystallization, cell volume change and mass transport across cell membranes at microscale level. These multi-scale approaches allow us to study cell and tissue cryopreservation. PMID:20047939

  15. Neurophysiological Basis of Multi-Scale Entropy of Brain Complexity and Its Relationship With Functional Connectivity.

    PubMed

    Wang, Danny J J; Jann, Kay; Fan, Chang; Qiao, Yang; Zang, Yu-Feng; Lu, Hanbing; Yang, Yihong

    2018-01-01

    Recently, non-linear statistical measures such as multi-scale entropy (MSE) have been introduced as indices of the complexity of electrophysiology and fMRI time-series across multiple time scales. In this work, we investigated the neurophysiological underpinnings of complexity (MSE) of electrophysiology and fMRI signals and their relations to functional connectivity (FC). MSE and FC analyses were performed on simulated data using neural mass model based brain network model with the Brain Dynamics Toolbox, on animal models with concurrent recording of fMRI and electrophysiology in conjunction with pharmacological manipulations, and on resting-state fMRI data from the Human Connectome Project. Our results show that the complexity of regional electrophysiology and fMRI signals is positively correlated with network FC. The associations between MSE and FC are dependent on the temporal scales or frequencies, with higher associations between MSE and FC at lower temporal frequencies. Our results from theoretical modeling, animal experiment and human fMRI indicate that (1) Regional neural complexity and network FC may be two related aspects of brain's information processing: the more complex regional neural activity, the higher FC this region has with other brain regions; (2) MSE at high and low frequencies may represent local and distributed information processing across brain regions. Based on literature and our data, we propose that the complexity of regional neural signals may serve as an index of the brain's capacity of information processing-increased complexity may indicate greater transition or exploration between different states of brain networks, thereby a greater propensity for information processing.

  16. Effects of the Acrylic Polyol Structure and the Selectivity of the Employed Catalyst on the Performance of Two-Component Aqueous Polyurethane Coatings

    PubMed Central

    Cakic, Suzana; Lacnjevac, Caslav; Stamenkovic, Jakov; Ristic, Nikola; Takic, Ljiljana; Barac, Miroljub; Gligoric, Miladin

    2007-01-01

    Two kinds of aqueous acrylic polyols (single step and multi step synthesis type) have been investigated for their performance in the two-component aqueous polyurethane application, by using more selective catalysts. The aliphatic polyfunctional isocyanates based on hexamethylen diisocyanates have been employed as suitable hardeners. The complex of zirconium, commercially known as K-KAT®XC-6212, and manganese (III) complexes with mixed ligands based on the derivative of maleic acid have been used as catalysts in this study. Both of the aqueous polyols give good results, in terms of application and hardness, when elevated temperatures and more selective catalysts are applied. A more selective catalyst promotes the reaction between the isocyanate and polyol component. This increases the percentage of urethane bonds and the degree of hardness in the films formed from the two components of aqueous polyurethane lacquers. The polyol based on the single step synthesis route is favourable concerning potlife and hardness. The obtained results show that the performance of the two-component aqueous polyurethane coatings depends on the polymer structure of the polyols as well as on the selectivity of the employed catalyst.

  17. Determination of helix orientations in a flexible DNA by multi-frequency EPR spectroscopy.

    PubMed

    Grytz, C M; Kazemi, S; Marko, A; Cekan, P; Güntert, P; Sigurdsson, S Th; Prisner, T F

    2017-11-15

    Distance measurements are performed between a pair of spin labels attached to nucleic acids using Pulsed Electron-Electron Double Resonance (PELDOR, also called DEER) spectroscopy which is a complementary tool to other structure determination methods in structural biology. The rigid spin label Ç, when incorporated pairwise into two helical parts of a nucleic acid molecule, allows the determination of both the mutual orientation and the distance between those labels, since Ç moves rigidly with the helix to which it is attached. We have developed a two-step protocol to investigate the conformational flexibility of flexible nucleic acid molecules by multi-frequency PELDOR. In the first step, a library with a broad collection of conformers, which are in agreement with topological constraints, NMR restraints and distances derived from PELDOR, was created. In the second step, a weighted structural ensemble of these conformers was chosen, such that it fits the multi-frequency PELDOR time traces of all doubly Ç-labelled samples simultaneously. This ensemble reflects the global structure and the conformational flexibility of the two-way DNA junction. We demonstrate this approach on a flexible bent DNA molecule, consisting of two short helical parts with a five adenine bulge at the center. The kink and twist motions between both helical parts were quantitatively determined and showed high flexibility, in agreement with a Förster Resonance Energy Transfer (FRET) study on a similar bent DNA motif. The approach presented here should be useful to describe the relative orientation of helical motifs and the conformational flexibility of nucleic acid structures, both alone and in complexes with proteins and other molecules.

  18. Gynecologic oncology group strategies to improve timeliness of publication.

    PubMed

    Bialy, Sally; Blessing, John A; Stehman, Frederick B; Reardon, Anne M; Blaser, Kim M

    2013-08-01

    The Gynecologic Oncology Group (GOG) is a multi-institution cooperative group funded by the National Cancer Institute to conduct clinical trials encompassing clinical and basic scientific research in gynecologic malignancies. These results are disseminated via publication in peer-reviewed journals. This process requires collaboration of numerous investigators located in diverse cancer research centers. Coordination of manuscript development is positioned within the Statistical and Data Center (SDC), thus allowing the SDC personnel to manage the process and refine strategies to promote earlier dissemination of results. A major initiative to improve timeliness utilizing the assignment, monitoring, and enforcement of deadlines for each phase of manuscript development is the focus of this investigation. Document improvement in timeliness via comparison of deadline compliance and time to journal submission due to expanded administrative and technologic initiatives implemented in 2006. Major steps in the publication process include generation of first draft by the First Author and submission to SDC, Co-author review, editorial review by Publications Subcommittee, response to journal critique, and revision. Associated with each step are responsibilities of First Author to write or revise, collaborating Biostatistician to perform analysis and interpretation, and assigned SDC Clinical Trials Editorial Associate to format/revise according to journal requirements. Upon the initiation of each step, a deadline for completion is assigned. In order to improve efficiency, a publications database was developed to track potential steps in manuscript development that enables the SDC Director of Administration and the Publications Subcommittee Chair to assign, monitor, and enforce deadlines. They, in turn, report progress to Group Leadership through the Operations Committee. The success of the strategies utilized to improve the GOG publication process was assessed by comparing the timeliness of each potential step in the development of primary Phase II manuscripts during 2003-2006 versus 2007-2010. Improvement was noted in 10 of 11 identified steps resulting in a cumulative average improvement of 240 days from notification of data maturity to First Author through first submission to a journal. Moreover, the average time to journal acceptance has improved by an average of 346 days. The investigation is based on only Phase II trials to ensure comparability of manuscript complexity. Nonetheless, the procedures employed are applicable to the development of any clinical trials manuscript. The assignment, monitoring, and enforcement of deadlines for all stages of manuscript development have resulted in increased efficiency and timeliness. The positioning and support of manuscript development within the SDC provide a valuable resource to authors in meeting assigned deadlines, accomplishing peer review, and complying with journal requirements.

  19. Benzimidazoles: an ideal privileged drug scaffold for the design of multitargeted anti-inflammatory ligands.

    PubMed

    Kaur, Gaganpreet; Kaur, Maninder; Silakari, Om

    2014-01-01

    The recent research area endeavors to discover ultimate multi-target ligands, an increasingly feasible and attractive alternative to existing mono-targeted drugs for treatment of complex, multi-factorial inflammation process which underlays plethora of debilitated health conditions. In order to improvise this option, exploration of relevant chemical core scaffold will be an utmost need. Privileged benzimidazole scaffold being historically versatile structural motif could offer a viable starting point in the search for novel multi-target ligands against multi-factorial inflammation process since, when appropriately substituted, it can selectively modulate diverse receptors, pathways and enzymes associated with the pathogenesis of inflammation. Despite this remarkable capability, the multi-target capacity of the benzimidazole scaffold remains largely unexploited. With this in focus, the present review article attempts to provide synopsis of published research to exemplify the valuable use of benzimidazole nucleus and focus on their suitability as starting scaffold to develop multi-targeted anti-inflammatory ligands.

  20. Resolution of singularities for multi-loop integrals

    NASA Astrophysics Data System (ADS)

    Bogner, Christian; Weinzierl, Stefan

    2008-04-01

    We report on a program for the numerical evaluation of divergent multi-loop integrals. The program is based on iterated sector decomposition. We improve the original algorithm of Binoth and Heinrich such that the program is guaranteed to terminate. The program can be used to compute numerically the Laurent expansion of divergent multi-loop integrals regulated by dimensional regularisation. The symbolic and the numerical steps of the algorithm are combined into one program. Program summaryProgram title: sector_decomposition Catalogue identifier: AEAG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 47 506 No. of bytes in distributed program, including test data, etc.: 328 485 Distribution format: tar.gz Programming language: C++ Computer: all Operating system: Unix RAM: Depending on the complexity of the problem Classification: 4.4 External routines: GiNaC, available from http://www.ginac.de, GNU scientific library, available from http://www.gnu.org/software/gsl Nature of problem: Computation of divergent multi-loop integrals. Solution method: Sector decomposition. Restrictions: Only limited by the available memory and CPU time. Running time: Depending on the complexity of the problem.

  1. Communication in diagnostic radiology: meeting the challenges of complexity.

    PubMed

    Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J

    2014-11-01

    As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.

  2. Environmental hazard mapping using GIS and AHP - A case study of Dong Trieu District in Quang Ninh Province, Vietnam

    NASA Astrophysics Data System (ADS)

    Anh, N. K.; Phonekeo, V.; My, V. C.; Duong, N. D.; Dat, P. T.

    2014-02-01

    In recent years, Vietnamese economy has been growing up rapidly and caused serious environmental quality plunging, especially in industrial and mining areas. It brings an enormous threat to a socially sustainable development and the health of human beings. Environmental quality assessment and protection are complex and dynamic processes, since it involves spatial information from multi-sector, multi-region and multi-field sources and needs complicated data processing. Therefore, an effective environmental protection information system is needed, in which considerable factors hidden in the complex relationships will become clear and visible. In this paper, the authors present the methodology which was used to generate environmental hazard maps which are applied to the integration of Analytic Hierarchy Process (AHP) and Geographical Information system (GIS). We demonstrate the results that were obtained from the study area in Dong Trieu district. This research study has contributed an overall perspective of environmental quality and identified the devastated areas where the administration urgently needs to establish an appropriate policy to improve and protect the environment.

  3. Comprehensive proteomic analysis of the human spliceosome

    NASA Astrophysics Data System (ADS)

    Zhou, Zhaolan; Licklider, Lawrence J.; Gygi, Steven P.; Reed, Robin

    2002-09-01

    The precise excision of introns from pre-messenger RNA is performed by the spliceosome, a macromolecular machine containing five small nuclear RNAs and numerous proteins. Much has been learned about the protein components of the spliceosome from analysis of individual purified small nuclear ribonucleoproteins and salt-stable spliceosome `core' particles. However, the complete set of proteins that constitutes intact functional spliceosomes has yet to be identified. Here we use maltose-binding protein affinity chromatography to isolate spliceosomes in highly purified and functional form. Using nanoscale microcapillary liquid chromatography tandem mass spectrometry, we identify ~145 distinct spliceosomal proteins, making the spliceosome the most complex cellular machine so far characterized. Our spliceosomes comprise all previously known splicing factors and 58 newly identified components. The spliceosome contains at least 30 proteins with known or putative roles in gene expression steps other than splicing. This complexity may be required not only for splicing multi-intronic metazoan pre-messenger RNAs, but also for mediating the extensive coupling between splicing and other steps in gene expression.

  4. A modular platform for one-step assembly of multi-component membrane systems by fusion of charged proteoliposomes

    NASA Astrophysics Data System (ADS)

    Ishmukhametov, Robert R.; Russell, Aidan N.; Berry, Richard M.

    2016-10-01

    An important goal in synthetic biology is the assembly of biomimetic cell-like structures, which combine multiple biological components in synthetic lipid vesicles. A key limiting assembly step is the incorporation of membrane proteins into the lipid bilayer of the vesicles. Here we present a simple method for delivery of membrane proteins into a lipid bilayer within 5 min. Fusogenic proteoliposomes, containing charged lipids and membrane proteins, fuse with oppositely charged bilayers, with no requirement for detergent or fusion-promoting proteins, and deliver large, fragile membrane protein complexes into the target bilayers. We demonstrate the feasibility of our method by assembling a minimal electron transport chain capable of adenosine triphosphate (ATP) synthesis, combining Escherichia coli F1Fo ATP-synthase and the primary proton pump bo3-oxidase, into synthetic lipid vesicles with sizes ranging from 100 nm to ~10 μm. This provides a platform for the combination of multiple sets of membrane protein complexes into cell-like artificial structures.

  5. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  6. Development of a 3-step straight-through purification strategy combining membrane adsorbers and resins.

    PubMed

    Hughson, Michael D; Cruz, Thayana A; Carvalho, Rimenys J; Castilho, Leda R

    2017-07-01

    The pressures to efficiently produce complex biopharmaceuticals at reduced costs are driving the development of novel techniques, such as in downstream processing with straight-through processing (STP). This method involves directly and sequentially purifying a particular target with minimal holding steps. This work developed and compared six different 3-step STP strategies, combining membrane adsorbers, monoliths, and resins, to purify a large, complex, and labile glycoprotein from Chinese hamster ovary cell culture supernatant. The best performing pathway was cation exchange chromatography to hydrophobic interaction chromatography to affinity chromatography with an overall product recovery of up to 88% across the process and significant clearance of DNA and protein impurities. This work establishes a platform and considerations for the development of STP of biopharmaceutical products and highlights its suitability for integration with single-use technologies and continuous production methods. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:931-940, 2017. © 2017 American Institute of Chemical Engineers.

  7. Imaging Study of Multi-Crystalline Silicon Wafers Throughout the Manufacturing Process: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, S.; Yan, F.; Zaunbracher, K.

    2011-07-01

    Imaging techniques are applied to multi-crystalline silicon bricks, wafers at various process steps, and finished solar cells. Photoluminescence (PL) imaging is used to characterize defects and material quality on bricks and wafers. Defect regions within the wafers are influenced by brick position within an ingot and height within the brick. The defect areas in as-cut wafers are compared to imaging results from reverse-bias electroluminescence and dark lock-in thermography and cell parameters of near-neighbor finished cells. Defect areas are also characterized by defect band emissions. The defect areas measured by these techniques on as-cut wafers are shown to correlate to finishedmore » cell performance.« less

  8. Tracking children's mental states while solving algebra equations.

    PubMed

    Anderson, John R; Betts, Shawn; Ferris, Jennifer L; Fincham, Jon M

    2012-11-01

    Behavioral and function magnetic resonance imagery (fMRI) data were combined to infer the mental states of students as they interacted with an intelligent tutoring system. Sixteen children interacted with a computer tutor for solving linear equations over a six-day period (days 0-5), with days 1 and 5 occurring in an fMRI scanner. Hidden Markov model algorithms combined a model of student behavior with multi-voxel imaging pattern data to predict the mental states of students. We separately assessed the algorithms' ability to predict which step in a problem-solving sequence was performed and whether the step was performed correctly. For day 1, the data patterns of other students were used to predict the mental states of a target student. These predictions were improved on day 5 by adding information about the target student's behavioral and imaging data from day 1. Successful tracking of mental states depended on using the combination of a behavioral model and multi-voxel pattern analysis, illustrating the effectiveness of an integrated approach to tracking the cognition of individuals in real time as they perform complex tasks. Copyright © 2011 Wiley Periodicals, Inc.

  9. Simulation of dynamic processes when machining transition surfaces of stepped shafts

    NASA Astrophysics Data System (ADS)

    Maksarov, V. V.; Krasnyy, V. A.; Viushin, R. V.

    2018-03-01

    The paper addresses the characteristics of stepped surfaces of parts categorized as "solids of revolution". It is noted that in the conditions of transition modes during the switch to end surface machining, there is cutting with varied load intensity in the section of the cut layer, which leads to change in cutting force, onset of vibrations, an increase in surface layer roughness, a decrease of size precision, and increased wear of a tool's cutting edge. This work proposes a method that consists in developing a CNC program output code that allows one to process complex forms of stepped shafts with only one machine setup. The authors developed and justified a mathematical model of a technological system for mechanical processing with consideration for the resolution of tool movement at the stages of transition processes to assess the dynamical stability of a system in the process of manufacturing stepped surfaces of parts of “solid of revolution” type.

  10. Fabrication of hybrid molecular devices using multi-layer graphene break junctions.

    PubMed

    Island, J O; Holovchenko, A; Koole, M; Alkemade, P F A; Menelaou, M; Aliaga-Alcalde, N; Burzurí, E; van der Zant, H S J

    2014-11-26

    We report on the fabrication of hybrid molecular devices employing multi-layer graphene (MLG) flakes which are patterned with a constriction using a helium ion microscope or an oxygen plasma etch. The patterning step allows for the localization of a few-nanometer gap, created by electroburning, that can host single molecules or molecular ensembles. By controlling the width of the sculpted constriction, we regulate the critical power at which the electroburning process begins. We estimate the flake temperature given the critical power and find that at low powers it is possible to electroburn MLG with superconducting contacts in close proximity. Finally, we demonstrate the fabrication of hybrid devices with superconducting contacts and anthracene-functionalized copper curcuminoid molecules. This method is extendable to spintronic devices with ferromagnetic contacts and a first step towards molecular integrated circuits.

  11. Fabrication of hybrid molecular devices using multi-layer graphene break junctions

    NASA Astrophysics Data System (ADS)

    Island, J. O.; Holovchenko, A.; Koole, M.; Alkemade, P. F. A.; Menelaou, M.; Aliaga-Alcalde, N.; Burzurí, E.; van der Zant, H. S. J.

    2014-11-01

    We report on the fabrication of hybrid molecular devices employing multi-layer graphene (MLG) flakes which are patterned with a constriction using a helium ion microscope or an oxygen plasma etch. The patterning step allows for the localization of a few-nanometer gap, created by electroburning, that can host single molecules or molecular ensembles. By controlling the width of the sculpted constriction, we regulate the critical power at which the electroburning process begins. We estimate the flake temperature given the critical power and find that at low powers it is possible to electroburn MLG with superconducting contacts in close proximity. Finally, we demonstrate the fabrication of hybrid devices with superconducting contacts and anthracene-functionalized copper curcuminoid molecules. This method is extendable to spintronic devices with ferromagnetic contacts and a first step towards molecular integrated circuits.

  12. Responsible innovation in port development: the Rotterdam Maasvlakte 2 and the Dalian Dayao Bay extension projects.

    PubMed

    Ravesteijn, Wim; Liu, Yi; Yan, Ping

    2015-01-01

    The paper outlines and specifies 'responsible port innovation', introducing the development of a methodological and procedural step-by-step plan for the implementation and evaluation of (responsible) innovations. Subsequently, it uses this as a guideline for the analysis and evaluation of two case-studies. The construction of the Rotterdam Maasvlakte 2 Port meets most of the formulated requirements, though making values more explicit and treating it as a process right from the start could have benefitted the project. The Dalian Dayao Port could improve its decision-making procedures in several respects, including the introduction of new methods to handle value tensions. Both projects show that public support is crucial in responsible port innovation and that it should be not only a multi-faceted but also a multi-level strategy.

  13. Impact of user influence on information multi-step communication in a micro-blog

    NASA Astrophysics Data System (ADS)

    Wu, Yue; Hu, Yong; He, Xiao-Hai; Deng, Ken

    2014-06-01

    User influence is generally considered as one of the most critical factors that affect information cascading spreading. Based on this common assumption, this paper proposes a theoretical model to examine user influence on the information multi-step communication in a micro-blog. The multi-steps of information communication are divided into first-step and non-first-step, and user influence is classified into five dimensions. Actual data from the Sina micro-blog is collected to construct the model by means of an approach based on structural equations that uses the Partial Least Squares (PLS) technique. Our experimental results indicate that the dimensions of the number of fans and their authority significantly impact the information of first-step communication. Leader rank has a positive impact on both first-step and non-first-step communication. Moreover, global centrality and weight of friends are positively related to the information non-first-step communication, but authority is found to have much less relation to it.

  14. An analysis of hydrogen production via closed-cycle schemes. [thermochemical processings from water

    NASA Technical Reports Server (NTRS)

    Chao, R. E.; Cox, K. E.

    1975-01-01

    A thermodynamic analysis and state-of-the-art review of three basic schemes for production of hydrogen from water: electrolysis, thermal water-splitting, and multi-step thermochemical closed cycles is presented. Criteria for work-saving thermochemical closed-cycle processes are established, and several schemes are reviewed in light of such criteria. An economic analysis is also presented in the context of energy costs.

  15. Complexity of line-seru conversion for different scheduling rules and two improved exact algorithms for the multi-objective optimization.

    PubMed

    Yu, Yang; Wang, Sihan; Tang, Jiafu; Kaku, Ikou; Sun, Wei

    2016-01-01

    Productivity can be greatly improved by converting the traditional assembly line to a seru system, especially in the business environment with short product life cycles, uncertain product types and fluctuating production volumes. Line-seru conversion includes two decision processes, i.e., seru formation and seru load. For simplicity, however, previous studies focus on the seru formation with a given scheduling rule in seru load. We select ten scheduling rules usually used in seru load to investigate the influence of different scheduling rules on the performance of line-seru conversion. Moreover, we clarify the complexities of line-seru conversion for ten different scheduling rules from the theoretical perspective. In addition, multi-objective decisions are often used in line-seru conversion. To obtain Pareto-optimal solutions of multi-objective line-seru conversion, we develop two improved exact algorithms based on reducing time complexity and space complexity respectively. Compared with the enumeration based on non-dominated sorting to solve multi-objective problem, the two improved exact algorithms saves computation time greatly. Several numerical simulation experiments are performed to show the performance improvement brought by the two proposed exact algorithms.

  16. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    NASA Astrophysics Data System (ADS)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  17. Applications of ortho-phenylisonitrile and ortho-N-Boc aniline for the two-step preparation of novel bis-heterocyclic chemotypes.

    PubMed

    Xu, Zhigang; Shaw, Arthur Y; Nichol, Gary S; Cappelli, Alexandra P; Hulme, Christopher

    2012-08-01

    Concise routes to five pharmacologically relevant bis-heterocyclic scaffolds are described. Significant molecular complexity is generated in a mere two synthetic operations enabling access to each scaffold. Routes are often improved by microwave irradiation and all utilize isocyanide-based multi-component reaction methods to incorporate the required diversity elements. Common reagents in all initial condensation reactions include 2-(N-Boc-amino)-phenyl-isocyanide 1, mono-Boc-phenylenediamine 2 and ethyl glyoxalate 3.

  18. Spherical Panoramas for Astrophysical Data Visualization

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2017-05-01

    Data immersion has advantages in astrophysical visualization. Complex multi-dimensional data and phase spaces can be explored in a seamless and interactive viewing environment. Putting the user in the data is a first step toward immersive data analysis. We present a technique for creating 360° spherical panoramas with astrophysical data. The three-dimensional software package Blender and the Google Spatial Media module are used together to immerse users in data exploration. Several examples employing these methods exhibit how the technique works using different types of astronomical data.

  19. Numerical study of multi-point forming of thick sheet using remeshing procedure

    NASA Astrophysics Data System (ADS)

    Cherouat, A.; Ma, X.; Borouchaki, H.; Zhang, Q.

    2018-05-01

    Multi-point forming MPF is an innovative technology of manufacturing complex thick sheet metal products without the need for solid tools. The central component of this system is a pair of the desired discrete matrices of punches, and die surface constructed by changing the positions of the tools though CAD and a control system. Because reconfigurable discrete tools are used, part-manufacturing costs are reduced and manufacturing time is shorten substantially. Firstly, in this work we develop constitutive equations which couples isotropic ductile damage into various flow stress based on the Continuum Damage Mechanic theory. The modified Johnson-Cook flow model fully coupled with an isotropic ductile damage is established using the quasi-unilateral damage evolution for considering both the open and the close of micro-cracks. During the forming processes severe mesh distortion of elements occur after a few incremental forming steps. Secondly, we introduce 3D adaptive remeshing procedure based on linear tetrahedral element and geometrical/physical errors estimation to optimize the element quality, to refine the mesh size in the whole model and to adapt the deformed mesh to the tools geometry. Simulation of the MPF process (see Fig. 1) and the unloading spring-back are carried out using adaptive remeshing scheme using the commercial finite element package ABAQUS and OPTIFORM mesher. Subsequently, influencing factors of MPF spring-back are researched to investigate the MPF spring-back tendency with the proposed remeshing procedure.

  20. Isotopic and trace element characteristics of an unusual refractory inclusion from Essebi

    NASA Technical Reports Server (NTRS)

    Deloule, E.; Kennedy, A. K.; Hutcheon, I. D.; Elgoresy, A.

    1993-01-01

    The isotopic and chemical properties of Ca-Al-rich inclusions (CAI) provide important clues to the early solar nebula environment. While the abundances of refractory major and trace elements are similar to those expected for high temperature condensates, the variety of textural, chemical, and isotopic signatures indicate most CAI experienced complex, multi-stage histories involving repeated episodes of condensation, evaporation, and metamorphism. Evidence of multiple processes is especially apparent in an unusual refractory inclusion from Essebi (URIE) described by El Goresy et al. The melilite (mel)-rich core of URIE contains polygonal framboids of spinel (sp) and hibonite (hb) or sp and fassaite (fas) and is surrounded by a rim sequence consisting of five layers. In contrast to rims on Allende, the mineralogy of the URIE rim layers becomes increasingly refractory from the core outwards, ending in a layer of spinel-Al2O3 solid solution + Sc-rich fassaite. The chemical and mineralogical features of URIE are inconsistent with crystallization from a homogeneous melt, and El Goresy et al. proposed a multi-step history involving condensation of sp + hb and aggregation into framboids, capture of framboids by a refractory silicate melt droplet, condensation of rim layers, and alteration of mel to calcite and feldspathoid. The PANURGE ion probe was used to investigate the isotopic and trace element characteristics of URIE to develop a more complete picture of the multiple processes leading to formation and metamorphism.

  1. Real Time, On Line Crop Monitoring and Analysis with Near Global Landsat-class Mosaics

    NASA Astrophysics Data System (ADS)

    Varlyguin, D.; Hulina, S.; Crutchfield, J.; Reynolds, C. A.; Frantz, R.

    2015-12-01

    The presentation will discuss the current status of GDA technology for operational, automated generation of 10-30 meter near global mosaics of Landsat-class data for visualization, monitoring, and analysis. Current version of the mosaic combines Landsat 8 and Landsat 7. Sentinel-2A imagery will be added once it is operationally available. The mosaics are surface reflectance calibrated and are analysis ready. They offer full spatial resolution and all multi-spectral bands of the source imagery. Each mosaic covers all major agricultural regions of the world and 16 day time window. 2014-most current dates are supported. The mosaics are updated in real-time, as soon as GDA downloads Landsat imagery, calibrates it to the surface reflectances, and generates data gap masks (all typically under 10 minutes for a Landsat scene). The technology eliminates the complex, multi-step, hands-on process of data preparation and provides imagery ready for repetitive, field-to-country analysis of crop conditions, progress, acreages, yield, and production. The mosaics can be used for real-time, on-line interactive mapping and time series drilling via GeoSynergy webGIS platform. The imagery is of great value for improved, persistent monitoring of global croplands and for the operational in-season analysis and mapping of crops across the globe in USDA FAS purview as mandated by the US government. The presentation will overview operational processing of Landsat-class mosaics in support of USDA FAS efforts and will look into 2015 and beyond.

  2. Kernel Regression Estimation of Fiber Orientation Mixtures in Diffusion MRI

    PubMed Central

    Cabeen, Ryan P.; Bastin, Mark E.; Laidlaw, David H.

    2016-01-01

    We present and evaluate a method for kernel regression estimation of fiber orientations and associated volume fractions for diffusion MR tractography and population-based atlas construction in clinical imaging studies of brain white matter. This is a model-based image processing technique in which representative fiber models are estimated from collections of component fiber models in model-valued image data. This extends prior work in nonparametric image processing and multi-compartment processing to provide computational tools for image interpolation, smoothing, and fusion with fiber orientation mixtures. In contrast to related work on multi-compartment processing, this approach is based on directional measures of divergence and includes data-adaptive extensions for model selection and bilateral filtering. This is useful for reconstructing complex anatomical features in clinical datasets analyzed with the ball-and-sticks model, and our framework’s data-adaptive extensions are potentially useful for general multi-compartment image processing. We experimentally evaluate our approach with both synthetic data from computational phantoms and in vivo clinical data from human subjects. With synthetic data experiments, we evaluate performance based on errors in fiber orientation, volume fraction, compartment count, and tractography-based connectivity. With in vivo data experiments, we first show improved scan-rescan reproducibility and reliability of quantitative fiber bundle metrics, including mean length, volume, streamline count, and mean volume fraction. We then demonstrate the creation of a multi-fiber tractography atlas from a population of 80 human subjects. In comparison to single tensor atlasing, our multi-fiber atlas shows more complete features of known fiber bundles and includes reconstructions of the lateral projections of the corpus callosum and complex fronto-parietal connections of the superior longitudinal fasciculus I, II, and III. PMID:26691524

  3. An experimental investigation on the thermal field of overlapping layers in laser-assisted tape winding process

    NASA Astrophysics Data System (ADS)

    Hosseini, S. M. A.; Baran, I.; Akkerman, R.

    2018-05-01

    The laser-assisted tape winding (LATW) is an automated process for manufacturing fiber-reinforced thermoplastic tubular products, such as pipes and pressure vessels. Multi-physical phenomena such as heat transfer, mechanical bonding, phase changes and solid mechanics take place during the process. These phenomena need to be understood and described well for an improved product reliability. Temperature is one of the important parameters in this process to control and optimize the product quality which can be employed in an intelligent model-based inline control system. The incoming tape can overlap with the already wounded layer during the process based on the lay-up configuration. In this situation, the incoming tape can step-on or step-off to an already deposited layer/laminate. During the overlapping, the part temperature changes due to the variation of the geometry caused by previously deposited layer, i.e. a bump geometry. In order to qualify the temperature behavior at the bump regions, an experimental set up is designed on a flat laminate. Artificial bumps/steps are formed on the laminate with various thicknesses and fiber orientations. As the laser head experiences the step-on and step-off, the IR (Infra-Red) camera and the embedded thermocouples measure the temperature on the surface and inside the laminate, respectively. During the step-on, a small drop in temperature is observed while in step-off a higher peak in temperature is observed. It can be concluded that the change in the temperature during overlapping is due to the change in laser incident angle made by the bump geometry. The effect of the step thickness on the temperature peak is quantified and found to be significant.

  4. How thermal stress alters the confinement of polymers vitrificated in nanopores

    NASA Astrophysics Data System (ADS)

    Teng, Chao; Li, Linling; Wang, Yong; Wang, Rong; Chen, Wei; Wang, Xiaoliang; Xue, Gi

    2017-05-01

    Understanding and controlling the glass transition temperature (Tg) and dynamics of polymers in confined geometries are of significance in both academia and industry. Here, we investigate how the thermal stress induced by a mismatch in the coefficient of thermal expansion affects the Tg behavior of polystyrene (PS) nanorods located inside cylindrical alumina nanopores. The size effects and molecular weight dependence of the Tg are also studied. A multi-step relaxation process was employed to study the relationship between thermal stress and cooling rate. At fast cooling rates, the imparted thermal stress would overcome the yield stress of PS and peel chains off the pore walls, while at slow cooling rates, chains are kept in contact with the pore walls due to timely dissipation of the produced thermal stress during vitrification. In smaller nanopores, more PS chains closely contact with pore walls, then stronger internal thermal stress would be generated between core and shell of PS nanorod, which results in a larger deviation between two Tgs. The core part of PS shows lower Tg than bulk value, which can induce faster dynamics in the center region. A complex and important role stress plays is supposed in complex confinement condition, e.g., in nanopores, during vitrification.

  5. Analyzing critical material demand: A revised approach.

    PubMed

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-07-15

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. From master slave interferometry to complex master slave interferometry: theoretical work

    NASA Astrophysics Data System (ADS)

    Rivet, Sylvain; Bradu, Adrian; Maria, Michael; Feuchter, Thomas; Leick, Lasse; Podoleanu, Adrian

    2018-03-01

    A general theoretical framework is described to obtain the advantages and the drawbacks of two novel Fourier Domain Optical Coherence Tomography (OCT) methods denoted as Master/Slave Interferometry (MSI) and its extension denoted as Complex Master/Slave Interferometry (CMSI). Instead of linearizing the digital data representing the channeled spectrum before a Fourier transform can be applied to it (as in OCT standard methods), channeled spectrum is decomposed on the basis of local oscillations. This replaces the need for linearization, generally time consuming, before any calculation of the depth profile in the range of interest. In this model two functions, g and h, are introduced. The function g describes the modulation chirp of the channeled spectrum signal due to nonlinearities in the decoding process from wavenumber to time. The function h describes the dispersion in the interferometer. The utilization of these two functions brings two major improvements to previous implementations of the MSI method. The paper details the steps to obtain the functions g and h, and represents the CMSI in a matrix formulation that enables to implement easily this method in LabVIEW by using parallel programming with multi-cores.

  7. The Effects of Marzano's Six Step Vocabulary Process, on Fourth Grade Students' Vocabulary Knowledge, Fluency, and Sentence Complexity

    ERIC Educational Resources Information Center

    Suing, Janet S.

    2012-01-01

    This exploratory study examined the ways in which fourth grade students, in an urban setting, responded to a nine-week implementation of Marzano's Six Step Vocabulary Process. The purpose of this study was to explore the relationship between the direct instruction of vocabulary and the effects on student achievement as measured by Vocabulary…

  8. A 3D modeling approach to complex faults with multi-source data

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Xu, Hua; Zou, Xukai; Lei, Hongzhuan

    2015-04-01

    Fault modeling is a very important step in making an accurate and reliable 3D geological model. Typical existing methods demand enough fault data to be able to construct complex fault models, however, it is well known that the available fault data are generally sparse and undersampled. In this paper, we propose a workflow of fault modeling, which can integrate multi-source data to construct fault models. For the faults that are not modeled with these data, especially small-scale or approximately parallel with the sections, we propose the fault deduction method to infer the hanging wall and footwall lines after displacement calculation. Moreover, using the fault cutting algorithm can supplement the available fault points on the location where faults cut each other. Increasing fault points in poor sample areas can not only efficiently construct fault models, but also reduce manual intervention. By using a fault-based interpolation and remeshing the horizons, an accurate 3D geological model can be constructed. The method can naturally simulate geological structures no matter whether the available geological data are sufficient or not. A concrete example of using the method in Tangshan, China, shows that the method can be applied to broad and complex geological areas.

  9. Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis

    USGS Publications Warehouse

    Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.

    2003-01-01

    This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.

  10. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    NASA Astrophysics Data System (ADS)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  11. Multi-dimensional Fokker-Planck equation analysis using the modified finite element method

    NASA Astrophysics Data System (ADS)

    Náprstek, J.; Král, R.

    2016-09-01

    The Fokker-Planck equation (FPE) is a frequently used tool for the solution of cross probability density function (PDF) of a dynamic system response excited by a vector of random processes. FEM represents a very effective solution possibility, particularly when transition processes are investigated or a more detailed solution is needed. Actual papers deal with single degree of freedom (SDOF) systems only. So the respective FPE includes two independent space variables only. Stepping over this limit into MDOF systems a number of specific problems related to a true multi-dimensionality must be overcome. Unlike earlier studies, multi-dimensional simplex elements in any arbitrary dimension should be deployed and rectangular (multi-brick) elements abandoned. Simple closed formulae of integration in multi-dimension domain have been derived. Another specific problem represents the generation of multi-dimensional finite element mesh. Assembling of system global matrices should be subjected to newly composed algorithms due to multi-dimensionality. The system matrices are quite full and no advantages following from their sparse character can be profited from, as is commonly used in conventional FEM applications in 2D/3D problems. After verification of partial algorithms, an illustrative example dealing with a 2DOF non-linear aeroelastic system in combination with random and deterministic excitations is discussed.

  12. Analysis of the phosphorescent dye concentration dependence of triplet-triplet annihilation in organic host-guest systems

    NASA Astrophysics Data System (ADS)

    Zhang, L.; van Eersel, H.; Bobbert, P. A.; Coehoorn, R.

    2016-10-01

    Using a novel method for analyzing transient photoluminescence (PL) experiments, a microscopic description is obtained for the dye concentration dependence of triplet-triplet annihilation (TTA) in phosphorescent host-guest systems. It is demonstrated that the TTA-mechanism, which could be a single-step dominated process or a diffusion-mediated multi-step process, can be deduced for any given dye concentration from a recently proposed PL intensity analysis. A comparison with the results of kinetic Monte Carlo simulations provides the TTA-Förster radius and shows that the TTA enhancement due to triplet diffusion can be well described in a microscopic manner assuming Förster- or Dexter-type energy transfer.

  13. Color sensitivity of the multi-exposure HDR imaging process

    NASA Astrophysics Data System (ADS)

    Lenseigne, Boris; Jacobs, Valéry Ann; Withouck, Martijn; Hanselaer, Peter; Jonker, Pieter P.

    2013-04-01

    Multi-exposure high dynamic range(HDR) imaging builds HDR radiance maps by stitching together different views of a same scene with varying exposures. Practically, this process involves converting raw sensor data into low dynamic range (LDR) images, estimate the camera response curves, and use them in order to recover the irradiance for every pixel. During the export, applying white balance settings and image stitching, which both have an influence on the color balance in the final image. In this paper, we use a calibrated quasi-monochromatic light source, an integrating sphere, and a spectrograph in order to evaluate and compare the average spectral response of the image sensor. We finally draw some conclusion about the color consistency of HDR imaging and the additional steps necessary to use multi-exposure HDR imaging as a tool to measure the physical quantities such as radiance and luminance.

  14. MREG V1.1 : a multi-scale image registration algorithm for SAR applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichel, Paul H.

    2013-08-01

    MREG V1.1 is the sixth generation SAR image registration algorithm developed by the Signal Processing&Technology Department for Synthetic Aperture Radar applications. Like its predecessor algorithm REGI, it employs a powerful iterative multi-scale paradigm to achieve the competing goals of sub-pixel registration accuracy and the ability to handle large initial offsets. Since it is not model based, it allows for high fidelity tracking of spatially varying terrain-induced misregistration. Since it does not rely on image domain phase, it is equally adept at coherent and noncoherent image registration. This document provides a brief history of the registration processors developed by Dept. 5962more » leading up to MREG V1.1, a full description of the signal processing steps involved in the algorithm, and a user's manual with application specific recommendations for CCD, TwoColor MultiView, and SAR stereoscopy.« less

  15. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  16. ICME — A Mere Coupling of Models or a Discipline of Its Own?

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Schmitz, Georg J.; Prahl, Ulrich

    Technically, ICME — Integrated computational materials engineering — is an approach for solving advanced engineering problems related to the design of new materials and processes by combining individual materials and process models. The combination of models by now is mainly achieved by manual transformation of the output of a simulation to form the input to a subsequent one. This subsequent simulation is either performed at a different length scale or constitutes a subsequent step along the process chain. Is ICME thus just a synonym for the coupling of simulations? In fact, most ICME publications up to now are examples of the joint application of selected models and software codes to a specific problem. However, from a systems point of view, the coupling of individual models and/or software codes across length scales and along material processing chains leads to highly complex meta-models. Their viability has to be ensured by joint efforts from science, industry, software developers and independent organizations. This paper identifies some developments that seem necessary to make future ICME simulations viable, sustainable and broadly accessible and accepted. The main conclusion is that ICME is more than a multi-disciplinary subject but a discipline of its own, for which a generic structural framework has to be elaborated and established.

  17. Carotene Degradation and Isomerization during Thermal Processing: A Review on the Kinetic Aspects.

    PubMed

    Colle, Ines J P; Lemmens, Lien; Knockaert, Griet; Van Loey, Ann; Hendrickx, Marc

    2016-08-17

    Kinetic models are important tools for process design and optimization to balance desired and undesired reactions taking place in complex food systems during food processing and preservation. This review covers the state of the art on kinetic models available to describe heat-induced conversion of carotenoids, in particular lycopene and β-carotene. First, relevant properties of these carotenoids are discussed. Second, some general aspects of kinetic modeling are introduced, including both empirical single-response modeling and mechanism-based multi-response modeling. The merits of multi-response modeling to simultaneously describe carotene degradation and isomerization are demonstrated. The future challenge in this research field lies in the extension of the current multi-response models to better approach the real reaction pathway and in the integration of kinetic models with mass transfer models in case of reaction in multi-phase food systems.

  18. Visible CWDM system design for Multi-Gbit/s transmission over SI-POF

    NASA Astrophysics Data System (ADS)

    Vázquez, Carmen; Pinzón, Plinio Jesús; Pérez, Isabel

    2015-01-01

    In order to increase the data rates of Multi-Gbit/s links based on large core step index (SI) plastic optical fibers (POF), different modulation scenes have been proposed. Another option is to use multiple optical carriers for parallel transmission of communication channels over the same fiber. Some designs to reach data rates of 14.77 Gb/s in 50 m, with 4 channels have been developed by off line processing. In this work, designs to test the potential of real Multi- Gbit/s transmission systems using commercial products are reported. Special care in designing low insertion loss multiplexers and demultiplexers is carried out to allow for greener solutions in terms of power consumption.

  19. Multi-parameter phenotypic profiling: using cellular effects to characterize small-molecule compounds.

    PubMed

    Feng, Yan; Mitchison, Timothy J; Bender, Andreas; Young, Daniel W; Tallarico, John A

    2009-07-01

    Multi-parameter phenotypic profiling of small molecules provides important insights into their mechanisms of action, as well as a systems level understanding of biological pathways and their responses to small molecule treatments. It therefore deserves more attention at an early step in the drug discovery pipeline. Here, we summarize the technologies that are currently in use for phenotypic profiling--including mRNA-, protein- and imaging-based multi-parameter profiling--in the drug discovery context. We think that an earlier integration of phenotypic profiling technologies, combined with effective experimental and in silico target identification approaches, can improve success rates of lead selection and optimization in the drug discovery process.

  20. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

Top