Sample records for data-driven batch processing

  1. Data-driven monitoring for stochastic systems and its application on batch process

    NASA Astrophysics Data System (ADS)

    Yin, Shen; Ding, Steven X.; Haghani Abandan Sari, Adel; Hao, Haiyang

    2013-07-01

    Batch processes are characterised by a prescribed processing of raw materials into final products for a finite duration and play an important role in many industrial sectors due to the low-volume and high-value products. Process dynamics and stochastic disturbances are inherent characteristics of batch processes, which cause monitoring of batch processes a challenging problem in practice. To solve this problem, a subspace-aided data-driven approach is presented in this article for batch process monitoring. The advantages of the proposed approach lie in its simple form and its abilities to deal with stochastic disturbances and process dynamics existing in the process. The kernel density estimation, which serves as a non-parametric way of estimating the probability density function, is utilised for threshold calculation. An industrial benchmark of fed-batch penicillin production is finally utilised to verify the effectiveness of the proposed approach.

  2. Parameterized data-driven fuzzy model based optimal control of a semi-batch reactor.

    PubMed

    Kamesh, Reddi; Rani, K Yamuna

    2016-09-01

    A parameterized data-driven fuzzy (PDDF) model structure is proposed for semi-batch processes, and its application for optimal control is illustrated. The orthonormally parameterized input trajectories, initial states and process parameters are the inputs to the model, which predicts the output trajectories in terms of Fourier coefficients. Fuzzy rules are formulated based on the signs of a linear data-driven model, while the defuzzification step incorporates a linear regression model to shift the domain from input to output domain. The fuzzy model is employed to formulate an optimal control problem for single rate as well as multi-rate systems. Simulation study on a multivariable semi-batch reactor system reveals that the proposed PDDF modeling approach is capable of capturing the nonlinear and time-varying behavior inherent in the semi-batch system fairly accurately, and the results of operating trajectory optimization using the proposed model are found to be comparable to the results obtained using the exact first principles model, and are also found to be comparable to or better than parameterized data-driven artificial neural network model based optimization results. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  3. CLARA: CLAS12 Reconstruction and Analysis Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  4. Unifying cancer and normal RNA sequencing data from different sources

    PubMed Central

    Wang, Qingguo; Armenia, Joshua; Zhang, Chao; Penson, Alexander V.; Reznik, Ed; Zhang, Liguo; Minet, Thais; Ochoa, Angelica; Gross, Benjamin E.; Iacobuzio-Donahue, Christine A.; Betel, Doron; Taylor, Barry S.; Gao, Jianjiong; Schultz, Nikolaus

    2018-01-01

    Driven by the recent advances of next generation sequencing (NGS) technologies and an urgent need to decode complex human diseases, a multitude of large-scale studies were conducted recently that have resulted in an unprecedented volume of whole transcriptome sequencing (RNA-seq) data, such as the Genotype Tissue Expression project (GTEx) and The Cancer Genome Atlas (TCGA). While these data offer new opportunities to identify the mechanisms underlying disease, the comparison of data from different sources remains challenging, due to differences in sample and data processing. Here, we developed a pipeline that processes and unifies RNA-seq data from different studies, which includes uniform realignment, gene expression quantification, and batch effect removal. We find that uniform alignment and quantification is not sufficient when combining RNA-seq data from different sources and that the removal of other batch effects is essential to facilitate data comparison. We have processed data from GTEx and TCGA and successfully corrected for study-specific biases, enabling comparative analysis between TCGA and GTEx. The normalized datasets are available for download on figshare. PMID:29664468

  5. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG

    PubMed Central

    Cowley, Benjamin U.; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705

  6. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    PubMed

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  7. Data-driven model reference control of MIMO vertical tank systems with model-free VRFT and Q-Learning.

    PubMed

    Radac, Mircea-Bogdan; Precup, Radu-Emil; Roman, Raul-Cristian

    2018-02-01

    This paper proposes a combined Virtual Reference Feedback Tuning-Q-learning model-free control approach, which tunes nonlinear static state feedback controllers to achieve output model reference tracking in an optimal control framework. The novel iterative Batch Fitted Q-learning strategy uses two neural networks to represent the value function (critic) and the controller (actor), and it is referred to as a mixed Virtual Reference Feedback Tuning-Batch Fitted Q-learning approach. Learning convergence of the Q-learning schemes generally depends, among other settings, on the efficient exploration of the state-action space. Handcrafting test signals for efficient exploration is difficult even for input-output stable unknown processes. Virtual Reference Feedback Tuning can ensure an initial stabilizing controller to be learned from few input-output data and it can be next used to collect substantially more input-state data in a controlled mode, in a constrained environment, by compensating the process dynamics. This data is used to learn significantly superior nonlinear state feedback neural networks controllers for model reference tracking, using the proposed Batch Fitted Q-learning iterative tuning strategy, motivating the original combination of the two techniques. The mixed Virtual Reference Feedback Tuning-Batch Fitted Q-learning approach is experimentally validated for water level control of a multi input-multi output nonlinear constrained coupled two-tank system. Discussions on the observed control behavior are offered. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Raw material variability of an active pharmaceutical ingredient and its relevance for processability in secondary continuous pharmaceutical manufacturing.

    PubMed

    Stauffer, F; Vanhoorne, V; Pilcer, G; Chavez, P-F; Rome, S; Schubert, M A; Aerts, L; De Beer, T

    2018-06-01

    Active Pharmaceutical Ingredients (API) raw material variability is not always thoroughly considered during pharmaceutical process development, mainly due to low quantities of drug substance available. However, synthesis, crystallization routes and production sites evolve during product development and product life cycle leading to changes in physical material attributes which can potentially affect their processability. Recent literature highlights the need for a global approach to understand the link between material synthesis, material variability, process and product quality. The study described in this article aims at explaining the raw material variability of an API using extensive material characterization on a restricted number of representative batches using multivariate data analysis. It is part of a larger investigation trying to link the API drug substance manufacturing process, the resulting physical API raw material attributes and the drug product continuous manufacturing process. Eight API batches produced using different synthetic routes, crystallization, drying, delumping processes and processing equipment were characterized, extensively. Seventeen properties from seven characterization techniques were retained for further analysis using Principal Component Analysis (PCA). Three principal components (PCs) were sufficient to explain 92.9% of the API raw material variability. The first PC was related to crystal length, agglomerate size and fraction, flowability and electrostatic charging. The second PC was driven by the span of the particle size distribution and the agglomerates strength. The third PC was related to surface energy. Additionally, the PCA allowed to summarize the API batch-to-batch variability in only three PCs which can be used in future drug product development studies to quantitatively evaluate the impact of the API raw material variability upon the drug product process. The approach described in this article could be applied to any other compound which is prone to batch-to-batch variability. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    PubMed

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Plant-Scale Concentration Column Designs for SHINE Target Solution Utilizing AG 1 Anion Exchange Resin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stepinski, Dominique C.; Vandegrift, G. F.

    2015-09-30

    Argonne is assisting SHINE Medical Technologies (SHINE) in their efforts to develop SHINE, an accelerator-driven process that will utilize a uranyl-sulfate solution for the production of fission product Mo-99. An integral part of the process is the development of a column for the separation and recovery of Mo-99, followed by a concentration column to reduce the product volume from 15-25 L to <1 L. Argonne has collected data from batch studies and breakthrough column experiments to utilize the VERSE (Versatile Reaction Separation) simulation program (Purdue University) to design plant-scale product recovery and concentration processes.

  11. Fault detection and diagnosis in an industrial fed-batch cell culture process.

    PubMed

    Gunther, Jon C; Conner, Jeremy S; Seborg, Dale E

    2007-01-01

    A flexible process monitoring method was applied to industrial pilot plant cell culture data for the purpose of fault detection and diagnosis. Data from 23 batches, 20 normal operating conditions (NOC) and three abnormal, were available. A principal component analysis (PCA) model was constructed from 19 NOC batches, and the remaining NOC batch was used for model validation. Subsequently, the model was used to successfully detect (both offline and online) abnormal process conditions and to diagnose the root causes. This research demonstrates that data from a relatively small number of batches (approximately 20) can still be used to monitor for a wide range of process faults.

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR BATCHING OF FIELD DATA FORMS (UA-C-4.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the assembly of household (HH) packets into data processing batches. The batching process enables orderly tracking of packets or forms through data processing and limits the potential for packet or form loss. This procedure was used for th...

  13. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR BATCHING OF FIELD DATA FORMS (UA-C-4.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the assembly of household (HH) packets into data processing batches. The batching process enables orderly tracking of packets or forms through data processing and limits the potential for packet or form loss. This procedure was used for th...

  14. IceProd 2 Usage Experience

    NASA Astrophysics Data System (ADS)

    Delventhal, D.; Schultz, D.; Diaz Velez, J. C.

    2017-10-01

    IceProd is a data processing and management framework developed by the IceCube Neutrino Observatory for processing of Monte Carlo simulations, detector data, and data driven analysis. It runs as a separate layer on top of grid and batch systems. This is accomplished by a set of daemons which process job workflow, maintaining configuration and status information on the job before, during, and after processing. IceProd can also manage complex workflow DAGs across distributed computing grids in order to optimize usage of resources. IceProd has recently been rewritten to increase its scaling capabilities, handle user analysis workflows together with simulation production, and facilitate the integration with 3rd party scheduling tools. IceProd 2, the second generation of IceProd, has been running in production for several months now. We share our experience setting up the system and things we’ve learned along the way.

  15. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  16. Lifelong learning of human actions with deep neural network self-organization.

    PubMed

    Parisi, German I; Tani, Jun; Weber, Cornelius; Wermter, Stefan

    2017-12-01

    Lifelong learning is fundamental in autonomous robotics for the acquisition and fine-tuning of knowledge through experience. However, conventional deep neural models for action recognition from videos do not account for lifelong learning but rather learn a batch of training data with a predefined number of action classes and samples. Thus, there is the need to develop learning systems with the ability to incrementally process available perceptual cues and to adapt their responses over time. We propose a self-organizing neural architecture for incrementally learning to classify human actions from video sequences. The architecture comprises growing self-organizing networks equipped with recurrent neurons for processing time-varying patterns. We use a set of hierarchically arranged recurrent networks for the unsupervised learning of action representations with increasingly large spatiotemporal receptive fields. Lifelong learning is achieved in terms of prediction-driven neural dynamics in which the growth and the adaptation of the recurrent networks are driven by their capability to reconstruct temporally ordered input sequences. Experimental results on a classification task using two action benchmark datasets show that our model is competitive with state-of-the-art methods for batch learning also when a significant number of sample labels are missing or corrupted during training sessions. Additional experiments show the ability of our model to adapt to non-stationary input avoiding catastrophic interference. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  17. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  18. SEWAGE OFF-GAS-DRIVEN FUEL CELLS TO STIMULATE RURAL ELECTRIFICATION

    EPA Science Inventory

    Literature reviews confirmed the feasibility of the system relying on methane to supply the fuel cell and the waste heat from the subsequent fuel cell operation driving the decomposition process. A batch bioreactor and a proton exchange fuel cell at the lab scale are used to c...

  19. Future Supply Chains Enabled by Continuous Processing-Opportunities Challenges May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Srai, Jagjit Singh; Badman, Clive; Krumme, Markus; Futran, Mauricio; Johnston, Craig

    2015-03-01

    This paper examines the opportunities and challenges facing the pharmaceutical industry in moving to a primarily "continuous processing"-based supply chain. The current predominantly "large batch" and centralized manufacturing system designed for the "blockbuster" drug has driven a slow-paced, inventory heavy operating model that is increasingly regarded as inflexible and unsustainable. Indeed, new markets and the rapidly evolving technology landscape will drive more product variety, shorter product life-cycles, and smaller drug volumes, which will exacerbate an already unsustainable economic model. Future supply chains will be required to enhance affordability and availability for patients and healthcare providers alike despite the increased product complexity. In this more challenging supply scenario, we examine the potential for a more pull driven, near real-time demand-based supply chain, utilizing continuous processing where appropriate as a key element of a more "flow-through" operating model. In this discussion paper on future supply chain models underpinned by developments in the continuous manufacture of pharmaceuticals, we have set out; The paper recognizes that although current batch operational performance in pharma is far from optimal and not necessarily an appropriate end-state benchmark for batch technology, the adoption of continuous supply chain operating models underpinned by continuous production processing, as full or hybrid solutions in selected product supply chains, can support industry transformations to deliver right-first-time quality at substantially lower inventory profiles. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. Continuous Manufacturing in Pharmaceutical Process Development and Manufacturing.

    PubMed

    Burcham, Christopher L; Florence, Alastair J; Johnson, Martin D

    2018-06-07

    The pharmaceutical industry has found new applications for the use of continuous processing for the manufacture of new therapies currently in development. The transformation has been encouraged by regulatory bodies as well as driven by cost reduction, decreased development cycles, access to new chemistries not practical in batch, improved safety, flexible manufacturing platforms, and improved product quality assurance. The transformation from batch to continuous manufacturing processing is the focus of this review. The review is limited to small, chemically synthesized organic molecules and encompasses the manufacture of both active pharmaceutical ingredients (APIs) and the subsequent drug product. Continuous drug product is currently used in approved processes. A few examples of production of APIs under current good manufacturing practice conditions using continuous processing steps have been published in the past five years, but they are lagging behind continuous drug product with respect to regulatory filings.

  1. Application of a continuous twin screw-driven process for dilute acid pretreatment of rape straw.

    PubMed

    Choi, Chang Ho; Oh, Kyeong Keun

    2012-04-01

    Rape straw, a processing residue generated from the bio-oil industry, was used as a model biomass for application of continuous twin screw-driven dilute acid pretreatment. The screw rotation speed and feeding rate were adjusted to 19.7rpm and 0.5g/min, respectively to maintain a residence time of 7.2min in the reaction zone, respectively. The sulfuric acid concentration was 3.5wt% and the reaction temperature was 165°C. The enzymatic digestibility of the glucan in the pretreated solids was 70.9%. The continuous process routinely gave around 28.8% higher yield for glucan digestibility than did the batch processing method. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Tier 3 batch system data locality via managed caches

    NASA Astrophysics Data System (ADS)

    Fischer, Max; Giffels, Manuel; Jung, Christopher; Kühn, Eileen; Quast, Günter

    2015-05-01

    Modern data processing increasingly relies on data locality for performance and scalability, whereas the common HEP approaches aim for uniform resource pools with minimal locality, recently even across site boundaries. To combine advantages of both, the High- Performance Data Analysis (HPDA) Tier 3 concept opportunistically establishes data locality via coordinated caches. In accordance with HEP Tier 3 activities, the design incorporates two major assumptions: First, only a fraction of data is accessed regularly and thus the deciding factor for overall throughput. Second, data access may fallback to non-local, making permanent local data availability an inefficient resource usage strategy. Based on this, the HPDA design generically extends available storage hierarchies into the batch system. Using the batch system itself for scheduling file locality, an array of independent caches on the worker nodes is dynamically populated with high-profile data. Cache state information is exposed to the batch system both for managing caches and scheduling jobs. As a result, users directly work with a regular, adequately sized storage system. However, their automated batch processes are presented with local replications of data whenever possible.

  3. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    PubMed

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Batch Model for Batched Timestamps Data Analysis with Application to the SSA Disability Program

    PubMed Central

    Yue, Qingqi; Yuan, Ao; Che, Xuan; Huynh, Minh; Zhou, Chunxiao

    2016-01-01

    The Office of Disability Adjudication and Review (ODAR) is responsible for holding hearings, issuing decisions, and reviewing appeals as part of the Social Security Administration’s disability determining process. In order to control and process cases, the ODAR has established a Case Processing and Management System (CPMS) to record management information since December 2003. The CPMS provides a detailed case status history for each case. Due to the large number of appeal requests and limited resources, the number of pending claims at ODAR was over one million cases by March 31, 2015. Our National Institutes of Health (NIH) team collaborated with SSA and developed a Case Status Change Model (CSCM) project to meet the ODAR’s urgent need of reducing backlogs and improve hearings and appeals process. One of the key issues in our CSCM project is to estimate the expected service time and its variation for each case status code. The challenge is that the systems recorded job departure times may not be the true job finished times. As the CPMS timestamps data of case status codes showed apparent batch patterns, we proposed a batch model and applied the constrained least squares method to estimate the mean service times and the variances. We also proposed a batch search algorithm to determine the optimal batch partition, as no batch partition was given in the real data. Simulation studies were conducted to evaluate the performance of the proposed methods. Finally, we applied the method to analyze a real CPMS data from ODAR/SSA. PMID:27747132

  5. Future supply chains enabled by continuous processing--opportunities and challenges. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Srai, Jagjit Singh; Badman, Clive; Krumme, Markus; Futran, Mauricio; Johnston, Craig

    2015-03-01

    This paper examines the opportunities and challenges facing the pharmaceutical industry in moving to a primarily "continuous processing"-based supply chain. The current predominantly "large batch" and centralized manufacturing system designed for the "blockbuster" drug has driven a slow-paced, inventory heavy operating model that is increasingly regarded as inflexible and unsustainable. Indeed, new markets and the rapidly evolving technology landscape will drive more product variety, shorter product life-cycles, and smaller drug volumes, which will exacerbate an already unsustainable economic model. Future supply chains will be required to enhance affordability and availability for patients and healthcare providers alike despite the increased product complexity. In this more challenging supply scenario, we examine the potential for a more pull driven, near real-time demand-based supply chain, utilizing continuous processing where appropriate as a key element of a more "flow-through" operating model. In this discussion paper on future supply chain models underpinned by developments in the continuous manufacture of pharmaceuticals, we have set out; The significant opportunities to moving to a supply chain flow-through operating model, with substantial opportunities in inventory reduction, lead-time to patient, and radically different product assurance/stability regimes. Scenarios for decentralized production models producing a greater variety of products with enhanced volume flexibility. Production, supply, and value chain footprints that are radically different from today's monolithic and centralized batch manufacturing operations. Clinical trial and drug product development cost savings that support more rapid scale-up and market entry models with early involvement of SC designers within New Product Development. The major supply chain and industrial transformational challenges that need to be addressed. The paper recognizes that although current batch operational performance in pharma is far from optimal and not necessarily an appropriate end-state benchmark for batch technology, the adoption of continuous supply chain operating models underpinned by continuous production processing, as full or hybrid solutions in selected product supply chains, can support industry transformations to deliver right-first-time quality at substantially lower inventory profiles. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  6. Big data analytics in hyperspectral imaging for detection of microbial colonies on agar plates (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yoon, Seung-Chul; Park, Bosoon; Lawrence, Kurt C.

    2017-05-01

    Various types of optical imaging techniques measuring light reflectivity and scattering can detect microbial colonies of foodborne pathogens on agar plates. Until recently, these techniques were developed to provide solutions for hypothesis-driven studies, which focused on developing tools and batch/offline machine learning methods with well defined sets of data. These have relatively high accuracy and rapid response time because the tools and methods are often optimized for the collected data. However, they often need to be retrained or recalibrated when new untrained data and/or features are added. A big-data driven technique is more suitable for online learning of new/ambiguous samples and for mining unknown or hidden features. Although big data research in hyperspectral imaging is emerging in remote sensing and many tools and methods have been developed so far in many other applications such as bioinformatics, the tools and methods still need to be evaluated and adjusted in applications where the conventional batch machine learning algorithms were dominant. The primary objective of this study is to evaluate appropriate big data analytic tools and methods for online learning and mining of foodborne pathogens on agar plates. After the tools and methods are successfully identified, they will be applied to rapidly search big color and hyperspectral image data of microbial colonies collected over the past 5 years in house and find the most probable colony or a group of colonies in the collected big data. The meta-data, such as collection time and any unstructured data (e.g. comments), will also be analyzed and presented with output results. The expected results will be novel, big data-driven technology to correctly detect and recognize microbial colonies of various foodborne pathogens on agar plates.

  7. Bioprocessing Data for the Production of Marine Enzymes

    PubMed Central

    Sarkar, Sreyashi; Pramanik, Arnab; Mitra, Anindita; Mukherjee, Joydeep

    2010-01-01

    This review is a synopsis of different bioprocess engineering approaches adopted for the production of marine enzymes. Three major modes of operation: batch, fed-batch and continuous have been used for production of enzymes (such as protease, chitinase, agarase, peroxidase) mainly from marine bacteria and fungi on a laboratory bioreactor and pilot plant scales. Submerged, immobilized and solid-state processes in batch mode were widely employed. The fed-batch process was also applied in several bioprocesses. Continuous processes with suspended cells as well as with immobilized cells have been used. Investigations in shake flasks were conducted with the prospect of large-scale processing in reactors. PMID:20479981

  8. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  9. Network acceleration techniques

    NASA Technical Reports Server (NTRS)

    Crowley, Patricia (Inventor); Maccabe, Arthur Barney (Inventor); Awrach, James Michael (Inventor)

    2012-01-01

    Splintered offloading techniques with receive batch processing are described for network acceleration. Such techniques offload specific functionality to a NIC while maintaining the bulk of the protocol processing in the host operating system ("OS"). The resulting protocol implementation allows the application to bypass the protocol processing of the received data. Such can be accomplished this by moving data from the NIC directly to the application through direct memory access ("DMA") and batch processing the receive headers in the host OS when the host OS is interrupted to perform other work. Batch processing receive headers allows the data path to be separated from the control path. Unlike operating system bypass, however, the operating system still fully manages the network resource and has relevant feedback about traffic and flows. Embodiments of the present disclosure can therefore address the challenges of networks with extreme bandwidth delay products (BWDP).

  10. How Many Batches Are Needed for Process Validation under the New FDA Guidance?

    PubMed

    Yang, Harry

    2013-01-01

    The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.

  11. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    PubMed

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Bayesian assurance and sample size determination in the process validation life-cycle.

    PubMed

    Faya, Paul; Seaman, John W; Stamey, James D

    2017-01-01

    Validation of pharmaceutical manufacturing processes is a regulatory requirement and plays a key role in the assurance of drug quality, safety, and efficacy. The FDA guidance on process validation recommends a life-cycle approach which involves process design, qualification, and verification. The European Medicines Agency makes similar recommendations. The main purpose of process validation is to establish scientific evidence that a process is capable of consistently delivering a quality product. A major challenge faced by manufacturers is the determination of the number of batches to be used for the qualification stage. In this article, we present a Bayesian assurance and sample size determination approach where prior process knowledge and data are used to determine the number of batches. An example is presented in which potency uniformity data is evaluated using a process capability metric. By using the posterior predictive distribution, we simulate qualification data and make a decision on the number of batches required for a desired level of assurance.

  13. The operable modeling of simultaneous saccharification and fermentation of ethanol production from cellulose.

    PubMed

    Shen, Jiacheng; Agblevor, Foster A

    2010-03-01

    An operable batch model of simultaneous saccharification and fermentation (SSF) for ethanol production from cellulose has been developed. The model includes four ordinary differential equations that describe the changes of cellobiose, glucose, yeast, and ethanol concentrations with respect to time. These equations were used to simulate the experimental data of the four main components in the SSF process of ethanol production from microcrystalline cellulose (Avicel PH101). The model parameters at 95% confidence intervals were determined by a MATLAB program based on the batch experimental data of the SSF. Both experimental data and model simulations showed that the cell growth was the rate-controlling step at the initial period in a series of reactions of cellulose to ethanol, and later, the conversion of cellulose to cellobiose controlled the process. The batch model was extended to the continuous and fed-batch operating models. For the continuous operation in the SSF, the ethanol productivities increased with increasing dilution rate, until a maximum value was attained, and rapidly decreased as the dilution rate approached the washout point. The model also predicted a relatively high ethanol mass for the fed-batch operation than the batch operation.

  14. A framework supporting the development of a Grid portal for analysis based on ROI.

    PubMed

    Ichikawa, K; Date, S; Kaishima, T; Shimojo, S

    2005-01-01

    In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.

  15. Workload-Driven Design and Evaluation of Large-Scale Data-Centric Systems

    DTIC Science & Technology

    2012-05-09

    in the batch zone in and out of a low-power state, e.g., sending a “ hibernate ” command via ssh and using Wake-on-LAN or related technologies [85]. If...parameter values for experiments with stand-alone jobs. The mapred.child.java.opts parameter sets the maximum virtual memory of the Java child pro- cesses

  16. Perfusion cell culture decreases process and product heterogeneity in a head-to-head comparison with fed-batch.

    PubMed

    Walther, Jason; Lu, Jiuyi; Hollenbach, Myles; Yu, Marcella; Hwang, Chris; McLarty, Jean; Brower, Kevin

    2018-05-30

    In this study, we compared the impacts of fed-batch and perfusion platforms on process and product attributes for IgG1- and IgG4-producing cell lines. A "plug-and-play" approach was applied to both platforms at bench scale, using commercially available basal and feed media, a standard feed strategy for fed-batch, and ATF filtration for perfusion. Product concentration in fed-batch was 2.5 times greater than perfusion, while average productivity in perfusion was 7.5 times greater than fed-batch. PCA revealed more variability in the cell environment and metabolism during the fed-batch run. LDH measurements showed that exposure of product to cell lysate was 7-10 times greater in fed-batch. Product analysis shows larger abundances of neutral species in perfusion, likely due to decreased bioreactor residence times and extracellular exposure. The IgG1 perfusion product also had higher purity and lower half-antibody. Glycosylation was similar across both culture modes. The first perfusion harvest slice for both product types showed different glycosylation than subsequent harvests, suggesting that product quality lags behind metabolism. In conclusion, process and product data indicate that intra-lot heterogeneity is decreased in perfusion cultures. Additional data and discussion is required to understand the developmental, clinical and commercial implications, and in what situations increased uniformity would be beneficial. This article is protected by copyright. All rights reserved.

  17. Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes

    NASA Astrophysics Data System (ADS)

    Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping

    2017-01-01

    Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.

  18. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  19. Radar Unix: a complete package for GPR data processing

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Durand, Herve

    1999-03-01

    A complete package for ground penetrating radar data interpretation including data processing, forward modeling and a case history database consultation is presented. Running on an Unix operating system, its architecture consists of a graphical user interface generating batch files transmitted to a library of processing routines. This design allows a better software maintenance and the possibility for the user to run processing or modeling batch files by itself and differed in time. A case history data base is available and consists of an hypertext document which can be consulted by using a standard HTML browser. All the software specifications are presented through a realistic example.

  20. Gas-driven microturbine

    DOEpatents

    Sniegowski, Jeffrey J.; Rodgers, Murray S.; McWhorter, Paul J.; Aeschliman, Daniel P.; Miller, William M.

    2002-01-01

    A microturbine fabricated by a three-level semiconductor batch-fabrication process based on polysilicon surface-micromachining. The microturbine comprises microelectromechanical elements formed from three polysilicon multi-layer surfaces applied to a silicon substrate. Interleaving sacrificial oxide layers provides electrical and physical isolation, and selective etching of both the sacrificial layers and the polysilicon layers allows formation of individual mechanical and electrical elements as well as the required space for necessary movement of rotating turbine parts and linear elements.

  1. 40 CFR 63.1325 - Batch process vents-performance test methods and procedures to determine compliance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... either total organic HAP or TOC. For purposes of this paragraph (c), the term “batch emission episode... ensure the measurement of total organic HAP or TOC (minus methane and ethane) concentrations in all batch... organic HAP or TOC, as appropriate. Alternatively, any other method or data that has been validated...

  2. Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook

    NASA Astrophysics Data System (ADS)

    Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.

    2012-12-01

    The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system. Outputs can be summarised and visualised using the full power of Python's many scientific tools, including Scipy, Matplotlib, Pandas and CDAT. This rich user experience is delivered through the user's web browser; maintaining the interactive feel of a workstation-based environment with the parallel power of a remote data-centric processing facility.

  3. Applicability of the Monocyte Activation Test (MAT) in the quality control of the 17DD yellow fever vaccine.

    PubMed

    de Mattos, Katherine Antunes; Navega, Elaine Cristina Azevedo; Silva, Vitor Fernandes; Almeida, Alessandra Santos; da Silva, Cristiane Caldeira; Presgrave, Octavio Augusto França; Junior, Daniel da Silva Guedes; Delgado, Isabella Fernandes

    2018-03-01

    The need for alternatives to animal use in pyrogen testing has been driven by the Three Rs concept. This has resulted in the inclusion of the monocyte activation test (MAT) in the European Pharmacopoeia, 2010. However, some technical and regulatory obstacles must be overcome to ensure the effective implementation of the MAT by the industry, especially for the testing of biological products. The yellow fever (YF) vaccine (17DD-YFV) was chosen for evaluation in this study, in view of: a) the 2016-2018 outbreak of YF in Brazil; b) the increase in demand for 17DD-YFV doses; c) the complex production process with live attenuated virus; d) the presence of possible test interference factors, such as residual process components (e.g. ovalbumin); and e) the need for the investigation of other pyrogens that are not detectable by the methods prescribed in the YF vaccine monograph. The product-specific testing was carried out by using cryopreserved and fresh whole blood, and IL-6 and IL-1β levels were used as the marker readouts. After assessing the applicability of the MAT on a 1:10 dilution of 17DD-YFV, endotoxin and non-endotoxin pyrogens were quantified in spiked batches, by using the lipopolysaccharide and lipoteichoic acid standards, respectively. The quantitative analysis demonstrated the correlation between the MAT and the Limulus amoebocyte lysate (LAL) assays, with respect to the limits of endotoxin recovery in spiked batches and the detection of no pyrogenic contamination in commercial batches of 17DD-YFV. The data demonstrated the applicability of the MAT for 17DD-YFV pyrogen testing, and as an alternative method that can contribute to biological quality control studies. 2018 FRAME.

  4. Multivariate statistical process control (MSPC) using Raman spectroscopy for in-line culture cell monitoring considering time-varying batches synchronized with correlation optimized warping (COW).

    PubMed

    Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic

    2017-02-01

    Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Multivariate statistical process control in product quality review assessment - A case study.

    PubMed

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  6. The development of an industrial-scale fed-batch fermentation simulation.

    PubMed

    Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry

    2015-01-10

    This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  7. Bioelectrochemical conversion of CO2 to chemicals: CO2 as a next generation feedstock for electricity-driven bioproduction in batch and continuous modes.

    PubMed

    Bajracharya, Suman; Vanbroekhoven, Karolien; Buisman, Cees J N; Strik, David P B T B; Pant, Deepak

    2017-09-21

    The recent concept of microbial electrosynthesis (MES) has evolved as an electricity-driven production technology for chemicals from low-value carbon dioxide (CO 2 ) using micro-organisms as biocatalysts. MES from CO 2 comprises bioelectrochemical reduction of CO 2 to multi-carbon organic compounds using the reducing equivalents produced at the electrically-polarized cathode. The use of CO 2 as a feedstock for chemicals is gaining much attention, since CO 2 is abundantly available and its use is independent of the food supply chain. MES based on CO 2 reduction produces acetate as a primary product. In order to elucidate the performance of the bioelectrochemical CO 2 reduction process using different operation modes (batch vs. continuous), an investigation was carried out using a MES system with a flow-through biocathode supplied with 20 : 80 (v/v) or 80 : 20 (v/v) CO 2  : N 2 gas. The highest acetate production rate of 149 mg L -1 d -1 was observed with a 3.1 V applied cell-voltage under batch mode. While running in continuous mode, high acetate production was achieved with a maximum rate of 100 mg L -1 d -1 . In the continuous mode, the acetate production was not sustained over long-term operation, likely due to insufficient microbial biocatalyst retention within the biocathode compartment (i.e. suspended micro-organisms were washed out of the system). Restarting batch mode operations resulted in a renewed production of acetate. This showed an apparent domination of suspended biocatalysts over the attached (biofilm forming) biocatalysts. Long term CO 2 reduction at the biocathode resulted in the accumulation of acetate, and more reduced compounds like ethanol and butyrate were also formed. Improvements in the production rate and different biomass retention strategies (e.g. selecting for biofilm forming micro-organisms) should be investigated to enable continuous biochemical production from CO 2 using MES. Certainly, other process optimizations will be required to establish MES as an innovative sustainable technology for manufacturing biochemicals from CO 2 as a next generation feedstock.

  8. The Use of Lean Six Sigma Methodology in Increasing Capacity of a Chemical Production Facility at DSM.

    PubMed

    Meeuwse, Marco

    2018-03-30

    Lean Six Sigma is an improvement method, combining Lean, which focuses on removing 'waste' from a process, with Six Sigma, which is a data-driven approach, making use of statistical tools. Traditionally it is used to improve the quality of products (reducing defects), or processes (reducing variability). However, it can also be used as a tool to increase the productivity or capacity of a production plant. The Lean Six Sigma methodology is therefore an important pillar of continuous improvement within DSM. In the example shown here a multistep batch process is improved, by analyzing the duration of the relevant process steps, and optimizing the procedures. Process steps were performed in parallel instead of sequential, and some steps were made shorter. The variability was reduced, giving the opportunity to make a tighter planning, and thereby reducing waiting times. Without any investment in new equipment or technical modifications, the productivity of the plant was improved by more than 20%; only by changing procedures and the programming of the process control system.

  9. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  10. Modelling of Batch Lactic Acid Fermentation in
the Presence of Anionic Clay

    PubMed Central

    Jinescu, Cosmin; Aruş, Vasilica Alisa; Nistor, Ileana Denisa

    2014-01-01

    Summary Batch fermentation of milk inoculated with lactic acid bacteria was conducted in the presence of hydrotalcite-type anionic clay under static and ultrasonic conditions. An experimental study of the effect of fermentation temperature (t=38–43 °C), clay/milk ratio (R=1–7.5 g/L) and ultrasonic field (ν=0 and 35 kHz) on process dynamics was performed. A mathematical model was selected to describe the fermentation process kinetics and its parameters were estimated based on experimental data. A good agreement between the experimental and simulated results was achieved. Consequently, the model can be employed to predict the dynamics of batch lactic acid fermentation with values of process variables in the studied ranges. A statistical analysis of the data based on a 23 factorial experiment was performed in order to express experimental and model-regressed process responses depending on t, R and ν factors. PMID:27904318

  11. Low-Rank Matrix Recovery Approach for Clutter Rejection in Real-Time IR-UWB Radar-Based Moving Target Detection

    PubMed Central

    Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom

    2016-01-01

    The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159

  12. A short term quality control tool for biodegradable microspheres.

    PubMed

    D'Souza, Susan; Faraj, Jabar A; Dorati, Rossella; DeLuca, Patrick P

    2014-06-01

    Accelerated in vitro release testing methodology has been developed as an indicator of product performance to be used as a discriminatory quality control (QC) technique for the release of clinical and commercial batches of biodegradable microspheres. While product performance of biodegradable microspheres can be verified by in vivo and/or in vitro experiments, such evaluation can be particularly challenging because of slow polymer degradation, resulting in extended study times, labor, and expense. Three batches of Leuprolide poly(lactic-co-glycolic acid) (PLGA) microspheres having varying morphology (process variants having different particle size and specific surface area) were manufactured by the solvent extraction/evaporation technique. Tests involving in vitro release, polymer degradation and hydration of the microspheres were performed on the three batches at 55°C. In vitro peptide release at 55°C was analyzed using a previously derived modification of the Weibull function termed the modified Weibull equation (MWE). Experimental observations and data analysis confirm excellent reproducibility studies within and between batches of the microsphere formulations demonstrating the predictability of the accelerated experiments at 55°C. The accelerated test method was also successfully able to distinguish the in vitro product performance between the three batches having varying morphology (process variants), indicating that it is a suitable QC tool to discriminate product or process variants in clinical or commercial batches of microspheres. Additionally, data analysis utilized the MWE to further quantify the differences obtained from the accelerated in vitro product performance test between process variants, thereby enhancing the discriminatory power of the accelerated methodology at 55°C.

  13. Missing data and technical variability in single-cell RNA-sequencing experiments.

    PubMed

    Hicks, Stephanie C; Townes, F William; Teng, Mingxiang; Irizarry, Rafael A

    2017-11-06

    Until recently, high-throughput gene expression technology, such as RNA-Sequencing (RNA-seq) required hundreds of thousands of cells to produce reliable measurements. Recent technical advances permit genome-wide gene expression measurement at the single-cell level. Single-cell RNA-Seq (scRNA-seq) is the most widely used and numerous publications are based on data produced with this technology. However, RNA-seq and scRNA-seq data are markedly different. In particular, unlike RNA-seq, the majority of reported expression levels in scRNA-seq are zeros, which could be either biologically-driven, genes not expressing RNA at the time of measurement, or technically-driven, genes expressing RNA, but not at a sufficient level to be detected by sequencing technology. Another difference is that the proportion of genes reporting the expression level to be zero varies substantially across single cells compared to RNA-seq samples. However, it remains unclear to what extent this cell-to-cell variation is being driven by technical rather than biological variation. Furthermore, while systematic errors, including batch effects, have been widely reported as a major challenge in high-throughput technologies, these issues have received minimal attention in published studies based on scRNA-seq technology. Here, we use an assessment experiment to examine data from published studies and demonstrate that systematic errors can explain a substantial percentage of observed cell-to-cell expression variability. Specifically, we present evidence that some of these reported zeros are driven by technical variation by demonstrating that scRNA-seq produces more zeros than expected and that this bias is greater for lower expressed genes. In addition, this missing data problem is exacerbated by the fact that this technical variation varies cell-to-cell. Then, we show how this technical cell-to-cell variability can be confused with novel biological results. Finally, we demonstrate and discuss how batch-effects and confounded experiments can intensify the problem. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  15. The influence of data-driven versus conceptually-driven processing on the development of PTSD-like symptoms.

    PubMed

    Kindt, Merel; van den Hout, Marcel; Arntz, Arnoud; Drost, Jolijn

    2008-12-01

    Ehlers and Clark [(2000). A cognitive model of posttraumatic stress disorder. Behaviour Research and Therapy, 38, 319-345] propose that a predominance of data-driven processing during the trauma predicts subsequent PTSD. We wondered whether, apart from data-driven encoding, sustained data-driven processing after the trauma is also crucial for the development of PTSD. Both hypotheses were tested in two analogue experiments. Experiment 1 demonstrated that relative to conceptually-driven processing (n=20), data-driven processing after the film (n=14), resulted in more intrusions. Experiment 2 demonstrated that relative to the neutral condition (n=24) and the data-driven encoding condition (n=24), conceptual encoding (n=25) reduced suppression of intrusions and a trend emerged for memory fragmentation. The difference between the two encoding styles was due to the beneficial effect of induced conceptual encoding and not to the detrimental effect of data-driven encoding. The data support the viability of the distinction between data-driven/conceptually-driven processing for the understanding of the development of PTSD.

  16. Integrated approaches to the application of advanced modeling technology in process development and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  17. B827 Chemical Synthhesis Project - Industrial Control System Integration - Statement of Work & Specification with Attachments 1-14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wade, F. E.

    The Chemical Synthesis Pilot Process at the Lawrence Livermore National Laboratory (LLNL) Site 300 827 Complex will be used to synthesize small quantities of material to support research and development. The project will modernize and increase current capabilities for chemical synthesis at LLNL. The primary objective of this project is the conversion of a non-automated hands-on process to a remoteoperation process, while providing enhanced batch process step control, stored recipe-specific parameter sets, process variable visibility, monitoring, alarm and warning handling, and comprehensive batch record data logging. This Statement of Work and Specification provides the industrial-grade process control requirements for themore » chemical synthesis batching control system, hereafter referred to as the “Control System” to be delivered by the System Integrator.« less

  18. A preliminary evaluation of a reusable digital sterilization indicator prototype.

    PubMed

    Puttaiah, R; Griggs, J; D'Onofrio, M

    2014-09-01

    Sterilization of critical and semicritical instruments used in patient care must undergo a terminal process of sterilization. Use of chemical and physical indicators are important in providing information on the sterilizer's performance during each cycle. Regular and periodic monitoring of sterilizers using biological indicators is necessary in periodically validating performance of sterilizers. Data loggers or independent digital parametric indicators are innovative devices that provide more information than various classes chemical indicators. In this study we evaluated a prototype of an independent digital parametric indicator's use in autoclaves. The purpose of this study was to evaluate the performance of an independent digital indicator/data logger prototype (DS1922F) that could be used for multiple cycles within an autoclave.MG Materials and methods: Three batches of the DS1922F (150 samples) were used in this study that was conducted in a series. The first batch was challenged with 300 sterilization cycles within an autoclave and the data loggers evaluated to study failures and the reason for failure, make corrections and improve the prototype design. After changes made based on studying the first batch, the second batch of the prototype (150 samples) were challenged once again with 300 sterilization cycles within an autoclave and failure studied again in further improvement of the prototype. The final batch (3rd batch) of the prototype (150 samples) was challenged again but with 600 cycles to see how long they would last. Kaplan-Meier survival analysis analyses of all three batches was conducted (α = 0.05) and failed samples qualitatively studied in understanding the variables involved in the failure of the prototype, and in improving quality. Each tested batch provided crucial information on device failure and helped in improvement of the prototype. Mean lifetime survival of the final batch (Batch 3) of prototype was 498 (480, 516) sterilization cycles in an autoclave. In this study, the final batch of the DS1922F prototype data logger was found to be robust in withstanding the challenge of 600 autoclave cycles, with a mean lifetime of more than 450 cycles, multiple times more than prescribed number of cycles. Instrument reprocessing is among the important aspects of infection control. While stringent procedures are followed in instrument reprocessing within the clinic in assuring patient safety, regular use of sterilization process indicators and periodic biological validation of the sterilizer's performance is necessary. Chemical indicators for use in Autoclaves provide information on whether the particular cycle's parameters were achieved but do not provide at what specific point in time or temperature the failure occurred. Data loggers and associated reader software as the tested prototype in this evaluation (DS1922F), are designed to provide continuous information on time and temperature of the prescribed cycle. Data loggers provide immediate information on the process as opposed to Biological Indicators that take from days to a week in obtaining a confirmatory result. Further, many countries do not have the sterilization monitoring service infrastructure to meet the demands of the end users. In the absence of sterilization monitoring services, use of digital data loggers for each sterilization cycle is more pragmatic.

  19. Image data-processing system for solar astronomy

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.; Teuber, D. L.; Watkins, J. R.; Thomas, D. T.; Cooper, C. M.

    1977-01-01

    The paper describes an image data processing system (IDAPS), its hardware/software configuration, and interactive and batch modes of operation for the analysis of the Skylab/Apollo Telescope Mount S056 X-Ray Telescope experiment data. Interactive IDAPS is primarily designed to provide on-line interactive user control of image processing operations for image familiarization, sequence and parameter optimization, and selective feature extraction and analysis. Batch IDAPS follows the normal conventions of card control and data input and output, and is best suited where the desired parameters and sequence of operations are known and when long image-processing times are required. Particular attention is given to the way in which this system has been used in solar astronomy and other investigations. Some recent results obtained by means of IDAPS are presented.

  20. Transfer of a three step mAb chromatography process from batch to continuous: Optimizing productivity to minimize consumable requirements.

    PubMed

    Gjoka, Xhorxhi; Gantier, Rene; Schofield, Mark

    2017-01-20

    The goal of this study was to adapt a batch mAb purification chromatography platform for continuous operation. The experiments and rationale used to convert from batch to continuous operation are described. Experimental data was used to design chromatography methods for continuous operation that would exceed the threshold for critical quality attributes and minimize the consumables required as compared to batch mode of operation. Four unit operations comprising of Protein A capture, viral inactivation, flow-through anion exchange (AEX), and mixed-mode cation exchange chromatography (MMCEX) were integrated across two Cadence BioSMB PD multi-column chromatography systems in order to process a 25L volume of harvested cell culture fluid (HCCF) in less than 12h. Transfer from batch to continuous resulted in an increase in productivity of the Protein A step from 13 to 50g/L/h and of the MMCEX step from 10 to 60g/L/h with no impact on the purification process performance in term of contaminant removal (4.5 log reduction of host cell proteins, 50% reduction in soluble product aggregates) and overall chromatography process yield of recovery (75%). The increase in productivity, combined with continuous operation, reduced the resin volume required for Protein A and MMCEX chromatography by more than 95% compared to batch. The volume of AEX membrane required for flow through operation was reduced by 74%. Moreover, the continuous process required 44% less buffer than an equivalent batch process. This significant reduction in consumables enables cost-effective, disposable, single-use manufacturing. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Unobtrusive integration of data management with fMRI analysis.

    PubMed

    Poliakov, Andrew V; Hertzenberg, Xenia; Moore, Eider B; Corina, David P; Ojemann, George A; Brinkley, James F

    2007-01-01

    This note describes a software utility, called X-batch which addresses two pressing issues typically faced by functional magnetic resonance imaging (fMRI) neuroimaging laboratories (1) analysis automation and (2) data management. The first issue is addressed by providing a simple batch mode processing tool for the popular SPM software package (http://www.fil.ion. ucl.ac.uk/spm/; Welcome Department of Imaging Neuroscience, London, UK). The second is addressed by transparently recording metadata describing all aspects of the batch job (e.g., subject demographics, analysis parameters, locations and names of created files, date and time of analysis, and so on). These metadata are recorded as instances of an extended version of the Protégé-based Experiment Lab Book ontology created by the Dartmouth fMRI Data Center. The resulting instantiated ontology provides a detailed record of all fMRI analyses performed, and as such can be part of larger systems for neuroimaging data management, sharing, and visualization. The X-batch system is in use in our own fMRI research, and is available for download at http://X-batch.sourceforge.net/.

  2. Pulse-Flow Microencapsulation System

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    2006-01-01

    The pulse-flow microencapsulation system (PFMS) is an automated system that continuously produces a stream of liquid-filled microcapsules for delivery of therapeutic agents to target tissues. Prior microencapsulation systems have relied on batch processes that involve transfer of batches between different apparatuses for different stages of production followed by sampling for acquisition of quality-control data, including measurements of size. In contrast, the PFMS is a single, microprocessor-controlled system that performs all processing steps, including acquisition of quality-control data. The quality-control data can be used as real-time feedback to ensure the production of large quantities of uniform microcapsules.

  3. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  4. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  5. Modeling of the pyruvate production with Escherichia coli: comparison of mechanistic and neural networks-based models.

    PubMed

    Zelić, B; Bolf, N; Vasić-Racki, D

    2006-06-01

    Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.

  6. Hadoop distributed batch processing for Gaia: a success story

    NASA Astrophysics Data System (ADS)

    Riello, Marco

    2015-12-01

    The DPAC Cambridge Data Processing Centre (DPCI) is responsible for the photometric calibration of the Gaia data including the low resolution spectra. The large data volume produced by Gaia (~26 billion transits/year), the complexity of its data stream and the self-calibrating approach pose unique challenges for scalability, reliability and robustness of both the software pipelines and the operations infrastructure. DPCI has been the first in DPAC to realise the potential of Hadoop and Map/Reduce and to adopt them as the core technologies for its infrastructure. This has proven a winning choice allowing DPCI unmatched processing throughput and reliability within DPAC to the point that other DPCs have started following our footsteps. In this talk we will present the software infrastructure developed to build the distributed and scalable batch data processing system that is currently used in production at DPCI and the excellent results in terms of performance of the system.

  7. A GPU-Accelerated Approach for Feature Tracking in Time-Varying Imagery Datasets.

    PubMed

    Peng, Chao; Sahani, Sandip; Rushing, John

    2017-10-01

    We propose a novel parallel connected component labeling (CCL) algorithm along with efficient out-of-core data management to detect and track feature regions of large time-varying imagery datasets. Our approach contributes to the big data field with parallel algorithms tailored for GPU architectures. We remove the data dependency between frames and achieve pixel-level parallelism. Due to the large size, the entire dataset cannot fit into cached memory. Frames have to be streamed through the memory hierarchy (disk to CPU main memory and then to GPU memory), partitioned, and processed as batches, where each batch is small enough to fit into the GPU. To reconnect the feature regions that are separated due to data partitioning, we present a novel batch merging algorithm to extract the region connection information across multiple batches in a parallel fashion. The information is organized in a memory-efficient structure and supports fast indexing on the GPU. Our experiment uses a commodity workstation equipped with a single GPU. The results show that our approach can efficiently process a weather dataset composed of terabytes of time-varying radar images. The advantages of our approach are demonstrated by comparing to the performance of an efficient CPU cluster implementation which is being used by the weather scientists.

  8. 40 CFR 63.1406 - Reactor batch process vent provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Reactor batch process vent provisions... § 63.1406 Reactor batch process vent provisions. (a) Emission standards. Owners or operators of reactor... reactor batch process vent located at a new affected source shall control organic HAP emissions by...

  9. 40 CFR 63.1406 - Reactor batch process vent provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Reactor batch process vent provisions... § 63.1406 Reactor batch process vent provisions. (a) Emission standards. Owners or operators of reactor... reactor batch process vent located at a new affected source shall control organic HAP emissions by...

  10. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 12 2013-07-01 2013-07-01 false Batch process vents-reference control... (CONTINUED) National Emission Standards for Hazardous Air Pollutant Emissions: Group IV Polymers and Resins § 63.1322 Batch process vents—reference control technology. (a) Batch process vents. The owner or...

  11. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 12 2012-07-01 2011-07-01 true Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Batch process vents—reference control technology. (a) Batch process vents. The owner or operator of a...

  12. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 12 2014-07-01 2014-07-01 false Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... § 63.1322 Batch process vents—reference control technology. (a) Batch process vents. The owner or...

  13. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Batch process vents—reference control technology. (a) Batch process vents. The owner or operator of a...

  14. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Batch process vents—reference control technology. (a) Batch process vents. The owner or operator of a...

  15. Actual Waste Demonstration of the Nitric-Glycolic Flowsheet for Sludge Batch 9 Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. D. Newell; Pareizs, J. M.; Martino, C. J.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs qualification testing to demonstrate that the sludge batch is processable. Testing performed by the Savannah River National Laboratory has shown glycolic acid to be effective in replacing the function of formic acid in the DWPF chemical process. The nitric-glycolic flowsheet reduces mercury, significantly lowers the catalytic generation of hydrogen and ammonia which could allow purge reduction in the Sludge Receipt and Adjustment Tank (SRAT), stabilizes the pH and chemistry in the SRAT and the Slurry Mix Evaporator (SME), allowsmore » for effective rheology adjustment, and is favorable with respect to melter flammability. In order to implement the new flowsheet, SRAT and SME cycles, designated SC-18, were performed using a Sludge Batch (SB) 9 slurry blended from SB8 Tank 40H and Tank 51H samples. The SRAT cycle involved adding nitric and glycolic acids to the sludge, refluxing to steam strip mercury, and dewatering to a targeted solids concentration. Data collected during the SRAT cycle included offgas analyses, process temperatures, heat transfer, and pH measurements. The SME cycle demonstrated the addition of glass frit and the replication of six canister decontamination additions. The demonstration concluded with dewatering to a targeted solids concentration. Data collected during the SME cycle included offgas analyses, process temperatures, heat transfer, and pH measurements. Slurry and condensate samples were collected for subsequent analysis« less

  16. Kinetic studies on batch cultivation of Trichoderma reesei and application to enhance cellulase production by fed-batch fermentation.

    PubMed

    Ma, Lijuan; Li, Chen; Yang, Zhenhua; Jia, Wendi; Zhang, Dongyuan; Chen, Shulin

    2013-07-20

    Reducing the production cost of cellulase as the key enzyme for cellulose hydrolysis to fermentable sugars remains a major challenge for biofuel production. Because of the complexity of cellulase production, kinetic modeling and mass balance calculation can be used as effective tools for process design and optimization. In this study, kinetic models for cell growth, substrate consumption and cellulase production in batch fermentation were developed, and then applied in fed-batch fermentation to enhance cellulase production. Inhibition effect of substrate was considered and a modified Luedeking-Piret model was developed for cellulase production and substrate consumption according to the growth characteristics of Trichoderma reesei. The model predictions fit well with the experimental data. Simulation results showed that higher initial substrate concentration led to decrease of cellulase production rate. Mass balance and kinetic simulation results were applied to determine the feeding strategy. Cellulase production and its corresponding productivity increased by 82.13% after employing the proper feeding strategy in fed-batch fermentation. This method combining mathematics and chemometrics by kinetic modeling and mass balance can not only improve cellulase fermentation process, but also help to better understand the cellulase fermentation process. The model development can also provide insight to other similar fermentation processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    PubMed

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  18. Identifying and mitigating batch effects in whole genome sequencing data.

    PubMed

    Tom, Jennifer A; Reeder, Jens; Forrest, William F; Graham, Robert R; Hunkapiller, Julie; Behrens, Timothy W; Bhangale, Tushar R

    2017-07-24

    Large sample sets of whole genome sequencing with deep coverage are being generated, however assembling datasets from different sources inevitably introduces batch effects. These batch effects are not well understood and can be due to changes in the sequencing protocol or bioinformatics tools used to process the data. No systematic algorithms or heuristics exist to detect and filter batch effects or remove associations impacted by batch effects in whole genome sequencing data. We describe key quality metrics, provide a freely available software package to compute them, and demonstrate that identification of batch effects is aided by principal components analysis of these metrics. To mitigate batch effects, we developed new site-specific filters that identified and removed variants that falsely associated with the phenotype due to batch effect. These include filtering based on: a haplotype based genotype correction, a differential genotype quality test, and removing sites with missing genotype rate greater than 30% after setting genotypes with quality scores less than 20 to missing. This method removed 96.1% of unconfirmed genome-wide significant SNP associations and 97.6% of unconfirmed genome-wide significant indel associations. We performed analyses to demonstrate that: 1) These filters impacted variants known to be disease associated as 2 out of 16 confirmed associations in an AMD candidate SNP analysis were filtered, representing a reduction in power of 12.5%, 2) In the absence of batch effects, these filters removed only a small proportion of variants across the genome (type I error rate of 3%), and 3) in an independent dataset, the method removed 90.2% of unconfirmed genome-wide SNP associations and 89.8% of unconfirmed genome-wide indel associations. Researchers currently do not have effective tools to identify and mitigate batch effects in whole genome sequencing data. We developed and validated methods and filters to address this deficiency.

  19. Enzyme recycle and fed-batch addition for high-productivity soybean flour processing to produce enriched soy protein and concentrated hydrolysate of fermentable sugars.

    PubMed

    Loman, Abdullah Al; Islam, S M Mahfuzul; Li, Qian; Ju, Lu-Kwang

    2017-10-01

    Despite having high protein and carbohydrate, soybean flour utilization is limited to partial replacement of animal feed to date. Enzymatic process can be exploited to increase its value by enriching protein content and separating carbohydrate for utilization as fermentation feedstock. Enzyme hydrolysis with fed-batch and recycle designs were evaluated here for achieving this goal with high productivities. Fed-batch process improved carbohydrate conversion, particularly at high substrate loadings of 250-375g/L. In recycle process, hydrolysate retained a significant portion of the limiting enzyme α-galactosidase to accelerate carbohydrate monomerization rate. At single-pass retention time of 6h and recycle rate of 62.5%, reducing sugar concentration reached up to 120g/L using 4ml/g enzyme. When compared with batch and fed-batch processes, the recycle process increased the volumetric productivity of reducing sugar by 36% (vs. fed-batch) to 57% (vs. batch) and that of protein product by 280% (vs. fed-batch) to 300% (vs. batch). Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. 40 CFR 63.1407 - Non-reactor batch process vent provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Non-reactor batch process vent... § 63.1407 Non-reactor batch process vent provisions. (a) Emission standards. (1) Owners or operators of non-reactor batch process vents located at new or existing affected sources with 0.25 tons per year (0...

  1. 40 CFR 63.1407 - Non-reactor batch process vent provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Non-reactor batch process vent... § 63.1407 Non-reactor batch process vent provisions. (a) Emission standards. (1) Owners or operators of non-reactor batch process vents located at new or existing affected sources with 0.25 tons per year (0...

  2. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process vents... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Batch front-end process vents-reference control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  3. A Model-based B2B (Batch to Batch) Control for An Industrial Batch Polymerization Process

    NASA Astrophysics Data System (ADS)

    Ogawa, Morimasa

    This paper describes overview of a model-based B2B (batch to batch) control for an industrial batch polymerization process. In order to control the reaction temperature precisely, several methods based on the rigorous process dynamics model are employed at all design stage of the B2B control, such as modeling and parameter estimation of the reaction kinetics which is one of the important part of the process dynamics model. The designed B2B control consists of the gain scheduled I-PD/II2-PD control (I-PD with double integral control), the feed-forward compensation at the batch start time, and the model adaptation utilizing the results of the last batch operation. Throughout the actual batch operations, the B2B control provides superior control performance compared with that of conventional control methods.

  4. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Verification Of The Defense Waste Processing Facility's (DWPF) Process Digestion Methods For The Sludge Batch 8 Qualification Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Click, D. R.; Edwards, T. B.; Wiedenman, B. J.

    2013-03-18

    This report contains the results and comparison of data generated from inductively coupled plasma – atomic emission spectroscopy (ICP-AES) analysis of Aqua Regia (AR), Sodium Peroxide/Sodium Hydroxide Fusion Dissolution (PF) and Cold Chem (CC) method digestions and Cold Vapor Atomic Absorption analysis of Hg digestions from the DWPF Hg digestion method of Sludge Batch 8 (SB8) Sludge Receipt and Adjustment Tank (SRAT) Receipt and SB8 SRAT Product samples. The SB8 SRAT Receipt and SB8 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB8 Batch ormore » qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 7b (SB7b), to form the SB8 Blend composition.« less

  6. Effects of region, demography, and protection from fishing on batch fecundity of common coral trout ( Plectropomus leopardus)

    NASA Astrophysics Data System (ADS)

    Carter, Alex B.; Davies, Campbell R.; Mapstone, Bruce D.; Russ, Garry R.; Tobin, Andrew J.; Williams, Ashley J.

    2014-09-01

    Batch fecundity of female Plectropomus leopardus, a coral reef fish targeted by commercial and recreational fishing, was compared between reefs open to fishing and reefs within no-take marine reserves within three regions of the Great Barrier Reef (GBR), Australia. Length, weight, and age had positive effects on batch fecundity of spawners from northern and central reefs but negligible effects on spawners from southern reefs. Females were least fecund for a given length, weight, and age in the southern GBR. Batch fecundity of a 500-mm fork length female was 430 % greater on central reefs and 207 % greater on northern reefs than on southern reefs. The effects of length and age on batch fecundity did not differ significantly between reserve and fished reefs in any region, but weight-specific fecundity was 100 % greater for large 2.0 kg females on reserve reefs compared with fished reefs in the central GBR. We hypothesize that regional variation in batch fecundity is likely driven by water temperature and prey availability. Significant regional variation in batch fecundity highlights the need for understanding spatial variation in reproductive output where single conservation or fishery management strategies cover large, potentially diverse, spatial scales.

  7. 40 CFR 63.1326 - Batch process vents-recordkeeping provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutant Emissions: Group IV Polymers and Resins § 63.1326 Batch process vents—recordkeeping provisions. (a) Group determination records for batch... this section for each batch process vent subject to the group determination procedures of § 63.1323...

  8. 40 CFR 63.1321 - Batch process vents provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for Hazardous Air Pollutant Emissions: Group IV Polymers and Resins § 63.1321 Batch process vents..., owners and operators of new and existing affected sources with batch process vents shall comply with the... applicable reporting requirements in § 63.1327. (b) New SAN batch affected sources. Owners and operators of...

  9. Hierarchical Bayesian models to assess between- and within-batch variability of pathogen contamination in food.

    PubMed

    Commeau, Natalie; Cornu, Marie; Albert, Isabelle; Denis, Jean-Baptiste; Parent, Eric

    2012-03-01

    Assessing within-batch and between-batch variability is of major interest for risk assessors and risk managers in the context of microbiological contamination of food. For example, the ratio between the within-batch variability and the between-batch variability has a large impact on the results of a sampling plan. Here, we designed hierarchical Bayesian models to represent such variability. Compatible priors were built mathematically to obtain sound model comparisons. A numeric criterion is proposed to assess the contamination structure comparing the ability of the models to replicate grouped data at the batch level using a posterior predictive loss approach. Models were applied to two case studies: contamination by Listeria monocytogenes of pork breast used to produce diced bacon and contamination by the same microorganism on cold smoked salmon at the end of the process. In the first case study, a contamination structure clearly exists and is located at the batch level, that is, between batches variability is relatively strong, whereas in the second a structure also exists but is less marked. © 2012 Society for Risk Analysis.

  10. Navigation Operations with Prototype Components of an Automated Real-Time Spacecraft Navigation System

    NASA Technical Reports Server (NTRS)

    Cangahuala, L.; Drain, T. R.

    1999-01-01

    At present, ground navigation support for interplanetary spacecraft requires human intervention for data pre-processing, filtering, and post-processing activities; these actions must be repeated each time a new batch of data is collected by the ground data system.

  11. Access to small size distributions of nanoparticles by microwave-assisted synthesis. Formation of Ag nanoparticles in aqueous carboxymethylcellulose solutions in batch and continuous-flow reactors

    NASA Astrophysics Data System (ADS)

    Horikoshi, Satoshi; Abe, Hideki; Torigoe, Kanjiro; Abe, Masahiko; Serpone, Nick

    2010-08-01

    This article examines the effect(s) of the 2.45-GHz microwave (MW) radiation in the synthesis of silver nanoparticles in aqueous media by reduction of the diaminesilver(i) complex, [Ag(NH3)2]+, with carboxymethylcellulose (CMC) in both batch-type and continuous-flow reactor systems with a particular emphasis on the characteristics of the microwaves in this process and the size distributions. This microwave thermally-assisted synthesis is compared to a conventional heating (CH) method, both requiring a reaction temperature of 100 °C to produce the nanoparticles, in both cases leading to the formation of silver colloids with different size distributions. Reduction of the diaminesilver(i) precursor complex, [Ag(NH3)2]+, by CMC depended on the solution temperature. Cooling the reactor during the heating process driven with 390-Watt microwaves (MW-390W/Cool protocol) yielded silver nanoparticles with sizes spanning the range 1-2 nm. By contrast, the size distribution of Ag nanoparticles with 170-Watt microwaves (no cooling; MW-170W protocol) was in the range 1.4-3.6 nm (average size ~3 nm). The overall results suggest the potential for a scale-up process in the microwave-assisted synthesis of nanoparticles. Based on the present data, a flow-through microwave reactor system is herein proposed for the continuous production of silver nanoparticles. The novel flow reactor system (flow rate, 600 mL min-1) coupled to 1200-Watt microwave radiation generated silver nanoparticles with a size distribution 0.7-2.8 nm (average size ca. 1.5 nm).

  12. Fate and transport with material response characterization of green sorption media for copper removal via desorption process.

    PubMed

    Chang, Ni-Bin; Houmann, Cameron; Lin, Kuen-Song; Wanielista, Martin

    2016-07-01

    Multiple adsorption and desorption cycles are required to achieve the reliable operation of copper removal and recovery. A green sorption media mixture composed of recycled tire chunk, expanded clay aggregate, and coconut coir was evaluated in this study for its desorptive characteristics as a companion study of the corresponding adsorption process in an earlier publication. We conducted a screening of potential desorbing agents, batch desorption equilibrium and kinetic studies, and batch tests through 3 adsorption/desorption cycles. The desorbing agent screening revealed that hydrochloric acid has good potential for copper desorption. Equilibrium data fit the Freundlich isotherm, whereas kinetic data had high correlation with the Lagergren pseudo second-order model and revealed a rapid desorption reaction. Batch equilibrium data over 3 adsorption/desorption cycles showed that the coconut coir and media mixture were the most resilient, demonstrating they could be used through 3 or more adsorption/desorption cycles. FE-SEM imaging, XRD, and EDS analyses supported the batch adsorption and desorption results showing significant surface sorption of CuO species in the media mixture and coconut coir, followed by partial desorption using 0.1 M HCl as a desorbing agent. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Enhancing tablet disintegration characteristics of a highly water-soluble high-drug-loading formulation by granulation process.

    PubMed

    Pandey, Preetanshu; Levins, Christopher; Pafiakis, Steve; Zacour, Brian; Bindra, Dilbir S; Trinh, Jade; Buckley, David; Gour, Shruti; Sharif, Shasad; Stamato, Howard

    2018-07-01

    The objective of this study was to improve the disintegration and dissolution characteristics of a highly water-soluble tablet matrix by altering the manufacturing process. A high disintegration time along with high dependence of the disintegration time on tablet hardness was observed for a high drug loading (70% w/w) API when formulated using a high-shear wet granulation (HSWG) process. Keeping the formulation composition mostly constant, a fluid-bed granulation (FBG) process was explored as an alternate granulation method using a 2 (4-1) fractional factorial design with two center points. FBG batches (10 batches) were manufactured using varying disingtegrant amount, spray rate, inlet temperature (T) and atomization air pressure. The resultant final blend particle size was affected significantly by spray rate (p = .0009), inlet T (p = .0062), atomization air pressure (p = .0134) and the interaction effect between inlet T*spray rate (p = .0241). The compactibility of the final blend was affected significantly by disintegrant amount (p < .0001), atomization air pressure (p = .0013) and spray rate (p = .05). It was observed that the fluid-bed batches gave significantly lower disintegration times than the HSWG batches, and mercury intrusion porosimetry data revealed that this was caused by the higher internal pore structure of tablets manufactured using the FBG batches.

  14. Multivariate data analysis on historical IPV production data for better process understanding and future improvements.

    PubMed

    Thomassen, Yvonne E; van Sprang, Eric N M; van der Pol, Leo A; Bakker, Wilfried A M

    2010-09-01

    Historical manufacturing data can potentially harbor a wealth of information for process optimization and enhancement of efficiency and robustness. To extract useful data multivariate data analysis (MVDA) using projection methods is often applied. In this contribution, the results obtained from applying MVDA on data from inactivated polio vaccine (IPV) production runs are described. Data from over 50 batches at two different production scales (700-L and 1,500-L) were available. The explorative analysis performed on single unit operations indicated consistent manufacturing. Known outliers (e.g., rejected batches) were identified using principal component analysis (PCA). The source of operational variation was pinpointed to variation of input such as media. Other relevant process parameters were in control and, using this manufacturing data, could not be correlated to product quality attributes. The gained knowledge of the IPV production process, not only from the MVDA, but also from digitalizing the available historical data, has proven to be useful for troubleshooting, understanding limitations of available data and seeing the opportunity for improvements. 2010 Wiley Periodicals, Inc.

  15. Potential of biogenic hydrogen production for hydrogen driven remediation strategies in marine environments.

    PubMed

    Hosseinkhani, Baharak; Hennebel, Tom; Boon, Nico

    2014-09-25

    Fermentative production of bio-hydrogen (bio-H2) from organic residues has emerged as a promising alternative for providing the required electron source for hydrogen driven remediation strategies. Unlike the widely used production of H2 by bacteria in fresh water systems, few reports are available regarding the generation of biogenic H2 and optimisation processes in marine systems. The present research aims to optimise the capability of an indigenous marine bacterium for the production of bio-H2 in marine environments and subsequently develop this process for hydrogen driven remediation strategies. Fermentative conversion of organics in marine media to H2 using a marine isolate, Pseudoalteromonas sp. BH11, was determined. A Taguchi design of experimental methodology was employed to evaluate the optimal nutritional composition in batch tests to improve bio-H2 yields. Further optimisation experiments showed that alginate-immobilised bacterial cells were able to produce bio-H2 at the same rate as suspended cells over a period of several weeks. Finally, bio-H2 was used as electron donor to successfully dehalogenate trichloroethylene (TCE) using biogenic palladium nanoparticles as a catalyst. Fermentative production of bio-H2 can be a promising technique for concomitant generation of an electron source for hydrogen driven remediation strategies and treatment of organic residue in marine ecosystems. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A neural network strategy for end-point optimization of batch processes.

    PubMed

    Krothapally, M; Palanki, S

    1999-01-01

    The traditional way of operating batch processes has been to utilize an open-loop "golden recipe". However, there can be substantial batch to batch variation in process conditions and this open-loop strategy can lead to non-optimal operation. In this paper, a new approach is presented for end-point optimization of batch processes by utilizing neural networks. This strategy involves the training of two neural networks; one to predict switching times and the other to predict the input profile in the singular region. This approach alleviates the computational problems associated with the classical Pontryagin's approach and the nonlinear programming approach. The efficacy of this scheme is illustrated via simulation of a fed-batch fermentation.

  17. Estimating Animal Abundance in Ground Beef Batches Assayed with Molecular Markers

    PubMed Central

    Hu, Xin-Sheng; Simila, Janika; Platz, Sindey Schueler; Moore, Stephen S.; Plastow, Graham; Meghen, Ciaran N.

    2012-01-01

    Estimating animal abundance in industrial scale batches of ground meat is important for mapping meat products through the manufacturing process and for effectively tracing the finished product during a food safety recall. The processing of ground beef involves a potentially large number of animals from diverse sources in a single product batch, which produces a high heterogeneity in capture probability. In order to estimate animal abundance through DNA profiling of ground beef constituents, two parameter-based statistical models were developed for incidence data. Simulations were applied to evaluate the maximum likelihood estimate (MLE) of a joint likelihood function from multiple surveys, showing superiority in the presence of high capture heterogeneity with small sample sizes, or comparable estimation in the presence of low capture heterogeneity with a large sample size when compared to other existing models. Our model employs the full information on the pattern of the capture-recapture frequencies from multiple samples. We applied the proposed models to estimate animal abundance in six manufacturing beef batches, genotyped using 30 single nucleotide polymorphism (SNP) markers, from a large scale beef grinding facility. Results show that between 411∼1367 animals were present in six manufacturing beef batches. These estimates are informative as a reference for improving recall processes and tracing finished meat products back to source. PMID:22479559

  18. Energy efficiency of batch and semi-batch (CCRO) reverse osmosis desalination.

    PubMed

    Warsinger, David M; Tow, Emily W; Nayar, Kishor G; Maswadeh, Laith A; Lienhard V, John H

    2016-12-01

    As reverse osmosis (RO) desalination capacity increases worldwide, the need to reduce its specific energy consumption becomes more urgent. In addition to the incremental changes attainable with improved components such as membranes and pumps, more significant reduction of energy consumption can be achieved through time-varying RO processes including semi-batch processes such as closed-circuit reverse osmosis (CCRO) and fully-batch processes that have not yet been commercialized or modelled in detail. In this study, numerical models of the energy consumption of batch RO (BRO), CCRO, and the standard continuous RO process are detailed. Two new energy-efficient configurations of batch RO are analyzed. Batch systems use significantly less energy than continuous RO over a wide range of recovery ratios and source water salinities. Relative to continuous RO, models predict that CCRO and batch RO demonstrate up to 37% and 64% energy savings, respectively, for brackish water desalination at high water recovery. For batch RO and CCRO, the primary reductions in energy use stem from atmospheric pressure brine discharge and reduced streamwise variation in driving pressure. Fully-batch systems further reduce energy consumption by not mixing streams of different concentrations, which CCRO does. These results demonstrate that time-varying processes can significantly raise RO energy efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. 40 CFR 63.161 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that references this subpart. Batch process means a process in which the equipment is fed... generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of equipment (e.g...

  20. 40 CFR 63.161 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... subpart that references this subpart. Batch process means a process in which the equipment is fed... generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of equipment (e.g...

  1. 40 CFR 63.161 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... subpart that references this subpart. Batch process means a process in which the equipment is fed... generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of equipment (e.g...

  2. 40 CFR 63.161 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... subpart that references this subpart. Batch process means a process in which the equipment is fed... generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of equipment (e.g...

  3. A Model of Batch Scheduling for a Single Batch Processor with Additional Setups to Minimize Total Inventory Holding Cost of Parts of a Single Item Requested at Multi-due-date

    NASA Astrophysics Data System (ADS)

    Hakim Halim, Abdul; Ernawati; Hidayat, Nita P. A.

    2018-03-01

    This paper deals with a model of batch scheduling for a single batch processor on which a number of parts of a single items are to be processed. The process needs two kinds of setups, i. e., main setups required before processing any batches, and additional setups required repeatedly after the batch processor completes a certain number of batches. The parts to be processed arrive at the shop floor at the times coinciding with their respective starting times of processing, and the completed parts are to be delivered at multiple due dates. The objective adopted for the model is that of minimizing total inventory holding cost consisting of holding cost per unit time for a part in completed batches, and that in in-process batches. The formulation of total inventory holding cost is derived from the so-called actual flow time defined as the interval between arrival times of parts at the production line and delivery times of the completed parts. The actual flow time satisfies not only minimum inventory but also arrival and delivery just in times. An algorithm to solve the model is proposed and a numerical example is shown.

  4. Quantitative imaging reveals heterogeneous growth dynamics and treatment-dependent residual tumor distributions in a three-dimensional ovarian cancer model

    NASA Astrophysics Data System (ADS)

    Celli, Jonathan P.; Rizvi, Imran; Evans, Conor L.; Abu-Yousif, Adnan O.; Hasan, Tayyaba

    2010-09-01

    Three-dimensional tumor models have emerged as valuable in vitro research tools, though the power of such systems as quantitative reporters of tumor growth and treatment response has not been adequately explored. We introduce an approach combining a 3-D model of disseminated ovarian cancer with high-throughput processing of image data for quantification of growth characteristics and cytotoxic response. We developed custom MATLAB routines to analyze longitudinally acquired dark-field microscopy images containing thousands of 3-D nodules. These data reveal a reproducible bimodal log-normal size distribution. Growth behavior is driven by migration and assembly, causing an exponential decay in spatial density concomitant with increasing mean size. At day 10, cultures are treated with either carboplatin or photodynamic therapy (PDT). We quantify size-dependent cytotoxic response for each treatment on a nodule by nodule basis using automated segmentation combined with ratiometric batch-processing of calcein and ethidium bromide fluorescence intensity data (indicating live and dead cells, respectively). Both treatments reduce viability, though carboplatin leaves micronodules largely structurally intact with a size distribution similar to untreated cultures. In contrast, PDT treatment disrupts micronodular structure, causing punctate regions of toxicity, shifting the distribution toward smaller sizes, and potentially increasing vulnerability to subsequent chemotherapeutic treatment.

  5. A universal piezo-driven ultrasonic cell microinjection system.

    PubMed

    Huang, Haibo; Mills, James K; Lu, Cong; Sun, Dong

    2011-08-01

    Over the past decade, the rapid development of biotechnologies such as gene injection, in-vitro fertilization, intracytoplasmic sperm injection (ICSI) and drug development have led to great demand for highly automated, high precision equipment for microinjection. Recently a new cell injection technology using piezo-driven pipettes with a very small mercury column was proposed and successfully applied in ICSI to a variety of mammal species. Although this technique significantly improves the survival rates of the ICSI process, shortcomings due to the toxicity of mercury and damage to the cell membrane due to large lateral tip oscillations of the injector pipette may limit its application. In this paper, a new cell injection system for automatic batch injection of suspended cells is developed. A new design of the piezo-driven cell injector is proposed for automated suspended cell injection. This new piezo-driven cell injector design relocates the piezo oscillation actuator to the injector pipette which eliminates the vibration effect on other parts of the micromanipulator. A small piezo stack is sufficient to perform the cell injection process. Harmful lateral tip oscillations of the injector pipette are reduced substantially without the use of a mercury column. Furthermore, ultrasonic vibration micro-dissection (UVM) theory is utilized to analyze the piezo-driven cell injection process, and the source of the lateral oscillations of the injector pipette is investigated. From preliminary experiments of cell injection of a large number of zebrafish embryos (n = 200), the injector pipette can easily pierce through the cell membrane at a low injection speed and almost no deformation of the cell wall, and with a high success rate(96%) and survival rate(80.7%) This new injection approach shows good potential for precision injection with less damage to the injected cells.

  6. In vitro growth of Curcuma longa L. in response to five mineral elements and plant density in fed-batch culture systems.

    PubMed

    El-Hawaz, Rabia F; Bridges, William C; Adelberg, Jeffrey W

    2015-01-01

    Plant density was varied with P, Ca, Mg, and KNO3 in a multifactor experiment to improve Curcuma longa L. micropropagation, biomass and microrhizome development in fed-batch liquid culture. The experiment had two paired D-optimal designs, testing sucrose fed-batch and nutrient sucrose fed-batch techniques. When sucrose became depleted, volume was restored to 5% m/v sucrose in 200 ml of modified liquid MS medium by adding sucrose solutions. Similarly, nutrient sucrose fed-batch was restored to set points with double concentration of treatments' macronutrient and MS micronutrient solutions, along with sucrose solutions. Changes in the amounts of water and sucrose supplementations were driven by the interaction of P and KNO3 concentrations. Increasing P from 1.25 to 6.25 mM increased both multiplication and biomass. The multiplication ratio was greatest in the nutrient sucrose fed-batch technique with the highest level of P, 6 buds/vessel, and the lowest level of Ca and KNO3. The highest density (18 buds/vessel) produced the highest fresh biomass at the highest concentrations of KNO3 and P with nutrient sucrose fed-batch, and moderate Ca and Mg concentrations. However, maximal rhizome dry biomass required highest P, sucrose fed-batch, and a moderate plant density. Different media formulations and fed-batch techniques were identified to maximize the propagation and storage organ responses. A single experimental design was used to optimize these dual purposes.

  7. In Vitro Growth of Curcuma longa L. in Response to Five Mineral Elements and Plant Density in Fed-Batch Culture Systems

    PubMed Central

    El-Hawaz, Rabia F.; Bridges, William C.; Adelberg, Jeffrey W.

    2015-01-01

    Plant density was varied with P, Ca, Mg, and KNO3 in a multifactor experiment to improve Curcuma longa L. micropropagation, biomass and microrhizome development in fed-batch liquid culture. The experiment had two paired D-optimal designs, testing sucrose fed-batch and nutrient sucrose fed-batch techniques. When sucrose became depleted, volume was restored to 5% m/v sucrose in 200 ml of modified liquid MS medium by adding sucrose solutions. Similarly, nutrient sucrose fed-batch was restored to set points with double concentration of treatments’ macronutrient and MS micronutrient solutions, along with sucrose solutions. Changes in the amounts of water and sucrose supplementations were driven by the interaction of P and KNO3 concentrations. Increasing P from 1.25 to 6.25 mM increased both multiplication and biomass. The multiplication ratio was greatest in the nutrient sucrose fed-batch technique with the highest level of P, 6 buds/vessel, and the lowest level of Ca and KNO3. The highest density (18 buds/vessel) produced the highest fresh biomass at the highest concentrations of KNO3 and P with nutrient sucrose fed-batch, and moderate Ca and Mg concentrations. However, maximal rhizome dry biomass required highest P, sucrose fed-batch, and a moderate plant density. Different media formulations and fed-batch techniques were identified to maximize the propagation and storage organ responses. A single experimental design was used to optimize these dual purposes. PMID:25830292

  8. Superstructure-based Design and Optimization of Batch Biodiesel Production Using Heterogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Nuh, M. Z.; Nasir, N. F.

    2017-08-01

    Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.

  9. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  10. 40 CFR 63.1020 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... shall have the meaning given them in the Act and in this section. Batch process means a process in which... which the equipment is generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of...

  11. 40 CFR 63.1020 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... shall have the meaning given them in the Act and in this section. Batch process means a process in which... which the equipment is generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of...

  12. 40 CFR 63.1020 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... shall have the meaning given them in the Act and in this section. Batch process means a process in which... which the equipment is generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of...

  13. 40 CFR 63.1020 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... shall have the meaning given them in the Act and in this section. Batch process means a process in which... which the equipment is generally emptied. Examples of industries that use batch processes include pharmaceutical production and pesticide production. Batch product-process equipment train means the collection of...

  14. DEVELOPMENT OF AN ANTIFOAM TRACKING SYSTEM AS AN OPTION TO SUPPORT THE MELTER OFF-GAS FLAMMABILITY CONTROL STRATEGY AT THE DWPF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Lambert, D.

    The Savannah River National Laboratory (SRNL) has been working with the Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) in the development and implementation of an additional strategy for confidently satisfying the flammability controls for DWPF’s melter operation. An initial strategy for implementing the operational constraints associated with flammability control in DWPF was based upon an analytically determined carbon concentration from antifoam. Due to the conservative error structure associated with the analytical approach, its implementation has significantly reduced the operating window for processing and has led to recurrent Slurry Mix Evaporator (SME) and Melter Feed Tank (MFT) remediation. Tomore » address the adverse operating impact of the current implementation strategy, SRR issued a Technical Task Request (TTR) to SRNL requesting the development and documentation of an alternate strategy for evaluating the carbon contribution from antifoam. The proposed strategy presented in this report was developed under the guidance of a Task Technical and Quality Assurance Plan (TTQAP) and involves calculating the carbon concentration from antifoam based upon the actual mass of antifoam added to the process assuming 100% retention. The mass of antifoam in the Additive Mix Feed Tank (AMFT), in the Sludge Receipt and Adjustment Tank (SRAT), and in the SME is tracked by mass balance as part of this strategy. As these quantities are monitored, the random and bias uncertainties affecting their values are also maintained and accounted for. This report documents: 1) the development of an alternate implementation strategy and associated equations describing the carbon concentration from antifoam in each SME batch derived from the actual amount of antifoam introduced into the AMFT, SRAT, and SME during the processing of the batch. 2) the equations and error structure for incorporating the proposed strategy into melter off-gas flammability assessments. Sample calculations of the system are also included in this report. Please note that the system developed and documented in this report is intended as an alternative to the current, analytically-driven system being utilized by DWPF; the proposed system is not intended to eliminate the current system. Also note that the system developed in this report to track antifoam mass in the AMFT, SRAT, and SME will be applicable beyond just Sludge Batch 8. While the model used to determine acceptability of the SME product with respect to melter off-gas flammability controls must be reassessed for each change in sludge batch, the antifoam mass tracking methodology is independent of sludge batch composition and as such will be transferable to future sludge batches.« less

  15. Boosted structured additive regression for Escherichia coli fed-batch fermentation modeling.

    PubMed

    Melcher, Michael; Scharl, Theresa; Luchner, Markus; Striedner, Gerald; Leisch, Friedrich

    2017-02-01

    The quality of biopharmaceuticals and patients' safety are of highest priority and there are tremendous efforts to replace empirical production process designs by knowledge-based approaches. Main challenge in this context is that real-time access to process variables related to product quality and quantity is severely limited. To date comprehensive on- and offline monitoring platforms are used to generate process data sets that allow for development of mechanistic and/or data driven models for real-time prediction of these important quantities. Ultimate goal is to implement model based feed-back control loops that facilitate online control of product quality. In this contribution, we explore structured additive regression (STAR) models in combination with boosting as a variable selection tool for modeling the cell dry mass, product concentration, and optical density on the basis of online available process variables and two-dimensional fluorescence spectroscopic data. STAR models are powerful extensions of linear models allowing for inclusion of smooth effects or interactions between predictors. Boosting constructs the final model in a stepwise manner and provides a variable importance measure via predictor selection frequencies. Our results show that the cell dry mass can be modeled with a relative error of about ±3%, the optical density with ±6%, the soluble protein with ±16%, and the insoluble product with an accuracy of ±12%. Biotechnol. Bioeng. 2017;114: 321-334. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. HPLC-MS Examination of Impurities in Pentaerythritol Tetranitrate

    NASA Astrophysics Data System (ADS)

    Brown, Geoffrey W.; Giambra, Anna M.

    2014-04-01

    Pentaerythritol tetranitrate (PETN) has trace homolog impurities that can be detected by high-performance liquid chromatography-mass spectrometry. Consideration of observed impurity masses and candidate structures based on known pentaerythritol impurities allows identification of 22 compounds in the data. These are all consistent with either fully nitrated homologs or derivatives substituted with methyl, methoxy, or hydroxyl groups in place of a nitric ester. Examining relative impurity concentrations in three starting batches of PETN and six subsequently processed batches shows that it is possible to use relative concentration profiles as a fingerprint to differentiate batches and follow them through recrystallization steps.

  17. Statistical process control for residential treated wood

    Treesearch

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  18. Impact of Radio Frequency Identification (RFID) on the Marine Corps’ Supply Process

    DTIC Science & Technology

    2006-09-01

    Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................56 3. As-Is: The Current... Processing System Vice a Batch Order Processing System ................58 V. RESULTS ................................................69 A. SIMULATION...Time: Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................71 3. As-Is: The

  19. Batch manufacturing: Six strategic needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ash, R.H.; Chappell, D.A.

    1995-08-01

    Since the advent of industrial digital control systems in the mid-1970s, industry has had the promise of integrated, configurable digital batch control systems to replace the morass of electromechanical devices like relays and stepping switches, recorders, and indicators which comprised the components of previous generations of batch control systems - the {open_quotes}monolithic monsters{close_quotes} of the 1960s and earlier. To help fulfill that promise, there have been many wide-ranging proprietary automation solutions for batch control since 1975, many of them technically excellent. However, even the best examples suffered from the lack of a common language and unifying concept permitting separate systemsmore » to be interconnected and work together. Today, some 20 years after the digital revolution began, industry has microprocessors, memory chips, data highways, and other marvelous technology to help automate the control of discontinuous processes. They also are on the way to having an accepted standard for batch automation, ISA S88. Batching systems are at once conceptually simple but executionally complex. The notion of adding ingredients one at a time to a vat, mixing, and then processing into final form is as old as the stone age. Every homemaker on earth, male or female, is familiar with how to follow a recipe to create some sumptuous item of culinary delight. Food recipes, so familiar and ubiquitous, are really just microcosms of the S88 recipe standard. They contain the same components: (1) Header (name and description of item being prepared, sometimes serving size); (2) Formula (list and amount of ingredients); (3) Equipment requirements (pans, mixing and cooking equipment); (4) Procedure (description of order of ingredient addition, mixing and other processing steps, baking/cooling time, and other processing steps); and (5) Other information (safety, cautions, and other miscellaneous instructions).« less

  20. A high-throughput media design approach for high performance mammalian fed-batch cultures

    PubMed Central

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame. PMID:23563583

  1. Recommendation of ruthenium source for sludge batch flowsheet studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodham, W.

    Included herein is a preliminary analysis of previously-generated data from sludge batches 7a, 7b, 8, and 9 sludge simulant and real-waste testing, performed to recommend a form of ruthenium for future sludge batch simulant testing under the nitric-formic flowsheet. Focus is given to reactions present in the Sludge Receipt and Adjustment Tank cycle, given that this cycle historically produces the most changes in chemical composition during Chemical Process Cell processing. Data is presented and analyzed for several runs performed under the nitric-formic flowsheet, with consideration given to effects on the production of hydrogen gas, nitrous oxide gas, consumption of formate,more » conversion of nitrite to nitrate, and the removal and recovery of mercury during processing. Additionally, a brief discussion is given to the effect of ruthenium source selection under the nitric-glycolic flowsheet. An analysis of data generated from scaled demonstration testing, sludge batch 9 qualification testing, and antifoam degradation testing under the nitric-glycolic flowsheet is presented. Experimental parameters of interest under the nitric-glycolic flowsheet include N2O production, glycolate destruction, conversion of glycolate to formate and oxalate, and the conversion of nitrite to nitrate. To date, the number of real-waste experiments that have been performed under the nitric-glycolic flowsheet is insufficient to provide a complete understanding of the effects of ruthenium source selection in simulant experiments with regard to fidelity to real-waste testing. Therefore, a determination of comparability between the two ruthenium sources as employed under the nitric-glycolic flowsheet is made based on available data in order to inform ruthenium source selection for future testing under the nitric-glycolic flowsheet.« less

  2. SLUDGE WASHING AND DEMONSTRATION OF THE DWPF FLOWSHEET IN THE SRNL SHIELDED CELLS FOR SLUDGE BATCH 7A QUALIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pareizs, J.; Billings, A.; Click, D.

    2011-07-08

    Waste Solidification Engineering (WSE) has requested that characterization and a radioactive demonstration of the next batch of sludge slurry (Sludge Batch 7a*) be completed in the Shielded Cells Facility of the Savannah River National Laboratory (SRNL) via a Technical Task Request (TTR). This characterization and demonstration, or sludge batch qualification process, is required prior to transfer of the sludge from Tank 51 to the Defense Waste Processing Facility (DWPF) feed tank (Tank 40). The current WSE practice is to prepare sludge batches in Tank 51 by transferring sludge from other tanks. Discharges of nuclear materials from H Canyon are oftenmore » added to Tank 51 during sludge batch preparation. The sludge is washed and transferred to Tank 40, the current DWPF feed tank. Prior to transfer of Tank 51 to Tank 40, SRNL simulates the Tank Farm and DWPF processes with a Tank 51 sample (referred to as the qualification sample). Sludge Batch 7a (SB7a) is composed of portions of Tanks 4, 7, and 12; the Sludge Batch 6 heel in Tank 51; and a plutonium stream from H Canyon. SRNL received the Tank 51 qualification sample (sample ID HTF-51-10-125) following sludge additions to Tank 51. This report documents: (1) The washing (addition of water to dilute the sludge supernate) and concentration (decanting of supernate) of the SB7a - Tank 51 qualification sample to adjust sodium content and weight percent insoluble solids to Tank Farm projections. (2) The performance of a DWPF Chemical Process Cell (CPC) simulation using the washed Tank 51 sample. The simulation included a Sludge Receipt and Adjustment Tank (SRAT) cycle, where acid was added to the sludge to destroy nitrite and reduce mercury, and a Slurry Mix Evaporator (SME) cycle, where glass frit was added to the sludge in preparation for vitrification. The SME cycle also included replication of five canister decontamination additions and concentrations. Processing parameters were based on work with a non-radioactive simulant. (3) Vitrification of a portion of the SME product and characterization and durability testing (as measured by the Product Consistency Test (PCT)) of the resulting glass. (4) Rheology measurements of the initial slurry samples and samples after each phase of CPC processing. This program was controlled by a Task Technical and Quality Assurance Plan (TTQAP), and analyses were guided by an Analytical Study Plan. This work is Technical Baseline Research and Development (R&D) for the DWPF. It should be noted that much of the data in this document has been published in interoffice memoranda. The intent of this technical report is bring all of the SB7a related data together in a single permanent record and to discuss the overall aspects of SB7a processing.« less

  3. Batch and multi-step fed-batch enzymatic saccharification of Formiline-pretreated sugarcane bagasse at high solid loadings for high sugar and ethanol titers.

    PubMed

    Zhao, Xuebing; Dong, Lei; Chen, Liang; Liu, Dehua

    2013-05-01

    Formiline pretreatment pertains to a biomass fractionation process. In the present work, Formiline-pretreated sugarcane bagasse was hydrolyzed with cellulases by batch and multi-step fed-batch processes at 20% solid loading. For wet pulp, after 144 h incubation with cellulase loading of 10 FPU/g dry solid, fed-batch process obtained ~150 g/L glucose and ~80% glucan conversion, while batch process obtained ~130 g/L glucose with corresponding ~70% glucan conversion. Solid loading could be further increased to 30% for the acetone-dried pulp. By fed-batch hydrolysis of the dried pulp in pH 4.8 buffer solution, glucose concentration could be 247.3±1.6 g/L with corresponding 86.1±0.6% glucan conversion. The enzymatic hydrolyzates could be well converted to ethanol by a subsequent fermentation using Saccharomices cerevisiae with ethanol titer of 60-70 g/L. Batch and fed-batch SSF indicated that Formiline-pretreated substrate showed excellent fermentability. The final ethanol concentration was 80 g/L with corresponding 82.7% of theoretical yield. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Effects of HPMC substituent pattern on water up-take, polymer and drug release: An experimental and modelling study.

    PubMed

    Caccavo, Diego; Lamberti, Gaetano; Barba, Anna Angela; Abrahmsén-Alami, Susanna; Viridén, Anna; Larsson, Anette

    2017-08-07

    The purpose of this study was to investigate the hydration behavior of two matrix formulations containing the cellulose derivative hydroxypropyl methylcellulose (HPMC). The two HPMC batches investigated had different substitution pattern along the backbone; the first one is referred to as heterogeneous and the second as homogenous. The release of both the drug molecule theophylline and the polymer was determined. Additionally, the water concentrations at different positions in the swollen gel layers were determined by Magnetic Resonance Imaging. The experimental data was compared to predicted values obtained by the extension of a mechanistic Fickian based model. The hydration of tablets containing the more homogenous HPMC batch showed a gradual water concentration gradient in the gel layer and could be well predicted. The hydration process for the more heterogeneous batch showed a very abrupt step change in the water concentration in the gel layer and could not be well predicted. Based on the comparison between the experimental and predicted data this study suggests, for the first time, that formulations with HPMC of different heterogeneities form gels in different ways. The homogeneous HPMC batch exhibits a water sorption behavior ascribable to a Ficḱs law for the diffusion process whereas the more heterogeneous HPMC batches does not. This conclusion is important in the future development of simulation models and in the understanding of drug release mechanism from hydrophilic matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Large-scale manufacturing of GMP-compliant anti-EGFR targeted nanocarriers: production of doxorubicin-loaded anti-EGFR-immunoliposomes for a first-in-man clinical trial.

    PubMed

    Wicki, Andreas; Ritschard, Reto; Loesch, Uli; Deuster, Stefanie; Rochlitz, Christoph; Mamot, Christoph

    2015-04-30

    We describe the large-scale, GMP-compliant production process of doxorubicin-loaded and anti-EGFR-coated immunoliposomes (anti-EGFR-ILs-dox) used in a first-in-man, dose escalation clinical trial. 10 batches of this nanoparticle have been produced in clean room facilities. Stability data from the pre-GMP and the GMP batch indicate that the anti-EGFR-ILs-dox nanoparticle was stable for at least 18 months after release. Release criteria included visual inspection, sterility testing, as well as measurements of pH (pH 5.0-7.0), doxorubicin HCl concentration (0.45-0.55 mg/ml), endotoxin concentration (<1.21 IU/ml), leakage (<10%), particle size (Z-average of Caelyx ± 20 nm), and particle uptake (uptake absolute: >0.50 ng doxorubicin/μg protein; uptake relatively to PLD: >5 fold). All batches fulfilled the defined release criteria, indicating a high reproducibility as well as batch-to-batch uniformity of the main physico-chemical features of the nanoparticles in the setting of the large-scale GMP process. In the clinical trial, 29 patients were treated with this nanoparticle between 2007 and 2010. Pharmacokinetic data of anti-EGFR-ILs-dox collected during the clinical study revealed stability of the nanocarrier in vivo. Thus, reliable and GMP-compliant production of anti-EGFR-targeted nanoparticles for clinical application is feasible. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Active Job Monitoring in Pilots

    NASA Astrophysics Data System (ADS)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-12-01

    Recent developments in high energy physics (HEP) including multi-core jobs and multi-core pilots require data centres to gain a deep understanding of the system to monitor, design, and upgrade computing clusters. Networking is a critical component. Especially the increased usage of data federations, for example in diskless computing centres or as a fallback solution, relies on WAN connectivity and availability. The specific demands of different experiments and communities, but also the need for identification of misbehaving batch jobs, requires an active monitoring. Existing monitoring tools are not capable of measuring fine-grained information at batch job level. This complicates network-aware scheduling and optimisations. In addition, pilots add another layer of abstraction. They behave like batch systems themselves by managing and executing payloads of jobs internally. The number of real jobs being executed is unknown, as the original batch system has no access to internal information about the scheduling process inside the pilots. Therefore, the comparability of jobs and pilots for predicting run-time behaviour or network performance cannot be ensured. Hence, identifying the actual payload is important. At the GridKa Tier 1 centre a specific tool is in use that allows the monitoring of network traffic information at batch job level. This contribution presents the current monitoring approach and discusses recent efforts and importance to identify pilots and their substructures inside the batch system. It will also show how to determine monitoring data of specific jobs from identified pilots. Finally, the approach is evaluated.

  7. Does Warming a Lysozyme Solution Cook Ones Data?

    NASA Technical Reports Server (NTRS)

    Pusey, Marc; Burke, Michael; Judge, Russell

    2000-01-01

    Chicken egg white lysozyme has a well characterized thermally driven phase transition. Between pH 4.0 and 5.2, the transition temperature, as defined by the point where the tetragonal and orthorhombic solubility are equal, is a function of the pH, salt (precipitant) type and concentration, and most likely of the buffer concentration as well. This phase transition can be carried out with protein solution alone, prior to the initiation of the crystallization process. We have now measured the kinetics of this process and investigated its reversibility. An aliquot of a stock protein solution is held at a given temperature, and at periodic intervals used to set up batch crystallization experiments. The batch solutions were incubated at 20 C until macroscopic crystals were obtained, at which point the number of crystals in each well were counted. The transition effects increased with temperature, slowly falling off at 30 C with a half time (time to approx. 1/2 the t = 0 number of crystals) of approx. 5 hours, and an estimated half time of approx. 0.5 hours at 43 C. Further, the process was not reversible by simple cooling. After holding a lysozyme solution at 37 C (prior to addition of precipitant) for 16 hours, then cooling and holding it at 4 C, no return to the pre-warmed nucleation kinetics are observed after at least 4 weeks. Thus every thermal excursion above the phase transition point results in a further decrease in the nucleation rate of that solution, the extent being a function of the time and temperature. Orthorhombic lysozyme crystals apparently do not undergo the flow-induced growth cessation of tetragonal lysozyme crystals. We have previously shown that putting the protein in the orthorhombic form does not affect the averaged face growth kinetics, only nucleation, for tetragonal crystals. We may be able to use this differential behavior to elucidate how flow affects tile lysozyme crystal growth process.

  8. Annual ADP planning document

    NASA Technical Reports Server (NTRS)

    Mogilevsky, M.

    1973-01-01

    The Category A computer systems at KSC (Al and A2) which perform scientific and business/administrative operations are described. This data division is responsible for scientific requirements supporting Saturn, Atlas/Centaur, Titan/Centaur, Titan III, and Delta vehicles, and includes realtime functions, Apollo-Soyuz Test Project (ASTP), and the Space Shuttle. The work is performed chiefly on the GEL-635 (Al) system located in the Central Instrumentation Facility (CIF). The Al system can perform computations and process data in three modes: (1) real-time critical mode; (2) real-time batch mode; and (3) batch mode. The Division's IBM-360/50 (A2) system, also at the CIF, performs business/administrative data processing such as personnel, procurement, reliability, financial management and payroll, real-time inventory management, GSE accounting, preventive maintenance, and integrated launch vehicle modification status.

  9. Fault Detection for Nonlinear Process With Deterministic Disturbances: A Just-In-Time Learning Based Data Driven Method.

    PubMed

    Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay

    2017-11-01

    Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.

  10. Rigorous buoyancy driven bubble mixing for centrifugal microfluidics.

    PubMed

    Burger, S; Schulz, M; von Stetten, F; Zengerle, R; Paust, N

    2016-01-21

    We present batch-mode mixing for centrifugal microfluidics operated at fixed rotational frequency. Gas is generated by the disk integrated decomposition of hydrogen peroxide (H2O2) to liquid water (H2O) and gaseous oxygen (O2) and inserted into a mixing chamber. There, bubbles are formed that ascent through the liquid in the artificial gravity field and lead to drag flow. Additionaly, strong buoyancy causes deformation and rupture of the gas bubbles and induces strong mixing flows in the liquids. Buoyancy driven bubble mixing is quantitatively compared to shake mode mixing, mixing by reciprocation and vortex mixing. To determine mixing efficiencies in a meaningful way, the different mixers are employed for mixing of a lysis reagent and human whole blood. Subsequently, DNA is extracted from the lysate and the amount of DNA recovered is taken as a measure for mixing efficiency. Relative to standard vortex mixing, DNA extraction based on buoyancy driven bubble mixing resulted in yields of 92 ± 8% (100 s mixing time) and 100 ± 8% (600 s) at 130g centrifugal acceleration. Shake mode mixing yields 96 ± 11% and is thus equal to buoyancy driven bubble mixing. An advantage of buoyancy driven bubble mixing is that it can be operated at fixed rotational frequency, however. The additional costs of implementing buoyancy driven bubble mixing are low since both the activation liquid and the catalyst are very low cost and no external means are required in the processing device. Furthermore, buoyancy driven bubble mixing can easily be integrated in a monolithic manner and is compatible to scalable manufacturing technologies such as injection moulding or thermoforming. We consider buoyancy driven bubble mixing an excellent alternative to shake mode mixing, in particular if the processing device is not capable of providing fast changes of rotational frequency or if the low average rotational frequency is challenging for the other integrated fluidic operations.

  11. Research on key technologies of data processing in internet of things

    NASA Astrophysics Data System (ADS)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  12. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  13. Design of two-column batch-to-batch recirculation to enhance performance in ion-exchange chromatography.

    PubMed

    Persson, Oliver; Andersson, Niklas; Nilsson, Bernt

    2018-01-05

    Preparative liquid chromatography is a separation technique widely used in the manufacturing of fine chemicals and pharmaceuticals. A major drawback of traditional single-column batch chromatography step is the trade-off between product purity and process performance. Recirculation of impure product can be utilized to make the trade-off more favorable. The aim of the present study was to investigate the usage of a two-column batch-to-batch recirculation process step to increase the performance compared to single-column batch chromatography at a high purity requirement. The separation of a ternary protein mixture on ion-exchange chromatography columns was used to evaluate the proposed process. The investigation used modelling and simulation of the process step, experimental validation and optimization of the simulated process. In the presented case the yield increases from 45.4% to 93.6% and the productivity increases 3.4 times compared to the performance of a batch run for a nominal case. A rapid concentration build-up product can be seen during the first cycles, before the process reaches a cyclic steady-state with reoccurring concentration profiles. The optimization of the simulation model predicts that the recirculated salt can be used as a flying start of the elution, which would enhance the process performance. The proposed process is more complex than a batch process, but may improve the separation performance, especially while operating at cyclic steady-state. The recirculation of impure fractions reduces the product losses and ensures separation of product to a high degree of purity. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Application of a mechanistic model as a tool for on-line monitoring of pilot scale filamentous fungal fermentation processes-The importance of evaporation effects.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-03-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs. Biotechnol. Bioeng. 2017;114: 589-599. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-reference control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process...

  16. 40 CFR 63.491 - Batch front-end process vents-recordkeeping requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.491 Batch front-end process vents—recordkeeping requirements. (a) Group determination records for...) through (a)(6) of this section for each batch front-end process vent subject to the group determination...

  17. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process vents...

  18. 40 CFR 63.491 - Batch front-end process vents-recordkeeping requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.491 Batch front-end process vents—recordkeeping requirements. (a) Group determination records for...) through (a)(6) of this section for each batch front-end process vent subject to the group determination...

  19. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-reference control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process...

  20. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-reference control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process...

  1. 40 CFR 63.491 - Batch front-end process vents-recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.491 Batch front-end process vents—recordkeeping requirements. (a) Group determination records for...) through (a)(6) of this section for each batch front-end process vent subject to the group determination...

  2. 40 CFR 63.491 - Batch front-end process vents-recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CATEGORIES National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.491 Batch front-end process vents—recordkeeping requirements. (a) Group determination records for...) through (a)(6) of this section for each batch front-end process vent subject to the group determination...

  3. Production of citric acid using its extraction wastewater treated by anaerobic digestion and ion exchange in an integrated citric acid-methane fermentation process.

    PubMed

    Xu, Jian; Chen, Yang-Qiu; Zhang, Hong-Jian; Tang, Lei; Wang, Ke; Zhang, Jian-Hua; Chen, Xu-Sheng; Mao, Zhong-Gui

    2014-08-01

    In order to solve the problem of extraction wastewater pollution in citric acid industry, an integrated citric acid-methane fermentation process is proposed in this study. Extraction wastewater was treated by mesophilic anaerobic digestion and then used to make mash for the next batch of citric acid fermentation. The recycling process was done for seven batches. Citric acid production (82.4 g/L on average) decreased by 34.1 % in the recycling batches (2nd-7th) compared with the first batch. And the residual reducing sugar exceeded 40 g/L on average in the recycling batches. Pigment substances, acetic acid, ammonium, and metal ions in anaerobic digestion effluent (ADE) were considered to be the inhibitors, and their effects on the fermentation were studied. Results indicated that ammonium, Na(+) and K(+) in the ADE significantly inhibited citric acid fermentation. Therefore, the ADE was treated by acidic cation exchange resin prior to reuse to make mash for citric acid fermentation. The recycling process was performed for ten batches, and citric acid productions in the recycling batches were 126.6 g/L on average, increasing by 1.7 % compared with the first batch. This process could eliminate extraction wastewater discharge and reduce water resource consumption.

  4. Stochastic growth logistic model with aftereffect for batch fermentation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah

    2014-06-19

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  5. Stochastic growth logistic model with aftereffect for batch fermentation process

    NASA Astrophysics Data System (ADS)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md

    2014-06-01

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  6. Detection and recognition of mechanical, digging and vehicle signals in the optical fiber pre-warning system

    NASA Astrophysics Data System (ADS)

    Tian, Qing; Yang, Dan; Zhang, Yuan; Qu, Hongquan

    2018-04-01

    This paper presents detection and recognition method to locate and identify harmful intrusions in the optical fiber pre-warning system (OFPS). Inspired by visual attention architecture (VAA), the process flow is divided into two parts, i.e., data-driven process and task-driven process. At first, data-driven process takes all the measurements collected by the system as input signals, which is handled by detection method to locate the harmful intrusion in both spatial domain and time domain. Then, these detected intrusion signals are taken over by task-driven process. Specifically, we get pitch period (PP) and duty cycle (DC) of the intrusion signals to identify the mechanical and manual digging (MD) intrusions respectively. For the passing vehicle (PV) intrusions, their strong low frequency component can be used as good feature. In generally, since the harmful intrusion signals only account for a small part of whole measurements, the data-driven process reduces the amount of input data for subsequent task-driven process considerably. Furthermore, the task-driven process determines the harmful intrusions orderly according to their severity, which makes a priority mechanism for the system as well as targeted processing for different harmful intrusion. At last, real experiments are performed to validate the effectiveness of this method.

  7. Production of laccase by Coriolus versicolor and its application in decolorization of dyestuffs: (I). Production of laccase by batch and repeated-batch processes.

    PubMed

    Lin, Jian-Ping; Wei, Lian; Xia, Li-Ming; Cen, Pei-Lin

    2003-01-01

    The production of laccase by Coriolus versicolor was studied. The effect of cultivation conditions on laccase production by Coriolus versicolor was examined to obtain optimal medium and cultivation conditions. Both batch and repeated-batch processes were performed for laccase production. In repeated-batch fermentation with self-immobilized mycelia, total of 14 cycles were performed with laccase activity in the range between 3.4 and 14.8 U/ml.

  8. Parallel-Batch Scheduling and Transportation Coordination with Waiting Time Constraint

    PubMed Central

    Gong, Hua; Chen, Daheng; Xu, Ke

    2014-01-01

    This paper addresses a parallel-batch scheduling problem that incorporates transportation of raw materials or semifinished products before processing with waiting time constraint. The orders located at the different suppliers are transported by some vehicles to a manufacturing facility for further processing. One vehicle can load only one order in one shipment. Each order arriving at the facility must be processed in the limited waiting time. The orders are processed in batches on a parallel-batch machine, where a batch contains several orders and the processing time of the batch is the largest processing time of the orders in it. The goal is to find a schedule to minimize the sum of the total flow time and the production cost. We prove that the general problem is NP-hard in the strong sense. We also demonstrate that the problem with equal processing times on the machine is NP-hard. Furthermore, a dynamic programming algorithm in pseudopolynomial time is provided to prove its ordinarily NP-hardness. An optimal algorithm in polynomial time is presented to solve a special case with equal processing times and equal transportation times for each order. PMID:24883385

  9. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    PubMed

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  10. A novel process-based model of microbial growth: self-inhibition in Saccharomyces cerevisiae aerobic fed-batch cultures.

    PubMed

    Mazzoleni, Stefano; Landi, Carmine; Cartenì, Fabrizio; de Alteriis, Elisabetta; Giannino, Francesco; Paciello, Lucia; Parascandola, Palma

    2015-07-30

    Microbial population dynamics in bioreactors depend on both nutrients availability and changes in the growth environment. Research is still ongoing on the optimization of bioreactor yields focusing on the increase of the maximum achievable cell density. A new process-based model is proposed to describe the aerobic growth of Saccharomyces cerevisiae cultured on glucose as carbon and energy source. The model considers the main metabolic routes of glucose assimilation (fermentation to ethanol and respiration) and the occurrence of inhibition due to the accumulation of both ethanol and other self-produced toxic compounds in the medium. Model simulations reproduced data from classic and new experiments of yeast growth in batch and fed-batch cultures. Model and experimental results showed that the growth decline observed in prolonged fed-batch cultures had to be ascribed to self-produced inhibitory compounds other than ethanol. The presented results clarify the dynamics of microbial growth under different feeding conditions and highlight the relevance of the negative feedback by self-produced inhibitory compounds on the maximum cell densities achieved in a bioreactor.

  11. 40 CFR 63.491 - Batch front-end process vents-recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (b)(2): (i) For an incinerator or non-combustion control device, the percent reduction of organic HAP... the process vent stream is introduced with combustion air or is used as a secondary fuel and is not... combustion device to control halogenated batch front-end process vents or halogenated aggregate batch vent...

  12. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Data mining for rapid prediction of facility fit and debottlenecking of biomanufacturing facilities.

    PubMed

    Yang, Yang; Farid, Suzanne S; Thornhill, Nina F

    2014-06-10

    Higher titre processes can pose facility fit challenges in legacy biopharmaceutical purification suites with capacities originally matched to lower titre processes. Bottlenecks caused by mismatches in equipment sizes, combined with process fluctuations upon scale-up, can result in discarding expensive product. This paper describes a data mining decisional tool for rapid prediction of facility fit issues and debottlenecking of biomanufacturing facilities exposed to batch-to-batch variability and higher titres. The predictive tool comprised advanced multivariate analysis techniques to interrogate Monte Carlo stochastic simulation datasets that mimicked batch fluctuations in cell culture titres, step yields and chromatography eluate volumes. A decision tree classification method, CART (classification and regression tree) was introduced to explore the impact of these process fluctuations on product mass loss and reveal the root causes of bottlenecks. The resulting pictorial decision tree determined a series of if-then rules for the critical combinations of factors that lead to different mass loss levels. Three different debottlenecking strategies were investigated involving changes to equipment sizes, using higher capacity chromatography resins and elution buffer optimisation. The analysis compared the impact of each strategy on mass output, direct cost of goods per gram and processing time, as well as consideration of extra capital investment and space requirements. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Mathematical Modeling to Reduce Waste of Compounded Sterile Products in Hospital Pharmacies

    PubMed Central

    Dobson, Gregory; Haas, Curtis E.; Tilson, David

    2014-01-01

    Abstract In recent years, many US hospitals embarked on “lean” projects to reduce waste. One advantage of the lean operational improvement methodology is that it relies on process observation by those engaged in the work and requires relatively little data. However, the thoughtful analysis of the data captured by operational systems allows the modeling of many potential process options. Such models permit the evaluation of likely waste reductions and financial savings before actual process changes are made. Thus the most promising options can be identified prospectively, change efforts targeted accordingly, and realistic targets set. This article provides one example of such a datadriven process redesign project focusing on waste reduction in an in-hospital pharmacy. A mathematical model of the medication prepared and delivered by the pharmacy is used to estimate the savings from several potential redesign options (rescheduling the start of production, scheduling multiple batches, or reordering production within a batch) as well as the impact of information system enhancements. The key finding is that mathematical modeling can indeed be a useful tool. In one hospital setting, it estimated that waste could be realistically reduced by around 50% by using several process changes and that the greatest benefit would be gained by rescheduling the start of production (for a single batch) away from the period when most order cancellations are made. PMID:25477580

  15. Algorithms for Autonomous GPS Orbit Determination and Formation Flying: Investigation of Initialization Approaches and Orbit Determination for HEO

    NASA Technical Reports Server (NTRS)

    Axelrad, Penina; Speed, Eden; Leitner, Jesse A. (Technical Monitor)

    2002-01-01

    This report summarizes the efforts to date in processing GPS measurements in High Earth Orbit (HEO) applications by the Colorado Center for Astrodynamics Research (CCAR). Two specific projects were conducted; initialization of the orbit propagation software, GEODE, using nominal orbital elements for the IMEX orbit, and processing of actual and simulated GPS data from the AMSAT satellite using a Doppler-only batch filter. CCAR has investigated a number of approaches for initialization of the GEODE orbit estimator with little a priori information. This document describes a batch solution approach that uses pseudorange or Doppler measurements collected over an orbital arc to compute an epoch state estimate. The algorithm is based on limited orbital element knowledge from which a coarse estimate of satellite position and velocity can be determined and used to initialize GEODE. This algorithm assumes knowledge of nominal orbital elements, (a, e, i, omega, omega) and uses a search on time of perigee passage (tau(sub p)) to estimate the host satellite position within the orbit and the approximate receiver clock bias. Results of the method are shown for a simulation including large orbital uncertainties and measurement errors. In addition, CCAR has attempted to process GPS data from the AMSAT satellite to obtain an initial estimation of the orbit. Limited GPS data have been received to date, with few satellites tracked and no computed point solutions. Unknown variables in the received data have made computations of a precise orbit using the recovered pseudorange difficult. This document describes the Doppler-only batch approach used to compute the AMSAT orbit. Both actual flight data from AMSAT, and simulated data generated using the Satellite Tool Kit and Goddard Space Flight Center's Flight Simulator, were processed. Results for each case and conclusion are presented.

  16. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion methodmore » was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U data, which is also discussed. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element without support from XRD analysis or used to estimate ratios of compounds in the sludge.« less

  17. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    PubMed

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  18. Laser Micro and Nano Processing of Metals , Ceramics , and Polymers

    NASA Astrophysics Data System (ADS)

    Pfleging, Wilhelm; Kohler, Robert; Südmeyer, Isabelle; Rohde, Magnus

    Laser -based material processing is well investigated for structuring , modification , and bonding of metals , ceramics , glasses, and polymers . Especially for material processing on micrometer, and nanometer scale laser-assisted processes will very likely become more prevalent as lasers offer more cost-effective solutions for advanced material research, and application. Laser ablation , and surface modification are suitable for direct patterning of materials and their surface properties. Lasers allow rapid prototyping and small-batch manufacturing . They can also be used to pattern moving substrates, permitting fly-processing of large areas at reasonable speed. Different types of laser processes such as ablation, modification, and welding can be successfully combined in order to enable a high grade of bulk and surface functionality. Ultraviolet lasers favored for precise and debris-free patterns can be generated without the need for masks, resist materials, or chemicals. Machining of materials, for faster operation, thermally driven laser processes using NIR and IR laser radiation, could be increasingly attractive for a real rapid manufacturing.

  19. Validation of a 30-year-old process for the manufacture of L-asparaginase from Erwinia chrysanthemi.

    PubMed

    Gervais, David; Allison, Nigel; Jennings, Alan; Jones, Shane; Marks, Trevor

    2013-04-01

    A 30-year-old manufacturing process for the biologic product L-asparaginase from the plant pathogen Erwinia chrysanthemi was rigorously qualified and validated, with a high level of agreement between validation data and the 6-year process database. L-Asparaginase exists in its native state as a tetrameric protein and is used as a chemotherapeutic agent in the treatment regimen for Acute Lymphoblastic Leukaemia (ALL). The manufacturing process involves fermentation of the production organism, extraction and purification of the L-asparaginase to make drug substance (DS), and finally formulation and lyophilisation to generate drug product (DP). The extensive manufacturing experience with the product was used to establish ranges for all process parameters and product quality attributes. The product and in-process intermediates were rigorously characterised, and new assays, such as size-exclusion and reversed-phase UPLC, were developed, validated, and used to analyse several pre-validation batches. Finally, three prospective process validation batches were manufactured and product quality data generated using both the existing and the new analytical methods. These data demonstrated the process to be robust, highly reproducible and consistent, and the validation was successful, contributing to the granting of an FDA product license in November, 2011.

  20. 40 CFR Table 1 to Subpart H of... - Batch Processes

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Batch Processes 1 Table 1 to Subpart H... Standards for Organic Hazardous Air Pollutants for Equipment Leaks Pt. 63, Subpt. H, Table 1 Table 1 to Subpart H of Part 63—Batch Processes Monitoring Frequency for Equipment Other than Connectors Operating...

  1. Batch Scheduling for Hybrid Assembly Differentiation Flow Shop to Minimize Total Actual Flow Time

    NASA Astrophysics Data System (ADS)

    Maulidya, R.; Suprayogi; Wangsaputra, R.; Halim, A. H.

    2018-03-01

    A hybrid assembly differentiation flow shop is a three-stage flow shop consisting of Machining, Assembly and Differentiation Stages and producing different types of products. In the machining stage, parts are processed in batches on different (unrelated) machines. In the assembly stage, each part of the different parts is assembled into an assembly product. Finally, the assembled products will further be processed into different types of final products in the differentiation stage. In this paper, we develop a batch scheduling model for a hybrid assembly differentiation flow shop to minimize the total actual flow time defined as the total times part spent in the shop floor from the arrival times until its due date. We also proposed a heuristic algorithm for solving the problems. The proposed algorithm is tested using a set of hypothetic data. The solution shows that the algorithm can solve the problems effectively.

  2. Use of near-infrared spectroscopy (NIRs) in the biopharmaceutical industry for real-time determination of critical process parameters and integration of advanced feedback control strategies using MIDUS control.

    PubMed

    Vann, Lucas; Sheppard, John

    2017-12-01

    Control of biopharmaceutical processes is critical to achieve consistent product quality. The most challenging unit operation to control is cell growth in bioreactors due to the exquisitely sensitive and complex nature of the cells that are converting raw materials into new cells and products. Current monitoring capabilities are increasing, however, the main challenge is now becoming the ability to use the data generated in an effective manner. There are a number of contributors to this challenge including integration of different monitoring systems as well as the functionality to perform data analytics in real-time to generate process knowledge and understanding. In addition, there is a lack of ability to easily generate strategies and close the loop to feedback into the process for advanced process control (APC). The current research aims to demonstrate the use of advanced monitoring tools along with data analytics to generate process understanding in an Escherichia coli fermentation process. NIR spectroscopy was used to measure glucose and critical amino acids in real-time to help in determining the root cause of failures associated with different lots of yeast extract. First, scale-down of the process was required to execute a simple design of experiment, followed by scale-up to build NIR models as well as soft sensors for advanced process control. In addition, the research demonstrates the potential for a novel platform technology that enables manufacturers to consistently achieve "goldenbatch" performance through monitoring, integration, data analytics, understanding, strategy design and control (MIDUS control). MIDUS control was employed to increase batch-to-batch consistency in final product titers, decrease the coefficient of variability from 8.49 to 1.16%, predict possible exhaust filter failures and close the loop to prevent their occurrence and avoid lost batches.

  3. Batch-batch stable microbial community in the traditional fermentation process of huyumei broad bean pastes.

    PubMed

    Zhu, Linjiang; Fan, Zihao; Kuai, Hui; Li, Qi

    2017-09-01

    During natural fermentation processes, a characteristic microbial community structure (MCS) is naturally formed, and it is interesting to know about its batch-batch stability. This issue was explored in a traditional semi-solid-state fermentation process of huyumei, a Chinese broad bean paste product. The results showed that this MCS mainly contained four aerobic Bacillus species (8 log CFU per g), including B. subtilis, B. amyloliquefaciens, B. methylotrophicus, and B. tequilensis, and the facultative anaerobe B. cereus with a low concentration (4 log CFU per g), besides a very small amount of the yeast Zygosaccharomyces rouxii (2 log CFU per g). The dynamic change of the MCS in the brine fermentation process showed that the abundance of dominant species varied within a small range, and in the beginning of process the growth of lactic acid bacteria was inhibited and Staphylococcus spp. lost its viability. Also, the MCS and its dynamic change were proved to be highly reproducible among seven batches of fermentation. Therefore, the MCS naturally and stably forms between different batches of the traditional semi-solid-state fermentation of huyumei. Revealing microbial community structure and its batch-batch stability is helpful for understanding the mechanisms of community formation and flavour production in a traditional fermentation. This issue in a traditional semi-solid-state fermentation of huyumei broad bean paste was firstly explored. This fermentation process was revealed to be dominated by a high concentration of four aerobic species of Bacillus, a low concentration of B. cereus and a small amount of Zygosaccharomyces rouxii. Lactic acid bacteria and Staphylococcus spp. lost its viability at the beginning of fermentation. Such the community structure was proved to be highly reproducible among seven batches. © 2017 The Society for Applied Microbiology.

  4. 27 CFR 19.748 - Dump/batch records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Dump/batch records. 19.748... OF THE TREASURY LIQUORS DISTILLED SPIRITS PLANTS Records and Reports Processing Account § 19.748 Dump/batch records. (a) Format of dump/batch records. Proprietor's dump/batch records shall contain, as...

  5. Scale-up of industrial biodiesel production to 40 m(3) using a liquid lipase formulation.

    PubMed

    Price, Jason; Nordblad, Mathias; Martel, Hannah H; Chrabas, Brent; Wang, Huali; Nielsen, Per Munk; Woodley, John M

    2016-08-01

    In this work, we demonstrate the scale-up from an 80 L fed-batch scale to 40 m(3) along with the design of a 4 m(3) continuous process for enzymatic biodiesel production catalyzed by NS-40116 (a liquid formulation of a modified Thermomyces lanuginosus lipase). Based on the analysis of actual pilot plant data for the transesterification of used cooking oil and brown grease, we propose a method applying first order integral analysis to fed-batch data based on either the bound glycerol or free fatty acid content in the oil. This method greatly simplifies the modeling process and gives an indication of the effect of mixing at the various scales (80 L to 40 m(3) ) along with the prediction of the residence time needed to reach a desired conversion in a CSTR. Suitable process metrics reflecting commercial performance such as the reaction time, enzyme efficiency, and reactor productivity were evaluated for both the fed-batch and CSTR cases. Given similar operating conditions, the CSTR operation on average, has a reaction time which is 1.3 times greater than the fed-batch operation. We also showed how the process metrics can be used to quickly estimate the selling price of the enzyme. Assuming a biodiesel selling price of 0.6 USD/kg and a one-time use of the enzyme (0.1% (w/woil ) enzyme dosage); the enzyme can then be sold for 30 USD/kg which ensures that that the enzyme cost is not more than 5% of the biodiesel revenue. Biotechnol. Bioeng. 2016;113: 1719-1728. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  6. Influences of operational parameters on phosphorus removal in batch and continuous electrocoagulation process performance.

    PubMed

    Nguyen, Dinh Duc; Yoon, Yong Soo; Bui, Xuan Thanh; Kim, Sung Su; Chang, Soon Woong; Guo, Wenshan; Ngo, Huu Hao

    2017-11-01

    Performance of an electrocoagulation (EC) process in batch and continuous operating modes was thoroughly investigated and evaluated for enhancing wastewater phosphorus removal under various operating conditions, individually or combined with initial phosphorus concentration, wastewater conductivity, current density, and electrolysis times. The results revealed excellent phosphorus removal (72.7-100%) for both processes within 3-6 min of electrolysis, with relatively low energy requirements, i.e., less than 0.5 kWh/m 3 for treated wastewater. However, the removal efficiency of phosphorus in the continuous EC operation mode was better than that in batch mode within the scope of the study. Additionally, the rate and efficiency of phosphorus removal strongly depended on operational parameters, including wastewater conductivity, initial phosphorus concentration, current density, and electrolysis time. Based on experimental data, statistical model verification of the response surface methodology (RSM) (multiple factor optimization) was also established to provide further insights and accurately describe the interactive relationship between the process variables, thus optimizing the EC process performance. The EC process using iron electrodes is promising for improving wastewater phosphorus removal efficiency, and RSM can be a sustainable tool for predicting the performance of the EC process and explaining the influence of the process variables.

  7. 40 CFR 63.1326 - Batch process vents-recordkeeping provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutant Emissions: Group IV Polymers and Resins... requirements for Group 2 batch process vents that are exempt from the batch mass input limitation provisions...

  8. Scheduling algorithm for flow shop with two batch-processing machines and arbitrary job sizes

    NASA Astrophysics Data System (ADS)

    Cheng, Bayi; Yang, Shanlin; Hu, Xiaoxuan; Li, Kai

    2014-03-01

    This article considers the problem of scheduling two batch-processing machines in flow shop where the jobs have arbitrary sizes and the machines have limited capacity. The jobs are processed in batches and the total size of jobs in each batch cannot exceed the machine capacity. Once a batch is being processed, no interruption is allowed until all the jobs in it are completed. The problem of minimising makespan is NP-hard in the strong sense. First, we present a mathematical model of the problem using integer programme. We show the scale of feasible solutions of the problem and provide optimality properties. Then, we propose a polynomial time algorithm with running time in O(nlogn). The jobs are first assigned in feasible batches and then scheduled on machines. For the general case, we prove that the proposed algorithm has a performance guarantee of 4. For the special case where the processing times of each job on the two machines satisfy p 1 j = ap 2 j , the performance guarantee is ? for a > 0.

  9. Intact cell mass spectrometry as a progress tracking tool for batch and fed-batch fermentation processes.

    PubMed

    Helmel, Michaela; Marchetti-Deschmann, Martina; Raus, Martin; Posch, Andreas E; Herwig, Christoph; Šebela, Marek; Allmaier, Günter

    2015-02-01

    Penicillin production during a fermentation process using industrial strains of Penicillium chrysogenum is a research topic permanently discussed since the accidental discovery of the antibiotic. Intact cell mass spectrometry (ICMS) can be a fast and novel monitoring tool for the fermentation progress during penicillin V production in a nearly real-time fashion. This method is already used for the characterization of microorganisms and the differentiation of fungal strains; therefore, the application of ICMS to samples directly harvested from a fermenter is a promising possibility to get fast information about the progress of fungal growth. After the optimization of the ICMS method to penicillin V fermentation broth samples, the obtained ICMS data were evaluated by hierarchical cluster analysis or an in-house software solution written especially for ICMS data comparison. Growth stages of a batch and fed-batch fermentation of Penicillium chrysogenum are differentiated by one of those statistical approaches. The application of two matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) instruments in the linear positive ion mode from different vendors demonstrated the universal applicability of the developed ICMS method. The base for a fast and easy-to-use method for monitoring the fermentation progress of P. chrysogenum is created with this ICMS method developed especially for fermentation broth samples. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Future Supply Chains Enabled by Continuous Processing—Opportunities and Challenges. May 20–21, 2014 Continuous Manufacturing Symposium

    PubMed Central

    Srai, Jagjit Singh; Badman, Clive; Krumme, Markus; Futran, Mauricio; Johnston, Craig

    2015-01-01

    This paper examines the opportunities and challenges facing the pharmaceutical industry in moving to a primarily “continuous processing”-based supply chain. The current predominantly “large batch” and centralized manufacturing system designed for the “blockbuster” drug has driven a slow-paced, inventory heavy operating model that is increasingly regarded as inflexible and unsustainable. Indeed, new markets and the rapidly evolving technology landscape will drive more product variety, shorter product life-cycles, and smaller drug volumes, which will exacerbate an already unsustainable economic model. Future supply chains will be required to enhance affordability and availability for patients and healthcare providers alike despite the increased product complexity. In this more challenging supply scenario, we examine the potential for a more pull driven, near real-time demand-based supply chain, utilizing continuous processing where appropriate as a key element of a more “flow-through” operating model. In this discussion paper on future supply chain models underpinned by developments in the continuous manufacture of pharmaceuticals, we have set out; The significant opportunities to moving to a supply chain flow-through operating model, with substantial opportunities in inventory reduction, lead-time to patient, and radically different product assurance/stability regimes. Scenarios for decentralized production models producing a greater variety of products with enhanced volume flexibility. Production, supply, and value chain footprints that are radically different from today's monolithic and centralized batch manufacturing operations. Clinical trial and drug product development cost savings that support more rapid scale-up and market entry models with early involvement of SC designers within New Product Development. The major supply chain and industrial transformational challenges that need to be addressed. The paper recognizes that although current batch operational performance in pharma is far from optimal and not necessarily an appropriate end-state benchmark for batch technology, the adoption of continuous supply chain operating models underpinned by continuous production processing, as full or hybrid solutions in selected product supply chains, can support industry transformations to deliver right-first-time quality at substantially lower inventory profiles. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:840–849, 2015 PMID:25631279

  11. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF of Part 63... vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents... applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents...

  12. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF of Part 63... vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents... applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents...

  13. Leaching Behavior Of Mineral Processing Waste: Comparison Of Batch And Column Investigations

    EPA Science Inventory

    In this study, a comparison of laboratory batch and column experiments on metal release profile from a mineral processing waste (MPW) is presented. Batch (equilibrium) and column (dynamic) leaching tests were conducted on ground MPW at different liquid–solid ratios (LS) to determ...

  14. Robust high-throughput batch screening method in 384-well format with optical in-line resin quantification.

    PubMed

    Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen

    2015-04-15

    High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Chemical and Toxicological Fate of Fumonisin B1 during Extrusion Processing of Corn Grits

    USDA-ARS?s Scientific Manuscript database

    Two batches of flaking corn grits were prepared by growing Fusarium verticillioides to contain low and high levels of fumonisin B1 (FB1), Batch-1 at 9.7 ppm and Batch-2 at 50 ppm FB1 as determined by HPLC. These two batches were extruded (Batch-1E; Batch-2E) or extruded with 10% w/w glucose supplem...

  16. Comparison of the release of constituents from granular materials under batch and column testing.

    PubMed

    Lopez Meza, Sarynna; Garrabrants, Andrew C; van der Sloot, Hans; Kosson, David S

    2008-01-01

    Column leaching testing can be considered a better basis for assessing field impact data than any other available batch test method and thus provides a fundamental basis from which to estimate constituent release under a variety of field conditions. However, column testing is time-intensive compared to the more simplified batch testing, and may not always be a viable option when making decisions for material reuse. Batch tests are used most frequently as a simple tool for compliance or quality control reasons. Therefore, it is important to compare the release that occurs under batch and column testing, and establish conservative interpretation protocols for extrapolation from batch data when column data are not available. Five different materials (concrete, construction debris, aluminum recycling residue, coal fly ash and bottom ash) were evaluated via batch and column testing, including different column flow regimes (continuously saturated and intermittent unsaturated flow). Constituent release data from batch and column tests were compared. Results showed no significant difference between the column flow regimes when constituent release data from batch and column tests were compared. In most cases batch and column testing agreed when presented in the form of cumulative release. For arsenic in carbonated materials, however, batch testing underestimates the column constituent release for most LS ratios and also on a cumulative basis. For cases when As is a constituent of concern, column testing may be required.

  17. Batching alternatives for Phase I retrieval wastes to be processed in WRAP Module 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayancsik, B.A.

    1994-10-13

    During the next two decades, the transuranic (TRU) waste now stored in the 200 Area burial trenches and storage buildings is to be retrieved, processed in the Waste Receiving and Processing (WRAP) Module 1 facility, and shipped to a final disposal facility. The purpose of this document is to identify the criteria that can be used to batch suspect TRU waste, currently in retrievable storage, for processing through the WRAP Module 1 facility. These criteria are then used to generate a batch plan for Phase 1 Retrieval operations, which will retrieve the waste located in Trench 4C-04 of the 200more » West Area burial ground. The reasons for batching wastes for processing in WRAP Module 1 include reducing the exposure of workers and the environment to hazardous material and ionizing radiation; maximizing the efficiency of the retrieval, processing, and disposal processes by reducing costs, time, and space throughout the process; reducing analytical sampling and analysis; and reducing the amount of cleanup and decontamination between process runs. The criteria selected for batching the drums of retrieved waste entering WRAP Module 1 are based on the available records for the wastes sent to storage as well as knowledge of the processes that generated these wastes. The batching criteria identified in this document include the following: waste generator; type of process used to generate or package the waste; physical waste form; content of hazardous/dangerous chemicals in the waste; radiochemical type and quantity of waste; drum weight; and special waste types. These criteria were applied to the waste drums currently stored in Trench 4C-04. At least one batching scheme is shown for each of the criteria listed above.« less

  18. Batch effects in single-cell RNA-sequencing data are corrected by matching mutual nearest neighbors.

    PubMed

    Haghverdi, Laleh; Lun, Aaron T L; Morgan, Michael D; Marioni, John C

    2018-06-01

    Large-scale single-cell RNA sequencing (scRNA-seq) data sets that are produced in different laboratories and at different times contain batch effects that may compromise the integration and interpretation of the data. Existing scRNA-seq analysis methods incorrectly assume that the composition of cell populations is either known or identical across batches. We present a strategy for batch correction based on the detection of mutual nearest neighbors (MNNs) in the high-dimensional expression space. Our approach does not rely on predefined or equal population compositions across batches; instead, it requires only that a subset of the population be shared between batches. We demonstrate the superiority of our approach compared with existing methods by using both simulated and real scRNA-seq data sets. Using multiple droplet-based scRNA-seq data sets, we demonstrate that our MNN batch-effect-correction method can be scaled to large numbers of cells.

  19. Methods that remove batch effects while retaining group differences may lead to exaggerated confidence in downstream analyses

    PubMed Central

    Nygaard, Vegard; Rødland, Einar Andreas; Hovig, Eivind

    2016-01-01

    Removal of, or adjustment for, batch effects or center differences is generally required when such effects are present in data. In particular, when preparing microarray gene expression data from multiple cohorts, array platforms, or batches for later analyses, batch effects can have confounding effects, inducing spurious differences between study groups. Many methods and tools exist for removing batch effects from data. However, when study groups are not evenly distributed across batches, actual group differences may induce apparent batch differences, in which case batch adjustments may bias, usually deflate, group differences. Some tools therefore have the option of preserving the difference between study groups, e.g. using a two-way ANOVA model to simultaneously estimate both group and batch effects. Unfortunately, this approach may systematically induce incorrect group differences in downstream analyses when groups are distributed between the batches in an unbalanced manner. The scientific community seems to be largely unaware of how this approach may lead to false discoveries. PMID:26272994

  20. Surface-micromachined and high-aspect ratio electrostatic actuators for aeronautic and space applications: design and lifetime considerations

    NASA Astrophysics Data System (ADS)

    Vescovo, P.; Joseph, E.; Bourbon, G.; Le Moal, P.; Minotti, P.; Hibert, C.; Pont, G.

    2003-09-01

    This paper focuses on recent advances in the field of MEMS-based actuators and distributed microelectromechanical systems (MEMS). IC-processed actuators (e.g. actuators that are machined using integrated circuit batch processes) are expected to open a wide range of industrial applications on the near term. The most promising investigations deal with high-aspect ratio electric field driven microactuators suitable for use in numerous technical fields such as aeronautics and space industry. Because the silicon micromachining technology have the potential to integrate both mechanical components and control circuits within a single process, MEMS-based active control of microscopic and macroscopic structures appears to be one of the most promising challenges for the next decade. As a first step towards new generations of MEMS-based smart structures, recent investigations dealing with silicon mechanisms involving MEMS-based actuators are briefly discussed in this paper.

  1. Fuzzy logic feedback control for fed-batch enzymatic hydrolysis of lignocellulosic biomass.

    PubMed

    Tai, Chao; Voltan, Diego S; Keshwani, Deepak R; Meyer, George E; Kuhar, Pankaj S

    2016-06-01

    A fuzzy logic feedback control system was developed for process monitoring and feeding control in fed-batch enzymatic hydrolysis of a lignocellulosic biomass, dilute acid-pretreated corn stover. Digested glucose from hydrolysis reaction was assigned as input while doser feeding time and speed of pretreated biomass were responses from fuzzy logic control system. Membership functions for these three variables and rule-base were created based on batch hydrolysis data. The system response was first tested in LabVIEW environment then the performance was evaluated through real-time hydrolysis reaction. The feeding operations were determined timely by fuzzy logic control system and efficient responses were shown to plateau phases during hydrolysis. Feeding of proper amount of cellulose and maintaining solids content was well balanced. Fuzzy logic proved to be a robust and effective online feeding control tool for fed-batch enzymatic hydrolysis.

  2. Rapid characterization of chemical markers for discrimination of Moutan Cortex and its processed products by direct injection-based mass spectrometry profiling and metabolomic method.

    PubMed

    Li, Chao-Ran; Li, Meng-Ning; Yang, Hua; Li, Ping; Gao, Wen

    2018-06-01

    Processing of herbal medicines is a characteristic pharmaceutical technique in Traditional Chinese Medicine, which can reduce toxicity and side effect, improve the flavor and efficacy, and even change the pharmacological action entirely. It is significant and crucial to perform a method to find chemical markers for differentiating herbal medicines in different processed degrees. The aim of this study was to perform a rapid and reasonable method to discriminate Moutan Cortex and its processed products, and to reveal the characteristics of chemical components depend on chemical markers. Thirty batches of Moutan Cortex and its processed products, including 11 batches of Raw Moutan Cortex (RMC), 9 batches of Moutan Cortex Tostus (MCT) and 10 batches of Moutan Cortex Carbonisatus (MCC), were directly injected in electrospray ionization quadrupole time-of-flight mass spectrometry (ESI-QTOF MS) for rapid analysis in positive and negative mode. Without chromatographic separation, each run was completed within 3 min. The raw MS data were automatically extracted by background deduction and molecular feature (MF) extraction algorithm. In negative mode, a total of 452 MFs were obtained and then pretreated by data filtration and differential analysis. After that, the filtered 85 MFs were treated by principal component analysis (PCA) to reduce the dimensions. Subsequently, a partial least squares discrimination analysis (PLS-DA) model was constructed for differentiation and chemical markers detection of Moutan Cortex in different processed degrees. The positive mode data were treated as same as those in negative mode. RMC, MCT and MCC were successfully classified. Moreover, 14 and 3 chemical markers from negative and positive mode respectively, were screened by the combination of their relative peak areas and the parameter variable importance in the projection (VIP) values in PLS-DA model. The content changes of these chemical markers were employed in order to illustrate chemical changes of Moutan Cortex after processed. These results showed that the proposed method which combined non-targeted metabolomics analysis with multivariate statistics analysis is reasonable and effective. It could not only be applied to discriminate herbal medicines and their processing products, but also to reveal the characteristics of chemical components during processing. Copyright © 2018. Published by Elsevier GmbH.

  3. Parallel steady state studies on a milliliter scale accelerate fed-batch bioprocess design for recombinant protein production with Escherichia coli.

    PubMed

    Schmideder, Andreas; Cremer, Johannes H; Weuster-Botz, Dirk

    2016-11-01

    In general, fed-batch processes are applied for recombinant protein production with Escherichia coli (E. coli). However, state of the art methods for identifying suitable reaction conditions suffer from severe drawbacks, i.e. direct transfer of process information from parallel batch studies is often defective and sequential fed-batch studies are time-consuming and cost-intensive. In this study, continuously operated stirred-tank reactors on a milliliter scale were applied to identify suitable reaction conditions for fed-batch processes. Isopropyl β-d-1-thiogalactopyranoside (IPTG) induction strategies were varied in parallel-operated stirred-tank bioreactors to study the effects on the continuous production of the recombinant protein photoactivatable mCherry (PAmCherry) with E. coli. Best-performing induction strategies were transferred from the continuous processes on a milliliter scale to liter scale fed-batch processes. Inducing recombinant protein expression by dynamically increasing the IPTG concentration to 100 µM led to an increase in the product concentration of 21% (8.4 g L -1 ) compared to an implemented high-performance production process with the most frequently applied induction strategy by a single addition of 1000 µM IPGT. Thus, identifying feasible reaction conditions for fed-batch processes in parallel continuous studies on a milliliter scale was shown to be a powerful, novel method to accelerate bioprocess design in a cost-reducing manner. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1426-1435, 2016. © 2016 American Institute of Chemical Engineers.

  4. ARTS: automated randomization of multiple traits for study design.

    PubMed

    Maienschein-Cline, Mark; Lei, Zhengdeng; Gardeux, Vincent; Abbasi, Taimur; Machado, Roberto F; Gordeuk, Victor; Desai, Ankit A; Saraf, Santosh; Bahroos, Neil; Lussier, Yves

    2014-06-01

    Collecting data from large studies on high-throughput platforms, such as microarray or next-generation sequencing, typically requires processing samples in batches. There are often systematic but unpredictable biases from batch-to-batch, so proper randomization of biologically relevant traits across batches is crucial for distinguishing true biological differences from experimental artifacts. When a large number of traits are biologically relevant, as is common for clinical studies of patients with varying sex, age, genotype and medical background, proper randomization can be extremely difficult to prepare by hand, especially because traits may affect biological inferences, such as differential expression, in a combinatorial manner. Here we present ARTS (automated randomization of multiple traits for study design), which aids researchers in study design by automatically optimizing batch assignment for any number of samples, any number of traits and any batch size. ARTS is implemented in Perl and is available at github.com/mmaiensc/ARTS. ARTS is also available in the Galaxy Tool Shed, and can be used at the Galaxy installation hosted by the UIC Center for Research Informatics (CRI) at galaxy.cri.uic.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. The influence of pH adjustment on kinetics parameters in tapioca wastewater treatment using aerobic sequencing batch reactor system

    NASA Astrophysics Data System (ADS)

    Mulyani, Happy; Budianto, Gregorius Prima Indra; Margono, Kaavessina, Mujtahid

    2018-02-01

    The present investigation deals with the aerobic sequencing batch reactor system of tapioca wastewater treatment with varying pH influent conditions. This project was carried out to evaluate the effect of pH on kinetics parameters of system. It was done by operating aerobic sequencing batch reactor system during 8 hours in many tapioca wastewater conditions (pH 4.91, pH 7, pH 8). The Chemical Oxygen Demand (COD) and Mixed Liquor Volatile Suspended Solids (MLVSS) of the aerobic sequencing batch reactor system effluent at steady state condition were determined at interval time of two hours to generate data for substrate inhibition kinetics parameters. Values of the kinetics constants were determined using Monod and Andrews models. There was no inhibition constant (Ki) detected in all process variation of aerobic sequencing batch reactor system for tapioca wastewater treatment in this study. Furthermore, pH 8 was selected as the preferred aerobic sequencing batch reactor system condition in those ranging pH investigated due to its achievement of values of kinetics parameters such µmax = 0.010457/hour and Ks = 255.0664 mg/L COD.

  6. Processing Ultra Wide Band Synthetic Aperture Radar Data with Motion Detectors

    NASA Technical Reports Server (NTRS)

    Madsen, Soren Norvang

    1996-01-01

    Several issues makes the processing of ultra wide band (UWB) SAR data acquired from an airborne platform difficult. The character of UWB data invalidates many of the usual SAR batch processing techniques, leading to the application of wavenumber domain type processors...This paper will suggest and evaluate an algorithm which combines a wavenumber domain processing algorithm with a motion compensation procedure which enables motion compensation to be applied as a function of target range and the azimuth angle.

  7. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  8. Inferring mixed-culture growth from total biomass data in a wavelet approach

    NASA Astrophysics Data System (ADS)

    Ibarra-Junquera, V.; Escalante-Minakata, P.; Murguía, J. S.; Rosu, H. C.

    2006-10-01

    It is shown that the presence of mixed-culture growth in batch fermentation processes can be very accurately inferred from total biomass data by means of the wavelet analysis for singularity detection. This is accomplished by considering simple phenomenological models for the mixed growth and the more complicated case of mixed growth on a mixture of substrates. The main quantity provided by the wavelet analysis is the Hölder exponent of the singularity that we determine for our illustrative examples. The numerical results point to the possibility that Hölder exponents can be used to characterize the nature of the mixed-culture growth in batch fermentation processes with potential industrial applications. Moreover, the analysis of the same data affected by the common additive Gaussian noise still lead to the wavelet detection of the singularities although the Hölder exponent is no longer a useful parameter.

  9. AOIPS - An interactive image processing system. [Atmospheric and Oceanic Information Processing System

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.

    1978-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.

  10. Improved solution accuracy for TDRSS-based TOPEX/Poseidon orbit determination

    NASA Technical Reports Server (NTRS)

    Doll, C. E.; Mistretta, G. D.; Hart, R. C.; Oza, D. H.; Bolvin, D. T.; Cox, C. M.; Nemesure, M.; Niklewski, D. J.; Samii, M. V.

    1994-01-01

    Orbit determination results are obtained by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) using a batch-least-squares estimator available in the Goddard Trajectory Determination System (GTDS) and an extended Kalman filter estimation system to process Tracking and Data Relay Satellite (TDRS) System (TDRSS) measurements. GTDS is the operational orbit determination system used by the FDD in support of the Ocean Topography Experiment (TOPEX)/Poseidon spacecraft navigation and health and safety operations. The extended Kalman filter was implemented in an orbit determination analysis prototype system, closely related to the Real-Time Orbit Determination System/Enhanced (RTOD/E) system. In addition, the Precision Orbit Determination (POD) team within the GSFC Space Geodesy Branch generated an independent set of high-accuracy trajectories to support the TOPEX/Poseidon scientific data. These latter solutions use the geodynamics (GEODYN) orbit determination system with laser ranging and Doppler Orbitography and Radiopositioning integrated by satellite (DORIS) tracking measurements. The TOPEX/Poseidon trajectories were estimated for November 7 through November 11, 1992, the timeframe under study. Independent assessments were made of the consistencies of solutions produced by the batch and sequential methods. The batch-least-squares solutions were assessed based on the solution residuals, while the sequential solutions were assessed based on primarily the estimated covariances. The batch-least-squares and sequential orbit solutions were compared with the definitive POD orbit solutions. The solution differences were generally less than 2 meters for the batch-least-squares and less than 13 meters for the sequential estimation solutions. After the sequential estimation solutions were processed with a smoother algorithm, position differences with POD orbit solutions of less than 7 meters were obtained. The differences among the POD, GTDS, and filter/smoother solutions can be traced to differences in modeling and tracking data types, which are being analyzed in detail.

  11. 21 CFR 111.530 - When must an investigation be conducted of your manufacturing processes and other batches?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... PRACTICE IN MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Returned Dietary Supplements § 111.530 When must an investigation be conducted of your manufacturing processes and other batches? If the reason for a dietary supplement being returned implicates other batches, you must...

  12. 21 CFR 111.530 - When must an investigation be conducted of your manufacturing processes and other batches?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... PRACTICE IN MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Returned Dietary Supplements § 111.530 When must an investigation be conducted of your manufacturing processes and other batches? If the reason for a dietary supplement being returned implicates other batches, you must...

  13. 21 CFR 111.530 - When must an investigation be conducted of your manufacturing processes and other batches?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... PRACTICE IN MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Returned Dietary Supplements § 111.530 When must an investigation be conducted of your manufacturing processes and other batches? If the reason for a dietary supplement being returned implicates other batches, you must...

  14. 21 CFR 111.530 - When must an investigation be conducted of your manufacturing processes and other batches?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... PRACTICE IN MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Returned Dietary Supplements § 111.530 When must an investigation be conducted of your manufacturing processes and other batches? If the reason for a dietary supplement being returned implicates other batches, you must...

  15. 21 CFR 111.530 - When must an investigation be conducted of your manufacturing processes and other batches?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PRACTICE IN MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Returned Dietary Supplements § 111.530 When must an investigation be conducted of your manufacturing processes and other batches? If the reason for a dietary supplement being returned implicates other batches, you must...

  16. Searching CA Condensates, On-Line and Batch.

    ERIC Educational Resources Information Center

    Kaminecki, Ronald M.; And Others

    Batch mode processing is compared, using cost-effectiveness, with on-line processing for computer-aided searching of chemical abstracts. Consideration for time, need, coverage, and adaptability are found to be the criteria by which a searcher selects a method, and sometimes both methods are used. There is a tradeoff between batch mode's slower…

  17. Explanatory Variables Associated with Campylobacter and Escherichia coli Concentrations on Broiler Chicken Carcasses during Processing in Two Slaughterhouses.

    PubMed

    Pacholewicz, Ewa; Swart, Arno; Wagenaar, Jaap A; Lipman, Len J A; Havelaar, Arie H

    2016-12-01

    This study aimed at identifying explanatory variables that were associated with Campylobacter and Escherichia coli concentrations throughout processing in two commercial broiler slaughterhouses. Quantative data on Campylobacter and E. coli along the processing line were collected. Moreover, information on batch characteristics, slaughterhouse practices, process performance, and environmental variables was collected through questionnaires, observations, and measurements, resulting in data on 19 potential explanatory variables. Analysis was conducted separately in each slaughterhouse to identify which variables were related to changes in concentrations of Campylobacter and E. coli during the processing steps: scalding, defeathering, evisceration, and chilling. Associations with explanatory variables were different in the slaughterhouses studied. In the first slaughterhouse, there was only one significant association: poorer uniformity of the weight of carcasses within a batch with less decrease in E. coli concentrations after defeathering. In the second slaughterhouse, significant statistical associations were found with variables, including age, uniformity, average weight of carcasses, Campylobacter concentrations in excreta and ceca, and E. coli concentrations in excreta. Bacterial concentrations in excreta and ceca were found to be the most prominent variables, because they were associated with concentration on carcasses at various processing points. Although the slaughterhouses produced specific products and had different batch characteristics and processing parameters, the effect of the significant variables was not always the same for each slaughterhouse. Therefore, each slaughterhouse needs to determine its particular relevant measures for hygiene control and process management. This identification could be supported by monitoring changes in bacterial concentrations during processing in individual slaughterhouses. In addition, the possibility that management and food handling practices in slaughterhouses contribute to the differences in bacterial contamination between slaughterhouses needs further investigation.

  18. Medication Waste Reduction in Pediatric Pharmacy Batch Processes

    PubMed Central

    Veltri, Michael A.; Hamrock, Eric; Mollenkopf, Nicole L.; Holt, Kristen; Levin, Scott

    2014-01-01

    OBJECTIVES: To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. METHODS: A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. RESULTS: Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. CONCLUSIONS: The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste. PMID:25024671

  19. Medication waste reduction in pediatric pharmacy batch processes.

    PubMed

    Toerper, Matthew F; Veltri, Michael A; Hamrock, Eric; Mollenkopf, Nicole L; Holt, Kristen; Levin, Scott

    2014-04-01

    To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste.

  20. An Integer Batch Scheduling Model for a Single Machine with Simultaneous Learning and Deterioration Effects to Minimize Total Actual Flow Time

    NASA Astrophysics Data System (ADS)

    Yusriski, R.; Sukoyo; Samadhi, T. M. A. A.; Halim, A. H.

    2016-02-01

    In the manufacturing industry, several identical parts can be processed in batches, and setup time is needed between two consecutive batches. Since the processing times of batches are not always fixed during a scheduling period due to learning and deterioration effects, this research deals with batch scheduling problems with simultaneous learning and deterioration effects. The objective is to minimize total actual flow time, defined as a time interval between the arrival of all parts at the shop and their common due date. The decision variables are the number of batches, integer batch sizes, and the sequence of the resulting batches. This research proposes a heuristic algorithm based on the Lagrange Relaxation. The effectiveness of the proposed algorithm is determined by comparing the resulting solutions of the algorithm to the respective optimal solution obtained from the enumeration method. Numerical experience results show that the average of difference among the solutions is 0.05%.

  1. Towards a consensus-based biokinetic model for green microalgae - The ASM-A.

    PubMed

    Wágner, Dorottya S; Valverde-Pérez, Borja; Sæbø, Mariann; Bregua de la Sotilla, Marta; Van Wagenen, Jonathan; Smets, Barth F; Plósz, Benedek Gy

    2016-10-15

    Cultivation of microalgae in open ponds and closed photobioreactors (PBRs) using wastewater resources offers an opportunity for biochemical nutrient recovery. Effective reactor system design and process control of PBRs requires process models. Several models with different complexities have been developed to predict microalgal growth. However, none of these models can effectively describe all the relevant processes when microalgal growth is coupled with nutrient removal and recovery from wastewaters. Here, we present a mathematical model developed to simulate green microalgal growth (ASM-A) using the systematic approach of the activated sludge modelling (ASM) framework. The process model - identified based on a literature review and using new experimental data - accounts for factors influencing photoautotrophic and heterotrophic microalgal growth, nutrient uptake and storage (i.e. Droop model) and decay of microalgae. Model parameters were estimated using laboratory-scale batch and sequenced batch experiments using the novel Latin Hypercube Sampling based Simplex (LHSS) method. The model was evaluated using independent data obtained in a 24-L PBR operated in sequenced batch mode. Identifiability of the model was assessed. The model can effectively describe microalgal biomass growth, ammonia and phosphate concentrations as well as the phosphorus storage using a set of average parameter values estimated with the experimental data. A statistical analysis of simulation and measured data suggests that culture history and substrate availability can introduce significant variability on parameter values for predicting the reaction rates for bulk nitrate and the intracellularly stored nitrogen state-variables, thereby requiring scenario specific model calibration. ASM-A was identified using standard cultivation medium and it can provide a platform for extensions accounting for factors influencing algal growth and nutrient storage using wastewater resources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. The use of Optical Character Recognition (OCR) in the digitisation of herbarium specimen labels.

    PubMed

    Drinkwater, Robyn E; Cubey, Robert W N; Haston, Elspeth M

    2014-01-01

    At the Royal Botanic Garden Edinburgh (RBGE) the use of Optical Character Recognition (OCR) to aid the digitisation process has been investigated. This was tested using a herbarium specimen digitisation process with two stages of data entry. Records were initially batch-processed to add data extracted from the OCR text prior to being sorted based on Collector and/or Country. Using images of the specimens, a team of six digitisers then added data to the specimen records. To investigate whether the data from OCR aid the digitisation process, they completed a series of trials which compared the efficiency of data entry between sorted and unsorted batches of specimens. A survey was carried out to explore the opinion of the digitisation staff to the different sorting options. In total 7,200 specimens were processed. When compared to an unsorted, random set of specimens, those which were sorted based on data added from the OCR were quicker to digitise. Of the methods tested here, the most successful in terms of efficiency used a protocol which required entering data into a limited set of fields and where the records were filtered by Collector and Country. The survey and subsequent discussions with the digitisation staff highlighted their preference for working with sorted specimens, in which label layout, locations and handwriting are likely to be similar, and so a familiarity with the Collector or Country is rapidly established.

  3. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  4. Investigation of vinegar production using a novel shaken repeated batch culture system.

    PubMed

    Schlepütz, Tino; Büchs, Jochen

    2013-01-01

    Nowadays, bioprocesses are developed or optimized on small scale. Also, vinegar industry is motivated to reinvestigate the established repeated batch fermentation process. As yet, there is no small-scale culture system for optimizing fermentation conditions for repeated batch bioprocesses. Thus, the aim of this study is to propose a new shaken culture system for parallel repeated batch vinegar fermentation. A new operation mode - the flushing repeated batch - was developed. Parallel repeated batch vinegar production could be established in shaken overflow vessels in a completely automated operation with only one pump per vessel. This flushing repeated batch was first theoretically investigated and then empirically tested. The ethanol concentration was online monitored during repeated batch fermentation by semiconductor gas sensors. It was shown that the switch from one ethanol substrate quality to different ethanol substrate qualities resulted in prolonged lag phases and durations of the first batches. In the subsequent batches the length of the fermentations decreased considerably. This decrease in the respective lag phases indicates an adaptation of the acetic acid bacteria mixed culture to the specific ethanol substrate quality. Consequently, flushing repeated batch fermentations on small scale are valuable for screening fermentation conditions and, thereby, improving industrial-scale bioprocesses such as vinegar production in terms of process robustness, stability, and productivity. Copyright © 2013 American Institute of Chemical Engineers.

  5. The kinetics, current efficiency, and power consumption of electrochemical dye decolorization by BD-NCD film electrode

    NASA Astrophysics Data System (ADS)

    Nurhayati, Ervin; Juang, Yaju; Huang, Chihpin

    2017-06-01

    Diamond film electrode has been known as a material with very wide potential window for water electrolysis which leads to its applicability in numerous electrochemical processes. Its capability to produce hydroxyl radicals, a very strong oxidants, prompts its popular application in wastewater treatment. Batch and batch recirculation reactor were applied to perform bulk electrolysis experiments to investigate the kinetics of dye decolorization under different operation conditions, such as pH, active species, and current density. Furthermore, COD degradation data from batch recirculation reactor operation was used as the basis for the calculation of current efficiency and power consumption in the decolorization process. The kinetics of decolorization process using boron-doped nanocrystalline diamond (BD-NCD) film electrode revealed that acidic condition is favored for the dye degradation, and the presence of chloride ion in the solution was found to be more advantageous than sulfate active species, as evidenced by the higher reaction rate constants. Applying different current density of 10, 20 and 30 mA cm-2, it was found that the higher the current density the faster the decolorization rate. General current efficiency achieved after nearly total decolorization and 80% COD removal in batch recirculation reactor was around 74%, with specific power consumption of 4.4 kWh m-3 (in terms of volume of solution treated) or 145 kWh kg-1(in terms of kg COD treated).

  6. Removing Batch Effects from Longitudinal Gene Expression - Quantile Normalization Plus ComBat as Best Approach for Microarray Transcriptome Data

    PubMed Central

    Müller, Christian; Schillert, Arne; Röthemeier, Caroline; Trégouët, David-Alexandre; Proust, Carole; Binder, Harald; Pfeiffer, Norbert; Beutel, Manfred; Lackner, Karl J.; Schnabel, Renate B.; Tiret, Laurence; Wild, Philipp S.; Blankenberg, Stefan

    2016-01-01

    Technical variation plays an important role in microarray-based gene expression studies, and batch effects explain a large proportion of this noise. It is therefore mandatory to eliminate technical variation while maintaining biological variability. Several strategies have been proposed for the removal of batch effects, although they have not been evaluated in large-scale longitudinal gene expression data. In this study, we aimed at identifying a suitable method for batch effect removal in a large study of microarray-based longitudinal gene expression. Monocytic gene expression was measured in 1092 participants of the Gutenberg Health Study at baseline and 5-year follow up. Replicates of selected samples were measured at both time points to identify technical variability. Deming regression, Passing-Bablok regression, linear mixed models, non-linear models as well as ReplicateRUV and ComBat were applied to eliminate batch effects between replicates. In a second step, quantile normalization prior to batch effect correction was performed for each method. Technical variation between batches was evaluated by principal component analysis. Associations between body mass index and transcriptomes were calculated before and after batch removal. Results from association analyses were compared to evaluate maintenance of biological variability. Quantile normalization, separately performed in each batch, combined with ComBat successfully reduced batch effects and maintained biological variability. ReplicateRUV performed perfectly in the replicate data subset of the study, but failed when applied to all samples. All other methods did not substantially reduce batch effects in the replicate data subset. Quantile normalization plus ComBat appears to be a valuable approach for batch correction in longitudinal gene expression data. PMID:27272489

  7. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. Integrated production of lactic acid and biomass on distillery stillage.

    PubMed

    Djukić-Vuković, Aleksandra P; Mojović, Ljiljana V; Vukašinović-Sekulić, Maja S; Nikolić, Svetlana B; Pejin, Jelena D

    2013-09-01

    The possibilities of parallel lactic acid and biomass production in batch and fed-batch fermentation on distillery stillage from bioethanol production were studied. The highest lactic acid yield and productivity of 92.3 % and 1.49 g L(-1) h(-1) were achieved in batch fermentation with initial sugar concentration of 55 g L(-1). A significant improvement of the process was achieved in fed-batch fermentation where the concentration of lactic acid was increased to 47.6 % and volumetric productivity for 21 % over the batch process. A high number of Lactobacillus rhamnosus ATCC 7469 viable cells of 10(9) CFU ml(-1) was attained at the end of fed-batch fermentation. The survival of 92.9 % of L. rhamnosus cells after 3 h of incubation at pH 2.5 validated that the fermentation media remained after lactic acid removal could be used as a biomass-enriched animal feed thus making an additional value to the process.

  9. Continuous and scalable polymer capsule processing for inertial fusion energy target shell fabrication using droplet microfluidics.

    PubMed

    Li, Jin; Lindley-Start, Jack; Porch, Adrian; Barrow, David

    2017-07-24

    High specification, polymer capsules, to produce inertial fusion energy targets, were continuously fabricated using surfactant-free, inertial centralisation, and ultrafast polymerisation, in a scalable flow reactor. Laser-driven, inertial confinement fusion depends upon the interaction of high-energy lasers and hydrogen isotopes, contained within small, spherical and concentric target shells, causing a nuclear fusion reaction at ~150 M°C. Potentially, targets will be consumed at ~1 M per day per reactor, demanding a 5000x unit cost reduction to ~$0.20, and is a critical, key challenge. Experimentally, double emulsions were used as templates for capsule-shells, and were formed at 20 Hz, on a fluidic chip. Droplets were centralised in a dynamic flow, and their shapes both evaluated, and mathematically modeled, before subsequent shell solidification. The shells were photo-cured individually, on-the-fly, with precisely-actuated, millisecond-length (70 ms), uniform-intensity UV pulses, delivered through eight, radially orchestrated light-pipes. The near 100% yield rate of uniform shells had a minimum 99.0% concentricity and sphericity, and the solidification processing period was significantly reduced, over conventional batch methods. The data suggest the new possibility of a continuous, on-the-fly, IFE target fabrication process, employing sequential processing operations within a continuous enclosed duct system, which may include cryogenic fuel-filling, and shell curing, to produce ready-to-use IFE targets.

  10. An approach to optimize the batch mixing process for improving the quality consistency of the products made from traditional Chinese medicines*

    PubMed Central

    Yan, Bin-jun; Qu, Hai-bin

    2013-01-01

    The efficacy of traditional Chinese medicine (TCM) is based on the combined effects of its constituents. Variation in chemical composition between batches of TCM has always been the deterring factor in achieving consistency in efficacy. The batch mixing process can significantly reduce the batch-to-batch quality variation in TCM extracts by mixing them in a well-designed proportion. However, reducing the quality variation without sacrificing too much of the production efficiency is one of the challenges. Accordingly, an innovative and practical batch mixing method aimed at providing acceptable efficiency for industrial production of TCM products is proposed in this work, which uses a minimum number of batches of extracts to meet the content limits. The important factors affecting the utilization ratio of the extracts (URE) were studied by simulations. The results have shown that URE was affected by the correlation between the contents of constituents, and URE decreased with the increase in the number of targets and the relative standard deviations of the contents. URE could be increased by increasing the number of storage tanks. The results have provided a reference for designing the batch mixing process. The proposed method has possible application value in reducing the quality variation in TCM and providing acceptable production efficiency simultaneously. PMID:24190450

  11. An approach to optimize the batch mixing process for improving the quality consistency of the products made from traditional Chinese medicines.

    PubMed

    Yan, Bin-jun; Qu, Hai-bin

    2013-11-01

    The efficacy of traditional Chinese medicine (TCM) is based on the combined effects of its constituents. Variation in chemical composition between batches of TCM has always been the deterring factor in achieving consistency in efficacy. The batch mixing process can significantly reduce the batch-to-batch quality variation in TCM extracts by mixing them in a well-designed proportion. However, reducing the quality variation without sacrificing too much of the production efficiency is one of the challenges. Accordingly, an innovative and practical batch mixing method aimed at providing acceptable efficiency for industrial production of TCM products is proposed in this work, which uses a minimum number of batches of extracts to meet the content limits. The important factors affecting the utilization ratio of the extracts (URE) were studied by simulations. The results have shown that URE was affected by the correlation between the contents of constituents, and URE decreased with the increase in the number of targets and the relative standard deviations of the contents. URE could be increased by increasing the number of storage tanks. The results have provided a reference for designing the batch mixing process. The proposed method has possible application value in reducing the quality variation in TCM and providing acceptable production efficiency simultaneously.

  12. High solid fed-batch butanol fermentation with simultaneous product recovery: part II - process integration.

    USDA-ARS?s Scientific Manuscript database

    In these studies liquid hot water (LHW) pretreated and enzymatically hydrolyzed Sweet Sorghum Bagasse (SSB) hydrolyzates were fermented in a fed-batch reactor. As reported in the preceding paper, the culture was not able to ferment the hydrolyzate I in a batch process due to presence of high level o...

  13. User's guide to the UTIL-ODRC tape processing program. [for the Orbital Data Reduction Center

    NASA Technical Reports Server (NTRS)

    Juba, S. M. (Principal Investigator)

    1981-01-01

    The UTIL-ODRC computer compatible tape processing program, its input/output requirements, and its interface with the EXEC 8 operating system are described. It is a multipurpose orbital data reduction center (ODRC) tape processing program enabling the user to create either exact duplicate tapes and/or tapes in SINDA/HISTRY format. Input data elements for PRAMPT/FLOPLT and/or BATCH PLOT programs, a temperature summary, and a printed summary can also be produced.

  14. Actual waste demonstration of the nitric-glycolic flowsheet for sludge batch 9 qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newell, D.; Pareizs, J.; Martino, C.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs qualification testing to demonstrate that the sludge batch is processable. Based on the results of this actual-waste qualification and previous simulant studies, SRNL recommends implementation of the nitric-glycolic acid flowsheet in DWPF. Other recommendations resulting from this demonstration are reported in section 5.0.

  15. A novel data-driven learning method for radar target detection in nonstationary environments

    DOE PAGES

    Akcakaya, Murat; Nehorai, Arye; Sen, Satyabrata

    2016-04-12

    Most existing radar algorithms are developed under the assumption that the environment (clutter) is stationary. However, in practice, the characteristics of the clutter can vary enormously depending on the radar-operational scenarios. If unaccounted for, these nonstationary variabilities may drastically hinder the radar performance. Therefore, to overcome such shortcomings, we develop a data-driven method for target detection in nonstationary environments. In this method, the radar dynamically detects changes in the environment and adapts to these changes by learning the new statistical characteristics of the environment and by intelligibly updating its statistical detection algorithm. Specifically, we employ drift detection algorithms to detectmore » changes in the environment; incremental learning, particularly learning under concept drift algorithms, to learn the new statistical characteristics of the environment from the new radar data that become available in batches over a period of time. The newly learned environment characteristics are then integrated in the detection algorithm. Furthermore, we use Monte Carlo simulations to demonstrate that the developed method provides a significant improvement in the detection performance compared with detection techniques that are not aware of the environmental changes.« less

  16. Computerized Serial Processing System at the University of California, Berkeley

    ERIC Educational Resources Information Center

    Silberstein, Stephen M.

    1975-01-01

    The extreme flexibility of the MARC format coupled with the simplicity of a batch-oriented processing system centered around a sequential master file has enabled the University of California, Berkeley, library to gradually build an unusually large serials data base in support of both technical and public services. (Author)

  17. Converting the H. W. Wilson Company Indexes to an Automated System: A Functional Analysis.

    ERIC Educational Resources Information Center

    Regazzi, John J.

    1984-01-01

    Description of the computerized information system that supports the editorial and manufacturing processes involved in creation of Wilson's subject indexes and catalogs includes the major subsystems--online data entry, batch input processing, validation and release, file generation and database management, online and offline retrieval, publication…

  18. Investigation for improving Global Positioning System (GPS) orbits using a discrete sequential estimator and stochastic models of selected physical processes

    NASA Technical Reports Server (NTRS)

    Goad, Clyde C.; Chadwell, C. David

    1993-01-01

    GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.

  19. Simultaneous Transformation of Commingled Trichloroethylene, Tetrachloroethylene, and 1,4-Dioxane by a Microbially Driven Fenton Reaction in Batch Liquid Cultures.

    PubMed

    Sekar, Ramanan; Taillefert, Martial; DiChristina, Thomas J

    2016-11-01

    Improper disposal of 1,4-dioxane and the chlorinated organic solvents trichloroethylene (TCE) and tetrachloroethylene (also known as perchloroethylene [PCE]) has resulted in widespread contamination of soil and groundwater. In the present study, a previously designed microbially driven Fenton reaction system was reconfigured to generate hydroxyl (HO˙) radicals for simultaneous transformation of source zone levels of single, binary, and ternary mixtures of TCE, PCE, and 1,4-dioxane. The reconfigured Fenton reaction system was driven by fed batch cultures of the Fe(III)-reducing facultative anaerobe Shewanella oneidensis amended with lactate, Fe(III), and contaminants and exposed to alternating anaerobic and aerobic conditions. To avoid contaminant loss due to volatility, the Fe(II)-generating, hydrogen peroxide-generating, and contaminant transformation phases of the microbially driven Fenton reaction system were separated. The reconfigured Fenton reaction system transformed TCE, PCE, and 1,4-dioxane either as single contaminants or as binary and ternary mixtures. In the presence of equimolar concentrations of PCE and TCE, the ratio of the experimentally derived rates of PCE and TCE transformation was nearly identical to the ratio of the corresponding HO˙ radical reaction rate constants. The reconfigured Fenton reaction system may be applied as an ex situ platform for simultaneous degradation of commingled TCE, PCE, and 1,4-dioxane and provides valuable information for future development of in situ remediation technologies. A microbially driven Fenton reaction system [driven by the Fe(III)-reducing facultative anaerobe S. oneidensis] was reconfigured to transform source zone levels of TCE, PCE, and 1,4-dioxane as single contaminants or as binary and ternary mixtures. The microbially driven Fenton reaction may thus be applied as an ex situ platform for simultaneous degradation of at least three (and potentially more) commingled contaminants. Additional targets for ex situ and in situ degradation by the microbially driven Fenton reaction developed in the present study include multiple combinations of environmental contaminants susceptible to attack by Fenton reaction-generated HO˙ radicals, including commingled plumes of 1,4-dioxane, pentachlorophenol (PCP), PCE, TCE, 1,1,2-trichloroethane (TCA), and perfluoroalkylated substances (PFAS). Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  20. Simultaneous Transformation of Commingled Trichloroethylene, Tetrachloroethylene, and 1,4-Dioxane by a Microbially Driven Fenton Reaction in Batch Liquid Cultures

    PubMed Central

    Sekar, Ramanan; Taillefert, Martial

    2016-01-01

    ABSTRACT Improper disposal of 1,4-dioxane and the chlorinated organic solvents trichloroethylene (TCE) and tetrachloroethylene (also known as perchloroethylene [PCE]) has resulted in widespread contamination of soil and groundwater. In the present study, a previously designed microbially driven Fenton reaction system was reconfigured to generate hydroxyl (HO˙) radicals for simultaneous transformation of source zone levels of single, binary, and ternary mixtures of TCE, PCE, and 1,4-dioxane. The reconfigured Fenton reaction system was driven by fed batch cultures of the Fe(III)-reducing facultative anaerobe Shewanella oneidensis amended with lactate, Fe(III), and contaminants and exposed to alternating anaerobic and aerobic conditions. To avoid contaminant loss due to volatility, the Fe(II)-generating, hydrogen peroxide-generating, and contaminant transformation phases of the microbially driven Fenton reaction system were separated. The reconfigured Fenton reaction system transformed TCE, PCE, and 1,4-dioxane either as single contaminants or as binary and ternary mixtures. In the presence of equimolar concentrations of PCE and TCE, the ratio of the experimentally derived rates of PCE and TCE transformation was nearly identical to the ratio of the corresponding HO˙ radical reaction rate constants. The reconfigured Fenton reaction system may be applied as an ex situ platform for simultaneous degradation of commingled TCE, PCE, and 1,4-dioxane and provides valuable information for future development of in situ remediation technologies. IMPORTANCE A microbially driven Fenton reaction system [driven by the Fe(III)-reducing facultative anaerobe S. oneidensis] was reconfigured to transform source zone levels of TCE, PCE, and 1,4-dioxane as single contaminants or as binary and ternary mixtures. The microbially driven Fenton reaction may thus be applied as an ex situ platform for simultaneous degradation of at least three (and potentially more) commingled contaminants. Additional targets for ex situ and in situ degradation by the microbially driven Fenton reaction developed in the present study include multiple combinations of environmental contaminants susceptible to attack by Fenton reaction-generated HO˙ radicals, including commingled plumes of 1,4-dioxane, pentachlorophenol (PCP), PCE, TCE, 1,1,2-trichloroethane (TCA), and perfluoroalkylated substances (PFAS). PMID:27542932

  1. Quick generation of Raman spectroscopy based in-process glucose control to influence biopharmaceutical protein product quality during mammalian cell culture.

    PubMed

    Berry, Brandon N; Dobrowsky, Terrence M; Timson, Rebecca C; Kshirsagar, Rashmi; Ryll, Thomas; Wiltberger, Kelly

    2016-01-01

    Mitigating risks to biotherapeutic protein production processes and products has driven the development of targeted process analytical technology (PAT); however implementing PAT during development without significantly increasing program timelines can be difficult. The development of a monoclonal antibody expressed in a Chinese hamster ovary (CHO) cell line via fed-batch processing presented an opportunity to demonstrate capabilities of altering percent glycated protein product. Glycation is caused by pseudo-first order, non-enzymatic reaction of a reducing sugar with an amino group. Glucose is the highest concentration reducing sugar in the chemically defined media (CDM), thus a strategy controlling glucose in the production bioreactor was developed utilizing Raman spectroscopy for feedback control. Raman regions for glucose were determined by spiking studies in water and CDM. Calibration spectra were collected during 8 bench scale batches designed to capture a wide glucose concentration space. Finally, a PLS model capable of translating Raman spectra to glucose concentration was built using the calibration spectra and spiking study regions. Bolus feeding in mammalian cell culture results in wide glucose concentration ranges. Here we describe the development of process automation enabling glucose setpoint control. Glucose-free nutrient feed was fed daily, however glucose stock solution was fed as needed according to online Raman measurements. Two feedback control conditions were executed where glucose was controlled at constant low concentration or decreased stepwise throughout. Glycation was reduced from ∼9% to 4% using a low target concentration but was not reduced in the stepwise condition as compared to the historical bolus glucose feeding regimen. © 2015 American Institute of Chemical Engineers.

  2. Enabling a high throughput real time data pipeline for a large radio telescope array with GPUs

    NASA Astrophysics Data System (ADS)

    Edgar, R. G.; Clark, M. A.; Dale, K.; Mitchell, D. A.; Ord, S. M.; Wayth, R. B.; Pfister, H.; Greenhill, L. J.

    2010-10-01

    The Murchison Widefield Array (MWA) is a next-generation radio telescope currently under construction in the remote Western Australia Outback. Raw data will be generated continuously at 5 GiB s-1, grouped into 8 s cadences. This high throughput motivates the development of on-site, real time processing and reduction in preference to archiving, transport and off-line processing. Each batch of 8 s data must be completely reduced before the next batch arrives. Maintaining real time operation will require a sustained performance of around 2.5 TFLOP s-1 (including convolutions, FFTs, interpolations and matrix multiplications). We describe a scalable heterogeneous computing pipeline implementation, exploiting both the high computing density and FLOP-per-Watt ratio of modern GPUs. The architecture is highly parallel within and across nodes, with all major processing elements performed by GPUs. Necessary scatter-gather operations along the pipeline are loosely synchronized between the nodes hosting the GPUs. The MWA will be a frontier scientific instrument and a pathfinder for planned peta- and exa-scale facilities.

  3. Modeling of the adsorptive removal of arsenic(III) using plant biomass: a bioremedial approach

    NASA Astrophysics Data System (ADS)

    Roy, Palas; Dey, Uttiya; Chattoraj, Soumya; Mukhopadhyay, Debasis; Mondal, Naba Kumar

    2017-06-01

    In the present work, the possibility of using a non-conventional finely ground (250 μm) Azadirachta indica (neem) bark powder [AiBP] has been tested as a low-cost biosorbent for the removal of arsenic(III) from water. The removal of As(III) was studied by performing a series of biosorption experiments (batch and column). The biosorption behavior of As(III) for batch and column operations were examined in the concentration ranges of 50-500 µg L-1 and 500.0-2000.0 µg L-1, respectively. Under optimized batch conditions, the AiBP could remove up to 89.96 % of As(III) in water system. The artificial neural network (ANN) model was developed from batch experimental data sets which provided reasonable predictive performance ( R 2 = 0.961; 0.954) of As(III) biosorption. In batch operation, the initial As(III) concentration had the most significant impact on the biosorption process. For column operation, central composite design (CCD) was applied to investigate the influence on the breakthrough time for optimization of As(III) biosorption process and evaluation of interacting effects of different operating variables. The optimized result of CCD revealed that the AiBP was an effective and economically feasible biosorbent with maximum breakthrough time of 653.9 min, when the independent variables were retained at 2.0 g AiBP dose, 2000.0 µg L-1 initial As(III) concentrations, and 3.0 mL min-1 flow rate, at maximum desirability value of 0.969.

  4. An adsorption diffusion model for removal of para-chlorophenol by activated carbon derived from bituminous coal.

    PubMed

    Sze, M F F; McKay, G

    2010-05-01

    Batch adsorption experiments were carried out to study the adsorptive removal and diffusion mechanism of para-chlorophenol (p-CP) onto Calgon Filtrasorb 400 (F400) activated carbon. The external mass transfer resistance is negligible in the adsorption process carried out under different conditions in batch operation. Intraparticle diffusion model plots were used to correlate the batch p-CP adsorption data; three distinct linear sections were obtained for every batch operation. The textural properties of F400 activated carbon showed that it has a large portion of supermicropores, which is comparable to the size of the p-CP molecules. Due to the stronger interactions between p-CP molecules and F400 micropores, p-CP molecules predominantly diffused and occupied active sites in micropore region by hopping mechanism, and eventually followed by a slow filling of mesopores and micropores. This hypothesis is proven by the excellent agreement of the intraparticle diffusion model plots and the textural properties of F400 activated carbon. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Optimization of semi-continuous anaerobic digestion of sugarcane straw co-digested with filter cake: Effects of macronutrients supplementation on conversion kinetics.

    PubMed

    Janke, Leandro; Weinrich, Sören; Leite, Athaydes F; Schüch, Andrea; Nikolausz, Marcell; Nelles, Michael; Stinner, Walter

    2017-12-01

    Anaerobic digestion of sugarcane straw co-digested with sugarcane filter cake was investigated with a special focus on macronutrients supplementation for an optimized conversion process. Experimental data from batch tests and a semi-continuous experiment operated in different supplementation phases were used for modeling the conversion kinetics based on continuous stirred-tank reactors. The semi-continuous experiment showed an overall decrease in the performance along the inoculum washout from the reactors. By supplementing nitrogen alone or in combination to phosphorus and sulfur the specific methane production significantly increased (P<0.05) by 17% and 44%, respectively. Although the two-pool one-step model has fitted well to the batch experimental data (R 2 >0.99), the use of the depicted kinetics did not provide a good estimation for process simulation of the semi-continuous process (in any supplementation phase), possibly due to the different feeding modes and inoculum source, activity and adaptation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Geomagnetic field modeling by optimal recursive filtering

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Data sets selected for mini-batches and the software modifications required for processing these sets are described. Initial analysis was performed on minibatch field model recovery. Studies are being performed to examine the convergence of the solutions and the maximum expansion order the data will support in the constant and secular terms.

  7. Enhanced production and immunological characterization of recombinant West Nile virus envelope domain III protein.

    PubMed

    Tripathi, Nagesh K; Karothia, Divyanshi; Shrivastava, Ambuj; Banger, Swati; Kumar, Jyoti S

    2018-05-13

    West Nile virus (WNV) is an emerging mosquito-borne virus which is responsible for severe and fatal encephalitis in humans and for which there is no licensed vaccine or therapeutic available to prevent infection. The envelope domain III protein (EDIII) of WNV was over-expressed in Escherichia coli and purified using a two-step chromatography process which included immobilized metal affinity chromatography and ion exchange chromatography. E. coli cells were grown in a bioreactor to high density using batch and fed-batch cultivation. Wet biomass obtained after batch and fed-batch cultivation processes was 11.2 g and 84 g/L of culture respectively. Protein yield after affinity purification was 5.76 mg and 5.81 mg/g wet cell weight after batch and fed-batch processes respectively. The purified WNV EDIII elicited specific antibodies in rabbits, confirming its immunogenicity. Moreover, the antibodies were able to neutralize WNV in vitro. These results established that the refolded and purified WNV EDIII could be a potential vaccine candidate. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Kinetics and thermodynamics studies of silver ions adsorption onto coconut shell activated carbon.

    PubMed

    Silva-Medeiros, Flávia V; Consolin-Filho, Nelson; Xavier de Lima, Mateus; Bazzo, Fernando Previato; Barros, Maria Angélica S D; Bergamasco, Rosângela; Tavares, Célia R G

    2016-12-01

    The presence of silver in the natural water environment has been of great concern because of its toxicity, especially when it is in the free ion form (Ag(+)). This paper aims to study the adsorption kinetics of silver ions from an aqueous solution onto coconut shell activated carbon using batch methods. Batch kinetic data were fitted to the first-order model and the pseudo-second-order model, and this last equation fits correctly the experimental data. Equilibrium experiments were carried out at 30°C, 40°C, and 50°C. The adsorption isotherms were reasonably fit using Langmuir model, and the adsorption process was slightly influenced by changes in temperature. Thermodynamic parameters (ΔH°, ΔG°, and ΔS°) were determined. The adsorption process seems to be non-favorable, exothermic, and have an increase in the orderness.

  9. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    PubMed

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  10. The use of Optical Character Recognition (OCR) in the digitisation of herbarium specimen labels

    PubMed Central

    Drinkwater, Robyn E.; Cubey, Robert W. N.; Haston, Elspeth M.

    2014-01-01

    Abstract At the Royal Botanic Garden Edinburgh (RBGE) the use of Optical Character Recognition (OCR) to aid the digitisation process has been investigated. This was tested using a herbarium specimen digitisation process with two stages of data entry. Records were initially batch-processed to add data extracted from the OCR text prior to being sorted based on Collector and/or Country. Using images of the specimens, a team of six digitisers then added data to the specimen records. To investigate whether the data from OCR aid the digitisation process, they completed a series of trials which compared the efficiency of data entry between sorted and unsorted batches of specimens. A survey was carried out to explore the opinion of the digitisation staff to the different sorting options. In total 7,200 specimens were processed. When compared to an unsorted, random set of specimens, those which were sorted based on data added from the OCR were quicker to digitise. Of the methods tested here, the most successful in terms of efficiency used a protocol which required entering data into a limited set of fields and where the records were filtered by Collector and Country. The survey and subsequent discussions with the digitisation staff highlighted their preference for working with sorted specimens, in which label layout, locations and handwriting are likely to be similar, and so a familiarity with the Collector or Country is rapidly established. PMID:25009435

  11. Batching System for Superior Service

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  12. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  13. Batch calculations in CalcHEP

    NASA Astrophysics Data System (ADS)

    Pukhov, A.

    2003-04-01

    CalcHEP is a clone of the CompHEP project which is developed by the author outside of the CompHEP group. CompHEP/CalcHEP are packages for automatic calculations of elementary particle decay and collision properties in the lowest order of perturbation theory. The main idea prescribed into the packages is to make available passing on from the Lagrangian to the final distributions effectively with a high level of automation. According to this, the packages were created as a menu driven user friendly programs for calculations in the interactive mode. From the other side, long-time calculations should be done in the non-interactive regime. Thus, from the beginning CompHEP has a problem of batch calculations. In CompHEP 33.23 the batch session was realized by mean of interactive menu which allows to the user to formulate the task for batch. After that the not-interactive session was launched. This way is too restricted, not flexible, and leads to doubling in programming. In this article I discuss another approach how one can force an interactive program to work in non-interactive mode. This approach was realized in CalcHEP 2.1 disposed on http://theory.sinp.msu.ru/~pukhov/calchep.html.

  14. EMAT enhanced dispersion of particles in liquid

    DOEpatents

    Kisner, Roger A.; Rios, Orlando; Melin, Alexander M.; Ludtka, Gerard Michael; Ludtka, Gail Mackiewicz; Wilgen, John B.

    2016-11-29

    Particulate matter is dispersed in a fluid material. A sample including a first material in a fluid state and second material comprising particulate matter are placed into a chamber. The second material is spatially dispersed in the first material utilizing EMAT force. The dispersion process continues until spatial distribution of the second material enables the sample to meet a specified criterion. The chamber and/or the sample is electrically conductive. The EMAT force is generated by placing the chamber coaxially within an induction coil driven by an applied alternating current and placing the chamber and induction coil coaxially within a high field magnetic. The EMAT force is coupled to the sample without physical contact to the sample or to the chamber, by another physical object. Batch and continuous processing are utilized. The chamber may be folded within the bore of the magnet. Acoustic force frequency and/or temperature may be controlled.

  15. The Montana experience

    NASA Technical Reports Server (NTRS)

    Dundas, T. R.

    1981-01-01

    The development and capabilities of the Montana geodata system are discussed. The system is entirely dependent on the state's central data processing facility which serves all agencies and is therefore restricted to batch mode processing. The computer graphics equipment is briefly described along with its application to state lands and township mapping and the production of water quality interval maps.

  16. 12 CFR 7.5004 - Sale of excess electronic capacity and by-products.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... because a bank engages in batch processing of banking transactions or because a bank must have capacity to... bank's needs for banking purposes include: (1) Data processing services; (2) Production and..., records, or media (such as electronic images) developed by the bank for or during the performance of its...

  17. 12 CFR 7.5004 - Sale of excess electronic capacity and by-products.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... because a bank engages in batch processing of banking transactions or because a bank must have capacity to... bank's needs for banking purposes include: (1) Data processing services; (2) Production and..., records, or media (such as electronic images) developed by the bank for or during the performance of its...

  18. 12 CFR 7.5004 - Sale of excess electronic capacity and by-products.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... because a bank engages in batch processing of banking transactions or because a bank must have capacity to... bank's needs for banking purposes include: (1) Data processing services; (2) Production and..., records, or media (such as electronic images) developed by the bank for or during the performance of its...

  19. 12 CFR 7.5004 - Sale of excess electronic capacity and by-products.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... because a bank engages in batch processing of banking transactions or because a bank must have capacity to... bank's needs for banking purposes include: (1) Data processing services; (2) Production and..., records, or media (such as electronic images) developed by the bank for or during the performance of its...

  20. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR KEYPUNCH TRACKING, CUSTODY AND DATA TRANSFER (UA-C-6.0)

    EPA Science Inventory

    The purpose of this SOP is to describe this sub-routine within the overall field data flow and custody plan. Its purpose is to establish a uniform procedure for the tracking of physical field forms and questionnaires while at keypunch. It applies to all data processing batches ...

  1. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    PubMed

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  2. Continuous flow operation with appropriately adjusting composites in influent for recovery of Cr(VI), Cu(II) and Cd(II) in self-driven MFC-MEC system.

    PubMed

    Li, Ming; Pan, Yuzhen; Huang, Liping; Zhang, Yong; Yang, Jinhui

    2017-03-01

    A self-driven microbial fuel cell (MFC) - microbial electrolysis cell (MEC) system, where electricity generated from MFCs is in situ utilized for powering MECs, has been previously reported for recovering Cr(VI), Cu(II) and Cd(II) with individual metals fed in different units of the system in batch operation. Here it was advanced with treating synthetic mixed metals' solution at appropriately adjusting composites in fed-batch and continuous flow operations for complete separation of Cr(VI), Cu(II) and Cd(II) from each other. Under an optimal condition of hydraulic residence time of 4 h, matching of two serially connected MFCs with one MEC, and fed with a composite of either 5 mg L -1 Cr(VI), 1 mg L -1 Cu(II) and 5 mg L -1 Cd(II), or 1 mg L -1 Cr(VI), 5 mg L -1 Cu(II) and 5 mg L -1 Cd(II), the self-driven MFC-MEC system can completely and sequentially recover Cu(II), Cr(VI) and Cd(II) from mixed metals. This study provides a true sustainable and zero-energy-consumed approach of using bioelectrochemical systems for completely recovering and separating Cr(VI), Cu(II) and Cd(II) from each other or from wastes or contaminated sites.

  3. Influence of salt content and processing time on sensory characteristics of cooked "lacón".

    PubMed

    Purriños, Laura; Bermúdez, Roberto; Temperán, Sara; Franco, Daniel; Carballo, Javier; Lorenzo, José M

    2011-04-01

    The influence of salt content and processing time on the sensory properties of cooked "lacón" were determined. "Lacón" is a traditional dry-cured and ripened meat product made in the north-west of Spain from the fore leg of the pig, following a similar process to that of dry-cured ham. Six batches of "lacón" were salted with different amounts of salt (LS (3 days of salting), MS (4 days of salting) and HS (5 days of salting)) and ripened during two times (56 and 84 days of dry-ripening). Cured odour in all batches studied, red colour and rancid odour in MS and HS batches, flavour intensity in MS batch and fat yellowness, rancid flavour and hardness in the HS batch were significantly different with respect to the time of processing. Appearance, odour, flavour and texture were not significantly affected by the salt content (P>0.05). However, the saltiness score showed significant differences with respect to the salt levels in all studied batches (56 and 84 days of process). The principal component analysis showed that physicochemical traits were the most important ones concerning the quality of dry-cured "lacón" and offered a good separation of the mean samples according to the dry ripening days and salt level. © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.

  4. Continuous processing of recombinant proteins: Integration of inclusion body solubilization and refolding using simulated moving bed size exclusion chromatography with buffer recycling.

    PubMed

    Wellhoefer, Martin; Sprinzl, Wolfgang; Hahn, Rainer; Jungbauer, Alois

    2013-12-06

    An integrated process which combines continuous inclusion body dissolution with NaOH and continuous matrix-assisted refolding based on closed-loop simulated moving bed size exclusion chromatography was designed and experimentally evaluated at laboratory scale. Inclusion bodies from N(pro) fusion pep6His and N(pro) fusion MCP1 from high cell density fermentation were continuously dissolved with NaOH, filtered and mixed with concentrated refolding buffer prior to refolding by size exclusion chromatography (SEC). This process enabled an isocratic operation of the simulated moving bed (SMB) system with a closed-loop set-up with refolding buffer as the desorbent buffer and buffer recycling by concentrating the raffinate using tangential flow filtration. With this continuous refolding process, we increased the refolding and cleavage yield of both model proteins by 10% compared to batch dilution refolding. Furthermore, more than 99% of the refolding buffer of the raffinate could be recycled which reduced the buffer consumption significantly. Based on the actual refolding data, we compared throughput, productivity, and buffer consumption between two batch dilution refolding processes - one using urea for IB dissolution, the other one using NaOH for IB dissolution - and our continuous refolding process. The higher complexity of the continuous refolding process was rewarded with higher throughput and productivity as well as significantly lower buffer consumption compared to the batch dilution refolding processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Retention of neodymium by dolomite at variable ionic strength as probed by batch and column experiments.

    PubMed

    Emerson, H P; Zengotita, F; Richmann, M; Katsenovich, Y; Reed, D T; Dittrich, T M

    2018-10-01

    The results presented in this paper highlight the complexity of adsorption and incorporation processes of Nd with dolomite and significantly improve upon previous work investigating trivalent actinide and lanthanide interactions with dolomite. Both batch and mini column experiments were conducted at variable ionic strength. These data highlight the strong chemisorption of Nd to the dolomite surface (equilibrium K d 's > 3000 mL/g) and suggest that equilibrium adsorption processes may not be affected by ionic strength based on similar results at 0.1 and 5.0 M ionic strength in column breakthrough and equilibrium batch (>5 days) results. Mini column experiments conducted over approximately one year also represent a significant development in measurement of sorption of Nd in the presence of flow as previous large-scale column experiments did not achieve breakthrough likely due to the high loading capacity of dolomite for Nd (up to 240 μg/g). Batch experiments in the absence of flow show that the rate of Nd removal increases with increasing ionic strength (up to 5.0 M) with greater removal at greater ionic strength for a 24 h sampling point. We suggest that the increasing ionic strength induces increased mineral dissolution and re-precipitation caused by changes in activity with ionic strength that lead to increased removal of Nd through co-precipitation processes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The Mechanical Property of Batch Annealed High Strength Low Alloy Steel HC260LA

    NASA Astrophysics Data System (ADS)

    Yang, Xiaojiang; Xia, Mingsheng; Zhang, Hongbo; Han, Bin; Li, Guilan

    Cold rolled high strength low alloy steel is widely applied in the automotive parts due to its excellent formability and weldability. In this paper, the steel grade HC260LA according to European Norm was developed with batch annealing process. With commercial C-Mn mild steel as a benchmark, three different groups of chemistry namely C-Mn-Si, C-Mn-Nb-Ti and C-Mn-Nb were compared in terms of yield-tensile strength (Y/T) ratio. Microstructure and mechanical properties were characterized as well. Based on industrial production results, chemistry and detailed process parameters for batch annealing were identified. In the end the optimal Y/T ratio was proposed for this steel grade under batch annealing process.

  7. Lactate production as representative of the fermentation potential of Corynebacterium glutamicum 2262 in a one-step process.

    PubMed

    Khuat, Hoang Bao Truc; Kaboré, Abdoul Karim; Olmos, Eric; Fick, Michel; Boudrant, Joseph; Goergen, Jean-Louis; Delaunay, Stéphane; Guedon, Emmanuel

    2014-01-01

    The fermentative properties of thermo-sensitive strain Corynebacterium glutamicum 2262 were investigated in processes coupling aerobic cell growth and the anaerobic fermentation phase. In particular, the influence of two modes of fermentation on the production of lactate, the fermentation product model, was studied. In both processes, lactate was produced in significant amount, 27 g/L in batch culture, and up to 55.8 g/L in fed-batch culture, but the specific production rate in the fed-batch culture was four times lower than that in the batch culture. Compared to other investigated fermentation processes, our strategy resulted in the highest yield of lactic acid from biomass. Lactate production by C. glutamicum 2262 thus revealed the capability of the strain to produce various fermentation products from pyruvate.

  8. Risk-based Methodology for Validation of Pharmaceutical Batch Processes.

    PubMed

    Wiles, Frederick

    2013-01-01

    In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.

  9. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  10. Computional algorithm for lifetime exposure to antimicrobials in pigs using register data-The LEA algorithm.

    PubMed

    Birkegård, Anna Camilla; Andersen, Vibe Dalhoff; Halasa, Tariq; Jensen, Vibeke Frøkjær; Toft, Nils; Vigre, Håkan

    2017-10-01

    Accurate and detailed data on antimicrobial exposure in pig production are essential when studying the association between antimicrobial exposure and antimicrobial resistance. Due to difficulties in obtaining primary data on antimicrobial exposure in a large number of farms, there is a need for a robust and valid method to estimate the exposure using register data. An approach that estimates the antimicrobial exposure in every rearing period during the lifetime of a pig using register data was developed into a computational algorithm. In this approach data from national registers on antimicrobial purchases, movements of pigs and farm demographics registered at farm level are used. The algorithm traces batches of pigs retrospectively from slaughter to the farm(s) that housed the pigs during their finisher, weaner, and piglet period. Subsequently, the algorithm estimates the antimicrobial exposure as the number of Animal Defined Daily Doses for treatment of one kg pig in each of the rearing periods. Thus, the antimicrobial purchase data at farm level are translated into antimicrobial exposure estimates at batch level. A batch of pigs is defined here as pigs sent to slaughter at the same day from the same farm. In this study we present, validate, and optimise a computational algorithm that calculate the lifetime exposure of antimicrobials for slaughter pigs. The algorithm was evaluated by comparing the computed estimates to data on antimicrobial usage from farm records in 15 farm units. We found a good positive correlation between the two estimates. The algorithm was run for Danish slaughter pigs sent to slaughter in January to March 2015 from farms with more than 200 finishers to estimate the proportion of farms that it was applicable for. In the final process, the algorithm was successfully run for batches of pigs originating from 3026 farms with finisher units (77% of the initial population). This number can be increased if more accurate register data can be obtained. The algorithm provides a systematic and repeatable approach to estimating the antimicrobial exposure throughout the rearing period, independent of rearing site for finisher batches, as a lifetime exposure measurement. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A multi-run chemistry module for the production of [18F]FDG

    NASA Astrophysics Data System (ADS)

    Sipe, B.; Murphy, M.; Best, B.; Zigler, S.; Lim, J.; Dorman, E.; Mangner, T.; Weichelt, M.

    2001-07-01

    We have developed a new chemistry module for the production of up to four batches of [18F]FDG. Prior to starting a batch sequence, the module automatically performs a series of self-diagnostic tests, including a reagent detection sequence. The module then executes a user-defined production sequence followed by an automated process to rinse tubing, valves, and the reaction vessel prior to the next production sequence. Process feedback from the module is provided to a graphical user interface by mass flow controllers, radiation detectors, a pressure switch, a pressure transducer, and an IR temperature sensor. This paper will describe the module, the operating system, and the results of multi-site trials, including production data and quality control results.

  12. Using Forensics to Untangle Batch Effects in TCGA Data - TCGA

    Cancer.gov

    Rehan Akbani, Ph.D., and colleagues at the University of Texas MD Anderson Cancer Center developed a tool called MBatch to detect, diagnose, and correct batch effects in TCGA data. Read more about batch effects in this Case Study.

  13. [Evaluation of pipetting systems. III. Micropipette precision in a routine task].

    PubMed

    Salas, R; Loría, A; Rocha, C

    1995-01-01

    To establish a norm of the precision achievable with a micropipette in an IRMA assay under routine conditions. A micropipette (Gilson) adjusted to dispense 100 microL was used by a single analyst with experience in its use. In each assay, ten aliquots of radioactive antiprolactin were pipetted in clean tubes (PRE-batch tubes), followed by pipetting of the tubes being processed in the assay, and at the end, a second pipetting of 10 aliquots in clean tubes (POST-batch tubes). The study includes the data of 15 consecutive batches during a seven month period with an overall mean of 283 tubes per batch. The PRE- and POST-tubes were read in a gamma counter (Crystal plus). The mean, SD and CV for PRE, POST and global (PRE+POST) tubes were calculated for each batch. The global CV of the 15 batches ranged from 1.6 to 6.9%, mean of 3.1%. We found no evidence of increased imprecision due to fatigue of the analyst, but surprisingly, we observed that in nine of the 15 batches there was a significant difference in the means of the PRE-tubes vs the POST-tubes (t test) without differences in precision. Thus, part of the global variability is due to what we have called pseudoimprecision (i.e. an increase in CV due to differences in means). In addition, the POST-tubes had higher values in the first 7 batches but the opposite occurred in the last 8 batches (table 2). This shift in the sign of the PRE-POST differences suggests the presence of opposite factors operating in time, i.e. one or more factors increased the volume of pipetting after using the pipette more than 150 times (batches 1-7) whereas other/others decreased it (batches 8-15). 1. Our first approximation to a norm of micropipetting precision in batches of 200-300 tubes was a CV of 3.1%. 2. This norm was influenced by a problem of pseudoimprecision detected ex-post-facto. 3. Our findings justify continuation studies to detect the pseudoimprecision and evaluate its causes prospectively.

  14. Photochemical transformations accelerated in continuous-flow reactors: basic concepts and applications.

    PubMed

    Su, Yuanhai; Straathof, Natan J W; Hessel, Volker; Noël, Timothy

    2014-08-18

    Continuous-flow photochemistry is used increasingly by researchers in academia and industry to facilitate photochemical processes and their subsequent scale-up. However, without detailed knowledge concerning the engineering aspects of photochemistry, it can be quite challenging to develop a suitable photochemical microreactor for a given reaction. In this review, we provide an up-to-date overview of both technological and chemical aspects associated with photochemical processes in microreactors. Important design considerations, such as light sources, material selection, and solvent constraints are discussed. In addition, a detailed description of photon and mass-transfer phenomena in microreactors is made and fundamental principles are deduced for making a judicious choice for a suitable photomicroreactor. The advantages of microreactor technology for photochemistry are described for UV and visible-light driven photochemical processes and are compared with their batch counterparts. In addition, different scale-up strategies and limitations of continuous-flow microreactors are discussed. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Turbine blade processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Space processing of directionally solidified eutectic-alloy type turbine blades is envisioned as a simple remelt operations in which precast blades are remelted in a preformed mold. Process systems based on induction melting, continuous resistance furnaces, and batch resistance furnaces were evaluated. The batch resistance furnace type process using a multiblade mold is considered to offer the best possibility for turbine blade processing.

  16. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Development of High Temperature (3400F) and High Pressure (27,000 PSI) Gas Venting Process for Nitrogen Batch Heater

    DTIC Science & Technology

    2018-01-01

    for Mach 14 possibly degrading seals ability to contain pressure due to exposure to high temperatures. A different solution for Mach 14 case will be...AEDC-TR-18-H-1 Development of High Temperature (3400°F) and High Pressure (27,000 PSI) Gas Venting Process for Nitrogen Batch...Development of High Temperature (3400°F) and High Pressure (27,000 PSI) Gas Venting Process for Nitrogen Batch Heater FA9101-10-D-0001-0010 5b. GRANT

  18. Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis

    USDA-ARS?s Scientific Manuscript database

    Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...

  19. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.

  20. Results of Hg speciation testing on DWPF SMECT-8, OGCT-1, AND OGCT-2 samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bannochie, C.

    2016-02-22

    The Savannah River National Laboratory (SRNL) was tasked with preparing and shipping samples for Hg speciation by Eurofins Frontier Global Sciences, Inc. in Seattle, WA on behalf of the Savannah River Remediation (SRR) Mercury Task Team. The sixteenth shipment of samples was designated to include a Defense Waste Processing Facility (DWPF) Slurry Mix Evaporator Condensate Tank (SMECT) sample from Sludge Receipt and Adjustment Tank (SRAT) Batch 738 processing and two Off-Gas Condensate Tank (OGCT) samples, one following Batch 736 and one following Batch 738. The DWPF sample designations for the three samples analyzed are provided. The Batch 738 ‘End ofmore » SME Cycle’ SMECT sample was taken at the conclusion of Slurry Mix Evaporator (SME) operations for this batch and represents the fourth SMECT sample examined from Batch 738. Batch 738 experienced a sludge slurry carryover event, which introduced sludge solids to the SMECT that were particularly evident in the SMECT-5 sample, but less evident in the ‘End of SME Cycle’ SMECT-8 sample.« less

  1. Fenton-like Degradation of MTBE: Effects of Iron Counter Anion and Radical Scavengers

    EPA Science Inventory

    Fenton-driven oxidation of Methyl tert-butyl ether (MTBE) (0.11-0.16 mM) in batch reactors containing ferric iron (5 mM), hydrogen peroxide (H2O2) (6 mM) (pH=3) was performed to investigate MTBE transformation mechanisms. Independent variables included the form of iron (Fe) (Fe2(...

  2. Cell-controlled hybrid perfusion fed-batch CHO cell process provides significant productivity improvement over conventional fed-batch cultures.

    PubMed

    Hiller, Gregory W; Ovalle, Ana Maria; Gagnon, Matthew P; Curran, Meredith L; Wang, Wenge

    2017-07-01

    A simple method originally designed to control lactate accumulation in fed-batch cultures of Chinese Hamster Ovary (CHO) cells has been modified and extended to allow cells in culture to control their own rate of perfusion to precisely deliver nutritional requirements. The method allows for very fast expansion of cells to high density while using a minimal volume of concentrated perfusion medium. When the short-duration cell-controlled perfusion is performed in the production bioreactor and is immediately followed by a conventional fed-batch culture using highly concentrated feeds, the overall productivity of the culture is approximately doubled when compared with a highly optimized state-of-the-art fed-batch process. The technology was applied with near uniform success to five CHO cell processes producing five different humanized monoclonal antibodies. The increases in productivity were due to the increases in sustained viable cell densities. Biotechnol. Bioeng. 2017;114: 1438-1447. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Application of process analytical technology for monitoring freeze-drying of an amorphous protein formulation: use of complementary tools for real-time product temperature measurements and endpoint detection.

    PubMed

    Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning

    2015-05-01

    Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  4. Assessment of NASA Dual Microstructure Heat Treatment Method for Multiple Forging Batch Heat Treatment

    NASA Technical Reports Server (NTRS)

    Gayda, John (Technical Monitor); Lemsky, Joe

    2004-01-01

    NASA dual microstructure heat treatment technology previously demonstrated on single forging heat treat batches of a generic disk shape was successfully demonstrated on a multiple disk batch of a production shape component. A group of four Rolls-Royce Corporation 3rd Stage AE2100 forgings produced from alloy ME209 were successfully dual microstructure heat treated as a single heat treat batch. The forgings responded uniformly as evidenced by part-to-part consistent thermocouple recordings and resultant macrostructures, and from ultrasonic examination. Multiple disk DMHT processing offers a low cost alternative to other published dual microstructure processing techniques.

  5. A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.

    PubMed

    Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng

    2018-03-01

    Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. SLUDGE BATCH 7B QUALIFICATION ACTIVITIES WITH SRS TANK FARM SLUDGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pareizs, J.; Click, D.; Lambert, D.

    2011-11-16

    Waste Solidification Engineering (WSE) has requested that characterization and a radioactive demonstration of the next batch of sludge slurry - Sludge Batch 7b (SB7b) - be completed in the Shielded Cells Facility of the Savannah River National Laboratory (SRNL) via a Technical Task Request (TTR). This characterization and demonstration, or sludge batch qualification process, is required prior to transfer of the sludge from Tank 51 to the Defense Waste Processing Facility (DWPF) feed tank (Tank 40). The current WSE practice is to prepare sludge batches in Tank 51 by transferring sludge from other tanks. Discharges of nuclear materials from Hmore » Canyon are often added to Tank 51 during sludge batch preparation. The sludge is washed and transferred to Tank 40, the current DWPF feed tank. Prior to transfer of Tank 51 to Tank 40, SRNL typically simulates the Tank Farm and DWPF processes with a Tank 51 sample (referred to as the qualification sample). With the tight schedule constraints for SB7b and the potential need for caustic addition to allow for an acceptable glass processing window, the qualification for SB7b was approached differently than past batches. For SB7b, SRNL prepared a Tank 51 and a Tank 40 sample for qualification. SRNL did not receive the qualification sample from Tank 51 nor did it simulate all of the Tank Farm washing and decanting operations. Instead, SRNL prepared a Tank 51 SB7b sample from samples of Tank 7 and Tank 51, along with a wash solution to adjust the supernatant composition to the final SB7b Tank 51 Tank Farm projections. SRNL then prepared a sample to represent SB7b in Tank 40 by combining portions of the SRNL-prepared Tank 51 SB7b sample and a Tank 40 Sludge Batch 7a (SB7a) sample. The blended sample was 71% Tank 40 (SB7a) and 29% Tank 7/Tank 51 on an insoluble solids basis. This sample is referred to as the SB7b Qualification Sample. The blend represented the highest projected Tank 40 heel (as of May 25, 2011), and thus, the highest projected noble metals content for SB7b. Characterization was performed on the Tank 51 SB7b samples and SRNL performed DWPF simulations using the Tank 40 SB7b material. This report documents: (1) The preparation and characterization of the Tank 51 SB7b and Tank 40 SB7b samples. (2) The performance of a DWPF Chemical Process Cell (CPC) simulation using the SB7b Tank 40 sample. The simulation included a Sludge Receipt and Adjustment Tank (SRAT) cycle, where acid was added to the sludge to destroy nitrite and reduce mercury, and a Slurry Mix Evaporator (SME) cycle, where glass frit was added to the sludge in preparation for vitrification. The SME cycle also included replication of five canister decontamination additions and concentrations. Processing parameters were based on work with a nonradioactive simulant. (3) Vitrification of a portion of the SME product and characterization and durability testing (as measured by the Product Consistency Test (PCT)) of the resulting glass. (4) Rheology measurements of the SRAT receipt, SRAT product, and SME product. This program was controlled by a Task Technical and Quality Assurance Plan (TTQAP), and analyses were guided by an Analytical Study Plan. This work is Technical Baseline Research and Development (R&D) for the DWPF. It should be noted that much of the data in this document has been published in interoffice memoranda. The intent of this technical report is bring all of the SB7b related data together in a single permanent record and to discuss the overall aspects of SB7b processing.« less

  7. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR KEYPUNCH TRACKING, CUSTODY, AND DATA TRANSFER (UA-C-6.0)

    EPA Science Inventory

    The purpose of this SOP is to describe this sub-routine within the overall field data flow and custody plan. Its purpose is to establish a uniform procedure for the tracking of physical field forms and questionnaires while at keypunch. It applies to all data processing batches ...

  8. Batch-to-batch uniformity of bacterial community succession and flavor formation in the fermentation of Zhenjiang aromatic vinegar.

    PubMed

    Wang, Zong-Min; Lu, Zhen-Ming; Yu, Yong-Jian; Li, Guo-Quan; Shi, Jin-Song; Xu, Zheng-Hong

    2015-09-01

    Solid-state fermentation of traditional Chinese vinegar is a mixed-culture refreshment process that proceeds for many centuries without spoilage. Here, we investigated bacterial community succession and flavor formation in three batches of Zhenjiang aromatic vinegar using pyrosequencing and metabolomics approaches. Temporal patterns of bacterial succession in the Pei (solid-state vinegar culture) showed no significant difference (P > 0.05) among three batches of fermentation. In all the batches investigated, the average number of community operational taxonomic units (OTUs) decreased dramatically from 119 ± 11 on day 1 to 48 ± 16 on day 3, and then maintained in the range of 61 ± 9 from day 5 to the end of fermentation. We confirmed that, within a batch of fermentation process, the patterns of bacterial diversity between the starter (took from the last batch of vinegar culture on day 7) and the Pei on day 7 were similar (90%). The relative abundance dynamics of two dominant members, Lactobacillus and Acetobacter, showed high correlation (coefficient as 0.90 and 0.98 respectively) among different batches. Furthermore, statistical analysis revealed dynamics of 16 main flavor metabolites were stable among different batches. The findings validate the batch-to-batch uniformity of bacterial community succession and flavor formation accounts for the quality of Zhenjiang aromatic vinegar. Based on our understanding, this is the first study helps to explain the rationality of age-old artistry from a scientific perspective. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Correcting for batch effects in case-control microbiome studies

    PubMed Central

    Gibbons, Sean M.; Duvallet, Claire

    2018-01-01

    High-throughput data generation platforms, like mass-spectrometry, microarrays, and second-generation sequencing are susceptible to batch effects due to run-to-run variation in reagents, equipment, protocols, or personnel. Currently, batch correction methods are not commonly applied to microbiome sequencing datasets. In this paper, we compare different batch-correction methods applied to microbiome case-control studies. We introduce a model-free normalization procedure where features (i.e. bacterial taxa) in case samples are converted to percentiles of the equivalent features in control samples within a study prior to pooling data across studies. We look at how this percentile-normalization method compares to traditional meta-analysis methods for combining independent p-values and to limma and ComBat, widely used batch-correction models developed for RNA microarray data. Overall, we show that percentile-normalization is a simple, non-parametric approach for correcting batch effects and improving sensitivity in case-control meta-analyses. PMID:29684016

  10. Adaptive Batch Mode Active Learning.

    PubMed

    Chakraborty, Shayok; Balasubramanian, Vineeth; Panchanathan, Sethuraman

    2015-08-01

    Active learning techniques have gained popularity to reduce human effort in labeling data instances for inducing a classifier. When faced with large amounts of unlabeled data, such algorithms automatically identify the exemplar and representative instances to be selected for manual annotation. More recently, there have been attempts toward a batch mode form of active learning, where a batch of data points is simultaneously selected from an unlabeled set. Real-world applications require adaptive approaches for batch selection in active learning, depending on the complexity of the data stream in question. However, the existing work in this field has primarily focused on static or heuristic batch size selection. In this paper, we propose two novel optimization-based frameworks for adaptive batch mode active learning (BMAL), where the batch size as well as the selection criteria are combined in a single formulation. We exploit gradient-descent-based optimization strategies as well as properties of submodular functions to derive the adaptive BMAL algorithms. The solution procedures have the same computational complexity as existing state-of-the-art static BMAL techniques. Our empirical results on the widely used VidTIMIT and the mobile biometric (MOBIO) data sets portray the efficacy of the proposed frameworks and also certify the potential of these approaches in being used for real-world biometric recognition applications.

  11. Reduction of extended-spectrum-β-lactamase- and AmpC-β-lactamase-producing Escherichia coli through processing in two broiler chicken slaughterhouses.

    PubMed

    Pacholewicz, Ewa; Liakopoulos, Apostolos; Swart, Arno; Gortemaker, Betty; Dierikx, Cindy; Havelaar, Arie; Schmitt, Heike

    2015-12-23

    Whilst broilers are recognised as a reservoir of extended-spectrum-β-lactamase (ESBL)- and AmpC-β-lactamase (AmpC)-producing Escherichia coli, there is currently limited knowledge on the effect of slaughtering on its concentrations on poultry meat. The aim of this study was to establish the concentration of ESBL/AmpC producing E. coli on broiler chicken carcasses through processing. In addition the changes in ESBL/AmpC producing E. coli concentrations were compared with generic E. coli and Campylobacter. In two slaughterhouses, the surface of the whole carcasses was sampled after 5 processing steps: bleeding, scalding, defeathering, evisceration and chilling. In total, 17 batches were sampled in two different slaughterhouses during the summers of 2012 and 2013. ESBL/AmpC producing E. coli was enumerated on MacConkey agar with 1mg/l cefotaxime, and the ESBL/AmpC phenotypes and genotypes were characterised. The ESBL/AmpC producing E. coli concentrations varied significantly between the incoming batches in both slaughterhouses. The concentrations on broiler chicken carcasses were significantly reduced during processing. In Slaughterhouse 1, all subsequent processing steps reduced the concentrations except evisceration which led to a slight increase that was statistically not significant. The changes in concentration between processing steps were relatively similar for all sampled batches in this slaughterhouse. In contrast, changes varied between batches in Slaughterhouse 2, and the overall reduction through processing was higher in Slaughterhouse 2. Changes in ESBL/AmpC producing E. coli along the processing line were similar to changes in generic E. coli in both slaughterhouses. The effect of defeathering differed between ESBL/AmpC producing E. coli and Campylobacter. ESBL/AmpC producing E. coli decreased after defeathering, whereas Campylobacter concentrations increased. The genotypes of ESBL/AmpC producing E. coli (blaCTX-M-1, blaSHV-12, blaCMY-2, blaTEM-52c, blaTEM-52cvar) from both slaughterhouses match typical poultry genotypes. Their distribution differed between batches and changed throughout processing for some batches. The concentration levels found after chilling were between 10(2) and 10(5)CFU/carcass. To conclude, changes in ESBL/AmpC producing E. coli concentrations on broiler chicken carcasses during processing are influenced by batch and slaughterhouse, pointing to the role of both primary production and process control for reducing ESBL/AmpC producing E. coli levels in final products. Due to similar changes upon processing, E. coli can be used as a process indicator of ESBL/AmpC producing E. coli, because the processing steps had similar impact on both organisms. Cross contamination may potentially explain shifts in genotypes within some batches through the processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Neuro-estimator based GMC control of a batch reactive distillation.

    PubMed

    Prakash, K J Jithin; Patle, Dipesh S; Jana, Amiya K

    2011-07-01

    In this paper, an artificial neural network (ANN)-based nonlinear control algorithm is proposed for a simulated batch reactive distillation (RD) column. In the homogeneously catalyzed reactive process, an esterification reaction takes place for the production of ethyl acetate. The fundamental model has been derived incorporating the reaction term in the model structure of the nonreactive distillation process. The process operation is simulated at the startup phase under total reflux conditions. The open-loop process dynamics is also addressed running the batch process at the production phase under partial reflux conditions. In this study, a neuro-estimator based generic model controller (GMC), which consists of an ANN-based state predictor and the GMC law, has been synthesized. Finally, this proposed control law has been tested on the representative batch reactive distillation comparing with a gain-scheduled proportional integral (GSPI) controller and with its ideal performance (ideal GMC). Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Efficient arachidonic acid-rich oil production by Mortierella alpina through a repeated fed-batch fermentation strategy.

    PubMed

    Ji, Xiao-Jun; Zhang, Ai-Hui; Nie, Zhi-Kui; Wu, Wen-Jia; Ren, Lu-Jing; Huang, He

    2014-10-01

    Arachidonic acid (ARA)-rich oil production by Mortierella alpina is a long fermentation period needed process due to the low growth rate of the filamentous fungus used. This causes the low productivity of ARA-rich oil and hinders its industrial mass scale production. In the present study, different fed-batch strategies were conducted to shorten the fermentation period. The result showed that compared with the batch culture, the fermentation period was shortened from 7days to 5days with the productivity of ARA-rich oil increased from 0.9g/(L·d) to 1.3g/(L·d) by using the fed-batch fermentation strategy. Furthermore, repeated fed-batch fermentation strategy was adopted to achieve the purpose of continuous production. By using this strategy, the fermentation period was shortened from 40days to 26days in a four cycle repeated fed-batch fermentation. This strategy proved to be convenient and economical for ARA-rich oil commercial production process. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Stepwise heating in Stille polycondensation toward no batch-to-batch variations in polymer solar cell performance.

    PubMed

    Lee, Sang Myeon; Park, Kwang Hyun; Jung, Seungon; Park, Hyesung; Yang, Changduk

    2018-05-14

    For a given π-conjugated polymer, the batch-to-batch variations in molecular weight (M w ) and polydispersity index (Ð) can lead to inconsistent process-dependent material properties and consequent performance variations in the device application. Using a stepwise-heating protocol in the Stille polycondensation in conjunction with optimized processing, we obtained an ultrahigh-quality PTB7 polymer having high M w and very narrow Ð. The resulting ultrahigh-quality polymer-based solar cells demonstrate up to 9.97% power conversion efficiencies (PCEs), which is over 24% enhancement from the control devices fabricated with commercially available PTB7. Moreover, we observe almost negligible batch-to-batch variations in the overall PCE values from ultrahigh-quality polymer-based devices. The proposed stepwise polymerization demonstrates a facile and effective strategy for synthesizing high-quality semiconducting polymers that can significantly improve device yield in polymer-based solar cells, an important factor for the commercialization of organic solar cells, by mitigating device-to-device variations.

  15. 40 CFR Table 6 to Subpart Jjj of... - Known Organic HAP Emitted From the Production of Thermoplastic Products

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Styrene (100-42-5) ABS latex ✔ ✔ ✔ ABS using a batch emulsion process ✔ ✔ ✔ ABS using a batch suspension process ✔ ✔ ✔ ABS using a continuous emulsion process ✔ ✔ ✔ ABS using a continuous mass process ✔ ✔ ✔ ASA...

  16. 40 CFR Table 6 to Subpart Jjj of... - Known Organic HAP Emitted From the Production of Thermoplastic Products

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) Styrene (100-42-5) ABS latex ✔ ✔ ✔ ABS using a batch emulsion process ✔ ✔ ✔ ABS using a batch suspension process ✔ ✔ ✔ ABS using a continuous emulsion process ✔ ✔ ✔ ABS using a continuous mass process ✔ ✔ ✔ ASA...

  17. 40 CFR Table 6 to Subpart Jjj of... - Known Organic HAP Emitted From the Production of Thermoplastic Products

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Styrene (100-42-5) ABS latex ✔ ✔ ✔ ABS using a batch emulsion process ✔ ✔ ✔ ABS using a batch suspension process ✔ ✔ ✔ ABS using a continuous emulsion process ✔ ✔ ✔ ABS using a continuous mass process ✔ ✔ ✔ ASA...

  18. Bacteriophage PRD1 batch experiments to study attachment, detachment and inactivation processes

    NASA Astrophysics Data System (ADS)

    Sadeghi, Gholamreza; Schijven, Jack F.; Behrends, Thilo; Hassanizadeh, S. Majid; van Genuchten, Martinus Th.

    2013-09-01

    Knowledge of virus removal in subsurface environments is pivotal for assessing the risk of viral contamination of water resources and developing appropriate protection measures. Columns packed with sand are frequently used to quantify attachment, detachment and inactivation rates of viruses. Since column transport experiments are very laborious, a common alternative is to perform batch experiments where usually one or two measurements are done assuming equilibrium is reached. It is also possible to perform kinetic batch experiments. In that case, however, it is necessary to monitor changes in the concentration with time. This means that kinetic batch experiments will be almost as laborious as column experiments. Moreover, attachment and detachment rate coefficients derived from batch experiments may differ from those determined using column experiments. The aim of this study was to determine the utility of kinetic batch experiments and investigate the effects of different designs of the batch experiments on estimated attachment, detachment and inactivation rate coefficients. The experiments involved various combinations of container size, sand-water ratio, and mixing method (i.e., rolling or tumbling by pivoting the tubes around their horizontal or vertical axes, respectively). Batch experiments were conducted with clean quartz sand, water at pH 7 and ionic strength of 20 mM, and using the bacteriophage PRD1 as a model virus. Values of attachment, detachment and inactivation rate coefficients were found by fitting an analytical solution of the kinetic model equations to the data. Attachment rate coefficients were found to be systematically higher under tumbling than under rolling conditions because of better mixing and more efficient contact of phages with the surfaces of the sand grains. In both mixing methods, more sand in the container yielded higher attachment rate coefficients. A linear increase in the detachment rate coefficient was observed with increased solid-water ratio using tumbling method. Given the differences in the attachment rate coefficients, and assuming the same sticking efficiencies since chemical conditions of the batch and column experiments were the same, our results show that collision efficiencies of batch experiments are not the same as those of column experiments. Upscaling of the attachment rate from batch to column experiments hence requires proper understanding of the mixing conditions. Because batch experiments, in which the kinetics are monitored, are as laborious as column experiments, there seems to be no major advantage in performing batch instead of column experiments.

  19. JOB BUILDER remote batch processing subsystem

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.; Orlova, T. L.

    1980-01-01

    The functions of the JOB BUILDER remote batch processing subsystem are described. Instructions are given for using it as a component of a display system developed by personnel of the System Programming Laboratory, Institute of Space Research, USSR Academy of Sciences.

  20. CALIBRATION OF SUBSURFACE BATCH AND REACTIVE-TRANSPORT MODELS INVOLVING COMPLEX BIOGEOCHEMICAL PROCESSES

    EPA Science Inventory

    In this study, the calibration of subsurface batch and reactive-transport models involving complex biogeochemical processes was systematically evaluated. Two hypothetical nitrate biodegradation scenarios were developed and simulated in numerical experiments to evaluate the perfor...

  1. Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.

    PubMed

    Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M

    2017-02-02

    Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.

  2. Dynamic genome-scale metabolic modeling of the yeast Pichia pastoris.

    PubMed

    Saitua, Francisco; Torres, Paulina; Pérez-Correa, José Ricardo; Agosin, Eduardo

    2017-02-21

    Pichia pastoris shows physiological advantages in producing recombinant proteins, compared to other commonly used cell factories. This yeast is mostly grown in dynamic cultivation systems, where the cell's environment is continuously changing and many variables influence process productivity. In this context, a model capable of explaining and predicting cell behavior for the rational design of bioprocesses is highly desirable. Currently, there are five genome-scale metabolic reconstructions of P. pastoris which have been used to predict extracellular cell behavior in stationary conditions. In this work, we assembled a dynamic genome-scale metabolic model for glucose-limited, aerobic cultivations of Pichia pastoris. Starting from an initial model structure for batch and fed-batch cultures, we performed pre/post regression diagnostics to ensure that model parameters were identifiable, significant and sensitive. Once identified, the non-relevant ones were iteratively fixed until a priori robust modeling structures were found for each type of cultivation. Next, the robustness of these reduced structures was confirmed by calibrating the model with new datasets, where no sensitivity, identifiability or significance problems appeared in their parameters. Afterwards, the model was validated for the prediction of batch and fed-batch dynamics in the studied conditions. Lastly, the model was employed as a case study to analyze the metabolic flux distribution of a fed-batch culture and to unravel genetic and process engineering strategies to improve the production of recombinant Human Serum Albumin (HSA). Simulation of single knock-outs indicated that deviation of carbon towards cysteine and tryptophan formation improves HSA production. The deletion of methylene tetrahydrofolate dehydrogenase could increase the HSA volumetric productivity by 630%. Moreover, given specific bioprocess limitations and strain characteristics, the model suggests that implementation of a decreasing specific growth rate during the feed phase of a fed-batch culture results in a 25% increase of the volumetric productivity of the protein. In this work, we formulated a dynamic genome scale metabolic model of Pichia pastoris that yields realistic metabolic flux distributions throughout dynamic cultivations. The model can be calibrated with experimental data to rationally propose genetic and process engineering strategies to improve the performance of a P. pastoris strain of interest.

  3. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale.

    NASA Astrophysics Data System (ADS)

    Magnoni, L.; Suthakar, U.; Cordeiro, C.; Georgiou, M.; Andreeva, J.; Khan, A.; Smith, D. R.

    2015-12-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures.

  4. Kinetics and modeling of hexavalent chromium reduction in Enterobacter cloacae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamamoto, Koji; Kato, Junichi; Yano, Takuo

    1993-01-05

    Kinetics of bacterial reduction of toxic hexavalent chromium (chromate: CrO[sub 4][sup [minus]2]) was investigated using batch and fed-batch cultures of Enterobacter cloacae strain HO1. In fed-batch cultures, the CrO[sub 4][sup [minus]2] feed was controlled on the basis of the rate of pH change. This control strategy has proven to be useful for avoiding toxic CrO[sub 3][sup [minus]2] overload. A simple mathematical model was developed to describe the bacterial process of CrO[sub 4][sup [minus]2] reduction. In this model, two types of bacterial cells were considered: induced, CrO[sub 4][sup [minus]2]-resistant cells and uninduced, sensitive ones. Only resistant cells were assumed to bemore » able to reduce CrO[sub 4][sup [minus]2]. These fundamental ideas were supported by the model predictions which well approximated all experimental data. In a simulation study, the model was also used to optimize fed-batch cultures, instead of lengthy and expensive laboratory experiments.« less

  5. Composition and origin of basaltic magma of the Hawaiian Islands

    USGS Publications Warehouse

    Powers, H.A.

    1955-01-01

    Silica-saturated basaltic magma is the source of the voluminous lava flows, erupted frequently and rapidly in the primitive shield-building stage of activity, that form the bulk of each Hawaiian volcano. This magma may be available in batches that differ slightly in free silica content from batch to batch both at the same and at different volcanoes; differentiation by fractionation of olivine does not occur within this primitive magma. Silica-deficient basaltic magma, enriched in alkali, is the source of commonly porphyritic lava flows erupted less frequently and in relatively negligible volume during a declining and decadent stage of activity at some Hawaiian volcanoes. Differentiation by fractionation of olivine, plagioclase and augite is evident among these lavas, but does not account for the silica deficiency or the alkali enrichment. Most of the data of Hawaiian volcanism and petrology can be explained by a hypothesis that batches of magma are melted from crystalline paridotite by a recurrent process (distortion of the equatorial bulge by forced and free nutational stresses) that accomplishes the melting only of the plagioclase and pyroxene component but not the excess olivine and more refractory components within a zone of fixed and limited depth. Eruption exhausts the supply of meltable magma under a given locality and, in the absence of more violent melting processes, leaves a stratum of crystalline refractory components. ?? 1955.

  6. Bioreactors for high cell density and continuous multi-stage cultivations: options for process intensification in cell culture-based viral vaccine production.

    PubMed

    Tapia, Felipe; Vázquez-Ramírez, Daniel; Genzel, Yvonne; Reichl, Udo

    2016-03-01

    With an increasing demand for efficacious, safe, and affordable vaccines for human and animal use, process intensification in cell culture-based viral vaccine production demands advanced process strategies to overcome the limitations of conventional batch cultivations. However, the use of fed-batch, perfusion, or continuous modes to drive processes at high cell density (HCD) and overextended operating times has so far been little explored in large-scale viral vaccine manufacturing. Also, possible reductions in cell-specific virus yields for HCD cultivations have been reported frequently. Taking into account that vaccine production is one of the most heavily regulated industries in the pharmaceutical sector with tough margins to meet, it is understandable that process intensification is being considered by both academia and industry as a next step toward more efficient viral vaccine production processes only recently. Compared to conventional batch processes, fed-batch and perfusion strategies could result in ten to a hundred times higher product yields. Both cultivation strategies can be implemented to achieve cell concentrations exceeding 10(7) cells/mL or even 10(8) cells/mL, while keeping low levels of metabolites that potentially inhibit cell growth and virus replication. The trend towards HCD processes is supported by development of GMP-compliant cultivation platforms, i.e., acoustic settlers, hollow fiber bioreactors, and hollow fiber-based perfusion systems including tangential flow filtration (TFF) or alternating tangential flow (ATF) technologies. In this review, these process modes are discussed in detail and compared with conventional batch processes based on productivity indicators such as space-time yield, cell concentration, and product titers. In addition, options for the production of viral vaccines in continuous multi-stage bioreactors such as two- and three-stage systems are addressed. While such systems have shown similar virus titers compared to batch cultivations, keeping high yields for extended production times is still a challenge. Overall, we demonstrate that process intensification of cell culture-based viral vaccine production can be realized by the consequent application of fed-batch, perfusion, and continuous systems with a significant increase in productivity. The potential for even further improvements is high, considering recent developments in establishment of new (designer) cell lines, better characterization of host cell metabolism, advances in media design, and the use of mathematical models as a tool for process optimization and control.

  7. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  8. Event-driven time-optimal control for a class of discontinuous bioreactors.

    PubMed

    Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván

    2006-07-05

    Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller. (c) 2006 Wiley Periodicals, Inc.

  9. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  10. 40 CFR 63.1361 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... can be one release or a series of releases over a short time period due to a malfunction in the... or a series of devices. Examples include incinerators, carbon adsorption units, condensers, flares... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...

  11. Quantifying Hydro-biogeochemical Model Sensitivity in Assessment of Climate Change Effect on Hyporheic Zone Processes

    NASA Astrophysics Data System (ADS)

    Song, X.; Chen, X.; Dai, H.; Hammond, G. E.; Song, H. S.; Stegen, J.

    2016-12-01

    The hyporheic zone is an active region for biogeochemical processes such as carbon and nitrogen cycling, where the groundwater and surface water mix and interact with each other with distinct biogeochemical and thermal properties. The biogeochemical dynamics within the hyporheic zone are driven by both river water and groundwater hydraulic dynamics, which are directly affected by climate change scenarios. Besides that, the hydraulic and thermal properties of local sediments and microbial and chemical processes also play important roles in biogeochemical dynamics. Thus for a comprehensive understanding of the biogeochemical processes in the hyporheic zone, a coupled thermo-hydro-biogeochemical model is needed. As multiple uncertainty sources are involved in the integrated model, it is important to identify its key modules/parameters through sensitivity analysis. In this study, we develop a 2D cross-section model in the hyporheic zone at the DOE Hanford site adjacent to Columbia River and use this model to quantify module and parametric sensitivity on assessment of climate change. To achieve this purpose, We 1) develop a facies-based groundwater flow and heat transfer model that incorporates facies geometry and heterogeneity characterized from a field data set, 2) derive multiple reaction networks/pathways from batch experiments with in-situ samples and integrate temperate dependent reactive transport modules to the flow model, 3) assign multiple climate change scenarios to the coupled model by analyzing historical river stage data, 4) apply a variance-based global sensitivity analysis to quantify scenario/module/parameter uncertainty in hierarchy level. The objectives of the research include: 1) identifing the key control factors of the coupled thermo-hydro-biogeochemical model in the assessment of climate change, and 2) quantify the carbon consumption in different climate change scenarios in the hyporheic zone.

  12. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. ELIMINATION OF THE CHARACTERIZATION OF DWPF POUR STREAM SAMPLE AND THE GLASS FABRICATION AND TESTING OF THE DWPF SLUDGE BATCH QUALIFICATION SAMPLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amoroso, J.; Peeler, D.; Edwards, T.

    2012-05-11

    A recommendation to eliminate all characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification sample was made by a Six-Sigma team chartered to eliminate non-value-added activities for the Defense Waste Processing Facility (DWPF) sludge batch qualification program and is documented in the report SS-PIP-2006-00030. That recommendation was supported through a technical data review by the Savannah River National Laboratory (SRNL) and is documented in the memorandums SRNL-PSE-2007-00079 and SRNL-PSE-2007-00080. At the time of writing those memorandums, the DWPF was processing sludge-only waste but, has since transitioned to a coupledmore » operation (sludge and salt). The SRNL was recently tasked to perform a similar data review relevant to coupled operations and re-evaluate the previous recommendations. This report evaluates the validity of eliminating the characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification samples based on sludge-only and coupled operations. The pour stream sample has confirmed the DWPF's ability to produce an acceptable waste form from Slurry Mix Evaporator (SME) blending and product composition/durability predictions for the previous sixteen years but, ultimately the pour stream analysis has added minimal value to the DWPF's waste qualification strategy. Similarly, the information gained from the glass fabrication and PCT of the sludge batch qualification sample was determined to add minimal value to the waste qualification strategy since that sample is routinely not representative of the waste composition ultimately processed at the DWPF due to blending and salt processing considerations. Moreover, the qualification process has repeatedly confirmed minimal differences in glass behavior from actual radioactive waste to glasses fabricated from simulants or batch chemicals. In contrast, the variability study has significantly added value to the DWPF's qualification strategy. The variability study has evolved to become the primary aspect of the DWPF's compliance strategy as it has been shown to be versatile and capable of adapting to the DWPF's various and diverse waste streams and blending strategies. The variability study, which aims to ensure durability requirements and the PCT and chemical composition correlations are valid for the compositional region to be processed at the DWPF, must continue to be performed. Due to the importance of the variability study and its place in the DWPF's qualification strategy, it will also be discussed in this report. An analysis of historical data and Production Records indicated that the recommendation of the Six Sigma team to eliminate all characterization of pour stream glass samples and the glass fabrication and PCT performed with the qualification glass does not compromise the DWPF's current compliance plan. Furthermore, the DWPF should continue to produce an acceptable waste form following the remaining elements of the Glass Product Control Program; regardless of a sludge-only or coupled operations strategy. If the DWPF does decide to eliminate the characterization of pour stream samples, pour stream samples should continue to be collected for archival reasons, which would allow testing to be performed should any issues arise or new repository test methods be developed.« less

  14. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  15. Implementation of a repeated fed-batch process for the production of chitin-glucan complex by Komagataella pastoris.

    PubMed

    Farinha, Inês; Freitas, Filomena; Reis, Maria A M

    2017-07-25

    The yeast Komagataella pastoris was cultivated under different fed-batch strategies for the production of chitin-glucan complex (CGC), a co-polymer of chitin and β-glucan. The tested fed-batch strategies included DO-stat mode, predefined feeding profile and repeated fed-batch operation. Although high cell dry mass and high CGC production were obtained under the tested DO-stat strategy in a 94h cultivation (159 and 29g/L, respectively), the overall biomass and CGC productivities were low (41 and 7.4g/Lday, respectively). Cultivation with a predefined profile significantly improved both biomass and CGC volumetric productivity (87 and 10.8g/Lday, respectively). Hence, this strategy was used to implement a repeated fed-batch process comprising 7 consecutive cycles. A daily production of 119-126g/L of biomass with a CGC content of 11-16wt% was obtained, thus proving this cultivation strategy is adequate to reach a high CGC productivity that ranged between 11 and 18g/Lday. The process was stable and reproducible in terms of CGC productivity and polymer composition, making it a promising strategy for further process development. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Quantitative modeling of viable cell density, cell size, intracellular conductivity, and membrane capacitance in batch and fed-batch CHO processes using dielectric spectroscopy.

    PubMed

    Opel, Cary F; Li, Jincai; Amanullah, Ashraf

    2010-01-01

    Dielectric spectroscopy was used to analyze typical batch and fed-batch CHO cell culture processes. Three methods of analysis (linear modeling, Cole-Cole modeling, and partial least squares regression), were used to correlate the spectroscopic data with routine biomass measurements [viable packed cell volume, viable cell concentration (VCC), cell size, and oxygen uptake rate (OUR)]. All three models predicted offline biomass measurements accurately during the growth phase of the cultures. However, during the stationary and decline phases of the cultures, the models decreased in accuracy to varying degrees. Offline cell radius measurements were unsuccessfully used to correct for the deviations from the linear model, indicating that physiological changes affecting permittivity were occurring. The beta-dispersion was analyzed using the Cole-Cole distribution parameters Deltaepsilon (magnitude of the permittivity drop), f(c) (critical frequency), and alpha (Cole-Cole parameter). Furthermore, the dielectric parameters static internal conductivity (sigma(i)) and membrane capacitance per area (C(m)) were calculated for the cultures. Finally, the relationship between permittivity, OUR, and VCC was examined, demonstrating how the definition of viability is critical when analyzing biomass online. The results indicate that the common assumptions of constant size and dielectric properties used in dielectric analysis are not always valid during later phases of cell culture processes. The findings also demonstrate that dielectric spectroscopy, while not a substitute for VCC, is a complementary measurement of viable biomass, providing useful auxiliary information about the physiological state of a culture. (c) 2010 American Institute of Chemical Engineers

  17. 40 CFR Table 6 to Subpart Jjj of... - Known Organic HAP Emitted From the Production of Thermoplastic Products

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-crylate (80-62-6) Styrene (100-42-5) ABS latex ✔ ✔ ✔ ABS using a batch emulsion process ✔ ✔ ✔ ABS using a batch suspension process ✔ ✔ ✔ ABS using a continuous emulsion process ✔ ✔ ✔ ABS using a continuous mass...

  18. 40 CFR Table 6 to Subpart Jjj of... - Known Organic HAP Emitted From the Production of Thermoplastic Products

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-crylate (80-62-6) Styrene (100-42-5) ABS latex ✔ ✔ ✔ ABS using a batch emulsion process ✔ ✔ ✔ ABS using a batch suspension process ✔ ✔ ✔ ABS using a continuous emulsion process ✔ ✔ ✔ ABS using a continuous mass...

  19. Heterotrophs are key contributors to nitrous oxide production in activated sludge under low C-to-N ratios during nitrification-Batch experiments and modeling.

    PubMed

    Domingo-Félez, Carlos; Pellicer-Nàcher, Carles; Petersen, Morten S; Jensen, Marlene M; Plósz, Benedek G; Smets, Barth F

    2017-01-01

    Nitrous oxide (N 2 O), a by-product of biological nitrogen removal during wastewater treatment, is produced by ammonia-oxidizing bacteria (AOB) and heterotrophic denitrifying bacteria (HB). Mathematical models are used to predict N 2 O emissions, often including AOB as the main N 2 O producer. Several model structures have been proposed without consensus calibration procedures. Here, we present a new experimental design that was used to calibrate AOB-driven N 2 O dynamics of a mixed culture. Even though AOB activity was favoured with respect to HB, oxygen uptake rates indicated HB activity. Hence, rigorous experimental design for calibration of autotrophic N 2 O production from mixed cultures is essential. The proposed N 2 O production pathways were examined using five alternative process models confronted with experimental data inferred. Individually, the autotrophic and heterotrophic denitrification pathway could describe the observed data. In the best-fit model, which combined two denitrification pathways, the heterotrophic was stronger than the autotrophic contribution to N 2 O production. Importantly, the individual contribution of autotrophic and heterotrophic to the total N 2 O pool could not be unambiguously elucidated solely based on bulk N 2 O measurements. Data on NO would increase the practical identifiability of N 2 O production pathways. Biotechnol. Bioeng. 2017;114: 132-140. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Analytical solution of Luedeking-Piret equation for a batch fermentation obeying Monod growth kinetics.

    PubMed

    Garnier, Alain; Gaillet, Bruno

    2015-12-01

    Not so many fermentation mathematical models allow analytical solutions of batch process dynamics. The most widely used is the combination of the logistic microbial growth kinetics with Luedeking-Piret bioproduct synthesis relation. However, the logistic equation is principally based on formalistic similarities and only fits a limited range of fermentation types. In this article, we have developed an analytical solution for the combination of Monod growth kinetics with Luedeking-Piret relation, which can be identified by linear regression and used to simulate batch fermentation evolution. Two classical examples are used to show the quality of fit and the simplicity of the method proposed. A solution for the combination of Haldane substrate-limited growth model combined with Luedeking-Piret relation is also provided. These models could prove useful for the analysis of fermentation data in industry as well as academia. © 2015 Wiley Periodicals, Inc.

  1. Prediction of acid hydrolysis of lignocellulosic materials in batch and plug flow reactors.

    PubMed

    Jaramillo, Oscar Johnny; Gómez-García, Miguel Ángel; Fontalvo, Javier

    2013-08-01

    This study unifies contradictory conclusions reported in literature on acid hydrolysis of lignocellulosic materials, using batch and plug flow reactors, regarding the influence of the initial liquid ratio of acid aqueous solution to solid lignocellulosic material on sugar yield and concentration. The proposed model takes into account the volume change of the reaction media during the hydrolysis process. An error lower than 8% was found between predictions, using a single set of kinetic parameters for several liquid to solid ratios, and reported experimental data for batch and plug flow reactors. For low liquid-solid ratios, the poor wetting and the acid neutralization, due to the ash presented in the solid, will both reduce the sugar yield. Also, this study shows that both reactors are basically equivalent in terms of the influence of the liquid to solid ratio on xylose and glucose yield. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. [Monitoring method for macroporous resin column chromatography process of salvianolic acids based on near infrared spectroscopy].

    PubMed

    Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-07-01

    To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.

  3. Anaerobic sequencing batch reactors for wastewater treatment: a developing technology.

    PubMed

    Zaiat, M; Rodrigues, J A; Ratusznei, S M; de Camargo, E F; Borzani, W

    2001-01-01

    This paper describes and discusses the main problems related to anaerobic batch and fed-batch processes for wastewater treatment. A critical analysis of the literature evaluated the industrial application viability and proposed alternatives to improve operation and control of this system. Two approaches were presented in order to make this anaerobic discontinuous process feasible for industrial application: (1) optimization of the operating procedures in reactors containing self-immobilized sludge as granules, and (2) design of bioreactors with inert support media for biomass immobilization.

  4. 40 CFR 98.128 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... processing. An isolated intermediate is usually a product of chemical synthesis. Storage of an isolated... withdrawal of product do not occur simultaneously in a batch operation. Batch emission episode means a... process operations. By-product means a chemical that is produced coincidentally during the production of...

  5. 40 CFR 98.128 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... stored before subsequent processing. An isolated intermediate is usually a product of chemical synthesis... withdrawal of product do not occur simultaneously in a batch operation. Batch emission episode means a... process operations. By-product means a chemical that is produced coincidentally during the production of...

  6. 40 CFR 98.128 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... processing. An isolated intermediate is usually a product of chemical synthesis. Storage of an isolated... withdrawal of product do not occur simultaneously in a batch operation. Batch emission episode means a... process operations. By-product means a chemical that is produced coincidentally during the production of...

  7. CONVERTING FROM BATCH TO CONTINUOUS INTENSIFIED PROCESSING IN THE STT? REACTOR

    EPA Science Inventory


    The fluid dynamics, the physical dimensions and characteristics of the reaction zones of continuous process intensification reactors are often quite different from those of the batch reactors they replace. Understanding these differences is critical to the successful transit...

  8. Detailed requirements document for the problem reporting data system (PDS). [space shuttle and batch processing

    NASA Technical Reports Server (NTRS)

    West, R. S.

    1975-01-01

    The system is described as a computer-based system designed to track the status of problems and corrective actions pertinent to space shuttle hardware. The input, processing, output, and performance requirements of the system are presented along with standard display formats and examples. Operational requirements, hardware, requirements, and test requirements are also included.

  9. Guide to a Student-Family-School-Community Partnership: Using a Student & Data Driven Process to Improve School Environments & Promote Student Success

    ERIC Educational Resources Information Center

    Burgoa, Carol; Izu, Jo Ann

    2010-01-01

    This guide presents a data-driven, research-based process--referred to as the "school-community forum process"--for increasing youth voice, promoting resilience, strengthening adult-youth connections, and ultimately, for improving schools. It uses a "student listening circle"--a special type of focus group involving eight to…

  10. A Middle School Principal's and Teachers' Perceptions of Leadership Practices in Data-Driven Decision Making

    ERIC Educational Resources Information Center

    Godreau Cimma, Kelly L.

    2011-01-01

    The purpose of this qualitative case study was to describe one Connecticut middle school's voluntary implementation of a data-driven decision making process in order to improve student academic performance. Data-driven decision making is a component of Connecticut's accountability system to assist schools in meeting the requirements of the No…

  11. Effectiveness and Usability of the Sensory Processing Measure-Preschool Quick Tips: Data-Driven Intervention Following the Use of the SPM-Preschool in an Early Childhood, Multiple-Case Study

    ERIC Educational Resources Information Center

    Olson, Carol H.; Henry, Diana A.; Kliner, Ashley Peck; Kyllo, Alissa; Richter, Chelsea Munson; Charley, Jane; Whitcher, Meagan Chapman; Reinke, Katherine Roth; Tysver, Chelsay Horner; Wagner, Lacey; Walworth, Jessica

    2016-01-01

    This pre- and posttest multiple-case study examined the effectiveness and usability of the Sensory Processing Measure-Preschool Quick Tips (SPM-P QT) by key stakeholders (parents and teachers) for implementing data-driven intervention to address sensory processing challenges. The Sensory Processing Measure-Preschool (SPM-P) was administered as an…

  12. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    PubMed

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  13. An integer batch scheduling model considering learning, forgetting, and deterioration effects for a single machine to minimize total inventory holding cost

    NASA Astrophysics Data System (ADS)

    Yusriski, R.; Sukoyo; Samadhi, T. M. A. A.; Halim, A. H.

    2018-03-01

    This research deals with a single machine batch scheduling model considering the influenced of learning, forgetting, and machine deterioration effects. The objective of the model is to minimize total inventory holding cost, and the decision variables are the number of batches (N), batch sizes (Q[i], i = 1, 2, .., N) and the sequence of processing the resulting batches. The parts to be processed are received at the right time and the right quantities, and all completed parts must be delivered at a common due date. We propose a heuristic procedure based on the Lagrange method to solve the problem. The effectiveness of the procedure is evaluated by comparing the resulting solution to the optimal solution obtained from the enumeration procedure using the integer composition technique and shows that the average effectiveness is 94%.

  14. A feasibility study on biological nitrogen removal (BNR) via integrated thiosulfate-driven denitratation with anammox.

    PubMed

    Qian, Jin; Zhang, Mingkuan; Wu, Yaoguo; Niu, Juntao; Chang, Xing; Yao, Hairui; Hu, Sihai; Pei, Xiangjun

    2018-06-12

    To exploit the advantages of less electron donor consumptions in partial-denitrification (denitratation, NO 3 - → NO 2 - ) as well as less sludge production in autotrophic denitrification (AD) and anammox, a novel biological nitrogen removal (BNR) process through combined anammox and thiosulfate-driven denitratation was proposed here. In this study, the ratio of S 2 O 3 2- -S/NO 3 - -N and pH are confirmed to be two key factors affecting the thiosulfate-driven denitratation activity and nitrite accumulation. Simultaneous high denitratation activity and substantial nitrite accumulation were observed at initial S 2 O 3 2- -S/NO 3 - -N ratio of 1.5:1 and pH of 8.0. The optimal pH for the anammox reaction is determined to be 8.0. A sequential batch reactor (SBR) and an up-flow anaerobic sludge blanket (UASB) reactor were established to proceed the anammox and the high-rate thiosulfate-driven denitratation, respectively. Under the ambient temperature of 35 °C, the total nitrogen removal efficiency and capacity are 73% and 0.35 kg N/day/m 3 in the anammox-SBR. At HRT of 30 min, the NO 3 - removal efficiency could achieve above 90% with the nitrate-to-nitrite transformation ratio of 0.8, implying the great potential to apply the thiosulfate-driven denitratation & anammox system for BNR with minimal sludge production. Without the occurrence of denitritation (NO 2 - → N 2 O → N 2 ), theoretically no N 2 O could be emitted from this BNR system. This study could shed light on how to operate a high rate BNR system targeting to electron donor and energy savings as well as biowastes minimization and greenhouse gas reductions. Copyright © 2018. Published by Elsevier Ltd.

  15. DWPF Simulant CPC Studies For SB8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newell, J. D.

    2013-09-25

    Prior to processing a Sludge Batch (SB) in the Defense Waste Processing Facility (DWPF), flowsheet studies using simulants are performed. Typically, the flowsheet studies are conducted based on projected composition(s). The results from the flowsheet testing are used to 1) guide decisions during sludge batch preparation, 2) serve as a preliminary evaluation of potential processing issues, and 3) provide a basis to support the Shielded Cells qualification runs performed at the Savannah River National Laboratory (SRNL). SB8 was initially projected to be a combination of the Tank 40 heel (Sludge Batch 7b), Tank 13, Tank 12, and the Tank 51more » heel. In order to accelerate preparation of SB8, the decision was made to delay the oxalate-rich material from Tank 12 to a future sludge batch. SB8 simulant studies without Tank 12 were reported in a separate report.1 The data presented in this report will be useful when processing future sludge batches containing Tank 12. The wash endpoint target for SB8 was set at a significantly higher sodium concentration to allow acceptable glass compositions at the targeted waste loading. Four non-coupled tests were conducted using simulant representing Tank 40 at 110-146% of the Koopman Minimum Acid requirement. Hydrogen was generated during high acid stoichiometry (146% acid) SRAT testing up to 31% of the DWPF hydrogen limit. SME hydrogen generation reached 48% of of the DWPF limit for the high acid run. Two non-coupled tests were conducted using simulant representing Tank 51 at 110-146% of the Koopman Minimum Acid requirement. Hydrogen was generated during high acid stoichiometry SRAT testing up to 16% of the DWPF limit. SME hydrogen generation reached 49% of the DWPF limit for hydrogen in the SME for the high acid run. Simulant processing was successful using previously established antifoam addition strategy. Foaming during formic acid addition was not observed in any of the runs. Nitrite was destroyed in all runs and no N2O was detected during SME processing. Mercury behavior was consistent with that seen in previous SRAT runs. Mercury was stripped below the DWPF limit on 0.8 wt% for all runs. Rheology yield stress fell within or below the design basis of 1-5 Pa. The low acid Tank 40 run (106% acid stoichiometry) had the highest yield stress at 3.78 Pa.« less

  16. Electrical resistivity tomography to quantify in situ liquid content in a full-scale dry anaerobic digestion reactor.

    PubMed

    André, L; Lamy, E; Lutz, P; Pernier, M; Lespinard, O; Pauss, A; Ribeiro, T

    2016-02-01

    The electrical resistivity tomography (ERT) method is a non-intrusive method widely used in landfills to detect and locate liquid content. An experimental set-up was performed on a dry batch anaerobic digestion reactor to investigate liquid repartition in process and to map spatial distribution of inoculum. Two array electrodes were used: pole-dipole and gradient arrays. A technical adaptation of ERT method was necessary. Measured resistivity data were inverted and modeled by RES2DINV software to get resistivity sections. Continuous calibration along resistivity section was necessary to understand data involving sampling and physicochemical analysis. Samples were analyzed performing both biochemical methane potential and fiber quantification. Correlations were established between the protocol of reactor preparation, resistivity values, liquid content, methane potential and fiber content representing liquid repartition, high methane potential zones and degradations zones. ERT method showed a strong relevance to monitor and to optimize the dry batch anaerobic digestion process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Development of clean coal and clean soil technologies using advanced agglomeration techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ignasiak, B.; Ignasiak, T.; Szymocha, K.

    1990-01-01

    Three major topics are discussed in this report: (1) Upgrading of Low Rank Coals by the Agflotherm Process. Test data, procedures, equipment, etc., are described for co-upgrading of subbituminous coals and heavy oil; (2) Upgrading of Bituminous Coals by the Agflotherm Process. Experimental procedures and data, bench and pilot scale equipments, etc., for beneficiating bituminous coals are described; (3) Soil Clean-up and Hydrocarbon Waste Treatment Process. Batch and pilot plant tests are described for soil contaminated by tar refuse from manufactured gas plant sites. (VC)

  18. dada - a web-based 2D detector analysis tool

    NASA Astrophysics Data System (ADS)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  19. Optimization of Process Parameters for High Efficiency Laser Forming of Advanced High Strength Steels within Metallurgical Constraints

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, Ghazal; Griffiths, Jonathan; Dearden, Geoff; Edwardson, Stuart P.

    Laser forming (LF) has been shown to be a viable alternative to form automotive grade advanced high strength steels (AHSS). Due to their high strength, heat sensitivity and low conventional formability show early fractures, larger springback, batch-to-batch inconsistency and high tool wear. In this paper, optimisation of the LF process parameters has been conducted to further understand the impact of a surface heat treatment on DP1000. A FE numerical simulation has been developed to analyse the dynamic thermo-mechanical effects. This has been verified against empirical data. The goal of the optimisation has been to develop a usable process window for the LF of AHSS within strict metallurgical constraints. Results indicate it is possible to LF this material, however a complex relationship has been found between the generation and maintenance of hardness values in the heated zone. A laser surface hardening effect has been observed that could be beneficial to the efficiency of the process.

  20. Sequencing batch-reactor control using Gaussian-process models.

    PubMed

    Kocijan, Juš; Hvala, Nadja

    2013-06-01

    This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Mechanistic simulation of batch acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping using Aspen Plus™.

    PubMed

    Darkwah, Kwabena; Nokes, Sue E; Seay, Jeffrey R; Knutson, Barbara L

    2018-05-22

    Process simulations of batch fermentations with in situ product separation traditionally decouple these interdependent steps by simulating a separate "steady state" continuous fermentation and separation units. In this study, an integrated batch fermentation and separation process was simulated for a model system of acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping, such that the fermentation kinetics are linked in real-time to the gas stripping process. A time-dependent cell growth, substrate utilization, and product production is translated to an Aspen Plus batch reactor. This approach capitalizes on the phase equilibria calculations of Aspen Plus to predict the effect of stripping on the ABE fermentation kinetics. The product profiles of the integrated fermentation and separation are shown to be sensitive to gas flow rate, unlike separate steady state fermentation and separation simulations. This study demonstrates the importance of coupled fermentation and separation simulation approaches for the systematic analyses of unsteady state processes.

  2. Pad ultrasonic batch dyeing of causticized lyocell fabric with reactive dyes.

    PubMed

    Babar, Aijaz Ahmed; Peerzada, Mazhar Hussain; Jhatial, Abdul Khalique; Bughio, Noor-Ul-Ain

    2017-01-01

    Conventionally, cellulosic fabric dyed with reactive dyes requires significant amount of salt. However, the dyeing of a solvent spun regenerated cellulosic fiber is a critical process. This paper presents the dyeing results of lyocell fabrics dyed with conventional pad batch (CPB) and pad ultrasonic batch (PUB) processes. The dyeing of lyocell fabrics was carried out with two commercial dyes namely Drimarine Blue CL-BR and Ramazol Blue RGB. Dyeing parameters including concentration of sodium hydroxide, sodium carbonate and dwell time were compared for the two processes. The outcomes show that PUB dyed samples offered reasonably higher color yield and dye fixation than CPB dyed samples. A remarkable reduction of 12h in batching time, 18ml/l in NaOH and 05g/l in Na 2 CO 3 quantity was observed for PUB processed samples producing similar results compared to CPB process, making PUB a more economical, productive and an environment friendly process. Color fastness examination witnessed identical results for both PUB and CPB methods. No significant change in surface morphology of PUB processed samples was observed through scanning electron microscope (SEM) analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. An Inversion Analysis of Recent Variability in Natural CO2 Fluxes Using GOSAT and In Situ Observations

    NASA Astrophysics Data System (ADS)

    Wang, J. S.; Kawa, S. R.; Baker, D. F.; Collatz, G. J.; Ott, L. E.

    2015-12-01

    About one-half of the global CO2 emissions from fossil fuel combustion and deforestation accumulates in the atmosphere, where it contributes to global warming. The rest is taken up by vegetation and the ocean. The precise contribution of the two sinks, and their location and year-to-year variability are, however, not well understood. We use two different approaches, batch Bayesian synthesis inversion and variational data assimilation, to deduce the global spatiotemporal distributions of CO2 fluxes during 2009-2010. One of our objectives is to assess different sources of uncertainties in inferred fluxes, including uncertainties in prior flux estimates and observations, and differences in inversion techniques. For prior constraints, we utilize fluxes and uncertainties from the CASA-GFED model of the terrestrial biosphere and biomass burning driven by satellite observations and interannually varying meteorology. We also use measurement-based ocean flux estimates and two sets of fixed fossil CO2 emissions. Here, our inversions incorporate column CO2 measurements from the GOSAT satellite (ACOS retrieval, filtered and bias-corrected) and in situ observations (individual flask and afternoon-average continuous observations) to estimate fluxes in 108 regions over 8-day intervals for the batch inversion and at 3° x 3.75° weekly for the variational system. Relationships between fluxes and atmospheric concentrations are derived consistently for the two inversion systems using the PCTM atmospheric transport model driven by meteorology from the MERRA reanalysis. We compare the posterior fluxes and uncertainties derived using different data sets and the two inversion approaches, and evaluate the posterior atmospheric concentrations against independent data including aircraft measurements. The optimized fluxes generally resemble those from other studies. For example, the results indicate that the terrestrial biosphere is a net CO2 sink, and a GOSAT-only inversion suggests a shift in the global sink from the tropics/south to the north relative to the prior and to an in-situ-only inversion. We also find a smaller terrestrial sink in higher-latitude northern regions in boreal summer of 2010 relative to 2009.

  4. An Inversion Analysis of Recent Variability in Natural CO2 Fluxes Using GOSAT and In Situ Observations

    NASA Technical Reports Server (NTRS)

    Wang, James S.; Kawa, S. Randolph; Collatz, G. James; Baker, David F.; Ott, Lesley

    2015-01-01

    About one-half of the global CO2 emissions from fossil fuel combustion and deforestation accumulates in the atmosphere, where it contributes to global warming. The rest is taken up by vegetation and the ocean. The precise contribution of the two sinks, and their location and year-to-year variability are, however, not well understood. We use two different approaches, batch Bayesian synthesis inversion and variational data assimilation, to deduce the global spatiotemporal distributions of CO2 fluxes during 2009-2010. One of our objectives is to assess different sources of uncertainties in inferred fluxes, including uncertainties in prior flux estimates and observations, and differences in inversion techniques. For prior constraints, we utilize fluxes and uncertainties from the CASA-GFED model of the terrestrial biosphere and biomass burning driven by satellite observations and interannually varying meteorology. We also use measurement-based ocean flux estimates and two sets of fixed fossil CO2 emissions. Here, our inversions incorporate column CO2 measurements from the GOSAT satellite (ACOS retrieval, filtered and bias-corrected) and in situ observations (individual flask and afternoon-average continuous observations) to estimate fluxes in 108 regions over 8-day intervals for the batch inversion and at 3 x 3.75 weekly for the variational system. Relationships between fluxes and atmospheric concentrations are derived consistently for the two inversion systems using the PCTM atmospheric transport model driven by meteorology from the MERRA reanalysis. We compare the posterior fluxes and uncertainties derived using different data sets and the two inversion approaches, and evaluate the posterior atmospheric concentrations against independent data including aircraft measurements. The optimized fluxes generally resemble those from other studies. For example, the results indicate that the terrestrial biosphere is a net CO2 sink, and a GOSAT-only inversion suggests a shift in the global sink from the tropics south to the north relative to the prior and to an in-situ-only inversion. We also find a smaller terrestrial sink in higher-latitude northern regions in boreal summer of 2010 relative to 2009.

  5. Bio-plasticizer production by hybrid acetone-butanol-ethanol fermentation with full cell catalysis of Candida sp. 99-125.

    PubMed

    Chen, Changjing; Cai, Di; Qin, Peiyong; Chen, Biqiang; Wang, Zheng; Tan, Tianwei

    2018-06-01

    Hybrid process that integrated fermentation, pervaporation and esterification was established aiming to improve the economic feasibility of the conventional acetone-butanol-ethanol (ABE) fermentation process. Candida sp 99-125 cells were used as full-cell catalyst. The feasibility of batch and fed-batch esterification using the ABE permeate of pervaporation (ranging from 286.9 g/L to 402.9 g/L) as substrate were compared. Valuable butyl oleate was produced along with ethyl oleate. For the batch esterification, due to severe inhibition of substrate to lipase, the yield of butyl oleate and ethyl oleate were only 24.9% and 3.3%, respectively. In contrast, 75% and 11.8% of butyl oleate and ethyl oleate were obtained, respectively, at the end of the fed-batch esterification. The novel integration process provides a promising strategy for in situ upgrading ABE products. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. 40 CFR 63.1361 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pressure relief device. This release can be one release or a series of releases over a short time period... reduces the mass of HAP emitted to the air. The equipment may consist of an individual device or a series... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...

  7. 40 CFR 63.1361 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... can be one release or a series of releases over a short time period due to a malfunction in the... reduces the mass of HAP emitted to the air. The equipment may consist of an individual device or a series... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...

  8. A Batch Feeder for Inhomogeneous Bulk Materials

    NASA Astrophysics Data System (ADS)

    Vislov, I. S.; Kladiev, S. N.; Slobodyan, S. M.; Bogdan, A. M.

    2016-04-01

    The work includes the mechanical analysis of mechanical feeders and batchers that find application in various technological processes and industrial fields. Feeders are usually classified according to their design features into two groups: conveyor-type feeders and non-conveyor feeders. Batchers are used to batch solid bulk materials. Less frequently, they are used for liquids. In terms of a batching method, they are divided into volumetric and weighting batchers. Weighting batchers do not provide for sufficient batching accuracy. Automatic weighting batchers include a mass controlling sensor and systems for automatic material feed and automatic mass discharge control. In terms of operating principle, batchers are divided into gravitational batchers and batchers with forced feed of material using conveyors and pumps. Improved consumption of raw materials, decreased loss of materials, ease of use in automatic control systems of industrial facilities allows increasing the quality of technological processes and improve labor conditions. The batch feeder suggested by the authors is a volumetric batcher that has no comparable counterparts among conveyor-type feeders and allows solving the problem of targeted feeding of bulk material batches increasing reliability and hermeticity of the device.

  9. Modeling Lab-sized Anaerobic Fluidized Bed Reactor (AFBR) for Palm Oil Mill Effluent (POME) treatment: from Batch to Continuous Reactors

    NASA Astrophysics Data System (ADS)

    Mufti Azis, Muhammad; Sudibyo, Hanifrahmawan; Budhijanto, Wiratni

    2018-03-01

    Indonesia is aiming to produce 30 million tones/year of crude palm oil (CPO) by 2020. As a result, 90 million tones/year of POME will be produced. POME is highly polluting wastewater which may cause severe environmental problem due to its high chemical oxygen demand (COD) and biochemical oxygen demand (BOD). Due to the limitation of open pond treatment, the use of AFBR has been considered as a potential technology to treat POME. This study aims to develop mathematical models of lab-sized Anaerobic Fluidized Bed Reactor (AFBR) in batch and continuous processes. In addition, the AFBR also utilized natural zeolite as an immobilized media for microbes. To initiate the biomass growth, biodiesel waste has been used as an inoculum. In the first part of this study, a batch AFBR was operated to evaluate the COD, VFA, and CH4 concentrations. By comparing the batch results with and without zeolite, it showed that the addition of 17 g/gSCOD zeolite gave larger COD decrease within 20 days of operation. In order to elucidate the mechanism, parameter estimations of 12 kinetic parameters were proposed to describe the batch reactor performance. The model in general could describe the batch experimental data well. In the second part of this study, the kinetic parameters obtained from batch reactor were used to simulate the performance of double column AFBR where the acidogenic and methanogenic biomass were separated. The simulation showed that a relatively long residence time (Hydraulic Residence Time, HRT) was required to treat POME using the proposed double column AFBR. Sensitivity analyses was conducted and revealed that μm1 appeared to be the most sensitive parameter to reduce the HRT of double column AFBR.

  10. Environmental impact of mushroom compost production.

    PubMed

    Leiva, Francisco; Saenz-Díez, Juan-Carlos; Martínez, Eduardo; Jiménez, Emilio; Blanco, Julio

    2016-09-01

    This research analyses the environmental impact of the creation of Agaricus bisporus compost packages. The composting process is the intermediate stage of the mushroom production process, subsequent to the mycelium cultivation stage and prior to the fruiting bodies cultivation stage. A full life cycle assessment model of the Agaricus bisporus composting process has been developed through the identification and analysis of the inputs-outputs and energy consumption of the activities involved in the production process. The study has been developed based on data collected from a plant during a 1 year campaign, thereby obtaining accurate information used to analyse the environmental impact of the process. A global analysis of the main stages of the process shows that the process that has the greatest impact in most categories is the compost batch preparation process. This is due to an increased consumption of energy resources by the machinery that mixes the raw materials to create the batch. At the composting process inside the tunnel stage, the activity that has the greatest impact in almost all categories studied is the initial stage of composting. This is due to higher energy consumption during the process compared to the other stages. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  11. Inorganic fouling mitigation by salinity cycling in batch reverse osmosis.

    PubMed

    Warsinger, David M; Tow, Emily W; Maswadeh, Laith A; Connors, Grace B; Swaminathan, Jaichander; Lienhard V, John H

    2018-06-15

    Enhanced fouling resistance has been observed in recent variants of reverse osmosis (RO) desalination which use time-varying batch or semi-batch processes, such as closed-circuit RO (CCRO) and pulse flow RO (PFRO). However, the mechanisms of batch processes' fouling resistance are not well-understood, and models have not been developed for prediction of their fouling performance. Here, a framework for predicting reverse osmosis fouling is developed by comparing the fluid residence time in batch and continuous (conventional) reverse osmosis systems to the nucleation induction times for crystallization of sparingly soluble salts. This study considers the inorganic foulants calcium sulfate (gypsum), calcium carbonate (calcite), and silica, and the work predicts maximum recovery ratios for the treatment of typical water sources using batch reverse osmosis (BRO) and continuous reverse osmosis. The prediction method is validated through comparisons to the measured time delay for CaSO 4 membrane scaling in a bench-scale, recirculating reverse osmosis unit. The maximum recovery ratio for each salt solution (CaCO 3 , CaSO 4 ) is individually predicted as a function of inlet salinity, as shown in contour plots. Next, the maximum recovery ratios of batch and conventional RO are compared across several water sources, including seawater, brackish groundwater, and RO brine. Batch RO's shorter residence times, associated with cycling from low to high salinity during each batch, enable significantly higher recovery ratios and higher salinity than in continuous RO for all cases examined. Finally, representative brackish RO brine samples were analyzed to determine the maximum possible recovery with batch RO. Overall, the induction time modeling methodology provided here can be used to allow batch RO to operate at high salinity and high recovery, while controlling scaling. The results show that, in addition to its known energy efficiency improvement, batch RO has superior inorganic fouling resistance relative to conventional RO. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Optimal bioprocess design through a gene regulatory network - growth kinetic hybrid model: Towards Replacing Monod kinetics.

    PubMed

    Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios

    2018-05-02

    Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.

  13. Cognitive load privileges memory-based over data-driven processing, not group-level over person-level processing.

    PubMed

    Skorich, Daniel P; Mavor, Kenneth I

    2013-09-01

    In the current paper, we argue that categorization and individuation, as traditionally discussed and as experimentally operationalized, are defined in terms of two confounded underlying dimensions: a person/group dimension and a memory-based/data-driven dimension. In a series of three experiments, we unconfound these dimensions and impose a cognitive load. Across the three experiments, two with laboratory-created targets and one with participants' friends as the target, we demonstrate that cognitive load privileges memory-based over data-driven processing, not group- over person-level processing. We discuss the results in terms of their implications for conceptualizations of the categorization/individuation distinction, for the equivalence of person and group processes, for the ultimate 'purpose' and meaningfulness of group-based perception and, fundamentally, for the process of categorization, broadly defined. © 2012 The British Psychological Society.

  14. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or solvent from a...

  15. Do lab-derived distribution coefficient values of pesticides match distribution coefficient values determined from column and field-scale experiments? A critical analysis of relevant literature.

    PubMed

    Vereecken, H; Vanderborght, J; Kasteel, R; Spiteller, M; Schäffer, A; Close, M

    2011-01-01

    In this study, we analyzed sorption parameters for pesticides that were derived from batch and column or batch and field experiments. The batch experiments analyzed in this study were run with the same pesticide and soil as in the column and field experiments. We analyzed the relationship between the pore water velocity of the column and field experiments, solute residence times, and sorption parameters, such as the organic carbon normalized distribution coefficient ( ) and the mass exchange coefficient in kinetic models, as well as the predictability of sorption parameters from basic soil properties. The batch/column analysis included 38 studies with a total of 139 observations. The batch/field analysis included five studies, resulting in a dataset of 24 observations. For the batch/column data, power law relationships between pore water velocity, residence time, and sorption constants were derived. The unexplained variability in these equations was reduced, taking into account the saturation status and the packing status (disturbed-undisturbed) of the soil sample. A new regression equation was derived that allows estimating the values derived from column experiments using organic matter and bulk density with an value of 0.56. Regression analysis of the batch/column data showed that the relationship between batch- and column-derived values depends on the saturation status and packing of the soil column. Analysis of the batch/field data showed that as the batch-derived value becomes larger, field-derived values tend to be lower than the corresponding batch-derived values, and vice versa. The present dataset also showed that the variability in the ratio of batch- to column-derived value increases with increasing pore water velocity, with a maximum value approaching 3.5. American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.; Bannochie, C. J.

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of verification of Macrobatch (Salt Batch) 11 for the Interim Salt Disposition Program (ISDP) for processing. This document reports characterization data on the samples of Tank 21H and fulfills the requirements of Deliverable 3 of the Technical Task Request (TTR).

  17. Adsorptive removal of Cu(II) from aqueous solution and industrial effluent using natural/agricultural wastes.

    PubMed

    Singha, Biswajit; Das, Sudip Kumar

    2013-07-01

    The potentiality of low cost natural/agricultural waste biomasses for the removal of Cu(II) ion from aqueous solution has been investigated in batch experiments. The effect of various physico-chemical parameters such as initial pH, initial Cu(II) concentration, adsorbent dosage, contact time and temperature has been studied. The optimum pH for adsorption was found to be 6 for all adsorbents used. Kinetics data were best described by the pseudo-2nd-order model. The experimental data were fitted well with Freundlich and Halsey isotherm models. The diffusion coefficient and sorption energy indicated that the adsorption process was chemical in nature. Thermodynamic parameters such as ΔG°, ΔH° and ΔS° were calculated, and it was observed that the adsorption process was spontaneous and endothermic. The mean sorption energy was calculated using Dubinin-Radushkevich isotherm model and it confirmed that the sorption process was chemical in nature. Different active functional groups were identified by FTIR studies which were responsible for Cu(II) ion adsorption process. Application study using electroplating industrial waste water and regeneration experiment of the adsorbent were also investigated. Design procedure for the batch process was also reported. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M; Russell Eibling, R; David Koopman, D

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less

  19. Bacteriophage PRD1 batch experiments to study attachment, detachment and inactivation processes.

    PubMed

    Sadeghi, Gholamreza; Schijven, Jack F; Behrends, Thilo; Hassanizadeh, S Majid; van Genuchten, Martinus Th

    2013-09-01

    Knowledge of virus removal in subsurface environments is pivotal for assessing the risk of viral contamination of water resources and developing appropriate protection measures. Columns packed with sand are frequently used to quantify attachment, detachment and inactivation rates of viruses. Since column transport experiments are very laborious, a common alternative is to perform batch experiments where usually one or two measurements are done assuming equilibrium is reached. It is also possible to perform kinetic batch experiments. In that case, however, it is necessary to monitor changes in the concentration with time. This means that kinetic batch experiments will be almost as laborious as column experiments. Moreover, attachment and detachment rate coefficients derived from batch experiments may differ from those determined using column experiments. The aim of this study was to determine the utility of kinetic batch experiments and investigate the effects of different designs of the batch experiments on estimated attachment, detachment and inactivation rate coefficients. The experiments involved various combinations of container size, sand-water ratio, and mixing method (i.e., rolling or tumbling by pivoting the tubes around their horizontal or vertical axes, respectively). Batch experiments were conducted with clean quartz sand, water at pH 7 and ionic strength of 20 mM, and using the bacteriophage PRD1 as a model virus. Values of attachment, detachment and inactivation rate coefficients were found by fitting an analytical solution of the kinetic model equations to the data. Attachment rate coefficients were found to be systematically higher under tumbling than under rolling conditions because of better mixing and more efficient contact of phages with the surfaces of the sand grains. In both mixing methods, more sand in the container yielded higher attachment rate coefficients. A linear increase in the detachment rate coefficient was observed with increased solid-water ratio using tumbling method. Given the differences in the attachment rate coefficients, and assuming the same sticking efficiencies since chemical conditions of the batch and column experiments were the same, our results show that collision efficiencies of batch experiments are not the same as those of column experiments. Upscaling of the attachment rate from batch to column experiments hence requires proper understanding of the mixing conditions. Because batch experiments, in which the kinetics are monitored, are as laborious as column experiments, there seems to be no major advantage in performing batch instead of column experiments. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Preparation and Characterization of a Master Blend of Plutonium Oxide for the 3013 Large Scale Shelf-Life Surveillance Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillispie, Obie William; Worl, Laura Ann; Veirs, Douglas Kirk

    A mixture of chlorine-containing, impure plutonium oxides has been produced and has been given the name Master Blend. This large quantity of well-characterized chlorinecontaining material is available for use in the Integrated Surveillance and Monitoring Program for shelf-life experiments. It is intended to be representative of materials packaged to meet DOE-STD-3013.1 The Master Blend contains a mixture of items produced in Los Alamos National Laboratory’s (LANL) electro-refining pyrochemical process in the late 1990s. Twenty items were crushed and sieved, calcined to 800ºC for four hours, and blended multiple times. This process resulted in four batches of Master Blend. Calorimetry andmore » density data on material from the four batches indicate homogeneity.« less

  1. Comparison of TOPEX/Poseidon orbit determination solutions obtained by the Goddard Space Flight Center Flight Dynamics Division and Precision Orbit Determination Teams

    NASA Technical Reports Server (NTRS)

    Doll, C.; Mistretta, G.; Hart, R.; Oza, D.; Cox, C.; Nemesure, M.; Bolvin, D.; Samii, Mina V.

    1993-01-01

    Orbit determination results are obtained by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) using the Goddard Trajectory Determination System (GTDS) and a real-time extended Kalman filter estimation system to process Tracking Data and Relay Satellite (TDRS) System (TDRSS) measurements in support of the Ocean Topography Experiment (TOPEX)/Poseidon spacecraft navigation and health and safety operations. GTDS is the operational orbit determination system used by the FDD, and the extended Kalman fliter was implemented in an analysis prototype system, the Real-Time Orbit Determination System/Enhanced (RTOD/E). The Precision Orbit Determination (POD) team within the GSFC Space Geodesy Branch generates an independent set of high-accuracy trajectories to support the TOPEX/Poseidon scientific data. These latter solutions use the Geodynamics (GEODYN) orbit determination system with laser ranging tracking data. The TOPEX/Poseidon trajectories were estimated for the October 22 - November 1, 1992, timeframe, for which the latest preliminary POD results were available. Independent assessments were made of the consistencies of solutions produced by the batch and sequential methods. The batch cases were assessed using overlap comparisons, while the sequential cases were assessed with covariances and the first measurement residuals. The batch least-squares and forward-filtered RTOD/E orbit solutions were compared with the definitive POD orbit solutions. The solution differences were generally less than 10 meters (m) for the batch least squares and less than 18 m for the sequential estimation solutions. The differences among the POD, GTDS, and RTOD/E solutions can be traced to differences in modeling and tracking data types, which are being analyzed in detail.

  2. Linear Covariance Analysis and Epoch State Estimators

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Carpenter, J. Russell

    2014-01-01

    This paper extends in two directions the results of prior work on generalized linear covariance analysis of both batch least-squares and sequential estimators. The first is an improved treatment of process noise in the batch, or epoch state, estimator with an epoch time that may be later than some or all of the measurements in the batch. The second is to account for process noise in specifying the gains in the epoch state estimator. We establish the conditions under which the latter estimator is equivalent to the Kalman filter.

  3. Linear Covariance Analysis and Epoch State Estimators

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Carpenter, J. Russell

    2012-01-01

    This paper extends in two directions the results of prior work on generalized linear covariance analysis of both batch least-squares and sequential estimators. The first is an improved treatment of process noise in the batch, or epoch state, estimator with an epoch time that may be later than some or all of the measurements in the batch. The second is to account for process noise in specifying the gains in the epoch state estimator. We establish the conditions under which the latter estimator is equivalent to the Kalman filter.

  4. Ready-to-eat vegetables production with low-level water chlorination. An evaluation of water quality, and of its impact on end products.

    PubMed

    D'Acunzo, Francesca; Del Cimmuto, Angela; Marinelli, Lucia; Aurigemma, Caterina; De Giusti, Maria

    2012-01-01

    We evaluated the microbiological impact of low-level chlorination (1 ppm free chlorine) on the production of ready-to-eat (RTE) vegetables by monitoring the microbiological quality of irrigation and processing water in two production plants over a 4-season period, as well as the microbiological quality of unprocessed vegetables and RTE product. Water samples were also characterized in terms of some chemical and physico-chemical parameters of relevance in chlorination management. Both producers use water with maximum 1 ppm free chlorine for vegetables rinsing, while the two processes differ by the number of washing cycles. Salmonella spp and Campylobacter spp were detected once in two different irrigation water samples out of nine from one producer. No pathogens were found in the vegetable samples. As expected, the procedure encompassing more washing cycles performed slightly better in terms of total mesophilic count (TMC) when comparing unprocessed and RTE vegetables of the same batch. However, data suggest that low-level chlorination may be insufficient in preventing microbial build-up in the washing equipment and/or batch-to batch cross-contamination.

  5. Simulation of OSCM Concepts for HQ SACT

    DTIC Science & Technology

    2007-06-01

    effective method for creating understanding, identifying problems and developing solutions. • Simulation of a goal driven organization is a cost...effective method to visualize some aspects of the problem space Toolbox • The team used Extend™, a COTS product from Imagine That!® (http...Nations flow Model OSCM ATARES flow Batching A/C & Pallets Model ISAF Airbridge flow Flying and unbatching A/C Fleet Create resources Calculate flight

  6. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices...) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550, you...

  7. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices...) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550, you...

  8. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  9. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  10. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  11. 40 CFR 63.488 - Methods and procedures for batch front-end process vent group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... primary condenser recovering monomer, reaction products, by-products, or solvent from a stripper operated in batch mode, and the primary condenser recovering monomer, reaction products, by-products, or...

  12. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  13. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  14. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    PubMed Central

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  15. Work flow analysis of around-the-clock processing of blood culture samples and integrated MALDI-TOF mass spectrometry analysis for the diagnosis of bloodstream infections.

    PubMed

    Schneiderhan, Wilhelm; Grundt, Alexander; Wörner, Stefan; Findeisen, Peter; Neumaier, Michael

    2013-11-01

    Because sepsis has a high mortality rate, rapid microbiological diagnosis is required to enable efficient therapy. The effectiveness of MALDI-TOF mass spectrometry (MALDI-TOF MS) analysis in reducing turnaround times (TATs) for blood culture (BC) pathogen identification when available in a 24-h hospital setting has not been determined. On the basis of data from a total number of 912 positive BCs collected within 140 consecutive days and work flow analyses of laboratory diagnostics, we evaluated different models to assess the TATs for batch-wise and for immediate response (real-time) MALDI-TOF MS pathogen identification of positive BC results during the night shifts. The results were compared to TATs from routine BC processing and biochemical identification performed during regular working hours. Continuous BC incubation together with batch-wise MALDI-TOF MS analysis enabled significant reductions of up to 58.7 h in the mean TATs for the reporting of the bacterial species. The TAT of batch-wise MALDI-TOF MS analysis was inferior by a mean of 4.9 h when compared to the model of the immediate work flow under ideal conditions with no constraints in staff availability. Together with continuous cultivation of BC, the 24-h availability of MALDI-TOF MS can reduce the TAT for microbial pathogen identification within a routine clinical laboratory setting. Batch-wise testing of positive BC loses a few hours compared to real-time identification but is still far superior to classical BC processing. Larger prospective studies are required to evaluate the contribution of rapid around-the-clock pathogen identification to medical decision-making for septicemic patients.

  16. Pharmaceutical Product Lead Optimization for Better In vivo Bioequivalence Performance: A case study of Diclofenac Sodium Extended Release Matrix Tablets.

    PubMed

    Shahiwala, Aliasgar; Zarar, Aisha

    2018-01-01

    In order to prove the validity of a new formulation, a considerable amount of effort is required to study bioequivalence, which not only increases the burden of carrying out a number of bioequivalence studies but also eventually increases the cost of the optimization process. The aim of the present study was to develop sustained release matrix tablets containing diclofenac sodium using natural polymers and to demonstrate step by step process of product development till the prediction of in vivo marketed product equivalence of the developed product. Different batches of tablets were prepared by direct compression. In vitro drug release studies were performed as per USP. The drug release data were assessed using model-dependent, modelindependent and convolution approaches. Drug release profiles showed that extended release action were in the following order: Gum Tragacanth > Sodium Alginate > Gum Acacia. Amongst the different batches prepared, only F1 and F8 passed the USP criteria of drug release. Developed formulas were found to fit Higuchi kinetics model with Fickian (case I) diffusion-mediated release mechanism. Model- independent kinetics confirmed that total of four batches were passed depending on the similarity factors based on the comparison with the marketed Diclofenac. The results of in vivo predictive convolution model indicated that predicted AUC, Cmax and Tmax values for batch F8 were similar to that of marketed product. This study provides simple yet effective outline of pharmaceutical product development process that will minimize the formulation development trials and maximize the product success in bioequivalence studies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  17. Optical Tracking Data Validation and Orbit Estimation for Sparse Observations of Satellites by the OWL-Net.

    PubMed

    Choi, Jin; Jo, Jung Hyun; Yim, Hong-Suh; Choi, Eun-Jung; Cho, Sungki; Park, Jang-Hyun

    2018-06-07

    An Optical Wide-field patroL-Network (OWL-Net) has been developed for maintaining Korean low Earth orbit (LEO) satellites' orbital ephemeris. The OWL-Net consists of five optical tracking stations. Brightness signals of reflected sunlight of the targets were detected by a charged coupled device (CCD). A chopper system was adopted for fast astrometric data sampling, maximum 50 Hz, within a short observation time. The astrometric accuracy of the optical observation data was validated with precise orbital ephemeris such as Consolidated Prediction File (CPF) data and precise orbit determination result with onboard Global Positioning System (GPS) data from the target satellite. In the optical observation simulation of the OWL-Net for 2017, an average observation span for a single arc of 11 LEO observation targets was about 5 min, while an average optical observation separation time was 5 h. We estimated the position and velocity with an atmospheric drag coefficient of LEO observation targets using a sequential-batch orbit estimation technique after multi-arc batch orbit estimation. Post-fit residuals for the multi-arc batch orbit estimation and sequential-batch orbit estimation were analyzed for the optical measurements and reference orbit (CPF and GPS data). The post-fit residuals with reference show few tens-of-meters errors for in-track direction for multi-arc batch and sequential-batch orbit estimation results.

  18. Framework for developing hybrid process-driven, artificial neural network and regression models for salinity prediction in river systems

    NASA Astrophysics Data System (ADS)

    Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.

    2018-05-01

    Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.

  19. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  20. Sludge Washing and Demonstration of the DWPF Nitric/Formic Flowsheet in the SRNL Shielded Cells for Sludge Batch 9 Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pareizs, J.; Newell, D.; Martino, C.

    Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to qualify the next batch of sludge – Sludge Batch 9 (SB9). Current practice is to prepare sludge batches in Tank 51 by transferring sludge to Tank 51 from other tanks. The sludge is washed and transferred to Tank 40, the current Defense Waste Process Facility (DWPF) feed tank. Prior to sludge transfer from Tank 51 to Tank 40, the Tank 51 sludge must be qualified. SRNL qualifies the sludge in multiple steps. First, a Tank 51 sample is received, then characterized, washed, and again characterized. SRNL thenmore » demonstrates the DWPF Chemical Process Cell (CPC) flowsheet with the sludge. The final step of qualification involves chemical durability measurements of glass fabricated in the DWPF CPC demonstrations. In past sludge batches, SRNL had completed the DWPF demonstration with Tank 51 sludge. For SB9, SRNL has been requested to process a blend of Tank 51 and Tank 40 at a targeted ratio of 44% Tank 51 and 56% Tank 40 on an insoluble solids basis.« less

  1. Electrochemical oxidation of ampicillin antibiotic at boron-doped diamond electrodes and process optimization using response surface methodology.

    PubMed

    Körbahti, Bahadır K; Taşyürek, Selin

    2015-03-01

    Electrochemical oxidation and process optimization of ampicillin antibiotic at boron-doped diamond electrodes (BDD) were investigated in a batch electrochemical reactor. The influence of operating parameters, such as ampicillin concentration, electrolyte concentration, current density, and reaction temperature, on ampicillin removal, COD removal, and energy consumption was analyzed in order to optimize the electrochemical oxidation process under specified cost-driven constraints using response surface methodology. Quadratic models for the responses satisfied the assumptions of the analysis of variance well according to normal probability, studentized residuals, and outlier t residual plots. Residual plots followed a normal distribution, and outlier t values indicated that the approximations of the fitted models to the quadratic response surfaces were very good. Optimum operating conditions were determined at 618 mg/L ampicillin concentration, 3.6 g/L electrolyte concentration, 13.4 mA/cm(2) current density, and 36 °C reaction temperature. Under response surface optimized conditions, ampicillin removal, COD removal, and energy consumption were obtained as 97.1 %, 92.5 %, and 71.7 kWh/kg CODr, respectively.

  2. Request queues for interactive clients in a shared file system of a parallel computing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue;more » and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.« less

  3. High-concentration sugars production from corn stover based on combined pretreatments and fed-batch process.

    PubMed

    Yang, Maohua; Li, Wangliang; Liu, Binbin; Li, Qiang; Xing, Jianmin

    2010-07-01

    In this paper, high-concentration sugars were produced from pretreated corn stover. The raw corn stover was pretreated in a process combining steam explosion and alkaline hydrogen-peroxide. The hemicellulose and lignin were removed greatly. The cellulose content increased to 73.2%. Fed-batch enzymatic hydrolysis was initiated with 12% (w/v) solids loading and 20 FPU/g solids. Then, 6% solids were fed consecutively at 12, 36 and 60 h. After 144 h, the final concentrations of reducing sugar, glucose, cellobiose and xylose reached 220, 175, 22 and 20 g/L, respectively. The final total biomass conversion was 60% in fed-batch process. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  5. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  6. Dynamic Resource Allocation to Improve Service Performance in Order Fulfillment Systems

    DTIC Science & Technology

    2009-01-01

    efficient system uses economies of scale at two points: orders are batched before processing, which reduces processing costs, and processed or- ders ...the ef- fects of batching on order picking processes is well-researched and well-understood ( van den Berg and Gademann, 1999). Because orders are...a final so- journ time distribution. Our work builds on existing research in matrix-geometric methods by Neuts (1981), Asmussen and M0ller (2001

  7. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    PubMed Central

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854

  8. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...

  9. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...

  10. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...

  11. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... paragraph (b)(5) of this section. Engineering assessment may be used to estimate emissions from a batch... defined in paragraph (b)(5) of this section, through engineering assessment, as defined in paragraph (b)(6...

  12. Fermentation of Saccharomyces cerevisiae - Combining kinetic modeling and optimization techniques points out avenues to effective process design.

    PubMed

    Scheiblauer, Johannes; Scheiner, Stefan; Joksch, Martin; Kavsek, Barbara

    2018-09-14

    A combined experimental/theoretical approach is presented, for improving the predictability of Saccharomyces cerevisiae fermentations. In particular, a mathematical model was developed explicitly taking into account the main mechanisms of the fermentation process, allowing for continuous computation of key process variables, including the biomass concentration and the respiratory quotient (RQ). For model calibration and experimental validation, batch and fed-batch fermentations were carried out. Comparison of the model-predicted biomass concentrations and RQ developments with the corresponding experimentally recorded values shows a remarkably good agreement for both batch and fed-batch processes, confirming the adequacy of the model. Furthermore, sensitivity studies were performed, in order to identify model parameters whose variations have significant effects on the model predictions: our model responds with significant sensitivity to the variations of only six parameters. These studies provide a valuable basis for model reduction, as also demonstrated in this paper. Finally, optimization-based parametric studies demonstrate how our model can be utilized for improving the efficiency of Saccharomyces cerevisiae fermentations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Method and apparatus for melting glass batch

    DOEpatents

    Fassbender, Alexander G.; Walkup, Paul C.; Mudge, Lyle K.

    1988-01-01

    A glass melting system involving preheating, precalcining, and prefluxing of batch materials prior to injection into a glass furnace. The precursors are heated by convection rather than by radiation in present furnaces. Upon injection into the furnace, batch materials are intimately coated with molten flux so as to undergo or at least begin the process of dissolution reaction prior to entering the melt pool.

  14. Kinetics of sugars consumption and ethanol inhibition in carob pulp fermentation by Saccharomyces cerevisiae in batch and fed-batch cultures.

    PubMed

    Lima-Costa, Maria Emília; Tavares, Catarina; Raposo, Sara; Rodrigues, Brígida; Peinado, José M

    2012-05-01

    The waste materials from the carob processing industry are a potential resource for second-generation bioethanol production. These by-products are small carob kibbles with a high content of soluble sugars (45-50%). Batch and fed-batch Saccharomyces cerevisiae fermentations of high density sugar from carob pods were analyzed in terms of the kinetics of sugars consumption and ethanol inhibition. In all the batch runs, 90-95% of the total sugar was consumed and transformed into ethanol with a yield close to the theoretical maximum (0.47-0.50 g/g), and a final ethanol concentration of 100-110 g/l. In fed-batch runs, fresh carob extract was added when glucose had been consumed. This addition and the subsequent decrease of ethanol concentrations by dilution increased the final ethanol production up to 130 g/l. It seems that invertase activity and yeast tolerance to ethanol are the main factors to be controlled in carob fermentations. The efficiency of highly concentrated carob fermentation makes it a very promising process for use in a second-generation ethanol biorefinery.

  15. On-line monitoring of fluid bed granulation by photometric imaging.

    PubMed

    Soppela, Ira; Antikainen, Osmo; Sandler, Niklas; Yliruusi, Jouko

    2014-11-01

    This paper introduces and discusses a photometric surface imaging approach for on-line monitoring of fluid bed granulation. Five granule batches consisting of paracetamol and varying amounts of lactose and microcrystalline cellulose were manufactured with an instrumented fluid bed granulator. Photometric images and NIR spectra were continuously captured on-line and particle size information was extracted from them. Also key process parameters were recorded. The images provided direct real-time information on the growth, attrition and packing behaviour of the batches. Moreover, decreasing image brightness in the drying phase was found to indicate granule drying. The changes observed in the image data were also linked to the moisture and temperature profiles of the processes. Combined with complementary process analytical tools, photometric imaging opens up possibilities for improved real-time evaluation fluid bed granulation. Furthermore, images can give valuable insight into the behaviour of excipients or formulations during product development. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A model of strength

    USGS Publications Warehouse

    Johnson, Douglas H.; Cook, R.D.

    2013-01-01

    In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newell, J; Miller, D; Stone, M

    The Savannah River National Laboratory (SRNL) was tasked to provide an assessment of the downstream impacts to the Defense Waste Processing Facility (DWPF) of decisions regarding the implementation of Al-dissolution to support sludge mass reduction and processing. Based on future sludge batch compositional projections from the Liquid Waste Organization's (LWO) sludge batch plan, assessments have been made with respect to the ability to maintain comparable projected operating windows for sludges with and without Al-dissolution. As part of that previous assessment, candidate frits were identified to provide insight into melt rate for average sludge batches representing with and without Al-dissolution flowsheets.more » Initial melt rate studies using the melt rate furnace (MRF) were performed using five frits each for Cluster 2 and Cluster 4 compositions representing average without and with Al-dissolution. It was determined, however, that the REDOX endpoint (Fe{sup 2+}/{Sigma}Fe for the glass) for Clusters 2 and 4 resulted in an overly oxidized feed which negatively affected the initial melt rate tests. After the sludge was adjusted to a more reduced state, additional testing was performed with frits that contained both high and low concentrations of sodium and boron oxides. These frits were selected strictly based on the ability to ascertain compositional trends in melt rate and did not necessarily apply to any acceptability criteria for DWPF processing. The melt rate data are in general agreement with historical trends observed at SRNL and during processing of SB3 (Sludge Batch 3)and SB4 in DWPF. When MAR acceptability criteria were applied, Frit 510 was seen to have the highest melt rate at 0.67 in/hr for Cluster 2 (without Al-dissolution), which is compositionally similar to SB4. For Cluster 4 (with Al-dissolution), which is compositionally similar to SB3, Frit 418 had the highest melt rate at 0.63 in/hr. Based on this data, there appears to be a slight advantage of the Frit 510 based system without Al-dissolution relative to the Frit 418 based system with Al-dissolution. Though the without aluminum dissolution scenario suggests a slightly higher melt rate with frit 510, several points must be taken into consideration: (1) The MRF does not have the ability to assess liquid feeds and, thus, rheology impacts. Instead, the MRF is a 'static' test bed in which a mass of dried melter feed (SRAT product plus frit) is placed in an 'isothermal' furnace for a period of time to assess melt rate. These conditions, although historically effective in terms of identifying candidate frits for specific sludge batches and mapping out melt rate versus waste loading trends, do not allow for assessments of the potential impact of feed rheology on melt rate. That is, if the rheological properties of the slurried melter feed resulted in the mounding of the feed in the melter (i.e., the melter feed was thick and did not flow across the cold cap), melt rate and/or melter operations (i.e., surges) could be negatively impacted. This could affect one or both flowsheets. (2) Waste throughput factors were not determined for Frit 510 and Frit 418 over multiple waste loadings. In order to provide insight into the mission life versus canister count question, one needs to define the maximum waste throughput for both flowsheets. Due to funding limitations, the melt rate testing only evaluated melt rate at a fixed waste loading. (3) DWPF will be processing SB5 through their facility in mid-November 2008. Insight into the over arching questions of melt rate, waste throughput, and mission life can be obtained directly from the facility. It is recommended that processing of SB5 through the facility be monitored closely and that data be used as input into the decision making process on whether to implement Al-dissolution for future sludge batches.« less

  18. Results Of Initial Analyses Of The Salt (Macro) Batch 9 Tank 21H Qualification Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T.

    2015-10-08

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 9 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 9 composite sample indicates that the material does not display any unusual characteristics. Further results on the chemistry and other tests will be issued in the future.

  19. Approach to design space from retrospective quality data.

    PubMed

    Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon

    2016-01-01

    Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.

  20. Process performance of high-solids batch anaerobic digestion of sewage sludge.

    PubMed

    Liao, Xiaocong; Li, Huan; Cheng, Yingchao; Chen, Nan; Li, Chenchen; Yang, Yuning

    2014-01-01

    The characteristics of high-solids anaerobic digestion (AD) of sewage sludge were investigated by comparison with conventional low-solids processes. A series of batch experiments were conducted under mesophilic condition and the initial solid contents were controlled at four levels of 1.79%, 4.47%, 10.28% and 15.67%. During these experiments, biogas production, organic degradation and intermediate products were monitored. The results verified that high-solids batch AD of sewage sludge was feasible. Compared with the low-solids AD with solid contents of 1.79% or 4.47%, the high-solids processes decreased the specific biogas yield per gram of sludge volatile solids slightly, achieved the same organic degradation rate of about 40% within extended degradation time, but increased the volumetric biogas production rate and the treatment capability of digesters significantly. The blocked mass and energy transfer, the low substrate to inoculum rate and the excessive cumulative free ammonia were the main factors impacting the performance of high-solids batch AD.

  1. Systematic optimization of fed-batch simultaneous saccharification and fermentation at high-solid loading based on enzymatic hydrolysis and dynamic metabolic modeling of Saccharomyces cerevisiae.

    PubMed

    Unrean, Pornkamol; Khajeeram, Sutamat; Laoteng, Kobkul

    2016-03-01

    An integrative simultaneous saccharification and fermentation (SSF) modeling is a useful guiding tool for rapid process optimization to meet the techno-economic requirement of industrial-scale lignocellulosic ethanol production. In this work, we have developed the SSF model composing of a metabolic network of a Saccharomyces cerevisiae cell associated with fermentation kinetics and enzyme hydrolysis model to quantitatively capture dynamic responses of yeast cell growth and fermentation during SSF. By using model-based design of feeding profiles for substrate and yeast cell in the fed-batch SSF process, an efficient ethanol production with high titer of up to 65 g/L and high yield of 85 % of theoretical yield was accomplished. The ethanol titer and productivity was increased by 47 and 41 %, correspondingly, in optimized fed-batch SSF as compared to batch process. The developed integrative SSF model is, therefore, considered as a promising approach for systematic design of economical and sustainable SSF bioprocessing of lignocellulose.

  2. Adsorptive Removal of Cadmium (II) from Aqueous Solution by Multi-Carboxylic-Functionalized Silica Gel: Equilibrium, Kinetics and Thermodynamics

    NASA Astrophysics Data System (ADS)

    Li, Min; Meng, Xiaojing; Yuan, Jinhai; Deng, Wenwen; Liang, Xiuke

    2018-01-01

    In the present study, the adsorption behavior of cadmium (II) ion from aqueous solution onto multi-carboxylic-functionalized silica gel (SG-MCF) has been investigated in detail by means of batch and column experiments. Batch experiments were performed to evaluate the effects of various experimental parameters such as pH value, contact time and initial concentration on adsorption capacity of cadmium (II) ion. The kinetic data were analyzed on the basis of the pseudo-first-order kinetic and the pseudo-second-order kinetic models and consequently, the pseudo-second-order kinetic can better describe the adsorption process than the pseudo-first-order kinetic model. Equilibrium isotherms for the adsorption of cadmium (II) ion were analyzed by Freundlich and Langmuir isotherm models, the results indicate that Langmuir isotherm model was found to be credible to express the data for cadmium (II) ion from aqueous solution onto the SG-MCF. Various thermodynamics parameters of the adsorption process, including free energy of adsorption (ΔG0 ), the enthalpy of adsorption (ΔH0 ) and standard entropy changes (ΔS0 ), were calculated to predict the nature of adsorption. The positive value of the enthalpy change and the negative value of free energy change indicate that the process is endothermic and spontaneous process.

  3. Improved Mannanase Production from Penicillium occitanis by Fed-Batch Fermentation Using Acacia Seeds

    PubMed Central

    Blibech, Monia; Ellouz Ghorbel, Raoudha; Chaari, Fatma; Dammak, Ilyes; Bhiri, Fatma; Neifar, Mohamed; Ellouz Chaabouni, Semia

    2011-01-01

    By applying a fed-batch strategy, production of Penicillium occitanis mannanases could be almost doubled as compared to a batch cultivation on acacia seeds (76 versus 41 U/mL). Also, a 10-fold increase of enzyme activities was observed from shake flask fermentation to the fed-batch fermentation. These production levels were 3-fold higher than those obtained on coconut meal. The high mannanase production using acacia seeds powder as inducer substrate showed the suitability of this culture process for industrial-scale development. PMID:23724314

  4. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings per inch (.28 openings per centimeter)] and is used in conjunction with the press-piston head. Supporting equipment includes a soy-milk heat exchanger for maintaining selected coagulation temperatures, a filter system for separating okara from other particulate matter and from soy milk, two pumps, and various thermocouples, flowmeters, level indicators, pressure sensors, valves, tubes, and sample ports

  5. 40 CFR 420.91 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... wire, rods, and tubes in discrete batches or bundles. (f) The term continuous means those pickling operations which process steel products other than in discrete batches or bundles. (g) The term acid recovery...

  6. Human serum albumin (HSA) nanoparticles: reproducibility of preparation process and kinetics of enzymatic degradation.

    PubMed

    Langer, K; Anhorn, M G; Steinhauser, I; Dreis, S; Celebi, D; Schrickel, N; Faust, S; Vogel, V

    2008-01-22

    Nanoparticles prepared from human serum albumin (HSA) are versatile carrier systems for drug delivery and can be prepared by an established desolvation process. A reproducible process with a low batch-to-batch variability is required for transfer from the lab to an industrial production. In the present study the batch-to-batch variability of the starting material HSA on the preparation of nanoparticles was investigated. HSA can build dimers and higher aggregates because of a free thiol group present in the molecule. Therefore, the quality of different HSA batches was analysed by size exclusion chromatography (SEC) and analytical ultracentrifugation (AUC). The amount of dimerised HSA detected by SEC did not affect particle preparation. Higher aggregates of the protein detected in two batches by AUC disturbed nanoparticle formation at pH values below 8.0. At pH 8.0 and above monodisperse particles between 200 and 300 nm could be prepared with all batches, with higher pH values leading to smaller particles. Besides human derived albumin a particle preparation was also feasible based on recombinant human serum albumin (rHSA). Under comparable preparation conditions monodisperse nanoparticles could be achieved and the same effects of protein aggregates on particle formation were observed. For nanoparticulate drug delivery systems the enzymatic degradation is a crucial parameter for the release of an embedded drug. For this reason, besides the particle preparation process, particle degradation in the presence of different enzymes was studied. Under acidic conditions HSA as well as rHSA nanoparticles could be digested by pepsin and cathepsin B. At neutral pH trypsin, proteinase K, and protease were suitable for particle degradation. It could be shown that the kinetics of particle degradation was dependent on the degree of particle stabilisation. Therefore, the degree of particle stabilisation will influence drug release after cellular accumulation of HSA nanoparticles.

  7. Pc as Physics Computer for Lhc ?

    NASA Astrophysics Data System (ADS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  8. Exploiting the metabolism of PYC expressing HEK293 cells in fed-batch cultures.

    PubMed

    Vallée, Cédric; Durocher, Yves; Henry, Olivier

    2014-01-01

    The expression of recombinant yeast pyruvate carboxylase (PYC) in animal cell lines was shown in previous studies to reduce significantly the formation of waste metabolites, although it has translated into mixed results in terms of improved cellular growth and productivity. In this work, we demonstrate that the unique phenotype of PYC expressing cells can be exploited through the application of a dynamic fed-batch strategy and lead to significant process enhancements. Metabolically engineered HEK293 cells stably producing human recombinant IFNα2b and expressing the PYC enzyme were cultured in batch and fed-batch modes. Compared to parental cells, the maximum cell density in batch was increased 1.5-fold and the culture duration was extended by 2.5 days, but the product yield was only marginally increased. Further improvements were achieved by developing and implementing a dynamic fed-batch strategy using a concentrated feed solution. The feeding was based on an automatic control-loop to maintain a constant glucose concentration. This strategy led to a further 2-fold increase in maximum cell density (up to 10.7×10(6)cells/ml) and a final product titer of 160mg/l, representing nearly a 3-fold yield increase compared to the batch process with the parental cell clone. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A new experimental design method to optimize formulations focusing on a lubricant for hydrophilic matrix tablets.

    PubMed

    Choi, Du Hyung; Shin, Sangmun; Khoa Viet Truong, Nguyen; Jeong, Seong Hoon

    2012-09-01

    A robust experimental design method was developed with the well-established response surface methodology and time series modeling to facilitate the formulation development process with magnesium stearate incorporated into hydrophilic matrix tablets. Two directional analyses and a time-oriented model were utilized to optimize the experimental responses. Evaluations of tablet gelation and drug release were conducted with two factors x₁ and x₂: one was a formulation factor (the amount of magnesium stearate) and the other was a processing factor (mixing time), respectively. Moreover, different batch sizes (100 and 500 tablet batches) were also evaluated to investigate an effect of batch size. The selected input control factors were arranged in a mixture simplex lattice design with 13 experimental runs. The obtained optimal settings of magnesium stearate for gelation were 0.46 g, 2.76 min (mixing time) for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The optimal settings for drug release were 0.33 g, 7.99 min for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The exact ratio and mixing time of magnesium stearate could be formulated according to the resulting hydrophilic matrix tablet properties. The newly designed experimental method provided very useful information for characterizing significant factors and hence to obtain optimum formulations allowing for a systematic and reliable experimental design method.

  10. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  11. Perfusion seed cultures improve biopharmaceutical fed-batch production capacity and product quality.

    PubMed

    Yang, William C; Lu, Jiuyi; Kwiatkowski, Chris; Yuan, Hang; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2014-01-01

    Volumetric productivity and product quality are two key performance indicators for any biopharmaceutical cell culture process. In this work, we showed proof-of-concept for improving both through the use of alternating tangential flow perfusion seed cultures coupled with high-seed fed-batch production cultures. First, we optimized the perfusion N-1 stage, the seed train bioreactor stage immediately prior to the production bioreactor stage, to minimize the consumption of perfusion media for one CHO cell line and then successfully applied the optimized perfusion process to a different CHO cell line. Exponential growth was observed throughout the N-1 duration, reaching >40 × 10(6) vc/mL at the end of the perfusion N-1 stage. The cultures were subsequently split into high-seed (10 × 10(6) vc/mL) fed-batch production cultures. This strategy significantly shortened the culture duration. The high-seed fed-batch production processes for cell lines A and B reached 5 g/L titer in 12 days, while their respective low-seed processes reached the same titer in 17 days. The shortened production culture duration potentially generates a 30% increase in manufacturing capacity while yielding comparable product quality. When perfusion N-1 and high-seed fed-batch production were applied to cell line C, higher levels of the active protein were obtained, compared to the low-seed process. This, combined with correspondingly lower levels of the inactive species, can enhance the overall process yield for the active species. Using three different CHO cell lines, we showed that perfusion seed cultures can optimize capacity utilization and improve process efficiency by increasing volumetric productivity while maintaining or improving product quality. © 2014 American Institute of Chemical Engineers.

  12. Evaluation of enzymatic reactors for large-scale panose production.

    PubMed

    Fernandes, Fabiano A N; Rodrigues, Sueli

    2007-07-01

    Panose is a trisaccharide constituted by a maltose molecule bonded to a glucose molecule by an alpha-1,6-glycosidic bond. This trisaccharide has potential to be used in the food industry as a noncariogenic sweetener, as the oral flora does not ferment it. Panose can also be considered prebiotic for stimulating the growth of benefic microorganisms, such as lactobacillus and bifidobacteria, and for inhibiting the growth of undesired microorganisms such as E. coli and Salmonella. In this paper, the production of panose by enzymatic synthesis in a batch and a fed-batch reactor was optimized using a mathematical model developed to simulate the process. Results show that optimum production is obtained in a fed-batch process with an optimum production of 11.23 g/l h of panose, which is 51.5% higher than production with batch reactor.

  13. Removal of batch effects using distribution-matching residual networks.

    PubMed

    Shaham, Uri; Stanton, Kelly P; Zhao, Jun; Li, Huamin; Raddassi, Khadir; Montgomery, Ruth; Kluger, Yuval

    2017-08-15

    Sources of variability in experimentally derived data include measurement error in addition to the physical phenomena of interest. This measurement error is a combination of systematic components, originating from the measuring instrument and random measurement errors. Several novel biological technologies, such as mass cytometry and single-cell RNA-seq (scRNA-seq), are plagued with systematic errors that may severely affect statistical analysis if the data are not properly calibrated. We propose a novel deep learning approach for removing systematic batch effects. Our method is based on a residual neural network, trained to minimize the Maximum Mean Discrepancy between the multivariate distributions of two replicates, measured in different batches. We apply our method to mass cytometry and scRNA-seq datasets, and demonstrate that it effectively attenuates batch effects. our codes and data are publicly available at https://github.com/ushaham/BatchEffectRemoval.git. yuval.kluger@yale.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. A KINETIC MODEL FOR H2O2/UV PROCESS IN A COMPLETELY MIXED BATCH REACTOR. (R825370C076)

    EPA Science Inventory

    A dynamic kinetic model for the advanced oxidation process (AOP) using hydrogen peroxide and ultraviolet irradiation (H2O2/UV) in a completely mixed batch reactor (CMBR) is developed. The model includes the known elementary chemical and photochemical reac...

  15. 40 CFR Table 2 to Subpart Sssss of... - Operating Limits

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... activity level of the catalyst at least every 12 months. 7. Each affected batch process unit For each batch... activity level of the catalyst at least every 12 months. 10. Each new kiln that is used to process clay... across the DLA at or above the minimum levels established during the most recent performance test; and b...

  16. 40 CFR Table 2 to Subpart Sssss of... - Operating Limits

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... activity level of the catalyst at least every 12 months. 7. Each affected batch process unit For each batch... activity level of the catalyst at least every 12 months. 10. Each new kiln that is used to process clay... across the DLA at or above the minimum levels established during the most recent performance test; and b...

  17. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF... Group 1 batch process vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all... a flare); or Not applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of...

  18. Case Studies of Auditing in a Computer-Based Systems Environment.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    In response to a growing need for effective and efficient means for auditing computer-based systems, a number of studies dealing primarily with batch-processing type computer operations have been conducted to explore the impact of computers on auditing activities in the Federal Government. This report first presents some statistical data on…

  19. Modeling of feed-forward control using the partial least squares regression method in the tablet compression process.

    PubMed

    Hattori, Yusuke; Otsuka, Makoto

    2017-05-30

    In the pharmaceutical industry, the implementation of continuous manufacturing has been widely promoted in lieu of the traditional batch manufacturing approach. More specially, in recent years, the innovative concept of feed-forward control has been introduced in relation to process analytical technology. In the present study, we successfully developed a feed-forward control model for the tablet compression process by integrating data obtained from near-infrared (NIR) spectra and the physical properties of granules. In the pharmaceutical industry, batch manufacturing routinely allows for the preparation of granules with the desired properties through the manual control of process parameters. On the other hand, continuous manufacturing demands the automatic determination of these process parameters. Here, we proposed the development of a control model using the partial least squares regression (PLSR) method. The most significant feature of this method is the use of dataset integrating both the NIR spectra and the physical properties of the granules. Using our model, we determined that the properties of products, such as tablet weight and thickness, need to be included as independent variables in the PLSR analysis in order to predict unknown process parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. High Solid Fed-batch Butanol Fermentation with Simultaneous Product Recovery: Part II - Process Integration.

    PubMed

    Qureshi, Nasib; Klasson, K Thomas; Saha, Badal C; Liu, Siqing

    2018-04-25

    In these studies liquid hot water (LHW) pretreated and enzymatically hydrolyzed Sweet Sorghum Bagasse (SSB) hydrolyzates were fermented in a fed-batch reactor. As reported in the preceding paper, the culture was not able to ferment the hydrolyzate I in a batch process due to presence of high level of toxic chemicals, in particular acetic acid released from SSB during the hydrolytic process. To be able to ferment the hydrolyzate I obtained from 250 gL -1 SSB hydrolysis, a fed-batch reactor with in-situ butanol recovery was devised. The process was started with the hydrolyzate II and when good cell growth and vigorous fermentation were observed, the hydrolyzate I was slowly fed to the reactor. In this manner the culture was able to ferment all the sugars present in both the hydrolyzates to acetone butanol ethanol (ABE). In a control batch reactor in which ABE was produced from glucose, ABE productivity and yield of 0.42 gL -1 h -1 and 0.36 were obtained, respectively. In the fed-batch reactor fed with SSB hydrolyzates these productivity and yield values were 0.44 gL -1 h -1 and 0.45, respectively. ABE yield in the integrated system was high due to utilization of acetic acid to convert to ABE. In summary we were able to utilize both the hydrolyzates obtained from LHW pretreated and enzymatically hydrolyzed SSB (250 gL -1 ) and convert them to ABE. Complete fermentation was possible due to simultaneous recovery of ABE by vacuum. This article is protected by copyright. All rights reserved. © 2018 American Institute of Chemical Engineers.

  1. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  2. Catalytic wet oxidation of phenol in a trickle bed reactor over a Pt/TiO2 catalyst.

    PubMed

    Maugans, Clayton B; Akgerman, Aydin

    2003-01-01

    Catalytic wet oxidation of phenol was studied in a batch and a trickle bed reactor using 4.45% Pt/TiO2 catalyst in the temperature range 150-205 degrees C. Kinetic data were obtained from batch reactor studies and used to model the reaction kinetics for phenol disappearance and for total organic carbon disappearance. Trickle bed experiments were then performed to generate data from a heterogeneous flow reactor. Catalyst deactivation was observed in the trickle bed reactor, although the exact cause was not determined. Deactivation was observed to linearly increase with the cumulative amount of phenol that had passed over the catalyst bed. Trickle bed reactor modeling was performed using a three-phase heterogeneous model. Model parameters were determined from literature correlations, batch derived kinetic data, and trickle bed derived catalyst deactivation data. The model equations were solved using orthogonal collocations on finite elements. Trickle bed performance was successfully predicted using the batch derived kinetic model and the three-phase reactor model. Thus, using the kinetics determined from limited data in the batch mode, it is possible to predict continuous flow multiphase reactor performance.

  3. Modeling of Fusarium redolens Dzf2 mycelial growth kinetics and optimal fed-batch fermentation for beauvericin production.

    PubMed

    Xu, Li-Jian; Liu, Yuan-Shuai; Zhou, Li-Gang; Wu, Jian-Yong

    2011-09-01

    Beauvericin (BEA) is a cyclic hexadepsipeptide mycotoxin with notable phytotoxic and insecticidal activities. Fusarium redolens Dzf2 is a highly BEA-producing fungus isolated from a medicinal plant. The aim of the current study was to develop a simple and valid kinetic model for F. redolens Dzf2 mycelial growth and the optimal fed-batch operation for efficient BEA production. A modified Monod model with substrate (glucose) and product (BEA) inhibition was constructed based on the culture characteristics of F. redolens Dzf2 mycelia in a liquid medium. Model parameters were derived by simulation of the experimental data from batch culture. The model fitted closely with the experimental data over 20-50 g l(-1) glucose concentration range in batch fermentation. The kinetic model together with the stoichiometric relationships for biomass, substrate and product was applied to predict the optimal feeding scheme for fed-batch fermentation, leading to 54% higher BEA yield (299 mg l(-1)) than in the batch culture (194 mg l(-1)). The modified Monod model incorporating substrate and product inhibition was proven adequate for describing the growth kinetics of F. redolens Dzf2 mycelial culture at suitable but not excessive initial glucose levels in batch and fed-batch cultures.

  4. SEAPAK user's guide, version 2.0. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.

  5. The synthesis of cadmium sulfide nanoplatelets using a novel continuous flow sonochemical reactor

    DOE PAGES

    Palanisamy, Barath; Paul, Brian; Chang, Chih -hung

    2015-01-21

    A continuous flow sonochemical reactor was developed capable of producing metastable cadmium sulfide (CdS) nanoplatelets with thicknesses at or below 10 nm. The continuous flow sonochemical reactor included the passive in-line micromixing of reagents prior to sonochemical reaction. Synthesis results were compared with those from reactors involving batch conventional heating and batch ultrasound-induced heating. The continuous sonochemical synthesis was found to result in high aspect ratio hexagonal platelets of CdS possessing cubic crystal structures with thicknesses well below 10 nm. The unique shape and crystal structure of the nanoplatelets are suggestive of high localized temperatures within the sonochemical process. Asmore » a result, the particle size uniformity and product throughput are much higher for the continuous sonochemical process in comparison to the batch sonochemical process and conventional synthesis processes.« less

  6. Characterization Of The As-Received Sludge Batch 9 Qualification Sample (Htf-51-15-81)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pareizs, J.

    Savannah River National Laboratory (SRNL) personnel have been requested to qualify the next sludge batch (Sludge Batch 9 – SB9) for processing at the Defense Waste Processing Facility (DWPF). To accomplish this task, Savannah River Remediation (SRR) has sent SRNL a 3-L slurried sample of Tank 51H (HTF-51-15-81) to be characterized, washed, and then used in a lab-scale demonstration of the DWPF flowsheet (potentially after combining with Tank 40H sludge). This report documents the first steps of the qualification process – characterization of the as-received Tank 51H qualification sample. These results will be used to support a reprojection of SB9more » by SRR from which final Tank 51H washing, frit development, and Chemical Processing Cell (CPC) activities will be based.« less

  7. Production and purification of an untagged recombinant pneumococcal surface protein A (PspA4Pro) with high-purity and low endotoxin content.

    PubMed

    Figueiredo, Douglas B; Carvalho, Eneas; Santos, Mauricio P; Kraschowetz, Stefanie; Zanardo, Rafaela T; Campani, Gilson; Silva, Gabriel G; Sargo, Cíntia R; Horta, Antonio Carlos L; de C Giordano, Roberto; Miyaji, Eliane N; Zangirolami, Teresa C; Cabrera-Crespo, Joaquin; Gonçalves, Viviane Maimoni

    2017-03-01

    Streptococcus pneumoniae is the main cause of pneumonia, meningitis, and other conditions that kill thousands of children every year worldwide. The replacement of pneumococcal serotypes among the vaccinated population has evidenced the need for new vaccines with broader coverage and driven the research for protein-based vaccines. Pneumococcal surface protein A (PspA) protects S. pneumoniae from the bactericidal effect of human apolactoferrin and prevents complement deposition. Several studies indicate that PspA is a very promising target for novel vaccine formulations. Here we describe a production and purification process for an untagged recombinant fragment of PspA from clade 4 (PspA4Pro), which has been shown to be cross-reactive with several PspA variants. PspA4Pro was obtained using lactose as inducer in Phytone auto-induction batch or glycerol limited fed-batch in 5-L bioreactor. The purification process includes two novel steps: (i) clarification using a cationic detergent to precipitate contaminant proteins, nucleic acids, and other negatively charged molecules as the lipopolysaccharide, which is the major endotoxin; and (ii) cryoprecipitation that eliminates aggregates and contaminants, which precipitate at -20 °C and pH 4.0, leaving PspA4Pro in the supernatant. The final process consisted of cell rupture in a continuous high-pressure homogenizer, clarification, anion exchange chromatography, cryoprecipitation, and cation exchange chromatography. This process avoided costly tag removal steps and recovered 35.3 ± 2.5% of PspA4Pro with 97.8 ± 0.36% purity and reduced endotoxin concentration by >99.9%. Circular dichroism and lactoferrin binding assay showed that PspA4Pro secondary structure and biological activity were preserved after purification and remained stable in a wide range of temperatures and pH values.

  8. Establishing column batch repeatability according to Quality by Design (QbD) principles using modeling software.

    PubMed

    Rácz, Norbert; Kormány, Róbert; Fekete, Jenő; Molnár, Imre

    2015-04-10

    Column technology needs further improvement even today. To get information of batch-to-batch repeatability, intelligent modeling software was applied. Twelve columns from the same production process, but from different batches were compared in this work. In this paper, the retention parameters of these columns with real life sample solutes were studied. The following parameters were selected for measurements: gradient time, temperature and pH. Based on calculated results, batch-to-batch repeatability of BEH columns was evaluated. Two parallel measurements on two columns from the same batch were performed to obtain information about the quality of packing. Calculating the average of individual working points at the highest critical resolution (R(s,crit)) it was found that the robustness, calculated with a newly released robustness module, had a success rate >98% among the predicted 3(6) = 729 experiments for all 12 columns. With the help of retention modeling all substances could be separated independently from the batch and/or packing, using the same conditions, having high robustness of the experiments. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Adaptation to high throughput batch chromatography enhances multivariate screening.

    PubMed

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Continuous Cellulosic Bioethanol Fermentation by Cyclic Fed-Batch Cocultivation

    PubMed Central

    Jiang, He-Long; He, Qiang; He, Zhili; Hemme, Christopher L.; Wu, Liyou

    2013-01-01

    Cocultivation of cellulolytic and saccharolytic microbial populations is a promising strategy to improve bioethanol production from the fermentation of recalcitrant cellulosic materials. Earlier studies have demonstrated the effectiveness of cocultivation in enhancing ethanolic fermentation of cellulose in batch fermentation. To further enhance process efficiency, a semicontinuous cyclic fed-batch fermentor configuration was evaluated for its potential in enhancing the efficiency of cellulose fermentation using cocultivation. Cocultures of cellulolytic Clostridium thermocellum LQRI and saccharolytic Thermoanaerobacter pseudethanolicus strain X514 were tested in the semicontinuous fermentor as a model system. Initial cellulose concentration and pH were identified as the key process parameters controlling cellulose fermentation performance in the fixed-volume cyclic fed-batch coculture system. At an initial cellulose concentration of 40 g liter−1, the concentration of ethanol produced with pH control was 4.5-fold higher than that without pH control. It was also found that efficient cellulosic bioethanol production by cocultivation was sustained in the semicontinuous configuration, with bioethanol production reaching 474 mM in 96 h with an initial cellulose concentration of 80 g liter−1 and pH controlled at 6.5 to 6.8. These results suggested the advantages of the cyclic fed-batch process for cellulosic bioethanol fermentation by the cocultures. PMID:23275517

  11. 40 CFR 63.486 - Batch front-end process vent provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... National Emission Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.486... paragraph (b) of this section, owners and operators of new and existing affected sources with batch front...

  12. Genetic programming assisted stochastic optimization strategies for optimization of glucose to gluconic acid fermentation.

    PubMed

    Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D

    2002-01-01

    This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.

  13. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  14. PanDA for ATLAS distributed computing in the next decade

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, F. H.; De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The Production and Distributed Analysis (PanDA) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at the Large Hadron Collider (LHC) data processing scale. Heterogeneous resources used by the ATLAS experiment are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, dozens of scientific applications are supported, while data processing requires more than a few billion hours of computing usage per year. PanDA performed very well over the last decade including the LHC Run 1 data taking period. However, it was decided to upgrade the whole system concurrently with the LHC’s first long shutdown in order to cope with rapidly changing computing infrastructure. After two years of reengineering efforts, PanDA has embedded capabilities for fully dynamic and flexible workload management. The static batch job paradigm was discarded in favor of a more automated and scalable model. Workloads are dynamically tailored for optimal usage of resources, with the brokerage taking network traffic and forecasts into account. Computing resources are partitioned based on dynamic knowledge of their status and characteristics. The pilot has been re-factored around a plugin structure for easier development and deployment. Bookkeeping is handled with both coarse and fine granularities for efficient utilization of pledged or opportunistic resources. An in-house security mechanism authenticates the pilot and data management services in off-grid environments such as volunteer computing and private local clusters. The PanDA monitor has been extensively optimized for performance and extended with analytics to provide aggregated summaries of the system as well as drill-down to operational details. There are as well many other challenges planned or recently implemented, and adoption by non-LHC experiments such as bioinformatics groups successfully running Paleomix (microbial genome and metagenomes) payload on supercomputers. In this paper we will focus on the new and planned features that are most important to the next decade of distributed computing workload management.

  15. Batch Effect Confounding Leads to Strong Bias in Performance Estimates Obtained by Cross-Validation

    PubMed Central

    Delorenzi, Mauro

    2014-01-01

    Background With the large amount of biological data that is currently publicly available, many investigators combine multiple data sets to increase the sample size and potentially also the power of their analyses. However, technical differences (“batch effects”) as well as differences in sample composition between the data sets may significantly affect the ability to draw generalizable conclusions from such studies. Focus The current study focuses on the construction of classifiers, and the use of cross-validation to estimate their performance. In particular, we investigate the impact of batch effects and differences in sample composition between batches on the accuracy of the classification performance estimate obtained via cross-validation. The focus on estimation bias is a main difference compared to previous studies, which have mostly focused on the predictive performance and how it relates to the presence of batch effects. Data We work on simulated data sets. To have realistic intensity distributions, we use real gene expression data as the basis for our simulation. Random samples from this expression matrix are selected and assigned to group 1 (e.g., ‘control’) or group 2 (e.g., ‘treated’). We introduce batch effects and select some features to be differentially expressed between the two groups. We consider several scenarios for our study, most importantly different levels of confounding between groups and batch effects. Methods We focus on well-known classifiers: logistic regression, Support Vector Machines (SVM), k-nearest neighbors (kNN) and Random Forests (RF). Feature selection is performed with the Wilcoxon test or the lasso. Parameter tuning and feature selection, as well as the estimation of the prediction performance of each classifier, is performed within a nested cross-validation scheme. The estimated classification performance is then compared to what is obtained when applying the classifier to independent data. PMID:24967636

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, C

    The Department of Energy (DOE) recognizes the need for the characterization of High-Level Waste (HLW) saltcake in the Savannah River Site (SRS) F- and H-area tank farms to support upcoming salt processing activities. As part of the enhanced characterization efforts, Tank 25F will be sampled and the samples analyzed at the Savannah River National Laboratory (SRNL). This Task Technical and Quality Assurance Plan documents the planned activities for the physical, chemical, and radiological analysis of the Tank 25F saltcake core samples. This plan does not cover other characterization activities that do not involve core sample analysis and it does notmore » address issues regarding sampling or sample transportation. The objectives of this report are: (1) Provide information useful in projecting the composition of dissolved salt batches by quantifying important components (such as actinides, {sup 137}Cs, and {sup 90}Sr) on a per batch basis. This will assist in process selection for the treatment of salt batches and provide data for the validation of dissolution modeling. (2) Determine the properties of the heel resulting from dissolution of the bulk saltcake. Also note tendencies toward post-mixing precipitation. (3) Provide a basis for determining the number of samples needed for the characterization of future saltcake tanks. Gather information useful towards performing characterization in a manner that is more cost and time effective.« less

  17. Removal of trivalent chromium from water using low-cost natural diatomite.

    PubMed

    Gürü, Metin; Venedik, Duygu; Murathan, Ayşe

    2008-12-30

    Trivalent chromium was removed from the artificial wastewater using low-cost diatomite in batch and continuous systems. In batch system, four different sizes and five different amount of sorbent were used. The effect of the temperature on sorption was evaluated with using three different temperatures. As a result of the experiments, 85% of the trivalent chromium was removed from the wastewater in conditions of using 1.29mm grain material at 30 degrees C temperature for 60min in batch system but chromium removal was 82% at 30 degrees C temperature for 22min and 97% from the wastewater at 30 degrees C temperature for 80min in continuous system. Also, the equilibrium adsorption isotherms have been analyzed by Langmuir and Freundlich models. The Langmuir isotherms have the highest correlation coefficients. Langmuir adsorption isotherm constants corresponding to adsorption capacity, q0, were found to be 28.1, 26.5 and 21.8mg Cr3+/g diatomite at 15, 30 and 45 degrees C, respectively. Adsorption process was an exothermic process as a result of thermodynamic parameters calculations. The kinetic data of the sorption showed that the pseudo second-order equation was the more appropriate, which indicate that the intraparticle diffusion is the rate-limiting factor.

  18. Functional Description for the Department of the Army Movements Management System. Redesign Phase 1 (DAMMS-R1). Volume 2.

    DTIC Science & Technology

    1987-12-31

    Spot--Limit-Cal Print-Spot-Limit-Cal PART OF: MNar-t-Origri-Frt-Data 111-780 Syste-, < fvii -Hy-Asset-Forecast-Inp> -Operator 7’-4 :11-781 ADSM 18-LZ4-AKM...DEFINE PROCESS Maint-SpotFactor-Tbl DESCRIPTION; Maintain Spot Factor Table. This is an interactive process that receives SpotFactorCd(s) and...PROCESS Update-SpotFactor-Tbi DESCRIPTION; Update Spot Factor Table. This is a batch process that creates records or changes existing records in the

  19. Microfluidic biolector-microfluidic bioprocess control in microtiter plates.

    PubMed

    Funke, Matthias; Buchenauer, Andreas; Schnakenberg, Uwe; Mokwa, Wilfried; Diederichs, Sylvia; Mertens, Alan; Müller, Carsten; Kensy, Frank; Büchs, Jochen

    2010-10-15

    In industrial-scale biotechnological processes, the active control of the pH-value combined with the controlled feeding of substrate solutions (fed-batch) is the standard strategy to cultivate both prokaryotic and eukaryotic cells. On the contrary, for small-scale cultivations, much simpler batch experiments with no process control are performed. This lack of process control often hinders researchers to scale-up and scale-down fermentation experiments, because the microbial metabolism and thereby the growth and production kinetics drastically changes depending on the cultivation strategy applied. While small-scale batches are typically performed highly parallel and in high throughput, large-scale cultivations demand sophisticated equipment for process control which is in most cases costly and difficult to handle. Currently, there is no technical system on the market that realizes simple process control in high throughput. The novel concept of a microfermentation system described in this work combines a fiber-optic online-monitoring device for microtiter plates (MTPs)--the BioLector technology--together with microfluidic control of cultivation processes in volumes below 1 mL. In the microfluidic chip, a micropump is integrated to realize distinct substrate flow rates during fed-batch cultivation in microscale. Hence, a cultivation system with several distinct advantages could be established: (1) high information output on a microscale; (2) many experiments can be performed in parallel and be automated using MTPs; (3) this system is user-friendly and can easily be transferred to a disposable single-use system. This article elucidates this new concept and illustrates applications in fermentations of Escherichia coli under pH-controlled and fed-batch conditions in shaken MTPs. Copyright 2010 Wiley Periodicals, Inc.

  20. Headspace solid-phase microextraction (HS-SPME) and liquid-liquid extraction (LLE): comparison of the performance in classification of ecstasy tablets. Part 2.

    PubMed

    Bonadio, Federica; Margot, Pierre; Delémont, Olivier; Esseiva, Pierre

    2008-11-20

    Headspace solid-phase microextraction (HS-SPME) is assessed as an alternative to liquid-liquid extraction (LLE) currently used for 3,4-methylenedioxymethampethamine (MDMA) profiling. Both methods were compared evaluating their performance in discriminating and classifying samples. For this purpose 62 different seizures were analysed using both extraction techniques followed by gas chromatography-mass spectroscopy (GC-MS). A previously validated method provided data for HS-SPME, whereas LLE data were collected applying a harmonized methodology developed and used in the European project CHAMP. After suitable pre-treatment, similarities between sample pairs were studied using the Pearson correlation. Both methods enable to distinguish between samples coming from the same pre-tabletting batches and samples coming from different pre-tabletting batches. This finding emphasizes the use of HS-SPME as an effective alternative to LLE, with additional advantages such as sample preparation and a solvent-free process.

  1. Convolutional neural networks with balanced batches for facial expressions recognition

    NASA Astrophysics Data System (ADS)

    Battini Sönmez, Elena; Cangelosi, Angelo

    2017-03-01

    This paper considers the issue of fully automatic emotion classification on 2D faces. In spite of the great effort done in recent years, traditional machine learning approaches based on hand-crafted feature extraction followed by the classification stage failed to develop a real-time automatic facial expression recognition system. The proposed architecture uses Convolutional Neural Networks (CNN), which are built as a collection of interconnected processing elements to simulate the brain of human beings. The basic idea of CNNs is to learn a hierarchical representation of the input data, which results in a better classification performance. In this work we present a block-based CNN algorithm, which uses noise, as data augmentation technique, and builds batches with a balanced number of samples per class. The proposed architecture is a very simple yet powerful CNN, which can yield state-of-the-art accuracy on the very competitive benchmark algorithm of the Extended Cohn Kanade database.

  2. Acceptance Test Data for the AGR-5/6/7 Irradiation Test Fuel Composite Defective IPyC Fraction and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Hunn, John D.; Skitt, Darren J.

    Coated particle composite J52R-16-98005 was produced by Babcock and Wilcox Technologies (BWXT) as fuel for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR). This composite was comprised of four coated particle fuel batches J52O-16-93165B (26%), 93168B (26%), 93169B (24%), and 93170B (24%), chosen based on the Quality Control (QC) data acquired for each individual candidate AGR-5/6/7 batch. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT Lot J52R-16-69317more » containing a mixture of 15.5%-enriched uranium carbide and uranium oxide (UCO). The TRISO coatings consisted of four consecutive CVD layers: a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. The TRISO-coated particle batches were sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batches were designated by appending the letter A to the end of the batch number (e.g., 93165A). Secondary upgrading by sieving was performed on the A-designated batches to remove particles with missing or very-thin buffer layers that were identified during previous analysis of the individual batches for defective IPyC, as reported in the acceptance test data report for the AGR-5/6/7 production batches [Hunn et al. 2017]. The additionally-upgraded batches were designated by appending the letter B to the end of the batch number (e.g., 93165B).« less

  3. Manufacturing Study for a Four Meter Lightweight Mirror

    DTIC Science & Technology

    1980-04-01

    Preparation 2.1.1 The batch consists of the proper mixture of SiCl4 and TiCl4. This is accomplished by a weight process using electronic scales. The batch is...the requirement for this program. 2.2 Glass Laydown - Flame Hydrolysis 2.2.1 The laydown process is the operation where the SiCl4 and TiCl 4 are

  4. Electron Driven Processes in Atmospheric Behaviour

    NASA Astrophysics Data System (ADS)

    Campbell, L.; Brunger, M. J.; Teubner, P. J. O.

    2006-11-01

    Electron impact plays an important role in many atmospheric processes. Calculation of these is important for basic understanding, atmospheric modeling and remote sensing. Accurate atomic and molecular data, including electron impact cross sections, are required for such calculations. Five electron-driven processes are considered: auroral and dayglow emissions, the reduction of atmospheric electron density by vibrationally excited N2, NO production and infrared emission from NO. In most cases the predictions are compared with measurements. The dependence on experimental atomic and molecular data is also investigated.

  5. Improvement of l-lactic acid productivity from sweet sorghum juice by repeated batch fermentation coupled with membrane separation.

    PubMed

    Wang, Yong; Meng, Hongyu; Cai, Di; Wang, Bin; Qin, Peiyong; Wang, Zheng; Tan, Tianwei

    2016-07-01

    In order to efficiently produce l-lactic acid from non-food feedstocks, sweet sorghum juice (SSJ), which is rich of fermentable sugars, was directly used for l-lactic acid fermentation by Lactobacillus rhamnosus LA-04-1. A membrane integrated repeated batch fermentation (MIRB) was developed for productivity improvement. High-cell-density fermentation was achieved with a final cell density (OD620) of 42.3, and the CCR effect was overcomed. When SSJ (6.77gL(-1) glucose, 4.51gL(-1) fructose and 50.46gL(-1) sucrose) was used as carbon source in MIRB process, l-lactic acid productivity was increased significantly from 1.45gL(-1)h(-1) (batch 1) to 17.55gL(-1)h(-1) (batch 6). This process introduces an effective way to produce l-lactic acid from SSJ. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Fumaric Acid Production from Alkali-Pretreated Corncob by Fed-Batch Simultaneous Saccharification and Fermentation Combined with Separated Hydrolysis and Fermentation at High Solids Loading.

    PubMed

    Li, Xin; Zhou, Jin; Ouyang, Shuiping; Ouyang, Jia; Yong, Qiang

    2017-02-01

    Production of fumaric acid from alkali-pretreated corncob (APC) at high solids loading was investigated using a combination of separated hydrolysis and fermentation (SHF) and fed-batch simultaneous saccharification and fermentation (SSF) by Rhizopus oryzae. Four different fermentation modes were tested to maximize fumaric acid concentration at high solids loading. The highest concentration of 41.32 g/L fumaric acid was obtained from 20 % (w/v) APC at 38 °C in the combined SHF and fed-batch SSF process, compared with 19.13 g/L fumaric acid in batch SSF alone. The results indicated that a combination of SHF and fed-batch SSF significantly improved production of fumaric acid from lignocellulose by R. oryzae than that achieved with batch SSF at high solids loading.

  7. Influence of Powder Characteristics on Processability of AlSi12 Alloy Fabricated by Selective Laser Melting

    PubMed Central

    Zherebtsov, Dmitry; Radionova, Ludmila

    2018-01-01

    Selective laser melting (SLM) is one of the additive manufacturing technologies that allows for the production of parts with complex shapes from either powder feedstock or from wires. Aluminum alloys have a great potential for use in SLM especially in automotive and aerospace fields. This paper studies the influence of starting powder characteristics on the processability of SLM fabricated AlSi12 alloy. Three different batches of gas atomized powders from different manufacturers were processed by SLM. The powders differ in particle size and its distribution, morphology and chemical composition. Cubic specimens (10 mm × 10 mm × 10 mm) were fabricated by SLM from the three different powder batches using optimized process parameters. The fabrication conditions were kept similar for the three powder batches. The influence of powder characteristics on porosity and microstructure of the obtained specimens were studied in detail. The SLM samples produced from the three different powder batches do not show any significant variations in their structural aspects. However, the microstructural aspects differ and the amount of porosity in these three specimens vary significantly. It shows that both the flowability of the powder and the apparent density have an influential role on the processability of AlSi12 SLM samples. PMID:29735932

  8. Influence of Powder Characteristics on Processability of AlSi12 Alloy Fabricated by Selective Laser Melting.

    PubMed

    Baitimerov, Rustam; Lykov, Pavel; Zherebtsov, Dmitry; Radionova, Ludmila; Shultc, Alexey; Prashanth, Konda Gokuldoss

    2018-05-07

    Selective laser melting (SLM) is one of the additive manufacturing technologies that allows for the production of parts with complex shapes from either powder feedstock or from wires. Aluminum alloys have a great potential for use in SLM especially in automotive and aerospace fields. This paper studies the influence of starting powder characteristics on the processability of SLM fabricated AlSi12 alloy. Three different batches of gas atomized powders from different manufacturers were processed by SLM. The powders differ in particle size and its distribution, morphology and chemical composition. Cubic specimens (10 mm × 10 mm × 10 mm) were fabricated by SLM from the three different powder batches using optimized process parameters. The fabrication conditions were kept similar for the three powder batches. The influence of powder characteristics on porosity and microstructure of the obtained specimens were studied in detail. The SLM samples produced from the three different powder batches do not show any significant variations in their structural aspects. However, the microstructural aspects differ and the amount of porosity in these three specimens vary significantly. It shows that both the flowability of the powder and the apparent density have an influential role on the processability of AlSi12 SLM samples.

  9. Improving lactate metabolism in an intensified CHO culture process: productivity and product quality considerations.

    PubMed

    Xu, Sen; Hoshan, Linda; Chen, Hao

    2016-11-01

    In this study, we discussed the development and optimization of an intensified CHO culture process, highlighting medium and control strategies to improve lactate metabolism. A few strategies, including supplementing glucose with other sugars (fructose, maltose, and galactose), controlling glucose level at <0.2 mM, and supplementing medium with copper sulfate, were found to be effective in reducing lactate accumulation. Among them, copper sulfate supplementation was found to be critical for process optimization when glucose was in excess. When copper sulfate was supplemented in the new process, two-fold increase in cell density (66.5 ± 8.4 × 10(6) cells/mL) and titer (11.9 ± 0.6 g/L) was achieved. Productivity and product quality attributes differences between batch, fed-batch, and concentrated fed-batch cultures were discussed. The importance of process and cell metabolism understanding when adapting the existing process to a new operational mode was demonstrated in the study.

  10. Fed-batch hydrolysate addition and cell separation by settling in high cell density lignocellulosic ethanol fermentations on AFEX™ corn stover in the Rapid Bioconversion with Integrated recycling Technology process.

    PubMed

    Sarks, Cory; Jin, Mingjie; Balan, Venkatesh; Dale, Bruce E

    2017-09-01

    The Rapid Bioconversion with Integrated recycling Technology (RaBIT) process uses enzyme and yeast recycling to improve cellulosic ethanol production economics. The previous versions of the RaBIT process exhibited decreased xylose consumption using cell recycle for a variety of different micro-organisms. Process changes were tested in an attempt to eliminate the xylose consumption decrease. Three different RaBIT process changes were evaluated in this work including (1) shortening the fermentation time, (2) fed-batch hydrolysate addition, and (3) selective cell recycling using a settling method. Shorting the RaBIT fermentation process to 11 h and introducing fed-batch hydrolysate addition eliminated any xylose consumption decrease over ten fermentation cycles; otherwise, decreased xylose consumption was apparent by the third cell recycle event. However, partial removal of yeast cells during recycle was not economical when compared to recycling all yeast cells.

  11. Fossil fuel furnace reactor

    DOEpatents

    Parkinson, William J.

    1987-01-01

    A fossil fuel furnace reactor is provided for simulating a continuous processing plant with a batch reactor. An internal reaction vessel contains a batch of shale oil, with the vessel having a relatively thin wall thickness for a heat transfer rate effective to simulate a process temperature history in the selected continuous processing plant. A heater jacket is disposed about the reactor vessel and defines a number of independent controllable temperature zones axially spaced along the reaction vessel. Each temperature zone can be energized to simulate a time-temperature history of process material through the continuous plant. A pressure vessel contains both the heater jacket and the reaction vessel at an operating pressure functionally selected to simulate the continuous processing plant. The process yield from the oil shale may be used as feedback information to software simulating operation of the continuous plant to provide operating parameters, i.e., temperature profiles, ambient atmosphere, operating pressure, material feed rates, etc., for simulation in the batch reactor.

  12. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.

  14. Biosorption of Congo Red from aqueous solution onto burned root of Eichhornia crassipes biomass

    NASA Astrophysics Data System (ADS)

    Roy, Tapas Kumar; Mondal, Naba Kumar

    2017-07-01

    Biosorption is becoming a promising alternative to replace or supplement the present dye removal processes from dye containing waste water. In this work, adsorption of Congo Red (CR) from aqueous solution on burned root of Eichhornia crassipes ( BREC) biomass was investigated. A series of batch experiments were performed utilizing BREC biomass to remove CR dye from aqueous systems. Under optimized batch conditions, the BREC could remove up to 94.35 % of CR from waste water. The effects of operating parameters such as initial concentration, pH, adsorbent dose and contact time on the adsorption of CR were analyzed using response surface methodology. The proposed quadratic model for central composite design fitted very well to the experimental data. Response surface plots were used to determine the interaction effects of main factors and optimum conditions of the process. The optimum adsorption conditions were found to be initial CR concentration = 5 mg/L-1, pH = 7, adsorbent dose = 0.125 g and contact time = 45 min. The experimental isotherms data were analyzed using Langmuir, Freundlich, Temkin and Dubinin-Radushkevich (D-R) isotherm equations and the results indicated that the Freundlich isotherm showed a better fit for CR adsorption. Thermodynamic parameters were calculated from Van't Hoff plot, confirming that the adsorption process was spontaneous and exothermic. The high CR adsorptive removal ability and regeneration efficiency of this adsorbent suggest its applicability in industrial/household systems and data generated would help in further upscaling of the adsorption process.

  15. The Potential of Knowing More: A Review of Data-Driven Urban Water Management.

    PubMed

    Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max

    2017-03-07

    The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.

  16. A novel model-based control strategy for aerobic filamentous fungal fed-batch fermentation processes.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Cassells, Benny; Sin, Gürkan; Gernaey, Krist V

    2017-07-01

    A novel model-based control strategy has been developed for filamentous fungal fed-batch fermentation processes. The system of interest is a pilot scale (550 L) filamentous fungus process operating at Novozymes A/S. In such processes, it is desirable to maximize the total product achieved in a batch in a defined process time. In order to achieve this goal, it is important to maximize both the product concentration, and also the total final mass in the fed-batch system. To this end, we describe the development of a control strategy which aims to achieve maximum tank fill, while avoiding oxygen limited conditions. This requires a two stage approach: (i) calculation of the tank start fill; and (ii) on-line control in order to maximize fill subject to oxygen transfer limitations. First, a mechanistic model was applied off-line in order to determine the appropriate start fill for processes with four different sets of process operating conditions for the stirrer speed, headspace pressure, and aeration rate. The start fills were tested with eight pilot scale experiments using a reference process operation. An on-line control strategy was then developed, utilizing the mechanistic model which is recursively updated using on-line measurements. The model was applied in order to predict the current system states, including the biomass concentration, and to simulate the expected future trajectory of the system until a specified end time. In this way, the desired feed rate is updated along the progress of the batch taking into account the oxygen mass transfer conditions and the expected future trajectory of the mass. The final results show that the target fill was achieved to within 5% under the maximum fill when tested using eight pilot scale batches, and over filling was avoided. The results were reproducible, unlike the reference experiments which show over 10% variation in the final tank fill, and this also includes over filling. The variance of the final tank fill is reduced by over 74%, meaning that it is possible to target the final maximum fill reproducibly. The product concentration achieved at a given set of process conditions was unaffected by the control strategy. Biotechnol. Bioeng. 2017;114: 1459-1468. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Acceptance Test Data for BWXT Coated Particle Batches 93172B and 93173B—Defective IPyC and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunn, John D.; Helmreich, Grant W.; Dyer, John A.

    Coated particle batches J52O-16-93172B and J52O-16-93173B were produced by Babcock and Wilcox Technologies (BWXT) as part of the production campaign for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), but were not used in the final fuel composite. However, these batches may be used as demonstration production-scale coated particle fuel for other experiments. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT lot J52R-16-69317 containing a mixture ofmore » 15.5%-enriched uranium carbide and uranium oxide (UCO). The TRISO coatings consisted of four consecutive CVD layers: a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. The TRISO-coated particle batches were sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batches were designated by appending the letter A to the end of the batch number (e.g., 93172A). Secondary upgrading by sieving was performed on the A-designated batches to remove particles with missing or very-thin buffer layers that were identified during previous analysis of the individual batches for defective IPyC, as reported in the acceptance test data report for the AGR-5/6/7 production batches [Hunn et al. 2017b]. The additionally-upgraded batches were designated by appending the letter B to the end of the batch number (e.g., 93172B).« less

  18. Relationship between mozzarella yield and milk composition, processing factors, and recovery of whey constituents.

    PubMed

    Sales, D C; Rangel, A H N; Urbano, S A; Freitas, Alfredo R; Tonhati, Humberto; Novaes, L P; Pereira, M I B; Borba, L H F

    2017-06-01

    Our aim was to identify the relationship between mozzarella cheese yield and buffalo milk composition, processing factors, and recovery of whey constituents. A production of 30 batches of mozzarella cheese at a dairy industry in northeast Brazil (Rio Grande do Norte) was monitored between March and November 2015. Mozzarella yield and 32 other variables were observed for each batch, and divided into 3 groups: milk composition variables (12); variables involved in the cheesemaking process (14); and variables for recovery of whey constituents (6). Data were analyzed using descriptive statistics, Pearson correlation, and principal component analysis. Most of the correlations between milk composition variables and between the variables of the manufacturing processes were not significant. Significant correlations were mostly observed between variables for recovery of whey constituents. Yield only showed significant correlation with time elapsed between curd cuttings and age of the starter culture, and it showed greater association with age of the starter culture, time elapsed between curd cuttings, and during stretching, as well as with milk pH and density. Thus, processing factors and milk characteristics are closely related to dairy efficiency in mozzarella manufacturing. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, D.

    The Savannah River Site (SRS) Actinide Removal Process has been processing salt waste since 2008. This process includes a filtration step in the 512-S facility. Initial operations included the addition, or strike, of monosodium titanate (MST) to remove soluble actinides and strontium. The added MST and any entrained sludge solids were then separated from the supernate by cross flow filtration. During this time, the filter operations have, on many occasions, been the bottleneck process limiting the rate of salt processing. Recently, 512-S- has started operations utilizing “No-MST” where the MST actinide removal strike was not performed and the supernate wasmore » simply pre-filtered prior to Cs removal processing. Direct filtration of decanted tank supernate, as demonstrated in 512-S, is the proposed method of operation for the Hanford Low Activity Waste Pretreatment System (LAWPS) facility. Processing decanted supernate without MST solids has been demonstrated for cross flow filtration to provide a significant improvement in production with the SRS Salt Batches 8 and 9 feed chemistries. The average filtration rate for the first 512-S batch processing cycle using No-MST has increased filtrate production by over 35% of the historical average. The increase was sustained for more than double the amount of filtrate batches processed before cleaning of the filter was necessary. While there are differences in the design of the 512-S and Hanford filter systems, the 512-S system should provide a reasonable indication of LAWPS filter performance with similar feed properties. Based on the data from the 512-S facility and with favorable feed properties, the LAWPS filter, as currently sized at over twice the size of the 512-S filter (532 square feet filtration area versus 235 square feet), has the potential to provide sustained filtrate production at the upper range of the planned LAWPS production rate of 17 gpm.« less

  20. Teachers' Experiences with the Data-Driven Decision Making Process in Increasing Students' Reading Achievement in a Title I Elementary Public School

    ERIC Educational Resources Information Center

    Atkinson, Linton

    2015-01-01

    This paper is a research dissertation based on a qualitative case study conducted on Teachers' Experiences within a Data-Driven Decision Making (DDDM) process. The study site was a Title I elementary school in a large school district in Central Florida. Background information is given in relation to the need for research that was conducted on the…

  1. Laboratory-scale integrated ARP filter test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, M.; Burket, P.

    2016-03-01

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). Recently, the low filter flux through the ARP of approximately 5 gallons per minute has limited the rate at which radioactive liquid waste can be treated. Salt Batch 6 had a lower processing rate and required frequent filter cleaning. There is a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. This task attempted to simulate the entire ARP process, including multiple batches (5), washing, chemical cleaning, andmore » blending the feed with heels and recycle streams. The objective of the tests was to determine whether one of these processes is causing excessive fouling of the crossflow or secondary filter. The authors conducted the tests with feed solutions containing 6.6 M sodium Salt Batch 6 simulant supernate with no MST.« less

  2. Batch fabrication process development for ferrite logic conductors

    NASA Technical Reports Server (NTRS)

    Heckler, C. H., Jr.; Bhiwandker, N. C.

    1972-01-01

    A process for fabricating ultrareliable magnetic ferrite logic circuits is described in which the conductors are formed by a combination of two batch type processes - photolithography and electroplating - and a mechanized writing process for completing conductors in the third dimension. Up to 4 turns, through an aperture 1 mm in diameter, are formed by the described process. The number of joints in the conductors is reduced by use of this process to only those which are required for input, output and power connections of a logic block. To demonstrate feasibility, 8-stage magnetic ring counter circuits have been fabricated.

  3. Analyzing data flows of WLCG jobs at batch job level

    NASA Astrophysics Data System (ADS)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-05-01

    With the introduction of federated data access to the workflows of WLCG, it is becoming increasingly important for data centers to understand specific data flows regarding storage element accesses, firewall configurations, as well as the scheduling of batch jobs themselves. As existing batch system monitoring and related system monitoring tools do not support measurements at batch job level, a new tool has been developed and put into operation at the GridKa Tier 1 center for monitoring continuous data streams and characteristics of WLCG jobs and pilots. Long term measurements and data collection are in progress. These measurements already have been proven to be useful analyzing misbehaviors and various issues. Therefore we aim for an automated, realtime approach for anomaly detection. As a requirement, prototypes for standard workflows have to be examined. Based on measurements of several months, different features of HEP jobs are evaluated regarding their effectiveness for data mining approaches to identify these common workflows. The paper will introduce the actual measurement approach and statistics as well as the general concept and first results classifying different HEP job workflows derived from the measurements at GridKa.

  4. Image-based information, communication, and retrieval

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1980-01-01

    IBIS/VICAR system combines video image processing and information management. Flexible programs require user to supply only parameters specific to particular application. Special-purpose input/output routines transfer image data with reduced memory requirements. New application programs are easily incorporated. Program is written in FORTRAN IV, Assembler, and OS JCL for batch execution and has been implemented on IBM 360.

  5. SHOEBOX: A Personal File Handling System for Textual Data. Information System Language Studies, Number 23.

    ERIC Educational Resources Information Center

    Glantz, Richard S.

    Until recently, the emphasis in information storage and retrieval systems has been towards batch-processing of large files. In contrast, SHOEBOX is designed for the unformatted, personal file collection of the computer-naive individual. Operating through display terminals in a time-sharing, interactive environment on the IBM 360, the user can…

  6. A comparison of abundance estimates from extended batch-marking and Jolly–Seber-type experiments

    PubMed Central

    Cowen, Laura L E; Besbeas, Panagiotis; Morgan, Byron J T; Schwarz, Carl J

    2014-01-01

    Little attention has been paid to the use of multi-sample batch-marking studies, as it is generally assumed that an individual's capture history is necessary for fully efficient estimates. However, recently, Huggins et al. (2010) present a pseudo-likelihood for a multi-sample batch-marking study where they used estimating equations to solve for survival and capture probabilities and then derived abundance estimates using a Horvitz–Thompson-type estimator. We have developed and maximized the likelihood for batch-marking studies. We use data simulated from a Jolly–Seber-type study and convert this to what would have been obtained from an extended batch-marking study. We compare our abundance estimates obtained from the Crosbie–Manly–Arnason–Schwarz (CMAS) model with those of the extended batch-marking model to determine the efficiency of collecting and analyzing batch-marking data. We found that estimates of abundance were similar for all three estimators: CMAS, Huggins, and our likelihood. Gains are made when using unique identifiers and employing the CMAS model in terms of precision; however, the likelihood typically had lower mean square error than the pseudo-likelihood method of Huggins et al. (2010). When faced with designing a batch-marking study, researchers can be confident in obtaining unbiased abundance estimators. Furthermore, they can design studies in order to reduce mean square error by manipulating capture probabilities and sample size. PMID:24558576

  7. Data-Driven Decision-Making: Facilitating Teacher Use of Student Data to Inform Classroom Instruction

    ERIC Educational Resources Information Center

    Schifter, Catherine C.; Natarajan, Uma; Ketelhut, Diane Jass; Kirchgessner, Amanda

    2014-01-01

    Data-driven decision making is essential in K-12 education today, but teachers often do not know how to make use of extensive data sets. Research shows that teachers are not taught how to use extensive data (i.e., multiple data sets) to reflect on student progress or to differentiate instruction. This paper presents a process used in an National…

  8. Efficient provisioning for multi-core applications with LSF

    NASA Astrophysics Data System (ADS)

    Dal Pra, Stefano

    2015-12-01

    Tier-1 sites providing computing power for HEP experiments are usually tightly designed for high throughput performances. This is pursued by reducing the variety of supported use cases and tuning for performances those ones, the most important of which have been that of singlecore jobs. Moreover, the usual workload is saturation: each available core in the farm is in use and there are queued jobs waiting for their turn to run. Enabling multi-core jobs thus requires dedicating a number of hosts where to run, and waiting for them to free the needed number of cores. This drain-time introduces a loss of computing power driven by the number of unusable empty cores. As an increasing demand for multi-core capable resources have emerged, a Task Force have been constituted in WLCG, with the goal to define a simple and efficient multi-core resource provisioning model. This paper details the work done at the INFN Tier-1 to enable multi-core support for the LSF batch system, with the intent of reducing to the minimum the average number of unused cores. The adopted strategy has been that of dedicating to multi-core a dynamic set of nodes, whose dimension is mainly driven by the number of pending multi-core requests and fair-share priority of the submitting user. The node status transition, from single to multi core et vice versa, is driven by a finite state machine which is implemented in a custom multi-core director script, running in the cluster. After describing and motivating both the implementation and the details specific to the LSF batch system, results about performance are reported. Factors having positive and negative impact on the overall efficiency are discussed and solutions to reduce at most the negative ones are proposed.

  9. Conception preliminaire de disques de turbine axiale pour moteurs d'aeronefs

    NASA Astrophysics Data System (ADS)

    Ouellet, Yannick

    The preliminary design phase of a turbine rotor has an important impact on the architecture of a new engine definition, as it sets the technical orientation right from start and provides a good estimate of product performance, weight and cost. In addition, the execution speed at this preliminary phase has become critical into capturing business opportunities. Improving upfront accuracy also alleviates downstream detailed design work and therefore reduces overall product development cycle time. This preliminary phase contains elements slowing down its process, including low interoperability of currently used systems, incompatibility of software and ineffective management of data. In order to overcome these barriers, we have developed the first module of a new Design and Analysis (D&A) platform for the rotor disc. This complete platform ensures integration of different tools processing in batch mode, and is driven from a single graphical user interface. The platform developed has been linked with different optimization methods (algorithms, configuration) in order to automate the disc design and propose best practices for rotor structural optimization. This methodology allowed reduction in design cycle time and improvement of performance. It was applied on two reference P&WC axial discs. The platform's architecture was also used in the development of reference charts to better understand disc performance within given design space. Four high pressure rotor discs of P&WC turbofan and turboprop engines were used to generate the technical charts and understand the effect of various parameters. The new tools supporting disc D&A, combined with the optimization process and reference charts, has proven to be profitable in terms of component performance and engineering effort inputs.

  10. Adsorption and removal of clofibric acid and diclofenac from water with MIEX resin.

    PubMed

    Lu, Xian; Shao, Yisheng; Gao, Naiyun; Chen, Juxiang; Zhang, Yansen; Wang, Qiongfang; Lu, Yuqi

    2016-10-01

    This study demonstrates the use of MIEX resin as an efficient adsorbent for the removal of clofibric acid (CA) and diclofenac (DCF). The adsorption performance of CA and DCF are investigated by a batch mode in single-component or bi-component adsorption system. Various factors influencing the adsorption of CA and DCF, including initial concentration, contact time, adsorbent dosage, initial solution pH, agitation speed, natural organic matter and coexistent anions are studied. The Langmuir model can well describe CA adsorption in single-component system, while the Freundlich model gives better fitting in bi-component system. The DCF adsorption can be well fitted by the Freundlich model in both systems. Thermodynamic analyses show that the adsorption of CA and DCF is an endothermic (ΔH(o) > 0), entropy driven (ΔS(o) > 0) process and more randomness exists in the DCF adsorption process. The values of Gibbs free energy (ΔG(o) < 0) indicate the adsorption of DCF is spontaneous but nonspontaneous (ΔG(o) > 0) for CA adsorption. The kinetic data suggest the adsorption of CA and DCF follow the pseudo-first-order model in both systems and the intra-particle is not the unique rate-limiting step. The adsorption process is controlled simultaneously by external mass transfer and surface diffusion according to the surface diffusion modified Biot number (Bis) ranging from 1.06 to 26.15. Moreover, the possible removal mechanism for CA and DCF is respectively proposed based on the ion exchange stoichiometry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR BATCHING OF LABORATORY DATA (UA-C-7.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the steps involved in batching the physical laboratory data forms generated by the Arizona Border Study and slated for data entry. It applies to all physical laboratory data forms entered for this study. This procedure was followed to ensu...

  12. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR BATCHING OF LAB DATA (UA-C-7.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the steps involved in batching the physical laboratory data forms generated by NHEXAS Arizona and slated for data entry at the primary NHEXAS Arizona office. It applies to all physical laboratory data forms entered at this site. This proced...

  13. Application of integrated ozone and granular activated carbon for decolorization and chemical oxygen demand reduction of vinasse from alcohol distilleries.

    PubMed

    Hadavifar, Mojtaba; Younesi, Habibollah; Zinatizadeh, Ali Akbar; Mahdad, Faezeh; Li, Qin; Ghasemi, Zahra

    2016-04-01

    This study investigates the treatment of the distilleries vinasse using a hybrid process integrating ozone oxidation and granular activated carbons (GAC) in both batch and continuous operation mode. The batch-process studies have been carried out to optimize initial influent pH, GAC doses, the effect of the ozone (O3) and hydrogen peroxide (H2O2) concentrations on chemical oxygen demand (COD) and color removal of the distilleries vinasse. The continuous process was carried out on GAC and ozone treatment alone as well as the hybrid process comb both methods to investigate the synergism effectiveness of the two methods for distilleries vinasse COD reduction and color removal. In a continuous process, the Yan model described the experimental data better than the Thomas model. The efficiency of ozonation of the distilleries vinasse was more effective for color removal (74.4%) than COD removal (25%). O3/H2O2 process was not considerably more effective on COD and color removal. Moreover, O3/GAC process affected negatively on the removal efficiency by reducing COD and color from distilleries vinasse. The negative effect decreased by increasing pH value of the influent. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Using high throughput screening to define virus clearance by chromatography resins.

    PubMed

    Connell-Crowley, Lisa; Larimore, Elizabeth A; Gillespie, Ron

    2013-07-01

    High throughput screening (HTS) of chromatography resins can accelerate downstream process development by rapidly providing information on product and impurity partitioning over a wide range of experimental conditions. In addition to the removal of typical product and process-related impurities, chromatography steps are also used to remove potential adventitious viral contaminants and non-infectious retrovirus-like particles expressed by rodent cell lines used for production. This article evaluates the feasibility of using HTS in a 96-well batch-binding format to study removal of the model retrovirus xenotropic murine leukemia virus (xMuLV) from product streams. Two resins were examined: the anion exchange resin Q Sepharose Fast Flow™ (QSFF) and Capto adhere™, a mixed mode resin. QSFF batch-binding HTS data was generated using two mAbs at various pHs, NaCl concentrations, and levels of impurities. Comparison of HTS data to that generated using the column format showed good agreement with respect to virus retentation at different pHs, NaCl concentrations and impurity levels. Results indicate that NaCl concentration and impurity level, but not pH, are key parameters that can impact xMuLV binding to both resins. Binding of xMuLV to Capto adhere appeared to tolerate higher levels of NaCl and impurity than QSFF, and showed some product-specific impact on binding that was not observed with QSFF. Overall, the results demonstrate that the 96-well batch-binding HTS technique can be an effective tool for rapidly defining conditions for robust virus clearance on chromatographic resins. Copyright © 2013 Wiley Periodicals, Inc.

  15. Using the scanning electron microscope on the production line to assure quality semiconductors

    NASA Technical Reports Server (NTRS)

    Adolphsen, J. W.; Anstead, R. J.

    1972-01-01

    The use of the scanning electron microscope to detect metallization defects introduced during batch processing of semiconductor devices is discussed. A method of determining metallization integrity was developed which culminates in a procurement specification using the scanning microscope on the production line as a quality control tool. Batch process control of the metallization operation is monitored early in the manufacturing cycle.

  16. Establishment of replacement batches for heparin low-molecular-mass for calibration CRS, and the International Standard Low Molecular Weight Heparin for Calibration.

    PubMed

    Mulloy, B; Heath, A; Behr-Gross, M-E

    2007-12-01

    An international collaborative study involving fourteen laboratories has taken place, organised by the European Directorate for the Quality of Medicines & HealthCare (EDQM) with National Institute for Biological Standards & Control (NIBSC) (in its capacity as a World Health Organisation (WHO) Laboratory for Biological Standardisation) to provide supporting data for the establishment of replacement batches of Heparin Low-Molecular-Mass (LMM) for Calibration Chemical Reference Substance (CRS), and of the International Reference Reagent (IRR) Low Molecular Weight Heparin for Molecular Weight Calibration. A batch of low-molecular-mass heparin was donated to the organisers and candidate preparations of freeze-dried heparin were produced at NIBSC and EDQM. The establishment study was organised in two phases: a prequalification (phase 1, performed in 3 laboratories in 2005) followed by an international collaborative study (phase 2). In phase 2, started in March 2006, molecular mass parameters were determined for seven different LMM heparin samples using the current CRS batch and two batches of candidate replacement material with a defined number average relative molecular mass (Mn) of 3,700, determined in phase 1. The values calculated using the candidates as standard were systematically different from values calculated using the current batch with its assigned number-average molecular mass (Mna) of 3,700. Using raw data supplied by participants, molecular mass parameters were recalculated using the candidates as standard with values for Mna of 3,800 and 3,900. Values for these parameters agreed more closely with those calculated using the current batch supporting the fact that the candidates, though similar to batch 1 in view of the production processes used, differ slightly in terms of molecular mass distribution. Therefore establishment of the candidates was recommended with an assigned Mna value of 3,800 that is both consistent with phase 1 results and guarantees continuity with the current CRS batch. In phase 2, participants also determined molecular weight parameters for the seven different LMM heparin samples using both the 1st IRR (90/686) and its Broad Standard Table and the candidate World Health Organization (WHO) 2nd International Standard (05/112) (2nd IS) using a Broad Standard Table established in phase 1. Mean molecular weights calculated using 2nd IS were slightly higher than with 1st IRR, and participants in the study indicated that this systematic difference precluded establishment of 2nd IS with the table supplied. A replacement Broad Standard Table has been devised on the basis of the central recalculations of raw data supplied by participants; this table gives improved agreement between values derived using the 1st IRR and the candidate 2nd IS. On the basis of this study a recommendation was made for the establishment of 2nd IS and its proposed Broad Standard Table as a replacement for the 1st International Reference Reagent Low Molecular Weight Heparin for Molecular Weight Calibration. Unlike the 1st IRR however, the candidate material 2nd IS is not suitable for use with the method of Nielsen. The candidate materials were established as heparin low-molecular-mass for calibration batches 2 and 3 by the Ph. Eur. Commission in March 2007 and as 2nd IS low-molecular-weight heparin for molecular weight calibration (05/112) by the Expert Committee on Biological Standardization in November 2007.

  17. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  18. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE PAGES

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi; ...

    2017-03-13

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  19. [Identification of the authentic quality of Longdanxiegan pill by systematic quantified fingerprint method based on three wavelength fusion chromatogram].

    PubMed

    Sun, Guoxiang; Zhang, Jingxian

    2009-05-01

    The three wavelength fusion high performance liquid chromatographic fingerprin (TWFFP) of Longdanxiegan pill (LDXGP) was established to identify the quality of LDXGP by the systematic quantified fingerprint method. The chromatographic fingerprints (CFPs) of the 12 batches of LDXGP were determined by reversed-phase high performance liquid chromatography. The technique of multi-wavelength fusion fingerprint was applied during processing the fingerprints. The TWFFPs containing 63 co-possessing peaks were obtained when choosing baicalin peak as the referential peak. The 12 batches of LDXGP were identified with hierarchical clustering analysis by using macro qualitative similarity (S(m)) as the variable. According to the results of classification, the referential fingerprint (RFP) was synthesized from 10 batches of LDXGP. Taking the RFP for the qualified model, all the 12 batches of LDXGP were evaluated by the systematic quantified fingerprint method. Among the 12 batches of LDXGP, 9 batches were completely qualified, the contents of 1 batch were obviously higher while the chemical constituents quantity and distributed proportion in 2 batches were not qualified. The systematic quantified fingerprint method based on the technique of multi-wavelength fusion fingerprint ca effectively identify the authentic quality of traditional Chinese medicine.

  20. Revisiting Statistical Aspects of Nuclear Material Accounting

    DOE PAGES

    Burr, T.; Hamada, M. S.

    2013-01-01

    Nuclear material accounting (NMA) is the only safeguards system whose benefits are routinely quantified. Process monitoring (PM) is another safeguards system that is increasingly used, and one challenge is how to quantify its benefit. This paper considers PM in the role of enabling frequent NMA, which is referred to as near-real-time accounting (NRTA). We quantify NRTA benefits using period-driven and data-driven testing. Period-driven testing makes a decision to alarm or not at fixed periods. Data-driven testing decides as the data arrives whether to alarm or continue testing. The difference between period-driven and datad-riven viewpoints is illustrated by using one-year andmore » two-year periods. For both one-year and two-year periods, period-driven NMA using once-per-year cumulative material unaccounted for (CUMUF) testing is compared to more frequent Shewhart and joint sequential cusum testing using either MUF or standardized, independently transformed MUF (SITMUF) data. We show that the data-driven viewpoint is appropriate for NRTA and that it can be used to compare safeguards effectiveness. In addition to providing period-driven and data-driven viewpoints, new features include assessing the impact of uncertainty in the estimated covariance matrix of the MUF sequence and the impact of both random and systematic measurement errors.« less

  1. Production of orthophosphate suspension fertilizers from wet-process acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, T.M.; Burnell, J.R.

    1984-01-01

    For many years, the Tennessee Valley Authority (TVA) has worked toward development of suspension fertilizers. TVA has two plants for production of base suspension fertilizers from wet-process orthophosphoric acid. One is a demonstration-scale plant where a 13-38-0 grade base suspension is produced by a three-stage ammoniation process. The other is a new batch-type pilot plant which is capable of producing high-grade base suspensions of various ratios and grades from wet-process acid. In this batch plant, suspensions and solutions can also be produced from solid intermediates.

  2. Time delay and noise explaining the behaviour of the cell growth in fermentation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayuobi, Tawfiqullah; Rosli, Norhayati; Bahar, Arifah

    2015-02-03

    This paper proposes to investigate the interplay between time delay and external noise in explaining the behaviour of the microbial growth in batch fermentation process. Time delay and noise are modelled jointly via stochastic delay differential equations (SDDEs). The typical behaviour of cell concentration in batch fermentation process under this model is investigated. Milstein scheme is applied for solving this model numerically. Simulation results illustrate the effects of time delay and external noise in explaining the lag and stationary phases, respectively for the cell growth of fermentation process.

  3. Time delay and noise explaining the behaviour of the cell growth in fermentation process

    NASA Astrophysics Data System (ADS)

    Ayuobi, Tawfiqullah; Rosli, Norhayati; Bahar, Arifah; Salleh, Madihah Md

    2015-02-01

    This paper proposes to investigate the interplay between time delay and external noise in explaining the behaviour of the microbial growth in batch fermentation process. Time delay and noise are modelled jointly via stochastic delay differential equations (SDDEs). The typical behaviour of cell concentration in batch fermentation process under this model is investigated. Milstein scheme is applied for solving this model numerically. Simulation results illustrate the effects of time delay and external noise in explaining the lag and stationary phases, respectively for the cell growth of fermentation process.

  4. Traditionally used medicinal plants against uncomplicated urinary tract infections: Are unusual, flavan-4-ol- and derhamnosylmaysin derivatives responsible for the antiadhesive activity of extracts obtained from stigmata of Zea mays L. against uropathogenic E. coli and Benzethonium chloride as frequent contaminant faking potential antibacterial activities?

    PubMed

    Rafsanjany, Nasli; Sendker, Jandirk; Lechtenberg, Matthias; Petereit, Frank; Scharf, Birte; Hensel, Andreas

    2015-09-01

    The dried stigmata from Zea mays L. are used traditionally for the treatment of uncomplicated urinary tract infections. A recent screening has indicated that hydroalcoholic extract of the herbal material inhibits the adhesion of uropathogenic Escherichia coli (UPEC) to T24 bladder cells. For verification of these data EtOH-water (1:1) extracts from 4 different batches of Maydis stigmata were investigated. Within an in vitro adhesion assay (UPEC strain 2980 and human T24 bladder cells) a dose-dependent antiadhesive activity against UPEC was verified (IC50 1040μg/mL). Bioassay guided fractionation of M. stigmata, batch S1, by EtOH-water extraction, followed by chromatography on Sephadex LH20 revealed two active fractions (I and XI). Further purification of fraction I and structure elucidation of the isolated compound revealed the presence of significant amounts of the biocide benzethonium chloride as contaminant. Benzethonium chloride was also identified in subsequent investigations in 2 different batches of M. stigmata. The presence of such nondeclared and illegal contaminants in the herbal raw material market has to be discussed intensively. From benzethonium-free raw material (batch S2) as well as from batch S1 fraction XI was further fractionated by MPLC and preparative HPLC, leading to a still complex subfraction XIG, which was analyzed by UHPLC/+ESI-QTOF-MS analysis. Advanced data processing and species-metabolite relationship database revealed the tentatively existence of the unusual C-glycosidic flavones derhamnosylmaysin (6), 3'-deoxyrhamnosylmaysin (4), 3'-O-methylderhamnosylmaysin (3), apiferol (2) and alternanthin (8) which might be related to the antiadhesive activity of this subfraction against UPEC. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Data quality objectives for TWRS privatization phase 1: confirm tank T is an appropriate feed source for low-activity waste feed batch X

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NGUYEN, D.M.

    1999-06-01

    The US. Department of Energy, Richland Operations Office (DOE-RL) has initiated Phase 1 of a two-phase privatization strategy for treatment and immobilization of low-activity waste (LAW) currently being managed by the Hanford Tank Waste Remediation System (TWRS) Project. In this strategy, DOE will purchase services from a contractor-owned and operated facility under a fixed price. The Phase 1 TWRS privatization contract requires that the Project Hanford Management Contract (PHMC) contractors, on behalf of DOE, deliver LAW feed in specified quantities and composition to the Privatization Contractor in a timely manner (DOE-RL 1996). Additional requirements are imposed by the interface controlmore » document (ICD-19) for LAW feed (PHMC 1997). In response to these requirements, the Tank Waste Remediation System Operation and Utilization Plan (TWRSO and UP) (Kirkbride et al. 1997) was prepared by the PHMC. The TWRSO and UP, as updated by the Readiness-To-Proceed deliverable (Payne et al. 1998), establishes the baseline operating scenario for the delivery of LAW feed to the Privatization Contractor. The scenario specifies tanks from which LAW will be provided for each feed batch, the operational activities needed to prepare and deliver each batch, and the timing of these activities. The operating scenario was developed based on current knowledge of waste composition and chemistry, waste transfer methods, and operating constraints, such as tank farm logistics and availability of tank space. A project master baseline schedule (PMBS) has been developed to implement the operating scenario. The PMBS also includes activities aimed at reducing programmatic risks. One of the activities, ''Confirm Plans and Requirements,'' was identified to verify the basis used to develop the scenario. Additional data on waste quantity, physical and chemical characteristics, and transfer properties will be needed to support this activity. This document describes the data quality objective (DQO) process undertaken to assme appropriate data will be collected to support the activity, ''Confirm Tank Plans and Requirements.'' The DQO process was implemented in accordance with the TWRS DQO process (Banning 1997) with some modifications to accommodate project or tank-specific requirements and constraints.« less

  6. Enzymatic saccharification of pretreated wheat straw: comparison of solids-recycling, sequential hydrolysis and batch hydrolysis.

    PubMed

    Pihlajaniemi, Ville; Sipponen, Satu; Sipponen, Mika H; Pastinen, Ossi; Laakso, Simo

    2014-02-01

    In the enzymatic hydrolysis of lignocellulose materials, the recycling of the solid residue has previously been considered within the context of enzyme recycling. In this study, a steady state investigation of a solids-recycling process was made with pretreated wheat straw and compared to sequential and batch hydrolysis at constant reaction times, substrate feed and liquid and enzyme consumption. Compared to batch hydrolysis, the recycling and sequential processes showed roughly equal hydrolysis yields, while the volumetric productivity was significantly increased. In the 72h process the improvement was 90% due to an increased reaction consistency, while the solids feed was 16% of the total process constituents. The improvement resulted primarily from product removal, which was equally efficient in solids-recycling and sequential hydrolysis processes. No evidence of accumulation of enzymes beyond the accumulation of the substrate was found in recycling. A mathematical model of solids-recycling was constructed, based on a geometrical series. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Assessment Data-Driven Inquiry: A Review of How to Use Assessment Results to Inform Chemistry Teaching

    ERIC Educational Resources Information Center

    Harshman, Jordan; Yezierski, Ellen

    2017-01-01

    With abundant access to assessments of all kinds, many high school chemistry teachers have the opportunity to gather data from their students on a daily basis. This data can serve multiple purposes, such as informing teachers of students' content difficulties and guiding instruction in a process of data-driven inquiry. In this paper, 83 resources…

  8. Omega-3 production by fermentation of Yarrowia lipolytica: From fed-batch to continuous.

    PubMed

    Xie, Dongming; Miller, Edward; Sharpe, Pamela; Jackson, Ethel; Zhu, Quinn

    2017-04-01

    The omega-3 fatty acid, cis-5,8,11,14,17-eicosapentaenoic acid (C20:5; EPA) has wide-ranging benefits in improving heart health, immune function, and mental health. A sustainable source of EPA production through fermentation of metabolically engineered Yarrowia lipolytica has been developed. In this paper, key fed-batch fermentation conditions were identified to achieve 25% EPA in the yeast biomass, which is so far the highest EPA titer reported in the literature. Dynamic models of the EPA fermentation process were established for analyzing, optimizing, and scaling up the fermentation process. In addition, model simulations were used to develop a two-stage continuous process and compare to single-stage continuous and fed- batch processes. The two stage continuous process, which is equipped with a smaller growth fermentor (Stage 1) and a larger production fermentor (Stage 2), was found to be a superior process to achieve high titer, rate, and yield of EPA. A two-stage continuous fermentation experiment with Y. lipolytica strain Z7334 was designed using the model simulation and then tested in a 2 L and 5 L fermentation system for 1,008 h. Compared with the standard 2 L fed-batch process, the two-stage continuous fermentation process improved the overall EPA productivity by 80% and EPA concentration in the fermenter by 40% while achieving comparable EPA titer in biomass and similar conversion yield from glucose. During the long-term experiment it was also found that the Y. lipolytica strain evolved to reduce byproduct and increase lipid production. This is one of the few continuous fermentation examples that demonstrated improved productivity and concentration of a final product with similar conversion yield compared with a fed-batch process. This paper suggests the two-stage continuous fermentation could be an effective process to achieve improved production of omega-3 and other fermentation products where non-growth or partially growth associated kinetics characterize the process. Biotechnol. Bioeng. 2017;114: 798-812. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. GPU Accelerated Clustering for Arbitrary Shapes in Geoscience Data

    NASA Astrophysics Data System (ADS)

    Pankratius, V.; Gowanlock, M.; Rude, C. M.; Li, J. D.

    2016-12-01

    Clustering algorithms have become a vital component in intelligent systems for geoscience that helps scientists discover and track phenomena of various kinds. Here, we outline advances in Density-Based Spatial Clustering of Applications with Noise (DBSCAN) which detects clusters of arbitrary shape that are common in geospatial data. In particular, we propose a hybrid CPU-GPU implementation of DBSCAN and highlight new optimization approaches on the GPU that allows clustering detection in parallel while optimizing data transport during CPU-GPU interactions. We employ an efficient batching scheme between the host and GPU such that limited GPU memory is not prohibitive when processing large and/or dense datasets. To minimize data transfer overhead, we estimate the total workload size and employ an execution that generates optimized batches that will not overflow the GPU buffer. This work is demonstrated on space weather Total Electron Content (TEC) datasets containing over 5 million measurements from instruments worldwide, and allows scientists to spot spatially coherent phenomena with ease. Our approach is up to 30 times faster than a sequential implementation and therefore accelerates discoveries in large datasets. We acknowledge support from NSF ACI-1442997.

  10. Kinetic model-based feed-forward controlled fed-batch fermentation of Lactobacillus rhamnosus for the production of lactic acid from Arabic date juice.

    PubMed

    Choi, Minsung; Al-Zahrani, Saeed M; Lee, Sang Yup

    2014-06-01

    Arabic date is overproduced in Arabic countries such as Saudi Arabia and Iraq and is mostly composed of sugars (70-80 wt%). Here we developed a fed-batch fermentation process by using a kinetic model for the efficient production of lactic acid to a high concentration from Arabic date juice. First, a kinetic model of Lactobacillus rhamnosus grown on date juice in batch fermentation was constructed in EXCEL so that the estimation of parameters and simulation of the model can be easily performed. Then, several fed-batch fermentations were conducted by employing different feeding strategies including pulsed feeding, exponential feeding, and modified exponential feeding. Based on the results of fed-batch fermentations, the kinetic model for fed-batch fermentation was also developed. This new model was used to perform feed-forward controlled fed-batch fermentation, which resulted in the production of 171.79 g l(-1) of lactic acid with the productivity and yield of 1.58 and 0.87 g l(-1) h(-1), respectively.

  11. Correcting for intra-experiment variation in Illumina BeadChip data is necessary to generate robust gene-expression profiles.

    PubMed

    Kitchen, Robert R; Sabine, Vicky S; Sims, Andrew H; Macaskill, E Jane; Renshaw, Lorna; Thomas, Jeremy S; van Hemert, Jano I; Dixon, J Michael; Bartlett, John M S

    2010-02-24

    Microarray technology is a popular means of producing whole genome transcriptional profiles, however high cost and scarcity of mRNA has led many studies to be conducted based on the analysis of single samples. We exploit the design of the Illumina platform, specifically multiple arrays on each chip, to evaluate intra-experiment technical variation using repeated hybridisations of universal human reference RNA (UHRR) and duplicate hybridisations of primary breast tumour samples from a clinical study. A clear batch-specific bias was detected in the measured expressions of both the UHRR and clinical samples. This bias was found to persist following standard microarray normalisation techniques. However, when mean-centering or empirical Bayes batch-correction methods (ComBat) were applied to the data, inter-batch variation in the UHRR and clinical samples were greatly reduced. Correlation between replicate UHRR samples improved by two orders of magnitude following batch-correction using ComBat (ranging from 0.9833-0.9991 to 0.9997-0.9999) and increased the consistency of the gene-lists from the duplicate clinical samples, from 11.6% in quantile normalised data to 66.4% in batch-corrected data. The use of UHRR as an inter-batch calibrator provided a small additional benefit when used in conjunction with ComBat, further increasing the agreement between the two gene-lists, up to 74.1%. In the interests of practicalities and cost, these results suggest that single samples can generate reliable data, but only after careful compensation for technical bias in the experiment. We recommend that investigators appreciate the propensity for such variation in the design stages of a microarray experiment and that the use of suitable correction methods become routine during the statistical analysis of the data.

  12. Correcting for intra-experiment variation in Illumina BeadChip data is necessary to generate robust gene-expression profiles

    PubMed Central

    2010-01-01

    Background Microarray technology is a popular means of producing whole genome transcriptional profiles, however high cost and scarcity of mRNA has led many studies to be conducted based on the analysis of single samples. We exploit the design of the Illumina platform, specifically multiple arrays on each chip, to evaluate intra-experiment technical variation using repeated hybridisations of universal human reference RNA (UHRR) and duplicate hybridisations of primary breast tumour samples from a clinical study. Results A clear batch-specific bias was detected in the measured expressions of both the UHRR and clinical samples. This bias was found to persist following standard microarray normalisation techniques. However, when mean-centering or empirical Bayes batch-correction methods (ComBat) were applied to the data, inter-batch variation in the UHRR and clinical samples were greatly reduced. Correlation between replicate UHRR samples improved by two orders of magnitude following batch-correction using ComBat (ranging from 0.9833-0.9991 to 0.9997-0.9999) and increased the consistency of the gene-lists from the duplicate clinical samples, from 11.6% in quantile normalised data to 66.4% in batch-corrected data. The use of UHRR as an inter-batch calibrator provided a small additional benefit when used in conjunction with ComBat, further increasing the agreement between the two gene-lists, up to 74.1%. Conclusion In the interests of practicalities and cost, these results suggest that single samples can generate reliable data, but only after careful compensation for technical bias in the experiment. We recommend that investigators appreciate the propensity for such variation in the design stages of a microarray experiment and that the use of suitable correction methods become routine during the statistical analysis of the data. PMID:20181233

  13. Operation and model description of a sequencing batch reactor treating reject water for biological nitrogen removal via nitrite.

    PubMed

    Dosta, J; Galí, A; Benabdallah El-Hadj, T; Macé, S; Mata-Alvarez, J

    2007-08-01

    The aim of this study was the operation and model description of a sequencing batch reactor (SBR) for biological nitrogen removal (BNR) from a reject water (800-900 mg NH(4)(+)-NL(-1)) from a municipal wastewater treatment plant (WWTP). The SBR was operated with three cycles per day, temperature 30 degrees C, SRT 11 days and HRT 1 day. During the operational cycle, three alternating oxic/anoxic periods were performed to avoid alkalinity restrictions. Oxygen supply and working pH range were controlled to achieve the BNR via nitrite, which makes the process more economical. Under steady state conditions, a total nitrogen removal of 0.87 kg N (m(3)day)(-1) was reached. A four-step nitrogen removal model was developed to describe the process. This model enlarges the IWA activated sludge models for a more detailed description of the nitrogen elimination processes and their inhibitions. A closed intermittent-flow respirometer was set up for the estimation of the most relevant model parameters. Once calibrated, model predictions reproduced experimental data accurately.

  14. Design and scaleup of downstream processing of monoclonal antibodies for cancer therapy: from research to clinical proof of principle.

    PubMed

    Horenstein, Alberto L; Crivellin, Federico; Funaro, Ada; Said, Marcela; Malavasi, Fabio

    2003-04-01

    Murine monoclonal antibodies (mAb) from cell culture supernatants have been purified in order to acquire clinical grade for in vivo cancer treatment. The starting material was purified by high performance liquid chromatography (HPLC) systems ranging from the analytical scale process to a scaleup to 1 g per batch. Three columns (Protein A affinity chromatography with single-step elution, hydroxyapatite (HA) chromatography followed by linear gradient elution and endotoxin removing-gel chromatography), exploiting different properties of the mAb were applied. The final batches of antibody were subjected to a large panel of tests for the purpose of evaluating the efficacy of the downstream processing. The resulting data have allowed us to determine the maximum number of times the column can be used and to precisely and thoroughly characterize antibody integrity, specificity, and potency according to in-house reference standards. The optimized bioprocessing is rapid, efficient, and reproducible. Not less importantly, all the techniques applied are characterized by costs which are affordable to medium-sized laboratories. They represent the basis for implementing immunotherapeutic protocols transferable to clinical medicine.

  15. Feasibility of nitrification/denitrification in a sequencing batch biofilm reactor with liquid circulation applied to post-treatment.

    PubMed

    Andrade do Canto, Catarina Simone; Rodrigues, José Alberto Domingues; Ratusznei, Suzana Maria; Zaiat, Marcelo; Foresti, Eugênio

    2008-02-01

    An investigation was performed on the biological removal of ammonium nitrogen from synthetic wastewater by the simultaneous nitrification/denitrification (SND) process, using a sequencing batch biofilm reactor (SBBR). System behavior was analyzed as to the effects of sludge type used as inoculum (autotrophic/heterotrophic), wastewater feed strategy (batch/fed-batch) and aeration strategy (continuous/intermittent). The presence of an autotrophic aerobic sludge showed to be essential for nitrification startup, despite publications stating the existence of heterotrophic organisms capable of nitrifying organic and inorganic nitrogen compounds at low dissolved oxygen concentrations. As to feed strategy, batch operation (synthetic wastewater containing 100 mg COD/L and 50 mg N-NH(4)(+)/L) followed by fed-batch (synthetic wastewater with 100 mg COD/L) during a whole cycle seemed to be the most adequate, mainly during the denitrification phase. Regarding aeration strategy, an intermittent mode, with dissolved oxygen concentration of 2.0mg/L in the aeration phase, showed the best results. Under these optimal conditions, 97% of influent ammonium nitrogen (80% of total nitrogen) was removed at a rate of 86.5 mg N-NH(4)(+)/Ld. In the treated effluent only 0.2 mg N-NO(2)(-)/L,4.6 mg N-NO(3)(-)/L and 1.0 mg N-NH(4)(+)/L remained, demonstrating the potential viability of this process in post-treatment of wastewaters containing ammonium nitrogen.

  16. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  17. Hybrid intelligent control of substrate feeding for industrial fed-batch chlortetracycline fermentation process.

    PubMed

    Jin, Huaiping; Chen, Xiangguang; Yang, Jianwen; Wu, Lei; Wang, Li

    2014-11-01

    The lack of accurate process models and reliable online sensors for substrate measurements poses significant challenges for controlling substrate feeding accurately, automatically and optimally in fed-batch fermentation industries. It is still a common practice to regulate the feeding rate based upon manual operations. To address this issue, a hybrid intelligent control method is proposed to enable automatic substrate feeding. The resulting control system consists of three modules: a presetting module for providing initial set-points; a predictive module for estimating substrate concentration online based on a new time interval-varying soft sensing algorithm; and a feedback compensator using expert rules. The effectiveness of the proposed approach is demonstrated through its successful applications to the industrial fed-batch chlortetracycline fermentation process. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Online intelligent controllers for an enzyme recovery plant: design methodology and performance.

    PubMed

    Leite, M S; Fujiki, T L; Silva, F V; Fileti, A M F

    2010-12-27

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity.

  19. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    PubMed Central

    Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.

    2010-01-01

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106

  20. A natural macroalgae consortium for biosorption of copper from aqueous solution: Optimization, modeling and design studies.

    PubMed

    Deniz, Fatih; Ersanli, Elif Tezel

    2018-03-21

    In this study, the capacity of a natural macroalgae consortium consisting of Chaetomorpha sp., Polysiphonia sp., Ulva sp. and Cystoseira sp. species for the removal of copper ions from aqueous environment was investigated at different operating conditions, such as solution pH, copper ion concentration and contact time. These environmental parameters affecting the biosorption process were optimized on the basis of batch experiments. The experimentally obtained data for the biosorption of copper ions onto the macroalgae-based biosorbent were modeled using the isotherm models of Freundlich, Langmuir, Sips and Dubinin-Radushkevich and the kinetic models of pseudo-first-order, pseudo-second-order, Elovich and Weber and Morris. The pseudo-first-order and Sips equations were the most suitable models to describe the copper biosorption from aqueous solution. The thermodynamic data revealed the feasibility, spontaneity and physical nature of biosorption process. Based on the data of Sips isotherm model, the biosorption capacity of biosorbent for copper ions was calculated as 105.370 mg g -1 under the optimum operating conditions. A single-stage batch biosorption system was developed to predict the real-scale-based copper removal performance of biosorbent. The results of this investigation showed the potential utility of macroalgae consortium for the biosorption of copper ions from aqueous medium.

  1. Covalently bonded ionic liquid onto cellulose for fast adsorption and efficient separation of Cr(VI): Batch, column and mechanism investigation.

    PubMed

    Dong, Zhen; Zhao, Long

    2018-06-01

    Combining the advantages of both cellulose and ionic liquid, ionic liquid functionalized cellulose (ILFC) as adsorbent was prepared through radiation grafting glycidyl methacrylate onto cellulose microsphere following by reaction with ionic liquid 1-aminopropyl-3-methyl imidazolium nitrate. Its adsorption properties towards Cr(VI) were investigated in batch and column experiments. In batch experiments, the adsorption kinetics was well fitted with pseudo-second-order mode with equilibrium time of 2 h and the adsorption capacity reached 181.8 mg/g at pH 2 calculated from Langmuir model. In fixed column, both Yoon-Nelson and Thomas models gave satisfactory fit to experimental data and breakthrough curves, and equilibrium adsorption capacity calculated by Thomas model was 161.0 mg/g. Moreover, ILFC exhibited high selectivity towards Cr(VI) even in synthetic chrome-plating wastewater. Besides, adsorption/desorption test revealed ILFC can be regenerated and reused several times without obvious decrease in adsorbed amount. The adsorption process was demonstrated to anion exchange-reduction mechanism via XPS analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A Comprehensive Strategy to Construct In-house Database for Accurate and Batch Identification of Small Molecular Metabolites.

    PubMed

    Zhao, Xinjie; Zeng, Zhongda; Chen, Aiming; Lu, Xin; Zhao, Chunxia; Hu, Chunxiu; Zhou, Lina; Liu, Xinyu; Wang, Xiaolin; Hou, Xiaoli; Ye, Yaorui; Xu, Guowang

    2018-05-29

    Identification of the metabolites is an essential step in metabolomics study to interpret regulatory mechanism of pathological and physiological processes. However, it is still a big headache in LC-MSn-based studies because of the complexity of mass spectrometry, chemical diversity of metabolites, and deficiency of standards database. In this work, a comprehensive strategy is developed for accurate and batch metabolite identification in non-targeted metabolomics studies. First, a well defined procedure was applied to generate reliable and standard LC-MS2 data including tR, MS1 and MS2 information at a standard operational procedure (SOP). An in-house database including about 2000 metabolites was constructed and used to identify the metabolites in non-targeted metabolic profiling by retention time calibration using internal standards, precursor ion alignment and ion fusion, auto-MS2 information extraction and selection, and database batch searching and scoring. As an application example, a pooled serum sample was analyzed to deliver the strategy, 202 metabolites were identified in the positive ion mode. It shows our strategy is useful for LC-MSn-based non-targeted metabolomics study.

  3. High stability of yellow fever 17D-204 vaccine: a 12-year restrospective analysis of large-scale production.

    PubMed

    Barban, V; Girerd, Y; Aguirre, M; Gulia, S; Pétiard, F; Riou, P; Barrere, B; Lang, J

    2007-04-12

    We have retrospectively analyzed 12 bulk lots of yellow fever vaccine Stamaril, produced between 1990 and 2002 and prepared from the same seed lot that has been in continuous use since 1990. All vaccine batches displayed identical genome sequence. Only four nucleotide substitutions were observed, compared to previously published sequence, with no incidence at amino-acid level. Fine analysis of viral plaque size distribution was used as an additional marker for genetic stability and demonstrated a remarkable homogeneity of the viral population. The total virus load, measured by qRT-PCR, was also homogeneous pointing out reproducibility of the vaccine production process. Mice inoculated intracerebrally with the different bulks exhibited a similar average survival time, and ratio between in vitro potency and mouse LD(50) titers remained constant from batch-to-batch. Taken together, these data demonstrate the genetic stability of the strain at mass production level over a period of 12 years and reinforce the generally admitted idea of the safety of YF17D-based vaccines.

  4. Dynamic model of temperature impact on cell viability and major product formation during fed-batch and continuous ethanolic fermentation in Saccharomyces cerevisiae.

    PubMed

    Amillastre, Emilie; Aceves-Lara, César-Arturo; Uribelarrea, Jean-Louis; Alfenore, Sandrine; Guillouet, Stéphane E

    2012-08-01

    The impact of the temperature on an industrial yeast strain was investigated in very high ethanol performance fermentation fed-batch process within the range of 30-47 °C. As previously observed with a lab strain, decoupling between growth and glycerol formation occurred at temperature of 36 °C and higher. A dynamic model was proposed to describe the impact of the temperature on the total and viable biomass, ethanol and glycerol production. The model validation was implemented with experimental data sets from independent cultures under different temperatures, temperature variation profiles and cultivation modes. The proposed model fitted accurately the dynamic evolutions for products and biomass concentrations over a wide range of temperature profiles. R2 values were above 0.96 for ethanol and glycerol in most experiments. The best results were obtained at 37 °C in fed-batch and chemostat cultures. This dynamic model could be further used for optimizing and monitoring the ethanol fermentation at larger scale. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. miRNA profiling of high, low and non-producing CHO cells during biphasic fed-batch cultivation reveals process relevant targets for host cell engineering.

    PubMed

    Stiefel, Fabian; Fischer, Simon; Sczyrba, Alexander; Otte, Kerstin; Hesse, Friedemann

    2016-05-10

    Fed-batch cultivation of recombinant Chinese hamster ovary (CHO) cell lines is one of the most widely used production modes for commercial manufacturing of recombinant protein therapeutics. Furthermore, fed-batch cultivations are often conducted as biphasic processes where the culture temperature is decreased to maximize volumetric product yields. However, it remains to be elucidated which intracellular regulatory elements actually control the observed pro-productive phenotypes. Recently, several studies have revealed microRNAs (miRNAs) to be important molecular switches of cell phenotypes. In this study, we analyzed miRNA profiles of two different recombinant CHO cell lines (high and low producer), and compared them to a non-producing CHO DG44 host cell line during fed-batch cultivation at 37°C versus a temperature shift to 30°C. Taking advantage of next-generation sequencing combined with cluster, correlation and differential expression analyses, we could identify 89 different miRNAs, which were differentially expressed in the different cell lines and cultivation phases. Functional validation experiments using 19 validated target miRNAs confirmed that these miRNAs indeed induced changes in process relevant phenotypes. Furthermore, computational miRNA target prediction combined with functional clustering identified putative target genes and cellular pathways, which might be regulated by these miRNAs. This study systematically identified novel target miRNAs during different phases and conditions of a biphasic fed-batch production process and functionally evaluated their potential for host cell engineering. Copyright © 2016. Published by Elsevier B.V.

  6. NIGHTHAWK - A Program for Modeling Saturated Batch and Column Experiments Incorporating Equilibrium and Kinetic Biogeochemistry

    EPA Science Inventory

    NIGHTHAWK simulates the fate and transport of biogeochemically reactive contaminants in the saturated subsurface. Version 1.2 supports batch and one- dimensional advective-dispersive-reactive transport involving a number of biogeochemical processes, including: microbially-mediate...

  7. Efficacy of rabies immunoglobulins in an experimental post-exposure prophylaxis rodent model.

    PubMed

    Servat, Alexandre; Lutsch, Charles; Delore, Valentine; Lang, Jean; Veitch, Keith; Cliquet, Florence

    2003-12-12

    In a recently published Syrian hamster animal challenge study [Vaccine 19 (2001) 2273], a highly purified, heat-treated equine rabies immunoglobulin (pERIG HT, Favirab) did not elicit satisfactory protection. The efficacies of this batch, a second stage pERIG HT batch and reference RIG preparations (Imorab, Imogam Rage pasteurised, Berna antiserum) were compared in mice challenged with either Ariana canine field strain or CVS strain. Survival rates against Ariana challenge with the second pERIG HT batch were indistinguishable from those of other licensed preparations (83-90% survival), but the deficient batch did not provide satisfactory protection (53%). These data confirm the inadequate response to a first stage pERIG HT batch, but a current batch provides equivalent protection to that afforded by licensed HRIG and ERIG preparations.

  8. A data-driven multiplicative fault diagnosis approach for automation processes.

    PubMed

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  9. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  10. Rational development of solid dispersions via hot-melt extrusion using screening, material characterization, and numeric simulation tools.

    PubMed

    Zecevic, Damir E; Wagner, Karl G

    2013-07-01

    Effective and predictive small-scale selection tools are inevitable during the development of a solubility enhanced drug product. For hot-melt extrusion, this selection process can start with a microscale performance evaluation on a hot-stage microscope (HSM). A batch size of 400 mg can provide sufficient materials to assess the drug product attributes such as solid-state properties, solubility enhancement, and physical stability as well as process related attributes such as processing temperature in a twin-screw extruder (TSE). Prototype formulations will then be fed into a 5 mm TSE (~1-2 g) to confirm performance from the HSM under additional shear stress. Small stress stability testing might be performed with these samples or a larger batch (20-40 g) made by 9 or 12 mm TSE. Simultaneously, numeric process simulations are performed using process data as well as rheological and thermal properties of the formulations. Further scale up work to 16 and 18 mm TSE confirmed and refined the simulation model. Thus, at the end of the laboratory-scale development, not only the clinical trial supply could be manufactured, but also one can form a sound risk assessment to support further scale up even without decades of process experience. Copyright © 2013 Wiley Periodicals, Inc.

  11. Heater Validation for the NEXT-C Hollow Cathodes

    NASA Technical Reports Server (NTRS)

    Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan A.

    2018-01-01

    Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC-fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor from the remainder of the qualification batch. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.

  12. Cation exchange concentraion of the Americium product from TRUEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barney, G.S.; Cooper, T.D.; Fisher, F.D.

    1991-06-01

    A transuranic extraction (TRUEX) process has been developed to separate and recover plutonium, americium, and other transuranic (TRU) elements from acid wastes. The main objective of the process is to reduce the effluent to below the TRU limit for actinide concentrations (<100 nCi/g of material) so it can be disposed of inexpensively. The process yields a dilute nitric acid stream containing low concentrations of the extracted americium product. This solution also contains residual plutonium and trace amounts of iron. The americium will be absorbed into a cation exchange resin bed to concentrate it for disposal or for future use. Themore » overall objective of these laboratory tests was to determine the performance of the cation exchange process under expected conditions of the TRUEX process. Effects of acid, iron, and americium concentrations on americium absorption on the resin were determined. Distribution coefficients for americium absorption from acide solutions on the resin were measured using batch equilibrations. Batch equilibrations were also used to measure americium absorption in the presence of complexants. This data will be used to identify complexants and solution conditions that can be used to elute the americium from the columns. The rate of absorption was measured by passing solutions containing americium through small columns of resin, varying the flowrates, and measuring the concentrations of americium in the effluent. The rate data will be used to estimate the minimum bed size of the columns required to concentrate the americium product. 11 refs. , 10 figs., 2 tabs.« less

  13. Stochastic, compartmental, and dynamic modeling of cross-contamination during mechanical smearing of cheeses.

    PubMed

    Aziza, Fanny; Mettler, Eric; Daudin, Jean-Jacques; Sanaa, Moez

    2006-06-01

    Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.

  14. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Impact of utilisation of uncompleted handouts on power point presentations (PPT) in rural Indian medical institute.

    PubMed

    Bhaisare, Roshan; Kamble, Bhavna

    2016-07-01

    Note taking while attending a PPT requires high activity of memory and writing process which ultimately leads to what is called "death by power point" referring to boredom and fatigue.  To overcome this we planned to evaluate the impact of utilisation of uncompleted handouts given prior to PPT presentations. Final year MBBS students were divided in 2 batches, batch A and batch B.  For a set of lectures one batch was provided with handouts before lecture while the other batch was given lectures only. Crossover was done to avoid bias, all the lectures being given by the same presenter.  At the end of each lecture, a short questionnaire of 10 Multiple Choice Question (MCQ) was provided to the students. Mean scores were calculated for lectures with handouts and without handouts. For a set of lectures, when batch A was provided with handouts, the mean score was 28.2; for batch B to which no handouts were given the mean score was 23.4. Similarly, for batch B when provided with handouts the mean score was 29.1, for batch A which was not provided with handouts the mean score was 24. There was an average increase of 4.2 marks. Actual gain when handouts were provided was 1.2 marks per lecture.  It was more for the batch comprising of repeater students as compared to the batch of fresher students. Increase in attendance was also noted. Providing uncompleted handouts before a didactic lecture definitely results in increase in knowledge gain; repeater students benefit more with uncompleted handouts.

  16. Quantitative Microbial Risk Assessment of Pharmaceutical Products.

    PubMed

    Eissa, Mostafa Essam

    2017-01-01

    Monitoring of microbiological quality in the pharmaceutical industry is an important criterion that is required to justify safe product release to the drug market. Good manufacturing practice and efficient control on bioburden level of product components are critical parameters that influence the microbiological cleanliness of medicinal products. However, because microbial dispersion through the samples follows Poisson distribution, the rate of detection of microbiologically defective samples lambda (λ) decreases when the number of defective units per batch decreases. When integrating a dose-response model of infection (P inf ) of a specific objectionable microbe with a contamination module, the overall probability of infection from a single batch of pharmaceutical product can be estimated. The combination of P inf with detectability chance of the test (P det ) will yield a value that could be used as a quantitative measure of the possibility of passing contaminated batch units of product with a certain load of a specific pathogen and infecting the final consumer without being detected in the firm. The simulation study can be used to assess the risk of contamination and infection from objectionable microorganisms for sterile and non-sterile products. LAY ABSTRACT: Microbial contamination of pharmaceutical products is a global problem that may lead to infection and possibly death. While reputable pharmaceutical companies strive to deliver microbiologically safe products, it would be helpful to apply an assessment system for the current risk associated with pharmaceutical batches delivered to the drug market. The current methodology may be helpful also in determining the degree of improvement or deterioration on the batch processing flow until reaching the final consumer. Moreover, the present system is flexible and can be applied to other industries such as food, cosmetics, or medical devices manufacturing and processing fields to assess the microbiological risk of the processed and manufactured batch. © PDA, Inc. 2017.

  17. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.

  18. Different cultivation methods to acclimatise ammonia-tolerant methanogenic consortia.

    PubMed

    Tian, Hailin; Fotidis, Ioannis A; Mancini, Enrico; Angelidaki, Irini

    2017-05-01

    Bioaugmentation with ammonia tolerant-methanogenic consortia was proposed as a solution to overcome ammonia inhibition during anaerobic digestion process recently. However, appropriate technology to generate ammonia tolerant methanogenic consortia is still lacking. In this study, three basic reactors (i.e. batch, fed-batch and continuous stirred-tank reactors (CSTR)) operated at mesophilic (37°C) and thermophilic (55°C) conditions were assessed, based on methane production efficiency, incubation time, TAN/FAN (total ammonium nitrogen/free ammonia nitrogen) levels and maximum methanogenic activity. Overall, fed-batch cultivation was clearly the most efficient method compared to batch and CSTR. Specifically, by saving incubation time up to 150%, fed-batch reactors were acclimatised to nearly 2-fold higher FAN levels with a 37%-153% methanogenic activity improvement, compared to batch method. Meanwhile, CSTR reactors were inhibited at lower ammonia levels. Finally, specific methanogenic activity test showed that hydrogenotrophic methanogens were more active than aceticlastic methanogens in all FAN levels above 540mgNH 3 -NL -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. High-accuracy microassembly by intelligent vision systems and smart sensor integration

    NASA Astrophysics Data System (ADS)

    Schilp, Johannes; Harfensteller, Mark; Jacob, Dirk; Schilp, Michael

    2003-10-01

    Innovative production processes and strategies from batch production to high volume scale are playing a decisive role in generating microsystems economically. In particular assembly processes are crucial operations during the production of microsystems. Due to large batch sizes many microsystems can be produced economically by conventional assembly techniques using specialized and highly automated assembly systems. At laboratory stage microsystems are mostly assembled by hand. Between these extremes there is a wide field of small and middle sized batch production wherefore common automated solutions rarely are profitable. For assembly processes at these batch sizes a flexible automated assembly system has been developed at the iwb. It is based on a modular design. Actuators like grippers, dispensers or other process tools can easily be attached due to a special tool changing system. Therefore new joining techniques can easily be implemented. A force-sensor and a vision system are integrated into the tool head. The automated assembly processes are based on different optical sensors and smart actuators like high-accuracy robots or linear-motors. A fiber optic sensor is integrated in the dispensing module to measure contactless the clearance between the dispense needle and the substrate. Robot vision systems using the strategy of optical pattern recognition are also implemented as modules. In combination with relative positioning strategies, an assembly accuracy of the assembly system of less than 3 μm can be realized. A laser system is used for manufacturing processes like soldering.

  20. Simulated Batch Production of Penicillin

    ERIC Educational Resources Information Center

    Whitaker, A.; Walker, J. D.

    1973-01-01

    Describes a program in applied biology in which the simulation of the production of penicillin in a batch fermentor is used as a teaching technique to give students experience before handling a genuine industrial fermentation process. Details are given for the calculation of minimum production cost. (JR)

Top